Output resolution, i.e., PPI, DPI and LPI is commonly misunderstood. Here is a short, non-technical, non-scientific, simple explanation. (I am sure someone will leave comments about what else could have been said, but this is designed to get people started – if you want more information, it’s readily available with basic online searching skills.)
Terms like PPI, DPI and LPI are used to determine the difference in how resolution is measured depending on device. Many times you will incorrectly hear them used interchangeably. The Byzantine jumble of acronyms and jargon that is used to describe resolution is crazy. But never fear. Just memorize this concept, and you will understand more about it than most so-called experts.
When measuring DESKTOP PRINTER OUTPUT RESOLUTION, use dots per inch. DPI refers to the resolution of an output device like a laser printer or image setter. For example, a standard office laser printer is 300 dpi. A standard image setter is 2,540 dpi.
When measuring PROFESSIONAL OUTPUT RESOLUTION, lines per inch LPI is the resolution (line screen) of a professional printing press, which determines how much detail the press (and the paper) can hold.
When measuring MONITOR/SCANNER OUTPUT RESOLUTION, use pixels per inch. PPI is the resolution (or detail) of an image in a scanning or graphics program, and the resolution of computer monitors. For instance you might want to make it a practice to generate 72 PPI photos for the Web.
This post sponsored by Lensbabies.