Output resolution, i.e., PPI, DPI and LPI is commonly misunderstood. Here is a short, non-technical, non-scientific, simple explanation.
Terms like PPI, DPI and LPI are used to determine the difference in how resolution is measured depending on device. Many times you will incorrectly hear them used interchangeably. The Byzantine jumble of acronyms and jargon that is used to describe resolution is crazy. But never fear. Just memorize this concept, and you will understand more about it than most so-called experts.
When measuring DESKTOP PRINTER OUTPUT RESOLUTION, use dots per inch. DPI refers to the resolution of an output device like a laser printer or inkjet printer. For example, a standard office laser printer is 300 dpi. A standard image setter is 2,540 dpi.
When measuring PROFESSIONAL OUTPUT RESOLUTION, lines per inch LPI is the resolution (line screen) of a professional printing press, which determines how much detail the press (and the paper) can hold.
When measuring MONITOR/SCANNER OUTPUT RESOLUTION, use pixels per inch. PPI is the resolution (or detail) of an image in a scanning or graphics program, and the resolution of computer monitors. For instance you might want to make it a practice to generate 72 PPI photos for the Web.
Latest posts by Scott Bourne (see all)
- How Burlesque Inspired A Bird Photograph - December 4, 2016
- MacPhun Already Improving Luminar – Soon To Support MacBook Pro Touch Bar - December 1, 2016
- Microsoft Surface Studio From A Photographer’s POV – First Look - November 29, 2016