When you make a photograph using a digital camera – particularly a DSLR – what happens next?
Most people don’t think about this. They just press the shutter, download the image to their computer, then use some sort of cataloging or processing, and finally share it with someone.
Whether we think about it or not, a bunch of stuff is going on in the camera as soon as we press the shutter.
Using image processing algorithms, miniature computers within the camera start doing all sorts of things to the image. The processor is responsible for gathering the raw image data and then quickly applying everything from real-time chromatic aberration correction, picture controls, Bayer pattern demosaicing, sharpening filters, and color and gamma correction to name a few.
Each company has their own processing technology. Nikon calls theirs EXPEED. Canon calls their processor DIGIC. In each case, the processor in the camera impacts everything from autofocus, to card read and write times.
The signal processing in modern cameras sets them apart from even more expensive counterparts built 10 years ago.
Things like face detection, 14-bit imaging, improved noise reduction at high ISO speeds – all these happen because of the image processor.
In the future look for as many as four such processors in each camera. As the manufacturers find ways to make bigger, more powerful, faster processors that can work in tandem with each other, cameras will just get faster and better.
In my opinion, we’ve reached the effective limit of sensor technology with regard to megapixels. In the future, it will be less about the sensor and more about the processor(s).
This Post Sponsored by Animoto