Once in a while I choose to write something technical about photography. In this post, I will discuss the science behind resolution and how it relates to sharpness and details. I will also try to explain the very common misconceptions. My background is in Physics where I concentrated on optics and digital signal processing. I won’t try to go deep with the maths but instead, explain it in such a way that even those that fear numbers will understand.
First things first. When photographers say resolution, they usually mean the number of megapixels in a camera. For example, a 36Mp camera has a higher resolution compared to a 24Mp camera. This is a very simplistic way of looking at things. The assumption is that a 36Mp camera is capable of capturing more details given the same subject. Let’s see why this is not always true:
Supposing that you are photographing very fine parallel lines with a gap of 1mm between them. If the pixel size of your sensor is 2mm wide (yeah, that’s really HUGE), it won’t be able to detect two adjacent lines from each other because it is too large — it will “step” over two lines at a time and detect them as if they are just one big line. The number of lines your sensor will detect will be less than the total number of lines in the actual subject. In fact, it will detect less than half the number of lines. If your pixel is 1mm wide, it will definitely detect more lines but not all of them (i.e. a pixel that falls between the gaps will not detect any line at all). We call this process of adjusting the width of the sensor pixels as sampling. According to physics, in order to detect all the lines, we must sample at least twice as much as whatever details we need to detect in our subject, meaning our sensor pixel width must be 1/2mm (0.5mm) or smaller. The capacity to detect individual lines is called resolving power. When we undersample (bigger sensor pixels), we lose details which results in blurrier image. We call this blurring effect aliasing.
From the example above, it is quite obvious that given a fixed sensor area, we need to have smaller, tighter packed pixels to have better sampling rate. To cram 36 million pixels in a full frame sensor, the pixels have to be smaller compared to cramming (only) 24 million pixels into the same full frame sensor area. The 36Mp sensor, by virtue of the smaller pixels, has better sampling and thus result in more details. We therefore say that the 36Mp sensor is capable of more resolving power vs the 24Mp sensor. The 36Mp sensor has higher pixel density (number of pixels per area) than the 24Mp sensor. Higher pixel density means more resolving power.
Now here’s why this isn’t always true in photography. In the case of the 36Mp and 24Mp sensors, we are comparing two full frame sensors. Not all sensors are full frame though. APS-C and m4/3 sensors are smaller. A 16Mp APS-C sensor (in the case of Nikon D7000 or Pentax K5) contains pixels that are exactly the same size (width) as a 36Mp (Nikon D800) full frame sensor. They have the same pixel density. Therefore the 16Mp sensor has the same capacity to resolve details like its 36Mp big brother! They have the same resolving power! It also follows that the 16Mp APS-C has better resolving power than the 24Mp full frame sensor!
A bit of explanation is required when comparing the resolving power of sensors with different sizes. As we have mentioned, a 16Mp APS-C has more resolving power than a 24Mp full frame. The assumption is that everything else is constant. That is, they are using the same lens (i.e. same focal length). The same focal length results in the same optical magnification (another factor that affects resolution) although this would result in a narrower angle of view for smaller sensors. If you keep the angle of view constant, a 24Mp full frame will of course resolve more detail compared to a 16Mp APS-C.
Now before you decide to replace your 24Mp Canon 5D3 with an 18Mp Canon 7D, read further because the story does not end here. Sensor resolution is not everything.
Let’s properly define resolution first:
1. Resolution or resolving power is the capacity to detect details.
2. Image size is NOT necessarily equivalent to resolution. We have seen from the example above that a 16Mp APS-C sensor has better resolving power than a 24Mp full frame sensor even if the latter has a bigger image size.
Make sure you understand the concepts above. Photographers mistakenly attribute megapixels as resolution. They are not the same and as our 16Mp vs 24Mp example suggests, it is downright wrong! What megapixels will definitely tell you is that a higher Mp will result in a bigger image in terms of dimension and disk space usage ALWAYS.
Let’s continue. Why should you not replace your 24Mp full frame 5D3 with a 18Mp 7D? Because pixel density is not everything. There is another factor involve in resolution and that’s your lens. I am not talking about glass quality per se. I am talking about lens aperture.
Assume for a moment that all lenses are created equal. That they are perfectly “sharp”. Here’s how aperture comes into play. You should know by now that an aperture of f8 has a smaller opening than f4. Here’s the problem: light “bends”. It bends when it passes through small openings. We call this diffraction. The tighter the opening, the more light diffracts. It is like a hose with running water. If you squeeze the hose, the water will spread (bend). Squeeze it tighter and the water spreads even more.
How does diffraction affect resolution? Well, it “spreads” the details. In our line example above, instead of the lines being 1mm apart, because of diffraction the lines become thicker and thus the gaps become narrower. It will come to a point where individual lines appear to merge and two lines become one. As diffraction worsens, four lines become one and so on.
In photography, as your aperture becomes narrower diffraction worsens. For a full frame sensor for example, diffraction at f11 already worsens in such a way that the maximum resolving power is equivalent to about 16Mp. It means that even if your sensor is 36Mp, the lines have already spread in such a way that the sensor is no longer capable of detecting every single one of them. Your 36Mp sensor has now dropped in resolution as if it only 16Mp. If the aperture drops further to f22, diffraction becomes so bad that most of the lines would have already merged such that your 36Mp full frame sensor has now dropped to the equivalent of a 4Mp sensor!!! To fully utilise a 36Mp full frame sensor, one has to shoot at f5.6 or wider. At f5.6, a perfect lens is capable of fully utilising a 60Mp full frame sensor. At f8, around 30Mp. We therefore say, that the Nikon D800 36Mp sensor is diffraction limited at f8. Compare this with the Nikon D700 that is diffraction limited at f11 because it only has a 12Mp sensor.
So what now? Well it means that if you are shooting with a D800 doing landscape which typically uses apertures of f11-f22 for maximum depth of field, you are just wasting disk space. Your image size remains the same but you are not really resolving any more detail. Again, remember that image size is a direct result of the number of pixels. It does not change. Resolution however is affected by aperture. If you are shooting at f22 with a D800, you are no better than shooting with an older D700 at the same aperture. Both will have the equivalent resolving power of 4Mp. In this case, you can use Photoshop to resize your D700’s 12Mp image to arrive at 36Mp without losing or gaining anything vs the D800’s image. At f11 and lower, the D800 has no advantage whatsoever against the D700 in terms of resolution. Further to that, the D800 has the disadvantage of being a waste of disk space.
Ok, let’s go back to why you should not replace your 5D3 with a 7D (yet). Let’s use our water hose analogy again. Think of the APS-C sensor as a bucket and think of the full frame sensor as a much bigger bucket. Point the hose with running water at them. Now squeeze the hose. Notice that the smaller bucket will soon not be able to catch all of the spreading water. Some of the water coming from the hose will now miss the smaller bucket. The bigger bucket will still be able to catch all of the water at the same “squeeze strength”. We can say that the smaller bucket has lost some water (resolution). This is also true with photography. This is why landscape photographers who shoot with large format film cameras can shoot at f64 with very sharp images. You can’t do this with your measly full frame. This is also why pinhole photography with 135 film sucks. So how about that APS-C sensor? At f22, an APS-C sensor is only capable of 2Mp maximum resolution vs 4Mp with a full frame sensor. Does this mean that APS-C is inferior to full frame? Not necessarily. If you shoot with APS-C you only need to shoot at f16 to get the same depth of field as a full frame sensor at f22. They just balance out really.
Aperture affects resolution in ways that most photographers are not even aware of. In the case of the Nikon D800, the 36Mp sensor is just a waste of disk space starting at f11. Whether this really affects the final output depends on how close you want to pixel peep or how big you print. Fact is, at tighter apertures a D700 at 12Mp, with a bit of Photoshop magic, is as capable as a D800. It may even surpass the D800 by virtue of its better light gathering properties due to its larger pixels.
I hope this post has made you think. Until next time.
Thanks to marceldezor for the link (http://www.talkemount.com/showthread.php?t=387).
The shots in there are very good examples of the effects of diffraction. Notice that the 4Mp at f5.6 is showing more details (resolution) than the 16Mp at f22. The latter has been reduced to effectively 2Mp. In this case, the 4Mp f5.6 shot can be upsized to 16Mp in Photoshop and it will produce a better print than the 16Mp f22 image. The 16Mp shot is practically no better than a 4Mp camera at very small apertures.