Understanding the Effects of Diffraction

This post is a continuation of the previous article that I wrote about resolution and diffraction. I highly suggest that you read that one first so that you will gain a basic understanding of these concepts.

One thing that a lot of people still fail to understand is the absolute effect of diffraction on image resolution. A common argument of buying a higher megapixel camera is that it would “always” resolve more detail than a lower megapixel camera. That is true but only until you hit the diffraction limit. For example, a full frame camera shot at f/16 will not resolve any detail higher than 8Mp. That is, a 36Mp D800 will not give more details compared to a 12Mp D700 when both are shot at f/16. They both will have an effective resolution of 8Mp only.

To explain this, let us consider a very simple analogy. Notice that when you are driving at night in complete darkness, it is very difficult to distinguish if an incoming vehicle is a small car or a big truck if you were to judge only by their headlights. This is because the apparent separation between the left and right headlights is very dependent on the distance of the vehicle from your position. The headlights seem to look larger and closer together the farther the vehicle is from you. If the vehicle is far enough, both headlights will seem to merge as if there is just one light and you would think it’s a bike instead of a car. The reason is simple: light spreads. Both left and right headlights spread until they seem to merge and by then they become indistinguishable from each other. Diffraction is the same. Diffraction spreads light and you lose the details. Therefore it doesn’t matter if you have two eyes or eight eyes like a spider, you still won’t be able to distinguish two separate headlights if the incoming vehicle is very far. In this case, eight eyes are no better than two eyes. Both sets of eyes still see only one headlight not two. Think of the “number of eyes” as your sensor resolution. It does not matter if you have 8Mp or 2Mp, both cameras will detect only one headlight. Did the 8Mp lose resolution? No. It remained a 8Mp sensor. Did it manage to detect two headlights? No. Therefore in our example, a 8Mp is no better than 2Mp in resolving the number of headlights.

The point is that diffraction destroys details. When there is nothing to resolve, sensor resolution does not matter. Supposing that you have two lines that are very close together, diffraction will spread both lines such that they will appear to merge as if they are just one big line. If you only have one line to resolve it does not matter if you have a 2Mp camera or a 100Mp camera, both will detect only one line. The 100Mp camera will of course have more samples of that single line but it is still just one line. Diffraction does not affect sensor resolving power but it affects how the subject is presented to the sensor. Diffraction blurs the subject in such a way that it limits what the sensor can fully detect.

With that in mind, let us look at practical examples. For a full frame sensor, diffraction at f/8 is enough to blur the subject such that anything higher than approximately 30Mp will not resolve any more details. For each stop, the effective resolution drops by half so at f/11 the limit is 15Mp and at f/16 it’s 8Mp and at f/22 a measly 4Mp. These numbers are just approximations and assume that you have a perfect lens. The reality is much lower than those values.

How about smaller sensors like APS-C or m43? The decrease in resolution is proportional to the crop factor. So an APS-C shot at f/8 will only have a maximum effective resolution of 15Mp while m43 will have 8Mp and so on.

Here are MTF graphs for a Nikon 50/1.4 lens comparing a 16Mp D7000 (crop sensor) with a 36Mp D800 (full frame) at f/5.6 and f/16 respectively. Notice that the resolution at those settings are very similar.


So what are the implications? If you are a landscape photographer with a 36Mp Nikon D800 and you shoot at f/8 or f/11 or maybe f/16 to gain enough depth of field you are basically wasting disk space. At f/8, your 36Mp sensor is no better than a 30Mp sensor. At f/11 it’s no better than a 16Mp D4. At f/16 it is no better than a very old 12Mp D700. So a 36Mp sensor shot at small f-stops is not able to capture enough details and yet the image size remains the same and consumes 36Mp of disk space. If you shoot at f/16 for example, you are better off shooting with a 12Mp D700. If you want to print as big as a 36Mp camera then upsize your 12Mp image in Photoshop to an equivalent of a 36Mp image. Of course the upsized image will not gain any details but it doesn’t matter because the 36Mp hasn’t resolved any more details anyway.

A related analogy is that of scanning photos. Good prints are usually done at 300dpi. When scanning photos, it does not make sense if you scan higher than that because you won’t gain anything. Scanners are capable of 4800dpi or even 7200dpi and maybe higher. If you scan a print at 7200dpi you will get a really huge image but with no more detail than when you scanned it at 4800dpi or lower. You could have just scanned it at 600dpi and you won’t notice any difference. The 7200dpi scan is a waste of time and disk space.

Another common argument is that a sensor with lots of megapixels allows more cropping possibilities. Again, that is true only if you are not diffraction limited. Otherwise you could just shoot with a lower Mp camera, upsize the image and then crop and it will make no difference in terms of details.

This is why I have absolutely no interest in the D800 and other insanely high Mp APS-C cameras like the D7100 and K-3 and A6000. I shoot mostly landscape. I stop down to f/11 and sometimes even to f/22. At those f-stops these cameras are just a waste of space, time and processing power. Again, a 36Mp full frame camera does not make sense unless you shoot mostly wide open at f/5.6 and wider. A 24Mp APS-C is stupid unless you mostly shoot at f/5.6 and wider. Manufacturers keep increasing sensor resolution instead of improving noise performance because most photographers are gullible. Megapixels sell.

Having said that, do not be afraid to shoot at smaller f-stops if the shot calls for it. Even 4Mp effective resolution is a lot if you print at reasonable sizes. And since most people never print at all, 4Mp for web viewing is GIGANTIC!

For a more comprehensive explanation of the effects of diffraction refer to this article: http://www.luminous-landscape.com/tutorials/resolution.shtml

Shoot and shop wisely. 🙂

Advertisements

63 thoughts on “Understanding the Effects of Diffraction”

    1. If only his words had any truth in them.
      If he bothered to check a few lens tests and then calculate things from the MTFs, he’d notice how wrong he’s been.

      1. Sorry I like to leave it to people that really test lenses and resolution

        Or lets compare the D700 at F16 and the D800 at F16


        as you can see there is no diffraction limit as you say, just less and less gain going with a higher MP camera but we still see a large gain between the D800 and the D700 at F16

      2. Something is very wrong with their tests. They are inconsistent. Very inconsistent. Notice in the D700 graph, resolution peaks at f/8. In the D800 though it peaks at f/5.6. If their tests were uniform and correct both cameras would have peaked at f/5.6.

        Learn to analyse before swallowing.

      3. I’ve been analysing the first graph. It’s still inconsistent. Supposing that the 5D3 has achieved max resolution at f/4 which is approx 960foo then the 12Mp D700 should have maxed out at approx 700foo. But here, the D700 reached more than 750foo! The D800 would have peaked at 1400foo but it maxed out at barely 1100foo. The test chart is probably insufficient to measure the true resolution of a 36Mp sensor.

        The graph only covers up to f/8 where only the D800 is diffraction limited at 30Mp max which should have hit 1100foo. Comparing that graph with the last graph is also saying that the Nikon lens is better than the Zeiss?

        Now looking at the last two graphs comparing the D700 and D800, at f/11 the D800 would max out theoretically at 15Mp or around 850foo (close enough) and the D700 (still not diffraction limited at 12Mp) at around 750foo. So that f/11 D700 data point is an experimental error or the lens is crap at that aperture. Basically at f/11 you are comparing 15Mp (D800) vs 12Mp (D700). The graphs, although not that good are almost consistent with theory.

        Still very confusing though.

  1. ” If their tests were uniform and correct both cameras would have peaked at f/5.6. ”

    Why would both the D800 & D700 peak at the same Fstop?
    Even the reference material you linked to shows that they peak at different fstops
    Time to rethink you theory
    With a higher resolution sensor it is able to see diffraction start to affect IQ at lower Fstop because of its resolution ( the D800 is more accurately measuring at which Fstop we can see diffraction start)
    With the D700 and its lower resolution is unable to see (measure) diffraction at F5.6, not until F8 does the D700 start to see the effect of diffraction so this is why d700 peaks at F8

    1. Wrong. Very wrong.

      The graph of the D800 (3rd one) is expected but the graph of the D700 (2nd one) is incorrect. Have a look at the first graph and notice how the D700 peaks at f/5.6. Your graphs are even in conflict with each other.
      Again, learn to analyse. If an experiment conflicts with physical laws, then either the experiment is wrong or physics is wrong. I know which one I’m believing.

      1. Maybe you should learn to analyse or at the very least notice that the first graph is using another lens( 100 F2) other than the 50 1.4 G so of course the level the sensor resolution peaks will be different.

        If you look closely at graph 1 you would also see that the higher resolution sensor peaks earlier

        Looking at graphs 1 & 2 agrees with physics
        The point that we see diffraction starts earlier with the D800 F5.6 where we start to see diffraction having an affect at a higher resolution (or how you like to say it we hit the diffraction limit)
        With the D700 the points we see diffraction is F8 ( we hit the diffraction limit of the D700)
        The diffraction limit of a lens is the point as soon as it is closed down beyond of maximum resolution. If the sensors resolution is too low to see the max resolution then the point at which the sensor and lens captures max resolution is at the point just before we see a sensors resolution becomes diffraction limited(just before the point at which the sensor can capture diffraction) and this is what we see with the D700 at f8 in graph 1
        With the D800 we have the pixel count closer to the diffraction limit of the lens and it then of course peaks at a higher resolution and sooner than the d700 This is the physics of why D800 peaks at F5.6 and the D700 at F8.

      2. LOL!

        So the same Nikon lens that is able to maximise the D800 at f/5.6 suddenly does not work at the same f-stop on a D700?! ROFL!!!

        Ok, let’s summarise this a bit. Clearly you did not understand the luminous landscape article on diffraction. Honestly, I didn’t expect you to for the fact that you did not understand my very simple article.

        Let me guess: you bought an overpriced D800 and I bet you had to upgrade your computer as well?

        I will leave you with your delusions. Please keep on bending physical laws to suit your gear 🙂

  2. Clearly you don’t understand that the sensors resolution is what determines what F stop it is able to detect diffraction.
    Clearly you don’t understand that the max. resolution that a lens can project is that just before the point of it showing diffraction. Where the system ( lens & sensor) both capture the peak resolution depends on how these overlap
    “So the same Nikon lens that is able to maximise the D800 at f/5.6 suddenly does not work at the same f-stop on a D700?! ROFL!!! ”
    with regards to captured resolution yes.
    If that sensor is not able to see diffraction limit of a lens then the sensor is unable to capture any detail between the area where the sensor is diffraction limit and where the lens is diffraction limit. If the sensor is able to see the diffraction at a lower stop and thus closer to the lenses diffraction limit it records more resolution at a wider fstop

    1. So if max res is the f-stop before a sensor is diffraction limited then the D700 should peak at f/11 because it’s only 12Mp. So with older cameras like the D100 with only 6Mp then they should peak at f/16?! OMG!!! So your graphs are still very wrong. The graphs are bent the wrong way even if you are now bending physics ROFL!!! The article hits you squarely. You do not understand diffraction!
      Yeah, as I have said, keep on believing in your delusions. You don’t understand the very simple physics of light. Ironic that photography is all about light. 🙂
      Here’s what, why don’t you go to the luminous landscape website and tell them how wrong they are. ROFL!!!

      1. The maximum resolution one can achieve from a lens is at the point diffraction starts to degrade the image. The same applies for a sensor and lens together the sharpest image one can capture is just before the image is degraded by diffraction, it’s as simple as that

    1. I have a feeling that you guys are talking about the LENS being diffraction limited vs the SENSOR being diffraction limited (i.e. the lens is close to perfect).

      1. “Clearly you don’t understand that the sensors resolution is what determines what F stop it is able to detect diffraction.
        Clearly you don’t understand that the max. resolution that a lens can project is just before the point of it showing diffraction. Where the system ( lens & sensor) both capture the peak resolution depends on how these overlap”

        the point just before they both see diffraction as a system is the maximum resolution that system can capture and it is demonstrated by graphs 2&3

      2. Your theory is correct but the graphs are wrong. If a lens peaks at f/5.6 then it will peak in ALL sensors regardless of megapixels at the same f-stop. The lens is therefore diffraction limited at that f-stop but different sensors are diffraction limited at different f-stops depending on sensel density.

  3. The resolution of a lens is limited by diffraction and by aberrations.
    The best resolution you can obtain from a lens is at the point where the blur from aberrations is equal to the blur you get from diffraction (stopping a lens down to the point just before you see diffraction)

    Blur from aberrations-> optimal resolution <-blur from diffraction

    1. The resolution of a lens is NOT limited by diffraction. It’s only limited by aberrations. When a lens is said to be diffraction limited at a particular f-stop it means that aberrations at that f-stop are so minimal that any loss of resolution is purely caused by diffraction. For example, a very good lens is diffraction limited at f/4 which means it is capable of resolutions up to 120+Mp. If you stop it down to f/8 the lens actually improves because now you are using only the center of the glass. However diffraction kicks in causing a drop in resolution (to 30Mp effective) as detected on the sensor. It is the sensor that is essentially limited by diffraction. The same lens at f/8 on a sensor larger than FF is still capable of 120+Mp.

      But I’m sure you will continue to bend physics so you could justify your D800. I’m tired of educating you. When a 50Mp sensor comes out please don’t hesitate to buy it.

      Over and out.

      1. Lens resolution is limited by diffraction, when you close the diaphragm, and by aberrations, which worsen with focal length and the opening of the diaphragm.

  4. Why the D700 resolution peaks near F8
    The smallest Airy disk Dia. (Using a 2 pixels per Airy disk dia for green light) that the D700 can capture with its 8.4 µm pixel pitch is around F11 or an Airy disk size of 10-15 µm. Trying to capture any airy disk below F11 with the D700 is lost due to the low resolution of 8.4 µm pixel pitch. But this is based on a diffraction-limited lens, So the best ball park guess of F8-10 where you would see the greatest resolution from the D700 when 1 Airy disk is being projected onto 2 pixels of D700 sensor.
    Funny how that falls in line with Graph #2
    Or funny how that this sounds like me saying “ the sharpest image one can capture is just before the image is degraded by diffraction.

      1. You have very simplistic view on diffraction and it’s effect on image sampling, especially on device with color filter array.

        You said: “the limit is 15Mp” (for FF at f/11).

        A quick check of f/11 measurement on A7r http://www.photozone.de/sonyalphaff/865-zeiss35f28ff?start=1 give 3701 lines per image height, thus about 5551 lines per image width – at least 20.5 million pixels are needed for such resolving.

        I think you should create a new account to DPReview and discuss about this POLITELY this time in the science/technology-forum. If you have a scienticically curious mind, the surely a calm and polite discussion would be good and accepting errors and mistakes.

        And as I pointed out above, your resoltion limits are false.

      2. How can the D700 peak any place other than the very smallest Airy Disk that it can resolve
        Show me here on this table ( your reference material ) what is the smallest Airy disk the d700 can resolve.

      3. See, you clearly did not understand the statements that you quoted out of context. You have contradicted yourself in just two replies. ROFL!!!

        I’ll give you some hints straight from wikipedia nyahaha!!!

        in the case where the spread of the IRF is small with respect to the spread of the diffraction PSF, in which case the system may be said to be essentially diffraction limited (so long as the lens itself is diffraction limited).

        in the case where the spread of the diffraction PSF is small with respect to the IRF, in which case the system is instrument limited.

        in the case where the spread of the PSF and IRF are of the same order of magnitude, in which case both impact the available resolution of the system.

        Need a bigger cluebat? ROFL!

  5. dtmateojr wrote:
    “You are free to accept or reject the principles of physics. Here are some examples: http://www.talkemount.com/showthread.php?t=387

    The problem with those test images is that there is no exaplanation on what kind of blur-function he used to create them, and wether he considered the colour filter array (as it increases the effective pixel size). So it’s rather amateurish and superficial test.

    A sitenote: there is one thing where you are right – the lens performance isn’t influenced by the sensor, thus the lens performance peaks at the same aperture regardless of the image sensor used.

    Hower you said “The resolution of a lens is NOT limited by diffraction. It’s only limited by aberrations”. This is naturally false – if it weren’t, a 36MP sensor would sample 36MP even at f/22, an idea you are strongly against. You can’t have it both ways – something is reducing the ability of the lens of render the image with high resolution at f/22 and great majority of that something is diffraction. The sensor on the other hand is not limited by diffraction (unless we have tiny pixels when the wave nature of light needs to be considered as well as the particle nature, but this is beyond this topic).

    1. As stated so many times. Diffraction affects how the subject is PRESENTED TO THE SENSOR. If your sample has N lines and diffraction reduces it to N-M lines then obviously the sensor is not able to detect N linea no matter what. Please reread the entire article.

      1. Diffraction happens before sampling, not after it. If more information comes out of sampling that what you present to be possible (according to many different lens test sites), then could it be that your hypothesis is wrong?
        Also you ignore the color filter array.
        Also deconvolution can be used to resolve beyond diffraction (with the cost of increases noise due to imperfect knowledge of the point spread function).

  6. You should really rejoin DPR as it is quite impossible to have a discussion if your answers are like “Funny coz at f/5.6 it is only able to resolve 30Mp ” when I mentioned that your “fact” of 15MP at f/11 was proven wrong by simple lens measurement.
    Of course if discussion of science is not your cup of tea…

    1. How positive are you that no sharpening was performed? And besides, 20Mp is not much bigger than 15Mp. Compare with the idiot in DPR who said that f/16 is capable of resolving up to 89Mp?! ROFL!!! And you want me to rejoin them? I haven’t lost my mind yet.

      1. Sharpening increses contrast, not detail. If sharpening is used the detail in the data is easier to see due to increased contrast but the amount of detail is the same.

        Also have you considered CFA? For green light the sampling rate is only half (1.414 diagonally) of nominal. For blue and red a quarter.

        Also have you considered the benefits of oversampling (instead of just saying like 36MP sensor is no better than 30MP at f/8) – isn’t oversampling beneficial if you like to have an image without digital artifacts?

      2. The effect of diffraction is loss of contrast so yes, sharpening can increase resolution.

        Oversampling does not increase resolution. The only requirement is you satisfy the Nyquist sampling. Oversampling will only improve accuracy of sampling. Re-read the analogy on scanning photos as I have stated in my article. Scanning at 7200dpi is no better than scanning at 600dpi.

  7. Sharpening does on increase details. It only increases contrast which may appear as hight MTF-response in many measurements.Deconvolotuioin on the other hand can increase details, but it’s not really sharpening in the same sense.

    Oversampling reduces digital artifacts, thus it increases resoution. An artifact is contra-resolution.

    For kicks, I just did a small test with f/2.8 and f/16 – my camera has only one anti-alias filter (reduces horizontal resolution). interestingly the AA-filter that remains reduces resolution at f/2.8 at least as much as stopping down to f/16 on the other axis. Anyhow, quick measurements showed that the 24MP camera would resolve about 18.6MP worth at f/16 without the AA-filter. At f/2.8 it would go to 21MP with this test.

    The raw-conversion in Lightroom uses my normal capture sharpening which uses modest deconvolution instead of unharp-mask.

    Why not do the same test yourself? Print Bart van der Wolf’s sinusoidal siemens test target and measure the Nyquist response. Googling finds the target.

    1. May I know which lens you used that is diffraction limited at f/2.8? Must be a perfect lens with near zero aberration. Must be better than the Zeiss lens or the Nikon lens posted above. If you stopped down to f/4 you might get 24Mp. If you sharpen it you might get 36Mp from your 24Mp. 🙂

      Seriously, this is getting beyond ridiculous.

  8. And the 89MP was mentioned by DSPographer who is very knowledgeable about these issues, as is Jim Kasson who also participates in that thread. You could learn lots from them.

    1. So why don’t you ask them to go to the luminous landscape website and tell author of the article how wrong he is or better yet tell the people with real PhD’s in Physics that their theories in optics are bollocks instead of arguing with a nobody like me? Even better, you do it yourself if you truly believe that those guys whose names you dropped are correct. If they can produce a peer-reviewed formal paper that proves their “theory” then they should be getting published in optics journals for their game-changing findings.
      Next time you drop a name, please make sure that they are credible.

  9. HTS
    “Lens resolution is limited by diffraction, when you close the diaphragm, and by aberrations, which worsen with focal length and the opening of the diaphragm.”

    dtmateojr
    October 18, 2014 at 18:13
    “Wrong. I’ll leave the explanation to you as an exercise. You just won’t get it no matter how much I explain.”

    If this is wrong then you better write to LL as this was a direct quote from their article that your reference to

    1. Pffft! You can’t read, no? Let’s try to dissect that:

      Lens resolution IS LIMITED BY DIFFRACTION, WHEN YOU CLOSE THE DIAPHRAGM…

      The LL article is discussing the effects of diffraction on the sensor. The first condition assumes that there are no aberrations. Then it added:

      … AND BY ABERRATIONS…

      saying that when you are not diffraction limited as the first condition then aberrations will kill resolution. And this is consistent with the rest of the article.

      You quoted the article out of context just to prove a point. Lots of idiots in dpreview do that. Why don’t you join them if you haven’t already.

  10. You should learn to read.
    They are describing Lens resolution basics, as the title of that block would suggest

    “Lens resolution is limited by diffraction, when you close the diaphragm, and by aberrations, which worsen with focal length and the opening of the diaphragm.”

    Here in a single paragraph the text is not taken out of context

    They are defining how the resolution of a lens is limited

    Lens resolution is limited by diffraction ( they then go and describe what causes diffraction) when you close the diaphragm, ( now they add the word “and” this would imply that both diffraction and aberrations limit resolution) and by aberrations, ( they then go and describe why aberrations limit resolution ) , which worsens with focal length and the opening of the diaphragm.
    You say
    “saying that when you are not diffraction limited as the first condition then aberrations will kill resolution. And this is consistent with the rest of the article.”

    I would say that this is you putting your own spin on the article

    I don’t see how that was taken out of context as they are defining what limits the resolution of a lens

    1. Teh LULZ!!! Then you showed a table that has a list of max theoretical res for each f-stop. So if a lens at f/16 on aps-c resolves 4Mp how come the same lens resolves 8Mp on FF if diffraction limits a LENS?!! Mind boggling, no? But you still won’t accept it. Doh?!

      1. here’s your problem,
        All lenses at all fstop show diffraction, it’s what resolution you want to capture that determines the level of diffraction you will see.

        Here is one that goes down to F1 for a diffraction limited lens
        at F1 the smallest Airy Disk Dia (smallest level of detail that can be projected by a lens) in microns is 1.3

      2. You are hopeless. If only you would open up your mind to physics and basic common sense. Try to forget that you bought an expensive D800 just for a while.
        I own several cameras in different formats. I even own one that has twice the resolving power of your D800. The only difference is that I am not delusional.

        Let’s end this discussion. I’m tired. You go your own merry ways and I’ll go my own path. You keep believing the myth and ignore my blog posts. If you really think that you are absolutely correct then it won’t matter what I write in my blog, it won’t affect the truth. Right?

        If you are genuinely interested to learn then go to those whom you trust. I’m not the guy that you want. My purpose is to debunk the bullshit that gear whores are spreading in forums by using physical laws. I am not expecting you to agree with me but think before you ignore.

        Good bye.

  11. You
    “So if a lens at f/16 on aps-c resolves 4Mp how come the same lens resolves 8Mp on FF if diffraction limits a LENS?!! Mind boggling, no? But you still won’t accept it. Doh?!”
    You have to enlarge the aps-c more than you have to enlarge the FF To view them both at the same size.

    The more you enlarge, the more you enlarging of the diffraction you will see.

  12. We should be continuing this discussion here
    http://www.luminous-landscape.com/forum/index.php?topic=94395.0
    quote from that post
    “I have a question about this article
    Under the ” Lens resolution basics” block

    In this paragraph
    “Lens resolution is limited by diffraction, when you close the diaphragm, and by aberrations, which worsen with focal length and the opening of the diaphragm.”

    Are you say that a lens resolution is limited by both Diffraction & Aberrations at the same time.
    Would this imply that the greatest resolution from a lens would lie at the point where the blur from Diffraction and the blur from Aberrations are the same?”
    If you feel that anything that I have said here is incorrect here’s your time to correct me
    Its up to you if your man enough or chicken
    I am sure that if this is incorrect this is the place that would disagree with me.

    1. First you need to understand what a diffraction limited lens actually mean. Failure to understand this leads to nonsense. Then uderstand what diffraction does to the sensor. Consider both cases one at a time, say, assume that a lens is free of aberrations. If you can handle them separately then consider both of them together. Anyway I’ll let the people in LL educate you.

      1. I know what a diffraction-limited lens is
        But you fail to understand All lenses at all fstop show diffraction, it’s what resolution you want to capture that determines the level of diffraction you will see.

        As seen in this graph diffraction limits the resolution at all f-stops

        I understand what affect diffraction has on a sensor. what you don’t understand is that the Airy disk Dia sets the resolution that the sensors can capture
        Just as my example with the d700 the smallest Airy Disk ( resolution ) is a airy disk just before you see the onset of diffraction. Or another way to look at it, You will need a pixel with a diagonal at least as large as the diameter of the Airy disk in order to detect the spot size, its position and brightness.. If that Airy Disk is smaller then the pixel you wind up projecting more spots onto the single pixel with out it being able to detect the single spot, its position and brightness. They are all clumped together creating a lower resolution . So opening up the lens gives you no more resolution advantage and has reached its peak of resolution at a higher s-stop

      2. Here to stay
        “Are you say that a lens resolution is limited by both Diffraction & Aberrations at the same time.
        Would this imply that the greatest resolution from a lens would lie at the point where the blur from Diffraction and the blur from Aberrations are the same?”

        Petrus
        Sr. Member
        “Yes.

        Aberrations are the greatest at full open and diminish when stopped down. Diffraction is minimal at full open and start to get worse with smaller apertures. Where there two meet is the maximum resolution from this particular lens. Better lenses have this point at larger apertures, worse lenses at higher apertures.”

        ErikKaffehr
        Sr. Member
        “+1, Petrus is right.”

        ErikKaffehr
        Sr. Member

        “By and large, I would say it is so. The issue is probably a bit more complex, but what you say is a nice way of describing it.

        Best regards
        Erik”

        dtmateojr
        “Anyway I’ll let the people in LL educate you. “

  13. Just to let you know my employment at this time still use large format film for Arial photography and is in the process of going digital @ this time I don’t own a D800 but rather a classic 5D 😉

    1. Strange that you don’t understand why large format photographers shoot at f/64 when f/22 looks like shite on full frame. If diffractions limits a lens then large format photography is worse than shite.

      Anyway…

  14. What effect on your explantions of shooting wider lens is there when I shoot a 100 macro quality Canon L series lens with my 5D3 or even my 5dsr at close proximity ( 9-12 inches) with 2 large strobes ( equal distance and parallel to my lens) …and then regularly stop down between f-18 to f-25? I am not seeing noticeable diffractions…..

    Bruce

    1. “Noticing” diffraction with the naked eye is very subjective and inaccurate and highly dependent on the kind of subject you are shooting. Try shooting a test chart with your lens and how the very fine lines “merge”.

      Shooting a 5DS at f/22 effectively reduces its resolution to about 4Mp. In most cases, 4Mp is big enough. We look at photos displayed on 60″ full HD TVs and do not complain about sharpness. A full HD TV is only 2Mp.

      My point is, if it looks acceptable to you then just keep doing it.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s