Leica Megapixel Mania, Diffraction, and Diminishing Returns

uhoh7

Regular
One of my pet endeavors is getting clean crips images at f/11 and my experience is that some lenses are better at this than others. However it comes up in discussion: at some f stop diffraction makes lenses equal and not so crisp.

What point is that? When I try to chase that down suddenly I'm confronted with pixel size. Little pixels diffract quicker than big ones:
24mp diffraction limit is F/11 and 12mp is f/13 according to one site. I wonder, Brian, and anyone else, what do you think about this, and what would be the corresponding diffraction limit of film?

Next, is there anything to discriminate between lenses in general at F/11, and at what point do most lenses blend in performance? One expert at another forum says by F/11 the only aberration left is Astigmatism. But later he suggested grinding quality and coatings may have an effect as well.

At the moment the hot camera is the A7r2. The first thing guys mention about it is always "42 megapixels". Eric Fossum, from Yale, invented the CMOS, and trying to understand the issues above led me to a youtube lecture by him at Yale. First, it's a great introduction to digital sensors for anyone who wants a route to easy understanding. But at 38 minutes, he really drops a bomb. "The force of marketing is greater than the force of engineering" A personal favorite saying of his. Megapixels sell, put a bigger number on the box, they will buy it. This despite the fact the tiny pixels hit the diffraction limit sooner. People won't notice the image issues, they will just pay the money. That pays the engineers and gives them something to do. This is the industry thinking right now, he says. And everybody in the industry loves megapixels because then you have to get other tech to work with them, computers, screens, storage, etc. But it's essentially a scam, he implies.

here is the video:

Now, I have no idea if a 42mp shot is worse or better at, say 5k display, than an 18mp shot, using a good lens at f/11. But I sure would like to know LOL
 
One of my pet endeavors is getting clean crips images at f/11 and my experience is that some lenses are better at this than others. However it comes up in discussion: at some f stop diffraction makes lenses equal and not so crisp.

What point is that? When I try to chase that down suddenly I'm confronted with pixel size. Little pixels diffract quicker than big ones:
24mp diffraction limit is F/11 and 12mp is f/13 according to one site. I wonder, Brian, and anyone else, what do you think about this, and what would be the corresponding diffraction limit of film?

Next, is there anything to discriminate between lenses in general at F/11, and at what point do most lenses blend in performance? One expert at another forum says by F/11 the only aberration left is Astigmatism. But later he suggested grinding quality and coatings may have an effect as well.

At the moment the hot camera is the A7r2. The first thing guys mention about it is always "42 megapixels". Eric Fossum, from Yale, invented the CMOS, and trying to understand the issues above led me to a youtube lecture by him at Yale. First, it's a great introduction to digital sensors for anyone who wants a route to easy understanding. But at 38 minutes, he really drops a bomb. "The force of marketing is greater than the force of engineering" A personal favorite saying of his. Megapixels sell, put a bigger number on the box, they will buy it. This despite the fact the tiny pixels hit the diffraction limit sooner. People won't notice the image issues, they will just pay the money. That pays the engineers and gives them something to do. This is the industry thinking right now, he says. And everybody in the industry loves megapixels because then you have to get other tech to work with them, computers, screens, storage, etc. But it's essentially a scam, he implies.

here is the video:

Now, I have no idea if a 42mp shot is worse or better at, say 5k display, than an 18mp shot, using a good lens at f/11. But I sure would like to know LOL

Seems like once you have the body, that's it, whereas picking lenses is where you have control.
 
"3mm" is what I've always read about the effects of diffraction creeping into an image. The corresponding F-stop depends on focal length and the optical formula of a lens.

When I worked with Digital Imagers in the 1980s, the optical engineers wanted "Matched Optics". Lenses designed to match the resolution of the sensor. Adding more megapixels to have the blur of the lens require more space to store? Never made sense to me. Most lenses are tested for contrast at 50lp/mm. Nyquest states that requires 100 pixels/mm to resolve, but you are better to go slightly higher for resolution- we did 10%~20% higher.Color Mosaic Filters complicate matters, you would like to treat a 2x2 Bayer cell as one pixel to account for every possibility, but figure to double it. So a monochrome camera with 8.6MPixels is a good match for a typical lens, and 16MPixels is good for color. So the M Monochrom with 18MPixels is a match for my Micro-Nikkor stopped down and the Df at 16MPixels is a good match for most of my lenses. I prefer the light gathering power of a large pixel, small pixels have an entirely different set of issues to worry about- like dark current, readout noise, and fill-factor.

I never understood why someone would spend $7000 for a 50mm F2 lens with 125lp/mm resolution for a 24MPixel camera. My Jupiter-3 stopped down to F4 is a perfect match for the M Monochrom. But, I'm an Engineer that's been in Optical Sciences for 37 years- what do I know. Okay, I know their are no plans in my future to upgrade my 16MPixel Df and 18MPixel M9 and M Monochrom. Not necessary for my selection of ~200 lenses.
 
"3mm" is what I've always read about the effects of diffraction creeping into an image. The corresponding F-stop depends on focal length and the optical formula of a lens.

When I worked with Digital Imagers in the 1980s, the optical engineers wanted "Matched Optics". Lenses designed to match the resolution of the sensor. Adding more megapixels to have the blur of the lens require more space to store? Never made sense to me. Most lenses are tested for contrast at 50lp/mm. Nyquest states that requires 100 pixels/mm to resolve, but you are better to go slightly higher for resolution- we did 10%~20% higher.Color Mosaic Filters complicate matters, you would like to treat a 2x2 Bayer cell as one pixel to account for every possibility, but figure to double it. So a monochrome camera with 8.6MPixels is a good match for a typical lens, and 16MPixels is good for color. So the M Monochrom with 18MPixels is a match for my Micro-Nikkor stopped down and the Df at 16MPixels is a good match for most of my lenses. I prefer the light gathering power of a large pixel, small pixels have an entirely different set of issues to worry about- like dark current, readout noise, and fill-factor.

I never understood why someone would spend $7000 for a 50mm F2 lens with 125lp/mm resolution for a 24MPixel camera. My Jupiter-3 stopped down to F4 is a perfect match for the M Monochrom. But, I'm an Engineer that's been in Optical Sciences for 37 years- what do I know. Okay, I know their are no plans in my future to upgrade my 16MPixel Df and 18MPixel M9 and M Monochrom. Not necessary for my selection of ~200 lenses.

I have a DOF calculator I keep handy in the documents app of my iPhone and iPad, and I wonder if there's a handy document available for these calculations as well?
 
Back
Top