
Almost all current DSLR-type cameras are based on a Bayer-filter array sensor. This means that each pixel on a CCD or CMOS chip has their own color filter. The array is usually arranged in a a square fashion where each square of four pixels has 1 pixel with a red filter, one pixel with a blue filter and 2 with green filters (in one of the two diagonals for obvious reasons. There are two green ones, because the human eye is most sensitive to green light and therefore to noise in the green channel. So a 10MP camera has 2.5 Million green pixels, 2.5 million blue ones and 5 million green ones. The job of the RAW processor in the camera or in the computer software if you shoot RAW is to interpolate between the single colors to generate the missing color values at all pixel locations. Most RAW processors do a pretty good job at this, but there is a physical limitation imposed by this. The actual resolution of these sensors will never be as large as what is claimed. If you would remove all the filters, you would get the claimed resolution in black and white. Another alternative is the foveon sensor such as the ones used by sigma, which has three color pixels all at the same site arranged in layers. This leads to the exact same resolution as the number of photosites. Unfortunately, Sigma chooses to simply count the number of photodetectors, which artificially inflates the resolution number by a factor of 3. For example, the SD14 is marketed as a 14.1 MPixel camera. This is extremely misleading as the actual resolution is only 4.6 Megapixels. The camera generates jpegs at 14.1 megapixels, but the pixels in between the actual photosites are simply interpolated and do not add any extra information. Silly marketing! Anyway, cameras with Bayer-array sensors suffer a similar problem as I explained above. The excellent site
dpreview.com actually tests the resolution of all the cameras they can get their hands on. They use high resolution primes to make the comparison fair. Of course, comparing between cameras this way is the hallmark of a
measurebator as ken rockwell likes to say, so I am not going to compare cameras, just see if we can draw some conclusions about the Bayer technology. Dpreview gives some numbers for the actual resolutions of the cameras they test in lines per picture height (LPH). For example, for the Nikon D200 the horizontal LPH is 2100 and the vertical 1700, since this is a 6x4 sensor, the actual resolution of the sensor is 6/4*2100*1700=5.36 Mpixels. About half the number of photosites on the sensor. Dpreview also gives an extinction resolution where all detail disappears, but moiré artefacts are visible. For the D200, this occurs at 7.4 Mpixels. To see if we can learn some more about these Bayer array sensors, I've taken the MTF data from dpreview.com for a large array of Nikon and Canon DSLR cameras and plotted the actual measured resolution vs the number of photosites on the sensor (on the right), together with the "extinction resolution". The green line would be if the resolution was the same as the camera megapixels. It is clear that a trend is visible in both the real resolution and the extinction resolution. For all array densities, the actual resolution obeys:
actual resolution = 0.58*MP
and the extinction resolution:
ext. res. = 0.8*MP.
So the actual resolution of your camera, if you have a Bayer sensor, is only about half of what is claimed! Extraordinarly good RAW processors might be able to get slightly more out of it, but never more than 0.8*MP. WIth Foveon sensors it is even worse, you only get 1/3 of the resolution of the number the camera manufacturer claims as they quote you the number of photodetectors, while the only thing that matters is the number of pixel locations, which is only 1/3 of the number of photodetectors.