Personal View site logo
Sensor resolution vs nyquist sampling theorem
  • So, if I want to sample audio everybody knows that f = 0,5 fs. So to sample 20 kHz I need >40 kHz sampling frequency.

    So if I have a camera system of i.e. HD resolution of 1920 x 1080 pixels the actuall resolution of this sample raster is only 960 X 540 pixels.

    Yes or no? Actually somewhere in between?

  • 6 Replies sorted by
  • @EspenB

    You formulate it little weird.

    Look at original Nyquist sampling theorem, it tells you about sinus function mostly (and other stuff can be represented as such due to Fourier transform). So it talks how you can sample analog signal and how you can restore original using such sampling data.

    So, second question has very little resemblance and even less meaning. As requires digging deep into MTF. As you need to ask - "we have lens with such and such resolution defined by the edge response (MTF are derived from such, etc)", what resolution of bayer sensor we need so we can fully restore original picture with edge response.

  • I think it is a little bit subjetive or a mix of subjective/objective.

    I was in the cinema theater with a friend and she arrived earlier than me and sit in a front row. The image is soft if you see from near, so I prefer to sit in a back row.

    The 1920x1080 video from the old canon t3i camera is very soft and low resolution, when you see it in 50% size it get sharp because it is in 960x540 size

    The GH2 camera is 1920x1080 but it can rival 4k. When I see GH2 image in a 28 inch 4k monitor the resolution is great and I do not miss the 4k.

    Distance helps a lot. If you like sharp images find the correct distance to see the image.

    Some great 1080p cameras are the Canon C100, C200, C300, because they have a 4k sensor and downsample to 1080p to record, very sharp 1080p videos, search on vimeo to see

    you can also try to record 4k with your camera and downsample to 1080p and compare with the original 1080p recording. The downsample from 4k is much more sharp

    The Bayer array sensor does not deliver 1080 lines from a 1080 sensor. it is around 720 lines of resolution maximum, some kind of 66%

  • @Vitaliy_Kiselev I'm not talking abou the lens performance here. Just the basics of motion imaging which in a sense is a sampling system on three axis (X,Y dimension at frame level) and z-axis for the time (frame rate).

    Also z-axis sampling is known for mirroring effects, i.e. when the wheels on a wagon or car appear to move counter-clockwise to the direction of the objects travel due to too fast motion for the 24 FPS samling (FPS). Here 48 FPS or 60 FPS surely would be better. But somehow 24 FPS is now seen as giving a "cinematic" feel to the motion images. And attempt to advance to 48 FPS for cinema has been a failure. Even The Hobbit films was ridiculed for the use of HFR as I recall.

    Also the limited frame rate places restrictions of how fast you can pan the camera whitout introduction objectionable artifacts to the viewer. (Here also the field-of-view of the given lens is a factor. A wide field-of-view means you can pan faster than a narrow framing without getting panning artifacts. Alas, this is one of the jobs for the cinematographer to have such issues under control.)

    As for the intra frame sampling in the X/Y-dimension: At least until recently an OPF - Optical Low Pass filter was placed before the sensor to avoid details which could make obejctionable sample rate mirroring artifacts in the image. But as I recall this filter usually is at a 0.7 factor and not 0.5 as per Nyquist.

    Then come the recent OPF free designs which have traded the optical low pass filter with a DSP which eliminates such artifacts "digitally" instead. And thus generally preserve more resolution across the image.

    Downsampling algorithms (4K -> HD i.e.) may or may not take such problems into considerations at the consumer level, which might cause shimmering artifacts in the picture during reproduction of high detail, but still deliver a sharper and more pleasing picture regardless. Here it is clear examples in the early DVD days. DVD players could squeeze the 16:9 image on the disc down to letterbox for a 4:3 format tv, but the downsampling algorithms was very crude and a lot of detail shimmering could be seen from these early players.

    Alas the point of the discussion here is that the optical resolution of the image needs to be lower than the pixel raster resolution (which can be seen as the sampling frequency of the system). And it usually is.

  • Issue generally known as Kell factor in the analog tv/camera industry.

    https://en.wikipedia.org/wiki/Kell_factor

  • Things was even worse in the days of interlace. The actual perceptible resolution would change between full and half scan lines depending on the amount of image motion between the two fields of the interlaced image.

    I.e. a fast pan with a interlaced tv camera would drop the resolution to only half the number of system scan lines. And you still had to apply the Kell factor (modified Nyquist number) as well.