Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

With higher resolution sensors, small imperfections are more visible. To extract the most from a high-megapixel camera, you need spot-on focusing, a higher quality (more expensive) lens, and more care (e.g. higher shutter speed or tripod) to avoid camera shake during exposure.

If you nail all these, I wouldn't call it crappy even at 1:1.

But there is a residual imperfection from the Bayer pattern on the sensor, since each pixel records only one color (R, G, or B) and the other two values for that pixel have to be guesstimated from neighboring pixels, so the de-bayering process isn't perfect.

One way to fix it is to use a monochrome sensor and color filters, taking 3-4 exposures (luminance/mono plus R/G/B) and stack them.

A few cameras on the market have a pixel shift feature that can do something similar - multiple exposures, shifting the sensor one pixel between exposures so each pixel get a true R/G/B sample, and stack them in camera or in post.

Edit: Forgot to mention the anti-aliasing filter. It sits in front of the sensor and deliberately blurs the image at the pixel level. This is done to avoid aliasing and moiré artifacts, but obviously has the side effect of not-so-great image quality at the pixel peeping level. The fix for this is to get a camera without an AA filter, many modern high-resolution cameras don't have them.

https://www.outdoorphotographer.com/photography-gear/cameras...



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: