What is moiré? How can we avoid it?

  • What is the image defect called "moiré"? What causes it, and how can we avoid or reduce it? Is it related to "false color"?

    When the moon hits your eye like a big pizza pie, that's a moiré

    @Kyle, I'm tempted to say "When a guy drops a pun, and he thinks it's big fun, that's a moron". But then I'd be calling myself a moron, so I won't.

    I recently purchased a Nikon D7100 with 24.3 mp and they have done away with the aa filter altogether to sharpen up the image,I have taken about 200 pictures and so far found no evidence of moire,however I cannot see any improvement of sharpness over my old Nikon D90 when displaying them on my pc!Perhaps you only notice the increase of sharpness when blowing them up or its all conspiracy!but hey I'm happy with camera and its resulting images and that's what photography is all about isn't it?

    @KyleCronin see the xkcd comic strip https://www.xkcd.com/1814/ Color Pattern: " When a grid's misaligned / with another behind / That's a moiré..."

  • Matt Grum

    Matt Grum Correct answer

    10 years ago

    Moiré is a form of aliasing whereby false patterns can be observed in an image.

    Imagine a lighthouse which sends a pulse of light every 5 seconds, and a camera (or other observer) which sees the lighthouse for three seconds and is then blocked from seeing it for three seconds:

    lighthouse:        *....*....*....*....*....*....*....*...
    observer:          ***...***...***...***...***...***...***
    observed pattern:  *...................*....*....*........

    What is actually a regular pulsing is observed as a highly uneven pattern due to temporal aliasing, caused by the sampling frequency being close to but different from the frequency of the phenomenon being observed. This is why wagon wheels can appear to spin backwards when observed with a movie camera with fixed framerate.

    Moiré is exactly the same phenomenon but is an example of spatial rather than temporal aliasing. When a regular pattern of light and dark like a manmade fibre cloth is imaged by a sensor with a fixed pattern of pixels the evenness of the fibres gives way to an artificial pattern at a much lower frequency (more widely spaced):

    Moiré on parrot feathers

    Image by Fir0002/Flagstaffotos. License

    Aliasing is at its worst when the frequency of the pattern is close to the sampling frequency (the density of pixels in the sensor). Low frequency patterns aren't a problem. An anti-aliasing filter blurs the image therefore reducing the frequency of the input and chance of aliasing. The downside is that this also blurs non-repeating patterns which would not cause noticeable aliasing.

    Instead of reducing the image frequency by blurring, aliasing can also be mitigating by increasing the sampling frequency, i.e. having more megapixels (without increasing sensor size or lens sharpness). Digital medium format cameras have lots of megapixels but still suffer from aliasing due to sharper lenses.

    Anti-aliasing filters are fitted to almost all 35mm DSLR sensors as standard. They can be removed; I know of one company that does this:


    The advantage is greater sharpness but more moiré. If you only shoot random natural textures then this is probably a good idea. Some digital MF cameras and backs lack an AA filter. Opinion is divided on why, given how bad aliasing can be on fabrics and the fact the MF is frequently used for fashion. My view is that these images are probably shot at f/32–f/45 with giant lighting power packs, and the diffraction acts as an AA filter.

    Moiré can be reduced in software, but only to an extent, it's usually much better to use an AA filter. You can get screw-on anti-alias filters for lenses if your sensor lacks an AA filter.

    Moiré is related to false colour due to the demosiacing process. As (most) digital sensors are monochrome devices, alternating colour filters are placed over each pixel and the colours interpolated to produce a full colour image. When observing a high frequency monochrome pattern, adjacent pixels with different colour filters may see peaks and troughs in the signal respectively, and this can be (falsely) interpreted as different colours being present in the input.

    This is one of the best explanations I've ever read. Concur, +1 for the lighthouse analogy.

    why is the observed pattern "on" when the observer cannot see the lighthouse?

    the observed pattern is not on the same time scale, perhaps this is an easier to understand version of it `*...................*....*....*........`

License under CC-BY-SA with attribution

Content dated before 7/24/2021 11:53 AM