Talk:Chromatic aberration

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Purple Fringing isn't CA?[edit]

In digital photography it is common to see color fringes around high contrast areas. Imagine a dark spot against a light background. Chromatic aberration in the main lens of any camera (digital or otherwise) is caused because the wavelengths of light refract through the lens at slightly different angles, causing the "red" part of the image to be displaced from the "green" (say), leaving a red fringe to one side of the dark spot, and green fringe on the opposite edge.

Purple fringing is usually used, in my experience as a professional photographer, to refer to a separate artifact in which all sides of the spot I described above would have a fringe of the same color, but the source of that particular type of error would seem to be somewhat different than what's usually meant by "chromatic aberration."

Lacking citations, I'm making no changes to the text, but I encourage interested authors to consider and if appropriate apply the distinction I'm making.


CA's are Color fringing caused by objektiv=====[edit]

purple fringing is the same as transversal cromatic aberration which must not always have more than one smudged colors. In most cases it overlays with longitudinal cromatic aberration and has an expression in different magnification for example green fringing.

CA is always in pairs of colours[edit]

Which you can actually see in the examples. 'Real' optical CA is a violet edge on one side of a sharp detail, and red on the other side. The graph on the bottom explains why those pairs are red and violet. Opposite ends of the spectrum The top-image has a purple/green fringing, which would be caused by a YUV-based video-system with an phase-error somewhere in the UV-processing. The top-image shows an video-error, not optical CA.

Mirrors[edit]

Do mirrors show CA? —Preceding unsigned comment added by 130.194.13.105 (talk) 03:57, 8 November 2007 (UTC)[reply]


Someone would have to confirm this - but I believe that the reason why CA works on lenses and telescopes is because of their refractive qualities. Since a mirror is a reflective quality only, CA shouldn't be an issue. Or if it is, then it's called something else.

Absolutely not, Mirrors don't suffer from Chromatic Aberration since they don't refract light, mirrors reflect light; this was proven centuries ago by Isaac Newton, and is the reason modern telescopes (including the Hubble telescope) use Mirrors.

I agree. I'm surprised this article never mentions mirrors, so I added a short sentence. --DavidCary (talk) 17:56, 10 December 2021 (UTC)[reply]

Chromatic Aberration in Photography[edit]

I noticed a large part of the article is dedicated to effects of CA in photography, this is great as CA affects photography but the article (in my opinion) must show the causes of chromatic aberration before discussing its effects, so I'm gonna move the two paragraphs dedicated to photography and even digital cameras in the intro to a new sub-title called "Chromatic Aberration in Photography", the paragraphs may look out of context so anyone with deep knowledge on photography and chromatic aberration effects, please edit it, try to include technical data on how digital cameras correct some of this problems, should be interesting. —Preceding unsigned comment added by 189.163.205.77 (talk) 05:05, 4 March 2008 (UTC)[reply]

References to Nikon and Panasonic[edit]

The references to Nikon's and Panasonic's in-camera correction capabilities are correct, to my knowledge, but they seem more appropriate in a buyer's guide than in a discussion of CA. The comments made me wonder if they were included as brand name promotion rather than as impartial information. Perhaps the brand name references should be removed. I'm new here, so I'm not jumping in and making edits myself at this point. I just wanted to bring it up for consideration. —Preceding unsigned comment added by Dave21742 (talkcontribs) 16:48, 17 March 2010 (UTC)[reply]

Disambiguation needed[edit]

I think this article requires disambiguations for color fringing AKA "color aberration" in CMYK offset printing due to misaligned 4-color templates, and color fringing AKA "color shifting" or "color drooping", a similar-looking issue in composite video due to poor-quality copies of the video signal's chroma subsampling (I'm aware neither of the two issues have own articles yet). --79.193.44.240 (talk) 21:45, 25 May 2010 (UTC)[reply]

better examples needed[edit]

The examples given make no reference to the type of aberration, hence are of limited value. We need a comprehensive set of illustrations of the two types and their superposition. smartcat (talk) 04:00, 19 July 2010 (UTC)[reply]

Examples: Axial vs. Lateral[edit]

The examples show lateral chromatic aberration but all of the ray diagrams show axial chromatic aberration. I've never seen an example of an image suffering from axial; we should have one if it can be detected. We should also have a ray diagram showing how lateral chromatic comes about. —Ben FrantzDale (talk) 13:36, 14 April 2011 (UTC)[reply]

Is image processing possible or not?[edit]

The section 'Image processing to reduce the appearance of lateral chromatic aberration' starts out with a forceful statement that post processing can never correct CA. Then later paragraphs talk about existing systems that do correct CA in post processing. I believe the later paragraphs more than the first paragraph, but none have citations. I'm inclined to remove the first paragraph since its sounds ranty and is inconsistent, but I hesitate since none of it is verifiable as written. In fact I'm just going to mark this whole article as needing citations because it does. — Preceding unsigned comment added by Leopd (talkcontribs) 04:51, 26 July 2011 (UTC)[reply]

It's complicated. Let's talk about lateral CA -- where you get rainbows at sharp edges at the edge of an image because the red image is a different size than the blue image. Now, to a first approximation, you can scale the red, green, and blue channel separately to line them up -- this is what the later paragraphs are talking about. But what if you had a black & white camera? Well, you'd still have CA, it would just show up as smearing rather than looking like a rainbow. Now you don't have the color information to help you fix things. You could try something like deconvolution, but it would be ill-conditioned.
Considering the color case again, "red" isn't just a single wavelength, so the red channel of an image actually has the same problem as a B&W image, so while having R, G, and B helps you reconstruct the image, it's not perfect, because you don't have a way of distorting the 620nm image from the 740nm image. With a multispectral image, you in principal could align all wavelengths, but that's not a standard camera.
Furthermore, there's axial chromatic aberration. If you have spherical aberration in the blue wavelengths, but less so in the red wavelengths, to fully reconstruct the image would be to deconvolve the spherical aberration, which again is ill-conditioned.
In the very special case that you illuminate with a few lasers, you could design an optical system and image sensor such that the image was sharp at each wavelength, then you could rectify any remaining radial CA. For example, with a standard color sensor, a diffraction-limited apochromat and red, green, and blue lasers. That would be a case in which you could remove all of the CA in post processing, but it's a very special case. —Ben FrantzDale (talk) 12:58, 26 July 2011 (UTC)[reply]
What you're saying tells me the first paragraph is incorrect. Statements like "This detail cannot be restored other than by recapturing the original scene with a better lens." and "camera software producers falsely claim the ability" are ill-informed and mis-leading. Even if it's complicated, as you say, under the right conditions, some amount of CA can be corrected in post-processing. Moreover, the limited circumstances in which this is somewhat possible (having a color sensor, having detailed knowledge about the lens that took the picture) are pretty much the norm these days, with RGB sensors and EXIF data.
Unless somebody stands up for it, I'm going to remove the first paragraph of this section. Leopd (talk) 16:53, 28 July 2011 (UTC)[reply]
No, I still agree with the first paragraph. It really is impossible in general to fully correct CA, even with complete information about the light spectrum and about the optics. So if I have a lens exhibiting CA, I completely agree that "This detail cannot be restored other than by recapturing the original scene with a better lens." I am sure some camera software claims to correct CA, but some information is really lost, so while you can warp the R, G, and B channels together, even if you go nuts and do some sort of deconvolution, you've already lost some information that you'll never get that data back. The details are pretty deep, so it may make sense to explain these claims more in the article. —Ben FrantzDale (talk) 20:07, 28 July 2011 (UTC)[reply]
I have been shooting Nikons for many years and you are probably right in theory, but in practice these cameras delete CA from the image very good. It works with any lens even manual, they don't use a database. 84.178.29.222 (talk) 18:57, 30 July 2020 (UTC)[reply]

"different wavelengths of light"[edit]

The opening paragraph contends that chromatic aberration "occurs because lenses have a different refractive index for different wavelengths of light." would it be more accurate to say "different frequencies of light," because the frequency of the light does not change as it passes through the medium? GreenMachine86 (talk) 17:55, 20 August 2011 (UTC)[reply]

I suppose to be pedantic, yes, that's probably better. It's enough less common that it might want a parenthetical "(different wavelengths in a vacuum)" or something. —Ben FrantzDale (talk) 12:52, 22 August 2011 (UTC)[reply]

Content is being used in a bogus website advert[edit]

Maybe this is off-topic, but I think it is worth mentioning. A website (kinotehnik.com) selling lcd viewfinders is using the photographs used to illustrate this article - and is actually claiming that the photograph without chromatic aberration was taken using their product, and the one with chromatic aberration was taken using a rival product! The original image description on Wikipedia Commons states that the image without chromatic aberration was taken using a Sony V3 camera, and the one with chromatic aberration was taken using the same camera but with a wide angle lens attachment. Meowy 13:10, 16 June 2012 (UTC)[reply]

possibility of chromatic aberration being useful?[edit]

In Medibang Paint there's a filter option that applies chromatic aberration to the selected layer, and I've found that it results in the same visual effect as when you look at the screen through a diffraction grating. I know this only applies to the RGB colorspace, which is how color is displayed on the screen itself, but could this theoretically be useful for visualizing spectra?

chromatic aberration in RGB, done with a diffraction grating
chromatic aberration as calculated digitally

— Preceding unsigned comment added by Gridzbispudvetch (talkcontribs) 02:57, 2 October 2019 (UTC)[reply]

Gridzbispudvetch (talk) 03:39, 2 October 2019 (UTC)[reply]

You could use actual chromatic aberration to make spectra apparent. Computer simulated chromatic aberration is not going to be helpful, because you're limited to the data you have. An RGB image does not capture the full spectrum. --Srleffler (talk) 19:17, 6 October 2019 (UTC)[reply]