Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Diffraction
#1
I know it's a silly question, but anyway :-)

I just thought - when diffraction kicks in at specific aperture, red light is affected first (the longest wavelength). And, as I understand, red channel will suffer from some light loss as well, so technically camera should up the red sensitivity to counter that, correct? Having that red and blue channels already are quite bad noise wise, would that mean that at specific aperture (before it affects the blue channel) red noise will look worst? :-)
#2
[quote name='Lomskij' timestamp='1291199421' post='4636']

I know it's a silly question, but anyway :-)

I just thought - when diffraction kicks in at specific aperture, red light is affected first (the longest wavelength). And, as I understand, red channel will suffer from some light loss as well, so technically camera should up the red sensitivity to counter that, correct? Having that red and blue channels already are quite bad noise wise, would that mean that at specific aperture (before it affects the blue channel) red noise will look worst? :-)

[/quote]



No. The total amount of light energy remains the same across all pixels (assuming that the speed is adjusted according to the aperture).

It is only diffused - so the "give and take" is constant.
#3
[quote name='Klaus' timestamp='1291200297' post='4637']

No. The total amount of light energy remains the same across all pixels (assuming that the speed is adjusted according to the aperture).

It is only diffused - so the "give and take" is constant.

[/quote]



So does it mean that light loss due to "bending" (when light starts falling outside the sensor area) is negligible?
#4
[quote name='Lomskij' timestamp='1291201921' post='4641']

So does it mean that light loss due to "bending" (when light starts falling outside the sensor area) is negligible?

[/quote]



This is not how diffraciton works.



Diffraction does not "bend" the light so it falls outside of the area covered by the sensor ... it distributes

the light in a way that the remaining image is less sharp, bit it does not take away any light.



Imagine a single very small photocell, and a single very small lightbeam ... if this lightbeam is

distributed (by diffraciton) over a larger area, its intensity indeed decreases ... but ... in reality there

are all that neighbour-lightbeams that now also got distributed over larger areas ... so ... what you

lose from one beam, you get more from another.
#5
Concerning my previous diffraction related posts, I have to keep myself back but I can't help thinking even I lack of a bunch of basic knowledge concerning the electronics and optics <img src='http://forum.photozone.de/public/style_emoticons/<#EMO_DIR#>/unsure.gif' class='bbc_emoticon' alt=':unsure:' />...



We have a bayer filter in front of the photosites so that the A/D converter can understand the colors... And also there's a thingy called "demosaicing" which makes the interpolation (guessing of color in the adjacent cells) and produces the image data. And in addition to those, if the diffraction of red, green and blue light has different characteristics, I can say that I am very amazed how a decent lookable image can be produced by these little marvels... I hope it is my misconception (again) <img src='http://forum.photozone.de/public/style_emoticons/<#EMO_DIR#>/huh.gif' class='bbc_emoticon' alt='Huh' />...



Kind regards,



Serkan
#6
[quote name='Rainer' timestamp='1291205074' post='4644']

This is not how diffraciton works.



Diffraction does not "bend" the light so it falls outside of the area covered by the sensor ... it distributes

the light in a way that the remaining image is less sharp, bit it does not take away any light.



Imagine a single very small photocell, and a single very small lightbeam ... if this lightbeam is

distributed (by diffraciton) over a larger area, its intensity indeed decreases ... but ... in reality there

are all that neighbour-lightbeams that now also got distributed over larger areas ... so ... what you

lose from one beam, you get more from another.

[/quote]



Right. Ok, it probably will sound really stupid now, but does the image circle projected by the lens expand due to diffraction?
#7
[quote name='Lomskij' timestamp='1291210018' post='4650']

Right. Ok, it probably will sound really stupid now, but does the image circle projected by the lens expand due to diffraction?

[/quote]

What you see with diffraction is energy that is "supposed" to end up on one sensel ending up in neighbouring sensels. Edges get therefore more and more blurred.



But, the sensels are so very small... that you can understand that the light does not really end up in a totally different place. Yes, some light will end up in a slightly wider area than without any diffraction. But that also is a very small difference... so you will not be able to detect it (you will not start having underexposed images due to diffraction compared to images without diffraction).
#8
[quote name='Lomskij' timestamp='1291210018' post='4650']

... does the image circle projected by the lens expand due to diffraction?

[/quote]



No. What happens is, that the amount of vignetting usually decreases when stopping down,

so the brightness is more evenly distributed over the image circle ... and by stopping

down, the usable imagecricle might quite well expand a little ... but this is not a cause of

diffraction.
  


Forum Jump:


Users browsing this thread:
1 Guest(s)