Why are camera sensors green?












12















When I look at a CMOS sensor it's green.



But CCD sensor. photos in the internet show pink sensors.



So what exactly defines the color of an camera sensor? Especially what defines colors of the sensors of an gorgeous 3CCD camcorder?



I looked to the CMOS sensor in sunlight. Would there be an difference If I looked at it in a dark room with an perfectly white flashlight in my hand?










share|improve this question





























    12















    When I look at a CMOS sensor it's green.



    But CCD sensor. photos in the internet show pink sensors.



    So what exactly defines the color of an camera sensor? Especially what defines colors of the sensors of an gorgeous 3CCD camcorder?



    I looked to the CMOS sensor in sunlight. Would there be an difference If I looked at it in a dark room with an perfectly white flashlight in my hand?










    share|improve this question



























      12












      12








      12








      When I look at a CMOS sensor it's green.



      But CCD sensor. photos in the internet show pink sensors.



      So what exactly defines the color of an camera sensor? Especially what defines colors of the sensors of an gorgeous 3CCD camcorder?



      I looked to the CMOS sensor in sunlight. Would there be an difference If I looked at it in a dark room with an perfectly white flashlight in my hand?










      share|improve this question
















      When I look at a CMOS sensor it's green.



      But CCD sensor. photos in the internet show pink sensors.



      So what exactly defines the color of an camera sensor? Especially what defines colors of the sensors of an gorgeous 3CCD camcorder?



      I looked to the CMOS sensor in sunlight. Would there be an difference If I looked at it in a dark room with an perfectly white flashlight in my hand?







      sensor cmos-image-sensor ccd






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Feb 7 at 14:10







      Jonathan Irons

















      asked Feb 7 at 10:15









      Jonathan IronsJonathan Irons

      27910




      27910






















          4 Answers
          4






          active

          oldest

          votes


















          1














          The color you see when you look at a "sensor" is usually determined by the combined colors of the colored filter arrays that are placed directly in front of the actual silicone chip as well as the combination of other filters (Low-pass, IR, UV) placed in the "stack" in front of the sensor.



          Although we call them "red", "green", and "blue", the colors of most Bayer masks are:




          • 50% "green" pixels that are centered on around 530-540 nanometers and significantly sensitive to light ranging from about 460nm to past 800nm and the edge of the infrared range. The "color" of 540nm light is a slightly bluish green color.

          • 25% "blue" pixels that are centered on around 460nm and significantly sensitive to light ranging from the non-visible ultraviolet range to about 560 nm. The "color" of 460nm light is a bluish-violet color.

          • 25% "red" pixels that are centered on around 590-600nm and significantly sensitive to light ranging from about 560nm to well into the infrared range. The "color" of 600nm light is a yellowish-orange color. (What we call "red" is on the other side of orange at about 640nm).


          The "color" components of the Bayer mask can be seen by looking at spectral response curves for various sensors:



          enter image description here



          enter image description here



          The "colors" each type of cone in the human retina are most sensitive to are similar:



          enter image description here



          Here is a representation for the "colors" humans perceive for various wavelengths of light:



          enter image description here



          Please compare the peaks of the sensitivities above with the "colors" of those wavelengths along the visible spectrum.



          There are no coatings on most tri-color imaging sensors that is centered on what we call "red", all of the drawings on the internet of CMOS sensors with Bayer filter arrays depicted notwithstanding.



          enter image description here



          Most CMOS sensors placed in cameras used for taking the types of images we consider "photography" here have a "stack" of filters that include both infrared (IR) and ultraviolet (UV) cut filters in front of the Bayer color filter array. Most also include a low pass "anti-aliasing" filter. Even sensor designs that are said to have "no low pass filter" tend to have either a cover glass with the same refractive index or the two components of a low pass filter oriented to each other so that the second one cancels the first one.



          enter image description here



          enter image description here



          What one sees when one looks into the front of a camera and sees an exposed CMOS sensor is the combined effect of light reflecting off all of theses filters, and is dominated by the slightly bluish-green tint of the "green" filtered portions of the Bayer mask combined with half as many blue-violet and orange-yellow filtered portions that we call "blue" and "red". When viewed sitting inside an actual camera, most of the light striking the sensor and the stack in front of it will be from a fairly narrow range of angles and usually be fairly uniform in color. (The purple tint on the edge of the Sony sensor is probably due to reflections of light at just the right angles off the UV and/or IR cut filters.)



          enter image description hereenter image description here



          When there is light from a wide range of angles falling on such a sensor without the filter "stack" in front of it, there will also be a prismatic effect evident that will show a fuller range of colors, due to the shapes of the surface of the microlenses on top and the colors of the Bayer mask sandwiched in between the microlenses and the sensor.



          enter image description here






          share|improve this answer

































            13














            An unfiltered CCD or CMOS sensor looks very similar to any other silicon integrated circuit that has a very regular/repeating structure of similar structure size - semi-metallic gray (from silicon, quartz and aluminum) with some iridescence probably resulting from diffraction grating effects in the fine, repeating structures. Compare a bare DRAM or flash memory chip.



            A filtered sensor typical for a color video or still camera will appear greenish because colour filter matrices that are heavily green-biased (2 green pixels for every red and blue pixel) are very commonly used, since such a perception bias is also well known to exist in the human eye (even in non-green-eyed individuals :) )






            share|improve this answer


























            • How do you figure that "diffraction grating" effects explain any of this; light wavelengths are on the order of 400-800nm, an order of magnitude smaller than any of the lateral features on the top level of a typical photographic sensor.

              – Shamtam
              Feb 7 at 21:22













            • The bayer filters are usually dyed, not dichroic AFAIK - a dyed green filter is green :)

              – rackandboneman
              Feb 7 at 21:42











            • @Shamtam there's plumbing around the pixels ... and many silicon ICs manufactured in few-micrometer processes (think of a 27C128 EPROM) iridesce intensely....

              – rackandboneman
              Feb 7 at 21:44













            • Ah, I was not aware of that about typical Bayer filters. However, I still don't buy that diffraction grating of the sensor itself would cause any visible iridescence after traveling through the microlenses and color filters in front of the Si sensor itself. It's tough to say for sure without an actual diagram of the design of the sensor. I'm mostly at contention with the "semi-metallic gray" being the base color of an IC. Two different Si wafers with different thickness oxides grown on them can look vastly different, neither being anything close to gray.

              – Shamtam
              Feb 7 at 21:58













            • This doesn't explain why some sensors look pink even though they have green-biased bayer filters.

              – xiota
              Feb 8 at 0:11



















            3














            I've personally seen sensors of various colors in different cameras; green, pink, blue, etc. Without the specific dimensions and details on construction, it is tough to say, but I'd imagine that the color on most sensors is given by the thickness of the coatings on top of the sensor. Different thicknesses will produce different colors due to thin-film interference. Depending on the thickness of the coatings, different wavelengths of light (i.e. colors) destructively interfere with themselves in the coating, and whatever wavelengths do not reflect back to give you the color you see.






            share|improve this answer































              2














              Photographic film is naturally sensitive only to violet and blue light frequencies. Hermann Vogel, Professor Berlin Technical, attempting so solve the problem due to “halation”,. He had some emulsions dyed yellow to arrest blue light exposing from the reflections from the emulsion-base interface. The It worked but to his amazement, the film gained sensitivity to green light (orthochromatic). His graduate students discovered other dyes sensitize emulsions to red light. This was an important step, emulsions sensitive to red, green and blue, yielded correct monochromatic rending. These tweaked emulsions made future color films possible.



              As the CCD and CMOS senor evolved, it was also necessary to tweak them as to RGB sensitivity. Bryce Bayer of Eastman Kodak developed a subpixel matrix scheme coating the various photosites with strong additive color filters. The scheme is approximately 50% green, 25% blue and 25% red filters. This scheme tweaks the overall sensitivity so that a more faithful image results.



              Because the image sensor is highly sensitive to infrared radiation, the entire imaging surface is filtered and this flat cover-glass does duel duty and protects the fragile surface from abrasion. A cover-glass is highly polished so it, like polished lenses induces a light loss due to surface reflection.



              Robert Taylor, London optician, discovered that aged lenses acquired a natural coat of grime from air pollution. These “bloomed” lens reflected away only 2% whereas a new lens reflected away 8%. Artificial blooming (coating) took hold in the 1930’s.



              The coated lens or cover-glass appears dichroic. It looks one color by transmission and the opposite color by reflection. Say the coat is to control red and blue reflections, the lens appear green by reflected light and magenta by transmitted light. Because most such glass is multi-coated, a casual observation gives little clue as to what color is being mitigated.






              share|improve this answer























                Your Answer








                StackExchange.ready(function() {
                var channelOptions = {
                tags: "".split(" "),
                id: "61"
                };
                initTagRenderer("".split(" "), "".split(" "), channelOptions);

                StackExchange.using("externalEditor", function() {
                // Have to fire editor after snippets, if snippets enabled
                if (StackExchange.settings.snippets.snippetsEnabled) {
                StackExchange.using("snippets", function() {
                createEditor();
                });
                }
                else {
                createEditor();
                }
                });

                function createEditor() {
                StackExchange.prepareEditor({
                heartbeatType: 'answer',
                autoActivateHeartbeat: false,
                convertImagesToLinks: false,
                noModals: true,
                showLowRepImageUploadWarning: true,
                reputationToPostImages: null,
                bindNavPrevention: true,
                postfix: "",
                imageUploader: {
                brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
                contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
                allowUrls: true
                },
                noCode: true, onDemand: true,
                discardSelector: ".discard-answer"
                ,immediatelyShowMarkdownHelp:true
                });


                }
                });














                draft saved

                draft discarded


















                StackExchange.ready(
                function () {
                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphoto.stackexchange.com%2fquestions%2f104893%2fwhy-are-camera-sensors-green%23new-answer', 'question_page');
                }
                );

                Post as a guest















                Required, but never shown

























                4 Answers
                4






                active

                oldest

                votes








                4 Answers
                4






                active

                oldest

                votes









                active

                oldest

                votes






                active

                oldest

                votes









                1














                The color you see when you look at a "sensor" is usually determined by the combined colors of the colored filter arrays that are placed directly in front of the actual silicone chip as well as the combination of other filters (Low-pass, IR, UV) placed in the "stack" in front of the sensor.



                Although we call them "red", "green", and "blue", the colors of most Bayer masks are:




                • 50% "green" pixels that are centered on around 530-540 nanometers and significantly sensitive to light ranging from about 460nm to past 800nm and the edge of the infrared range. The "color" of 540nm light is a slightly bluish green color.

                • 25% "blue" pixels that are centered on around 460nm and significantly sensitive to light ranging from the non-visible ultraviolet range to about 560 nm. The "color" of 460nm light is a bluish-violet color.

                • 25% "red" pixels that are centered on around 590-600nm and significantly sensitive to light ranging from about 560nm to well into the infrared range. The "color" of 600nm light is a yellowish-orange color. (What we call "red" is on the other side of orange at about 640nm).


                The "color" components of the Bayer mask can be seen by looking at spectral response curves for various sensors:



                enter image description here



                enter image description here



                The "colors" each type of cone in the human retina are most sensitive to are similar:



                enter image description here



                Here is a representation for the "colors" humans perceive for various wavelengths of light:



                enter image description here



                Please compare the peaks of the sensitivities above with the "colors" of those wavelengths along the visible spectrum.



                There are no coatings on most tri-color imaging sensors that is centered on what we call "red", all of the drawings on the internet of CMOS sensors with Bayer filter arrays depicted notwithstanding.



                enter image description here



                Most CMOS sensors placed in cameras used for taking the types of images we consider "photography" here have a "stack" of filters that include both infrared (IR) and ultraviolet (UV) cut filters in front of the Bayer color filter array. Most also include a low pass "anti-aliasing" filter. Even sensor designs that are said to have "no low pass filter" tend to have either a cover glass with the same refractive index or the two components of a low pass filter oriented to each other so that the second one cancels the first one.



                enter image description here



                enter image description here



                What one sees when one looks into the front of a camera and sees an exposed CMOS sensor is the combined effect of light reflecting off all of theses filters, and is dominated by the slightly bluish-green tint of the "green" filtered portions of the Bayer mask combined with half as many blue-violet and orange-yellow filtered portions that we call "blue" and "red". When viewed sitting inside an actual camera, most of the light striking the sensor and the stack in front of it will be from a fairly narrow range of angles and usually be fairly uniform in color. (The purple tint on the edge of the Sony sensor is probably due to reflections of light at just the right angles off the UV and/or IR cut filters.)



                enter image description hereenter image description here



                When there is light from a wide range of angles falling on such a sensor without the filter "stack" in front of it, there will also be a prismatic effect evident that will show a fuller range of colors, due to the shapes of the surface of the microlenses on top and the colors of the Bayer mask sandwiched in between the microlenses and the sensor.



                enter image description here






                share|improve this answer






























                  1














                  The color you see when you look at a "sensor" is usually determined by the combined colors of the colored filter arrays that are placed directly in front of the actual silicone chip as well as the combination of other filters (Low-pass, IR, UV) placed in the "stack" in front of the sensor.



                  Although we call them "red", "green", and "blue", the colors of most Bayer masks are:




                  • 50% "green" pixels that are centered on around 530-540 nanometers and significantly sensitive to light ranging from about 460nm to past 800nm and the edge of the infrared range. The "color" of 540nm light is a slightly bluish green color.

                  • 25% "blue" pixels that are centered on around 460nm and significantly sensitive to light ranging from the non-visible ultraviolet range to about 560 nm. The "color" of 460nm light is a bluish-violet color.

                  • 25% "red" pixels that are centered on around 590-600nm and significantly sensitive to light ranging from about 560nm to well into the infrared range. The "color" of 600nm light is a yellowish-orange color. (What we call "red" is on the other side of orange at about 640nm).


                  The "color" components of the Bayer mask can be seen by looking at spectral response curves for various sensors:



                  enter image description here



                  enter image description here



                  The "colors" each type of cone in the human retina are most sensitive to are similar:



                  enter image description here



                  Here is a representation for the "colors" humans perceive for various wavelengths of light:



                  enter image description here



                  Please compare the peaks of the sensitivities above with the "colors" of those wavelengths along the visible spectrum.



                  There are no coatings on most tri-color imaging sensors that is centered on what we call "red", all of the drawings on the internet of CMOS sensors with Bayer filter arrays depicted notwithstanding.



                  enter image description here



                  Most CMOS sensors placed in cameras used for taking the types of images we consider "photography" here have a "stack" of filters that include both infrared (IR) and ultraviolet (UV) cut filters in front of the Bayer color filter array. Most also include a low pass "anti-aliasing" filter. Even sensor designs that are said to have "no low pass filter" tend to have either a cover glass with the same refractive index or the two components of a low pass filter oriented to each other so that the second one cancels the first one.



                  enter image description here



                  enter image description here



                  What one sees when one looks into the front of a camera and sees an exposed CMOS sensor is the combined effect of light reflecting off all of theses filters, and is dominated by the slightly bluish-green tint of the "green" filtered portions of the Bayer mask combined with half as many blue-violet and orange-yellow filtered portions that we call "blue" and "red". When viewed sitting inside an actual camera, most of the light striking the sensor and the stack in front of it will be from a fairly narrow range of angles and usually be fairly uniform in color. (The purple tint on the edge of the Sony sensor is probably due to reflections of light at just the right angles off the UV and/or IR cut filters.)



                  enter image description hereenter image description here



                  When there is light from a wide range of angles falling on such a sensor without the filter "stack" in front of it, there will also be a prismatic effect evident that will show a fuller range of colors, due to the shapes of the surface of the microlenses on top and the colors of the Bayer mask sandwiched in between the microlenses and the sensor.



                  enter image description here






                  share|improve this answer




























                    1












                    1








                    1







                    The color you see when you look at a "sensor" is usually determined by the combined colors of the colored filter arrays that are placed directly in front of the actual silicone chip as well as the combination of other filters (Low-pass, IR, UV) placed in the "stack" in front of the sensor.



                    Although we call them "red", "green", and "blue", the colors of most Bayer masks are:




                    • 50% "green" pixels that are centered on around 530-540 nanometers and significantly sensitive to light ranging from about 460nm to past 800nm and the edge of the infrared range. The "color" of 540nm light is a slightly bluish green color.

                    • 25% "blue" pixels that are centered on around 460nm and significantly sensitive to light ranging from the non-visible ultraviolet range to about 560 nm. The "color" of 460nm light is a bluish-violet color.

                    • 25% "red" pixels that are centered on around 590-600nm and significantly sensitive to light ranging from about 560nm to well into the infrared range. The "color" of 600nm light is a yellowish-orange color. (What we call "red" is on the other side of orange at about 640nm).


                    The "color" components of the Bayer mask can be seen by looking at spectral response curves for various sensors:



                    enter image description here



                    enter image description here



                    The "colors" each type of cone in the human retina are most sensitive to are similar:



                    enter image description here



                    Here is a representation for the "colors" humans perceive for various wavelengths of light:



                    enter image description here



                    Please compare the peaks of the sensitivities above with the "colors" of those wavelengths along the visible spectrum.



                    There are no coatings on most tri-color imaging sensors that is centered on what we call "red", all of the drawings on the internet of CMOS sensors with Bayer filter arrays depicted notwithstanding.



                    enter image description here



                    Most CMOS sensors placed in cameras used for taking the types of images we consider "photography" here have a "stack" of filters that include both infrared (IR) and ultraviolet (UV) cut filters in front of the Bayer color filter array. Most also include a low pass "anti-aliasing" filter. Even sensor designs that are said to have "no low pass filter" tend to have either a cover glass with the same refractive index or the two components of a low pass filter oriented to each other so that the second one cancels the first one.



                    enter image description here



                    enter image description here



                    What one sees when one looks into the front of a camera and sees an exposed CMOS sensor is the combined effect of light reflecting off all of theses filters, and is dominated by the slightly bluish-green tint of the "green" filtered portions of the Bayer mask combined with half as many blue-violet and orange-yellow filtered portions that we call "blue" and "red". When viewed sitting inside an actual camera, most of the light striking the sensor and the stack in front of it will be from a fairly narrow range of angles and usually be fairly uniform in color. (The purple tint on the edge of the Sony sensor is probably due to reflections of light at just the right angles off the UV and/or IR cut filters.)



                    enter image description hereenter image description here



                    When there is light from a wide range of angles falling on such a sensor without the filter "stack" in front of it, there will also be a prismatic effect evident that will show a fuller range of colors, due to the shapes of the surface of the microlenses on top and the colors of the Bayer mask sandwiched in between the microlenses and the sensor.



                    enter image description here






                    share|improve this answer















                    The color you see when you look at a "sensor" is usually determined by the combined colors of the colored filter arrays that are placed directly in front of the actual silicone chip as well as the combination of other filters (Low-pass, IR, UV) placed in the "stack" in front of the sensor.



                    Although we call them "red", "green", and "blue", the colors of most Bayer masks are:




                    • 50% "green" pixels that are centered on around 530-540 nanometers and significantly sensitive to light ranging from about 460nm to past 800nm and the edge of the infrared range. The "color" of 540nm light is a slightly bluish green color.

                    • 25% "blue" pixels that are centered on around 460nm and significantly sensitive to light ranging from the non-visible ultraviolet range to about 560 nm. The "color" of 460nm light is a bluish-violet color.

                    • 25% "red" pixels that are centered on around 590-600nm and significantly sensitive to light ranging from about 560nm to well into the infrared range. The "color" of 600nm light is a yellowish-orange color. (What we call "red" is on the other side of orange at about 640nm).


                    The "color" components of the Bayer mask can be seen by looking at spectral response curves for various sensors:



                    enter image description here



                    enter image description here



                    The "colors" each type of cone in the human retina are most sensitive to are similar:



                    enter image description here



                    Here is a representation for the "colors" humans perceive for various wavelengths of light:



                    enter image description here



                    Please compare the peaks of the sensitivities above with the "colors" of those wavelengths along the visible spectrum.



                    There are no coatings on most tri-color imaging sensors that is centered on what we call "red", all of the drawings on the internet of CMOS sensors with Bayer filter arrays depicted notwithstanding.



                    enter image description here



                    Most CMOS sensors placed in cameras used for taking the types of images we consider "photography" here have a "stack" of filters that include both infrared (IR) and ultraviolet (UV) cut filters in front of the Bayer color filter array. Most also include a low pass "anti-aliasing" filter. Even sensor designs that are said to have "no low pass filter" tend to have either a cover glass with the same refractive index or the two components of a low pass filter oriented to each other so that the second one cancels the first one.



                    enter image description here



                    enter image description here



                    What one sees when one looks into the front of a camera and sees an exposed CMOS sensor is the combined effect of light reflecting off all of theses filters, and is dominated by the slightly bluish-green tint of the "green" filtered portions of the Bayer mask combined with half as many blue-violet and orange-yellow filtered portions that we call "blue" and "red". When viewed sitting inside an actual camera, most of the light striking the sensor and the stack in front of it will be from a fairly narrow range of angles and usually be fairly uniform in color. (The purple tint on the edge of the Sony sensor is probably due to reflections of light at just the right angles off the UV and/or IR cut filters.)



                    enter image description hereenter image description here



                    When there is light from a wide range of angles falling on such a sensor without the filter "stack" in front of it, there will also be a prismatic effect evident that will show a fuller range of colors, due to the shapes of the surface of the microlenses on top and the colors of the Bayer mask sandwiched in between the microlenses and the sensor.



                    enter image description here







                    share|improve this answer














                    share|improve this answer



                    share|improve this answer








                    edited Feb 9 at 20:00

























                    answered Feb 9 at 19:48









                    Michael CMichael C

                    132k7150370




                    132k7150370

























                        13














                        An unfiltered CCD or CMOS sensor looks very similar to any other silicon integrated circuit that has a very regular/repeating structure of similar structure size - semi-metallic gray (from silicon, quartz and aluminum) with some iridescence probably resulting from diffraction grating effects in the fine, repeating structures. Compare a bare DRAM or flash memory chip.



                        A filtered sensor typical for a color video or still camera will appear greenish because colour filter matrices that are heavily green-biased (2 green pixels for every red and blue pixel) are very commonly used, since such a perception bias is also well known to exist in the human eye (even in non-green-eyed individuals :) )






                        share|improve this answer


























                        • How do you figure that "diffraction grating" effects explain any of this; light wavelengths are on the order of 400-800nm, an order of magnitude smaller than any of the lateral features on the top level of a typical photographic sensor.

                          – Shamtam
                          Feb 7 at 21:22













                        • The bayer filters are usually dyed, not dichroic AFAIK - a dyed green filter is green :)

                          – rackandboneman
                          Feb 7 at 21:42











                        • @Shamtam there's plumbing around the pixels ... and many silicon ICs manufactured in few-micrometer processes (think of a 27C128 EPROM) iridesce intensely....

                          – rackandboneman
                          Feb 7 at 21:44













                        • Ah, I was not aware of that about typical Bayer filters. However, I still don't buy that diffraction grating of the sensor itself would cause any visible iridescence after traveling through the microlenses and color filters in front of the Si sensor itself. It's tough to say for sure without an actual diagram of the design of the sensor. I'm mostly at contention with the "semi-metallic gray" being the base color of an IC. Two different Si wafers with different thickness oxides grown on them can look vastly different, neither being anything close to gray.

                          – Shamtam
                          Feb 7 at 21:58













                        • This doesn't explain why some sensors look pink even though they have green-biased bayer filters.

                          – xiota
                          Feb 8 at 0:11
















                        13














                        An unfiltered CCD or CMOS sensor looks very similar to any other silicon integrated circuit that has a very regular/repeating structure of similar structure size - semi-metallic gray (from silicon, quartz and aluminum) with some iridescence probably resulting from diffraction grating effects in the fine, repeating structures. Compare a bare DRAM or flash memory chip.



                        A filtered sensor typical for a color video or still camera will appear greenish because colour filter matrices that are heavily green-biased (2 green pixels for every red and blue pixel) are very commonly used, since such a perception bias is also well known to exist in the human eye (even in non-green-eyed individuals :) )






                        share|improve this answer


























                        • How do you figure that "diffraction grating" effects explain any of this; light wavelengths are on the order of 400-800nm, an order of magnitude smaller than any of the lateral features on the top level of a typical photographic sensor.

                          – Shamtam
                          Feb 7 at 21:22













                        • The bayer filters are usually dyed, not dichroic AFAIK - a dyed green filter is green :)

                          – rackandboneman
                          Feb 7 at 21:42











                        • @Shamtam there's plumbing around the pixels ... and many silicon ICs manufactured in few-micrometer processes (think of a 27C128 EPROM) iridesce intensely....

                          – rackandboneman
                          Feb 7 at 21:44













                        • Ah, I was not aware of that about typical Bayer filters. However, I still don't buy that diffraction grating of the sensor itself would cause any visible iridescence after traveling through the microlenses and color filters in front of the Si sensor itself. It's tough to say for sure without an actual diagram of the design of the sensor. I'm mostly at contention with the "semi-metallic gray" being the base color of an IC. Two different Si wafers with different thickness oxides grown on them can look vastly different, neither being anything close to gray.

                          – Shamtam
                          Feb 7 at 21:58













                        • This doesn't explain why some sensors look pink even though they have green-biased bayer filters.

                          – xiota
                          Feb 8 at 0:11














                        13












                        13








                        13







                        An unfiltered CCD or CMOS sensor looks very similar to any other silicon integrated circuit that has a very regular/repeating structure of similar structure size - semi-metallic gray (from silicon, quartz and aluminum) with some iridescence probably resulting from diffraction grating effects in the fine, repeating structures. Compare a bare DRAM or flash memory chip.



                        A filtered sensor typical for a color video or still camera will appear greenish because colour filter matrices that are heavily green-biased (2 green pixels for every red and blue pixel) are very commonly used, since such a perception bias is also well known to exist in the human eye (even in non-green-eyed individuals :) )






                        share|improve this answer















                        An unfiltered CCD or CMOS sensor looks very similar to any other silicon integrated circuit that has a very regular/repeating structure of similar structure size - semi-metallic gray (from silicon, quartz and aluminum) with some iridescence probably resulting from diffraction grating effects in the fine, repeating structures. Compare a bare DRAM or flash memory chip.



                        A filtered sensor typical for a color video or still camera will appear greenish because colour filter matrices that are heavily green-biased (2 green pixels for every red and blue pixel) are very commonly used, since such a perception bias is also well known to exist in the human eye (even in non-green-eyed individuals :) )







                        share|improve this answer














                        share|improve this answer



                        share|improve this answer








                        edited Feb 7 at 14:58

























                        answered Feb 7 at 14:07









                        rackandbonemanrackandboneman

                        1,733414




                        1,733414













                        • How do you figure that "diffraction grating" effects explain any of this; light wavelengths are on the order of 400-800nm, an order of magnitude smaller than any of the lateral features on the top level of a typical photographic sensor.

                          – Shamtam
                          Feb 7 at 21:22













                        • The bayer filters are usually dyed, not dichroic AFAIK - a dyed green filter is green :)

                          – rackandboneman
                          Feb 7 at 21:42











                        • @Shamtam there's plumbing around the pixels ... and many silicon ICs manufactured in few-micrometer processes (think of a 27C128 EPROM) iridesce intensely....

                          – rackandboneman
                          Feb 7 at 21:44













                        • Ah, I was not aware of that about typical Bayer filters. However, I still don't buy that diffraction grating of the sensor itself would cause any visible iridescence after traveling through the microlenses and color filters in front of the Si sensor itself. It's tough to say for sure without an actual diagram of the design of the sensor. I'm mostly at contention with the "semi-metallic gray" being the base color of an IC. Two different Si wafers with different thickness oxides grown on them can look vastly different, neither being anything close to gray.

                          – Shamtam
                          Feb 7 at 21:58













                        • This doesn't explain why some sensors look pink even though they have green-biased bayer filters.

                          – xiota
                          Feb 8 at 0:11



















                        • How do you figure that "diffraction grating" effects explain any of this; light wavelengths are on the order of 400-800nm, an order of magnitude smaller than any of the lateral features on the top level of a typical photographic sensor.

                          – Shamtam
                          Feb 7 at 21:22













                        • The bayer filters are usually dyed, not dichroic AFAIK - a dyed green filter is green :)

                          – rackandboneman
                          Feb 7 at 21:42











                        • @Shamtam there's plumbing around the pixels ... and many silicon ICs manufactured in few-micrometer processes (think of a 27C128 EPROM) iridesce intensely....

                          – rackandboneman
                          Feb 7 at 21:44













                        • Ah, I was not aware of that about typical Bayer filters. However, I still don't buy that diffraction grating of the sensor itself would cause any visible iridescence after traveling through the microlenses and color filters in front of the Si sensor itself. It's tough to say for sure without an actual diagram of the design of the sensor. I'm mostly at contention with the "semi-metallic gray" being the base color of an IC. Two different Si wafers with different thickness oxides grown on them can look vastly different, neither being anything close to gray.

                          – Shamtam
                          Feb 7 at 21:58













                        • This doesn't explain why some sensors look pink even though they have green-biased bayer filters.

                          – xiota
                          Feb 8 at 0:11

















                        How do you figure that "diffraction grating" effects explain any of this; light wavelengths are on the order of 400-800nm, an order of magnitude smaller than any of the lateral features on the top level of a typical photographic sensor.

                        – Shamtam
                        Feb 7 at 21:22







                        How do you figure that "diffraction grating" effects explain any of this; light wavelengths are on the order of 400-800nm, an order of magnitude smaller than any of the lateral features on the top level of a typical photographic sensor.

                        – Shamtam
                        Feb 7 at 21:22















                        The bayer filters are usually dyed, not dichroic AFAIK - a dyed green filter is green :)

                        – rackandboneman
                        Feb 7 at 21:42





                        The bayer filters are usually dyed, not dichroic AFAIK - a dyed green filter is green :)

                        – rackandboneman
                        Feb 7 at 21:42













                        @Shamtam there's plumbing around the pixels ... and many silicon ICs manufactured in few-micrometer processes (think of a 27C128 EPROM) iridesce intensely....

                        – rackandboneman
                        Feb 7 at 21:44







                        @Shamtam there's plumbing around the pixels ... and many silicon ICs manufactured in few-micrometer processes (think of a 27C128 EPROM) iridesce intensely....

                        – rackandboneman
                        Feb 7 at 21:44















                        Ah, I was not aware of that about typical Bayer filters. However, I still don't buy that diffraction grating of the sensor itself would cause any visible iridescence after traveling through the microlenses and color filters in front of the Si sensor itself. It's tough to say for sure without an actual diagram of the design of the sensor. I'm mostly at contention with the "semi-metallic gray" being the base color of an IC. Two different Si wafers with different thickness oxides grown on them can look vastly different, neither being anything close to gray.

                        – Shamtam
                        Feb 7 at 21:58







                        Ah, I was not aware of that about typical Bayer filters. However, I still don't buy that diffraction grating of the sensor itself would cause any visible iridescence after traveling through the microlenses and color filters in front of the Si sensor itself. It's tough to say for sure without an actual diagram of the design of the sensor. I'm mostly at contention with the "semi-metallic gray" being the base color of an IC. Two different Si wafers with different thickness oxides grown on them can look vastly different, neither being anything close to gray.

                        – Shamtam
                        Feb 7 at 21:58















                        This doesn't explain why some sensors look pink even though they have green-biased bayer filters.

                        – xiota
                        Feb 8 at 0:11





                        This doesn't explain why some sensors look pink even though they have green-biased bayer filters.

                        – xiota
                        Feb 8 at 0:11











                        3














                        I've personally seen sensors of various colors in different cameras; green, pink, blue, etc. Without the specific dimensions and details on construction, it is tough to say, but I'd imagine that the color on most sensors is given by the thickness of the coatings on top of the sensor. Different thicknesses will produce different colors due to thin-film interference. Depending on the thickness of the coatings, different wavelengths of light (i.e. colors) destructively interfere with themselves in the coating, and whatever wavelengths do not reflect back to give you the color you see.






                        share|improve this answer




























                          3














                          I've personally seen sensors of various colors in different cameras; green, pink, blue, etc. Without the specific dimensions and details on construction, it is tough to say, but I'd imagine that the color on most sensors is given by the thickness of the coatings on top of the sensor. Different thicknesses will produce different colors due to thin-film interference. Depending on the thickness of the coatings, different wavelengths of light (i.e. colors) destructively interfere with themselves in the coating, and whatever wavelengths do not reflect back to give you the color you see.






                          share|improve this answer


























                            3












                            3








                            3







                            I've personally seen sensors of various colors in different cameras; green, pink, blue, etc. Without the specific dimensions and details on construction, it is tough to say, but I'd imagine that the color on most sensors is given by the thickness of the coatings on top of the sensor. Different thicknesses will produce different colors due to thin-film interference. Depending on the thickness of the coatings, different wavelengths of light (i.e. colors) destructively interfere with themselves in the coating, and whatever wavelengths do not reflect back to give you the color you see.






                            share|improve this answer













                            I've personally seen sensors of various colors in different cameras; green, pink, blue, etc. Without the specific dimensions and details on construction, it is tough to say, but I'd imagine that the color on most sensors is given by the thickness of the coatings on top of the sensor. Different thicknesses will produce different colors due to thin-film interference. Depending on the thickness of the coatings, different wavelengths of light (i.e. colors) destructively interfere with themselves in the coating, and whatever wavelengths do not reflect back to give you the color you see.







                            share|improve this answer












                            share|improve this answer



                            share|improve this answer










                            answered Feb 7 at 21:33









                            ShamtamShamtam

                            1313




                            1313























                                2














                                Photographic film is naturally sensitive only to violet and blue light frequencies. Hermann Vogel, Professor Berlin Technical, attempting so solve the problem due to “halation”,. He had some emulsions dyed yellow to arrest blue light exposing from the reflections from the emulsion-base interface. The It worked but to his amazement, the film gained sensitivity to green light (orthochromatic). His graduate students discovered other dyes sensitize emulsions to red light. This was an important step, emulsions sensitive to red, green and blue, yielded correct monochromatic rending. These tweaked emulsions made future color films possible.



                                As the CCD and CMOS senor evolved, it was also necessary to tweak them as to RGB sensitivity. Bryce Bayer of Eastman Kodak developed a subpixel matrix scheme coating the various photosites with strong additive color filters. The scheme is approximately 50% green, 25% blue and 25% red filters. This scheme tweaks the overall sensitivity so that a more faithful image results.



                                Because the image sensor is highly sensitive to infrared radiation, the entire imaging surface is filtered and this flat cover-glass does duel duty and protects the fragile surface from abrasion. A cover-glass is highly polished so it, like polished lenses induces a light loss due to surface reflection.



                                Robert Taylor, London optician, discovered that aged lenses acquired a natural coat of grime from air pollution. These “bloomed” lens reflected away only 2% whereas a new lens reflected away 8%. Artificial blooming (coating) took hold in the 1930’s.



                                The coated lens or cover-glass appears dichroic. It looks one color by transmission and the opposite color by reflection. Say the coat is to control red and blue reflections, the lens appear green by reflected light and magenta by transmitted light. Because most such glass is multi-coated, a casual observation gives little clue as to what color is being mitigated.






                                share|improve this answer




























                                  2














                                  Photographic film is naturally sensitive only to violet and blue light frequencies. Hermann Vogel, Professor Berlin Technical, attempting so solve the problem due to “halation”,. He had some emulsions dyed yellow to arrest blue light exposing from the reflections from the emulsion-base interface. The It worked but to his amazement, the film gained sensitivity to green light (orthochromatic). His graduate students discovered other dyes sensitize emulsions to red light. This was an important step, emulsions sensitive to red, green and blue, yielded correct monochromatic rending. These tweaked emulsions made future color films possible.



                                  As the CCD and CMOS senor evolved, it was also necessary to tweak them as to RGB sensitivity. Bryce Bayer of Eastman Kodak developed a subpixel matrix scheme coating the various photosites with strong additive color filters. The scheme is approximately 50% green, 25% blue and 25% red filters. This scheme tweaks the overall sensitivity so that a more faithful image results.



                                  Because the image sensor is highly sensitive to infrared radiation, the entire imaging surface is filtered and this flat cover-glass does duel duty and protects the fragile surface from abrasion. A cover-glass is highly polished so it, like polished lenses induces a light loss due to surface reflection.



                                  Robert Taylor, London optician, discovered that aged lenses acquired a natural coat of grime from air pollution. These “bloomed” lens reflected away only 2% whereas a new lens reflected away 8%. Artificial blooming (coating) took hold in the 1930’s.



                                  The coated lens or cover-glass appears dichroic. It looks one color by transmission and the opposite color by reflection. Say the coat is to control red and blue reflections, the lens appear green by reflected light and magenta by transmitted light. Because most such glass is multi-coated, a casual observation gives little clue as to what color is being mitigated.






                                  share|improve this answer


























                                    2












                                    2








                                    2







                                    Photographic film is naturally sensitive only to violet and blue light frequencies. Hermann Vogel, Professor Berlin Technical, attempting so solve the problem due to “halation”,. He had some emulsions dyed yellow to arrest blue light exposing from the reflections from the emulsion-base interface. The It worked but to his amazement, the film gained sensitivity to green light (orthochromatic). His graduate students discovered other dyes sensitize emulsions to red light. This was an important step, emulsions sensitive to red, green and blue, yielded correct monochromatic rending. These tweaked emulsions made future color films possible.



                                    As the CCD and CMOS senor evolved, it was also necessary to tweak them as to RGB sensitivity. Bryce Bayer of Eastman Kodak developed a subpixel matrix scheme coating the various photosites with strong additive color filters. The scheme is approximately 50% green, 25% blue and 25% red filters. This scheme tweaks the overall sensitivity so that a more faithful image results.



                                    Because the image sensor is highly sensitive to infrared radiation, the entire imaging surface is filtered and this flat cover-glass does duel duty and protects the fragile surface from abrasion. A cover-glass is highly polished so it, like polished lenses induces a light loss due to surface reflection.



                                    Robert Taylor, London optician, discovered that aged lenses acquired a natural coat of grime from air pollution. These “bloomed” lens reflected away only 2% whereas a new lens reflected away 8%. Artificial blooming (coating) took hold in the 1930’s.



                                    The coated lens or cover-glass appears dichroic. It looks one color by transmission and the opposite color by reflection. Say the coat is to control red and blue reflections, the lens appear green by reflected light and magenta by transmitted light. Because most such glass is multi-coated, a casual observation gives little clue as to what color is being mitigated.






                                    share|improve this answer













                                    Photographic film is naturally sensitive only to violet and blue light frequencies. Hermann Vogel, Professor Berlin Technical, attempting so solve the problem due to “halation”,. He had some emulsions dyed yellow to arrest blue light exposing from the reflections from the emulsion-base interface. The It worked but to his amazement, the film gained sensitivity to green light (orthochromatic). His graduate students discovered other dyes sensitize emulsions to red light. This was an important step, emulsions sensitive to red, green and blue, yielded correct monochromatic rending. These tweaked emulsions made future color films possible.



                                    As the CCD and CMOS senor evolved, it was also necessary to tweak them as to RGB sensitivity. Bryce Bayer of Eastman Kodak developed a subpixel matrix scheme coating the various photosites with strong additive color filters. The scheme is approximately 50% green, 25% blue and 25% red filters. This scheme tweaks the overall sensitivity so that a more faithful image results.



                                    Because the image sensor is highly sensitive to infrared radiation, the entire imaging surface is filtered and this flat cover-glass does duel duty and protects the fragile surface from abrasion. A cover-glass is highly polished so it, like polished lenses induces a light loss due to surface reflection.



                                    Robert Taylor, London optician, discovered that aged lenses acquired a natural coat of grime from air pollution. These “bloomed” lens reflected away only 2% whereas a new lens reflected away 8%. Artificial blooming (coating) took hold in the 1930’s.



                                    The coated lens or cover-glass appears dichroic. It looks one color by transmission and the opposite color by reflection. Say the coat is to control red and blue reflections, the lens appear green by reflected light and magenta by transmitted light. Because most such glass is multi-coated, a casual observation gives little clue as to what color is being mitigated.







                                    share|improve this answer












                                    share|improve this answer



                                    share|improve this answer










                                    answered Feb 8 at 20:41









                                    Alan MarcusAlan Marcus

                                    25.5k23060




                                    25.5k23060






























                                        draft saved

                                        draft discarded




















































                                        Thanks for contributing an answer to Photography Stack Exchange!


                                        • Please be sure to answer the question. Provide details and share your research!

                                        But avoid



                                        • Asking for help, clarification, or responding to other answers.

                                        • Making statements based on opinion; back them up with references or personal experience.


                                        To learn more, see our tips on writing great answers.




                                        draft saved


                                        draft discarded














                                        StackExchange.ready(
                                        function () {
                                        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphoto.stackexchange.com%2fquestions%2f104893%2fwhy-are-camera-sensors-green%23new-answer', 'question_page');
                                        }
                                        );

                                        Post as a guest















                                        Required, but never shown





















































                                        Required, but never shown














                                        Required, but never shown












                                        Required, but never shown







                                        Required, but never shown

































                                        Required, but never shown














                                        Required, but never shown












                                        Required, but never shown







                                        Required, but never shown







                                        Popular posts from this blog

                                        Biblatex bibliography style without URLs when DOI exists (in Overleaf with Zotero bibliography)

                                        ComboBox Display Member on multiple fields

                                        Is it possible to collect Nectar points via Trainline?