A CCD is a stamp-sized silicon sensor that acts as an eye for digital imaging devices. When light passes through a camera’s lens, it hits the CCD’s millions of photoelectric cells which convert that light into electrons. The more light a photocell is exposed to, the more electrons it holds. (This effect, called the photoelectric effect, earned Albert Einstein a Nobel Prize in 1921.) The information from each cell of the CCD is then translated into binary code, row by row, and finally recreated on a display as pixels. Since the CCD only captures images in black and while, filters are used to determine the colors of each pixel.
Boyle and Smith first came up with the idea of the CCD as a form of electronic memory in a brief brainstorming session at Bell Labs 1969. Its potential as an imaging technology was quickly realized and has been developed over the past 40 years. The first camera to include a CCD appeared in 1981, and the first fully digital camera was released in 1995. Today, the CCD has proven itself to be a valuable tool outside of consumer digital photography, particularly in the medical imaging and astronomy fields. Doctors use CCD endoscopes to look inside the human body without major surgery. The Hubble space telescope uses four, 0.64 megapixel CCD sensors to capture its photographs of deep space.
CCD technology is not without competition. In recent years CMOS (Complementary Metal Oxide Semiconductor) sensors have surged in popularity due to their low energy consumption and cheaper production costs.