• No results found

3.2 Digital Camera Systems

3.2.1 Camera Setup and Components

The two most important components of a digital camera are the lens system and the imaging sensor that ideally should be tailored to each other. Several mechan-ical components such as the aperture and the shutter as well as the camera elec-tronics and host software influence the system performance, too, but will not be discussed in detail.

Imaging Sensor

An imaging sensor converts irradiance incident on a light sensitive area into an electrical signal whose exact nature depends on the sensor technology. Imaging sensors in digital cameras consist generally of a one or two dimensional array of sensor elements (sensels) or picture elements (pixels).

1There is some confusion about the meaning of the word calibration: In some areas such as color sciencecalibrationmeans to set a device into a defined state. In an additionalprofilingstep, this state is characterized and the characterization is later taken into account in order to achieve a defined behavior of a device. In computer graphics,calibrationis tuning a general model of the physical device in order to best describe the specific instance on the device. Unless otherwise noted, we use this second semantics in this thesis.

3.2 Digital Camera Systems 33

Figure 3.3: The Bayer pattern [Bayer76] for color image capture.Left: Arrange-ment of the individual color sensors. Middle: Color interpolation artifacts in an image acquired with a Bayer pattern sensor (Jenoptik ProgRes C14, single shot, fast reconstruction).Right:Ground truth image acquired using 4-shot mode. Note also the visually higher resolution of the ground truth image.

In the case of a charge coupled device (CCD) sensor, light is absorbed and creates free electrons that are stored inside the sensor element that serves as ca-pacitor (e.g., [Janesick01]). At the end of an exposure, the collected charge is shifted towards one end of the sensor using an analog shift register and digitized using an analog-digital converter. Due to the physical principles involved, CCD sensors are highly linear, i.e., the amount of charge collected is proportional to the irradiance. The main performance limitation are leakage currents caused by chip defects and thermal noise. Cooling the sensor will greatly improve its perfor-mance as the noise doubles with every increase in temperature of about 6 Kelvin.

Both, the Kodak DCS 560 and the Jenoptik ProgRes C14 digital cameras (see Section 3.2.4) employ CCD imaging sensors.

The dynamic range, i.e., the ratio of brightest to darkest non-zero recordable intensity values of a CCD sensor is not only limited by noise but also by the well capacity. A sensel of the Sony ICX085AK chip used in the Jenoptik ProgRes C14 camera can typically store about 25,000 electrons so that the dynamic range even of a noise free sensor is limited to about 1:25,000 requiring less than 15 bit for digitization. Very bright spots in an image can also lead to blooming artifacts if the irradiance creates a strong current and charge is overflowing from one sensel to neighboring sensels.

In contrast to CCD sensors, complementary metal oxide semiconductor (CMOS) sensors are based on the technology used for modern random access memory chips, processors, or analog circuitry. They are more versatile and flex-ible but often suffer from stronger image noise. In particular, modern CMOS cameras can achieve a dynamic range of more than 140 dB which is often re-quired for computer graphics applications (see, e.g., Kang et al. [Kang03] for an

overview of camera systems with extended dynamic range). The Silicon Vision Lars III camera is such a camera with a CMOS imaging sensor.

Color image acquisitionis difficult as a single sensor element can capture only a single value (defined by the convolution of its spectral sensitivity with the spec-trum of the incident light) but not a full set of RGB tristimulus values. The most common solution is to cover individual sensels with color filters arranged in a Bayer pattern [Bayer76] (see Figure 3.3) and to capture individual color compo-nents at different spatial locations. Spatial resolution is in this case traded for color information. Tristimulus color values (RGB) are reconstructed for each pixel us-ing information from neighborus-ing sensels. This reconstruction can introduce color artifacts so that the pixel values do not necessarily correspond to the irradiance in-cident at the pixel location.

Alternatively, the image can be simultaneously captured by 3 sensors with corresponding color filters (3 chip cameras) or at different times using, e.g., a color filter wheel or tunable filters. Only recently, a new technology was introduced that makes use of the different penetration depths of light into the chip and stacks three sensels on top of each other [Lyon02]. It allows to capture tristimulus color values without spatial or temporal aliasing artifacts.

Lens System

The lens system projects an image of a scene onto the imaging sensor (see Fig-ure 3.4). A lens with focal length f focuses a point at distanceg onto a point at distancebaccording to

1 f = 1

g +1 b

leading to a sharp image. At a pointb0 farther away, the image is blurred within a circle of confusion whose diameterdis proportional to the radius of the aperture.

While achieving a sharp image is desirable for an analog, film-based camera system, it can be harmful for the sampled imaging system of a digital camera:

If the light sensitive portion of a sensor element covers not its full area, a small feature could be focused on a light-insensitive part and would not be captured (see Figure 3.5). It has also the potential of severe aliasing if the fill factor, the percentage of light sensitive area of a pixel, is much smaller than 100 %. The fill factor can be increased by adding small lenses on top of the sensor elements as shown in Figure 3.5. This is however not sufficient for color sensors with Bayer pattern [Bayer76] where even a small feature should be registered by red, green, and blue color sensors in order to minimize interpolation artifacts. An additional antialiasing filter can be used to ensure this by slightly blurring the projected image.

3.2 Digital Camera Systems 35 To achieve good image quality, it is therefore important that the lens system

matches the imaging sensor. A way to check this is to determine the modula-tion transfer funcmodula-tion (MTF) [Williams99, Burns01] of an imaging system that describes its properties in frequency space.

The lens system can furthermore lead to a variety of artifacts such as chromatic aberration (the focal lengthf varies for different wave lengths of light leading to color artifacts at image edges), geometric distortions (the acquired image is dis-torted compared to an image generated using a pin-hole projection), or diffraction at the aperture blades (visible as star-shaped artifacts around bright features). It is therefore mandatory to employ high quality lenses and apply where possible correction techniques in order to acquire high quality data.