• No results found

Background Theory

2.1 Hyperspectral Remote Sensing

In Hyperspectral Remote Sensing (HRS), the satellite combines imaging and spectrome-try to capture information across the electromagnetic spectrum [15]. The satellite would ideally capture an image of a geographical scene where each pixel would have information over a spectrum. The data is stored in a three-dimensional dataset (x,y,λ) often called a hypercube, where the first two dimensions represent the spatialxand spatialydirection of the geographical scene, and the third dimensions represent the spectral information (often called bands). This is illustrated in Fig. 2.1 which shows a hypercube with 10×10 pixels and 14 different bands and the reflectance spectra for one pixel.

Each band corresponds to a narrow wavelength range where the satellite would capture information related to the chemical composition of materials by measuring the variation in power with the wavelength of the frequency of light. The variation of how reflectance or emissivity of the materials within an image pixel varies with wavelength is provided in the different bands and can often be enough to characterize the material observed [15]. A plot of the reflectance from four different materials as a function of wavelength is showed in Fig. 2.2 to illustrate how each material has a unique spectrum.

Figure 2.1:Illustration of the hypercube data structure. The data is captured in spatial x- and y-direction for sensor specific number of different bands, which each represents a narrow wavelength range of the electromagnetic spectrum (spectral band).

0 2 4 6 8 10 12 14

Wavelength [ m]

0 20 40 60 80

Reflectance

basalt_h10 (fine) gneiss5 (coarse) seafoam (liquid) marble4 (fine)

Figure 2.2:A plot of reflectance as a function of wavelength for three rocks (Basalt, Gneiss, and Marble) and water foam. The data is received from ECOSTRESS spectral library (formerly ASTER spectral library). The numbers in the label refer to the sample number, which can be used to find further information about the chemical composition of the material [16, 17].

2.1 Hyperspectral Remote Sensing A typical hyperspectral remote sensing hypercube consists of about 100-200 spectral bands with bandwidths of around5 nm. Another well-used imaging technique used for remote sensing is multispectral imaging. This differ from HRS by having fever bands (normally 5-10) and wider bandwidth (70-400 nm) [18]. In contrary to imaging systems that of-ten captures reflected and/or emitted electromagnetic radiation integrated over the visible band, HRS also measures other wavelength regions not visible for humans. All these regions can further be divided into the visible (VIS), near-infrared (NIR), shortwave-(SWIR), midwave-(MWIR) and longwave (LWIR) infrared. Contributions measured by the remote sensing systems in these regions (0.4-14µm) can be divided into reflected sun-light and thermal emission from an object in the scene. The different wavelength areas, their abbreviations, range, and contribution is specified in Table 2.1. Ocean color remote sensing normally utilize the VIS region, i.e. wavelengths between400 nmand700 nm, and NIR light, i.e. wavelengths from700 nmto just under2000 nm[13].

Table 2.1:Name of different wavelength regions, with abbreviation, regions and what contribution the region is dominated by.

Wavelength Region Abbreviation Wavelength Dominated by

Visible VIS 0.4 - 0.8 µm reflected sunlight

Near Infrared NIR 0.7 - 1.1 µm reflected sunlight Visible + Near Infrared VNIR 0.4 - 1.1 µm reflected sunlight Shortwave Infrared SWIR 1.1 - 3 µm reflected sunlight

Midwave Infrared MWIR 3 - 5 µm thermal emission

Longwave Infrared LWIR 5 - 14 µm thermal emission

Two ways to describe the precision of the sensors is typically by spectral and spatial reso-lution. The spectral resolution depends on two factors, sampling interval, and bandwidth.

The sampling interval is the spectral distance between the centers or peaks of spectral chan-nels along a spectrum and bandwidth which is the full width at half maximum (FWHM) of a spectral channel. This is illustrated in Fig. 2.3. The blue curves are typical for what one would find in multispectral sensors where the sampling interval is larger than the band-width. Hyperspectral sensors look more like the orange curves, where the bandwidth is narrow and less than the sampling interval. A large number of narrow bands would make it possible to do material analysis with few pixels, as it would consist of more information, as previously illustrated in Fig. 2.2. The spatial resolution represents the size of each pixel, which can be as small as 1 m for airborne systems to more than 1000 meters for satellite systems [13]. Large pixels could result in pixels containing multiple objects, making it difficult to identify the object. On the other hand, if the pixels are too small, the reliability of the measured features could be reduced due to decreased signal-to-noise ratio [15].

The goal of remote sensing is to convert the sensor measurements into useful and de-sired information [13]. It is of interest that the sensors have some standard measurements which can be compared to other sensors and reference values measured in the field for validation. Therefore, the measurements must be independent of the sensor.

Wavelength

Sensitivity band

width sampling

interval

Figure 2.3:Illustration of sensitivity, sampling interval and band width. The blue curves represent the bands and spacing typically used in multi-spectral imaging, whereas the orange bands are more typical of hyperspectral imaging

2.1.1 Radiometry

Radiometry is the science of measuring electromagnetic energy. However, what quantity of electromagnetic energy to measure is an important aspect to decide in remote sens-ing. The detectors of optical instruments in the satellite often register the light energy they receive during an observation period. In some cases, the energy received is not only depen-dent on the properties of the observed scene, but also on the instrument itself. Therefore, it is important to measure physical quantities that are not dependent on the sensor so that comparisons of measurements between sensors are possible [19]. The total spectral energy measured by the sensor,, during a time,t, is provided by photons of different wavelengths and is expressed in joules. The sensor would receive energy during the integration time given as [19]:

=

Z λ2

λ1

S(λ)(λ)dλ (2.1)

where S(λ) is the unit-less, instrument spectral sensitivity betweenλ1andλ2, which is the relative efficiency of detecting light as a function of the wavelength of the light. To remove the time dependency, the quantity spectral flux is introduced and is expressed in watts per unit wavelength as:

φ(λ) = d(λ)

dt (2.2)

To get rid of the detector surface dependency, irradiance is introduced. This quantity is the spectral flux reaching the detector per surface and given as:

E(λ) = dφ(λ)

dA (2.3)