What is imaging system in remote sensing?

Remote sensing is the process of detecting and monitoring the physical characteristics of an area by measuring its reflected and emitted radiation at a distance [typically from satellite or aircraft]. Special cameras collect remotely sensed images, which help researchers "sense" things about the Earth. Some examples are:

  • Cameras on satellites and airplanes take images of large areas on the Earth's surface, allowing us to see much more than we can see when standing on the ground.
  • Sonar systems on ships can be used to create images of the ocean floor without needing to travel to the bottom of the ocean.
  • Cameras on satellites can be used to make images of temperature changes in the oceans.

Some specific uses of remotely sensed images of the Earth include:

  • Large forest fires can be mapped from space, allowing rangers to see a much larger area than from the ground.
  • Tracking clouds to help predict the weather or watching erupting volcanoes, and help watching for dust storms.
  • Tracking the growth of a city and changes in farmland or forests over several years or decades.
  • Discovery and mapping of the rugged topography of the ocean floor [e.g., huge mountain ranges, deep canyons, and the “magnetic striping” on the ocean floor].

In 2015, one of the most remarkable events in the space industry was when SpaceX realized the reusability of its rocket for the first time. Additionally, in June 2014, Russia used 1 rocket to launch 37 satellites at the same time. At present, many countries have the capability to launch multiple satellites in one mission. For example, NASA and the US Air Force launched 29 satellites in a single mission in 2013. At that time, the mission represented the most satellites ever launched at one time []. In 2015 and 2016, China and India launched 20 satellites in single mission, respectively. At present, six organizations have the capability to launch multiple satellites in a single mission: Russia, USA, China, India, Japan, and ESA. This trend indicates that in the future, the cost of sending satellites to space will greatly decrease. More and more remote sensing resources are becoming available. It is of great importance to have a comprehensive survey of the available remote sensing technology and to utilize inter- or trans-disciplinary knowledge and technology to create new applications.

Remote sensing is considered a primary means of acquiring spatial data. Remote sensing measures electromagnetic radiation that interacts with the atmosphere and objects. Interactions of electromagnetic radiation with the surface of the Earth can provide information not only on the distance between the sensor and the object but also on the direction, intensity, wavelength, and polarization of the electromagnetic radiation []. These measurements can offer positional information about the objects and clues as to the characteristics of the surface materials.

Satellite remote sensing consists of one or multiple remote sensing instruments located on a satellite or satellite constellation collecting information about an object or phenomenon on the Earth surface without being in direct physical contact with the object or phenomenon. Compared to airborne and terrestrial platforms, spaceborne platforms are the most stable carrier. Satellites can be classified by their orbital geometry and timing. Three types of orbits are typically used in remote sensing satellites, such as geostationary, equatorial, and sun-synchronous orbits. A geostationary satellite has a period of rotation equal to that of Earth [24 hours] so the satellite always stays over the same location on Earth. Communications and weather satellites often use geostationary orbits with many of them located over the equator. In an equatorial orbit, a satellite circles the Earth at a low inclination [the angle between the orbital plane and the equatorial plane]. The Space Shuttle uses an equatorial orbit with an inclination of 57°. Sun-synchronous satellites have orbits with high inclination angles, passing nearly over the poles. Orbits are timed so that the satellite always passes over the equator at the same local sun time. In this way, these satellites maintain the same relative position with the sun for all of its orbits. Many remote sensing satellites are sun synchronous, which ensures repeatable sun illumination conditions during specific seasons. Because a sun-synchronous orbit does not pass directly over the poles, it is not always possible to acquire data for the extreme polar regions. The frequency at which a satellite sensor can acquire data of the entire Earth depends on the sensor and orbital characteristics []. For most remote sensing satellites, the total coverage frequency ranges from twice a day to once every 16 days. Another orbital characteristic is altitude. The space shuttle has a low orbital altitude of 300 km, whereas other common remote sensing satellites typically maintain higher orbits ranging from 600 to 1000 km.

The interaction between a sensor and the surface of the Earth has two modes: active or passive. Passive sensors utilize solar radiation to illuminate the Earth’s surface and detect the reflection from the surface. They typically record electromagnetic waves in the range of visible [˜430–720 nm] and near-infrared [NIR] [˜750–950 nm] light. Some systems, such as SPOT 5, are also designed to acquire images in middle-infrared [MIR] wavelengths [1580–1750 nm]. The power measured by passive sensors is a function of the surface composition, physical temperature, surface roughness, and other physical characteristics of the Earth []. Examples of passive satellite sensors are those aboard the Landsat, SPOT, Pléiades, EROS, GeoEye, and WorldView satellites. Active sensors provide their own source of energy to illuminate the objects and measure the observations. These sensors use electromagnetic waves in the range of visible light and near-infrared [e.g., a laser rangefinder or a laser altimeter] and radar waves [e.g., synthetic aperture radar [SAR]]. A laser rangefinder uses a laser beam to determine the distance between the sensor and the object and is typically used in airborne and ground-based laser scanning. A laser altimeter uses a laser beam to determine the altitude of an object above a fixed level and is typically utilized in satellite and aerial platforms. SAR uses microwaves to illuminate a ground target with a side-looking geometry and measures the backscatter and travel time of the transmitted waves reflected by objects on the ground. The distance that the SAR device travels over a target in the time taken for the radar pulses to return to the antenna produces the SAR image. SAR can be mounted on a moving platform, such as spaceborne and airborne platforms. According to the combination of frequency bands and polarization modes used in data acquisition, sensors can be categorized as single frequency [L-band, C-band, or X-band], multiple frequency [a combination of two or more frequency bands], single polarization [VV, HH, or HV], and multiple polarization [a combination of two or more polarization modes]. Currently, there are three commercial SAR missions in space: Germany’s TerraSAR-X and TanDEM-X [X-band with a ˜3.5 cm wavelength], Italy’s COSMO-SkyMed [X-band with ˜3.5 cm wavelength], and Canada’s RADARSAT-2 [C-band with ˜6 cm wavelength]. In addition, ESA’s ERS-1, ERS-2, and Envisat also carried SAR, although these missions have ended. The latest SAR satellites from ESA include Sentinel-1A, Sentinel-1B, and Sentinel-3A. Typical SAR parameters are repeat frequency, pulse repetition frequency, bandwidth, polarization, incidence angle, imaging mode, and orbit direction [].

As sensor technology has advanced, the integration of passive and active sensors into one system has emerged. This trend makes it unclear difficult to categorize sensors in the traditional way, into passive sensors and active sensors. In this paper, we introduce the sensors in terms of imaging or non-imaging functionality. Imaging sensors typically employ optical imaging systems, thermal imaging systems, or SAR. Optical imaging systems use the visible, near-infrared, and shortwave infrared spectrums and typically produce panchromatic, multispectral, and hyperspectral imagery. Thermal imaging systems employ mid to longwave infrared wavelengths. Non-imaging sensors include microwave radiometers, microwave altimeters, magnetic sensors, gravimeters, Fourier spectrometers, laser rangefinders, and laser altimeters [].

It has been decades since Landsat-1, the first Earth resources technology satellite, was launched in 1972. Satellite platforms have evolved from a single satellite to multi-satellite constellations. Sensors have experienced unprecedented development over the years, from 1972 with the first multispectral satellite, Landsat-1, with four spectral bands to 1997 with the first hyperspectral satellite, Lewis, with 384 spectral bands. Spatial resolution has also significantly improved over the decades, from 80 m in Landsat-1 to 31 cm in Worldview-3. A number of studies on satellite imagery processing methods and applications have been conducted. A few papers providing sensor overviews have been published, including [, , ]. Blais [] reviewed the range sensors developed over the past two decades. The studied range sensors include single point and laser scanners, slit scanners, pattern projections, and time-of-flight systems. In addition, commercial systems related to range sensors were reviewed. Melesse et al. [] provided a survey of remote sensing sensors for typical environmental and natural resources mapping purposes, such as urban studies, hydrological modeling, land-cover and floodplain mapping, fractional vegetation cover and impervious surface area mapping, surface energy flux and micro-topography correlation, remotely sensed-based rainfall, and potential evapotranspiration for estimating crop water requirement satisfaction indexes. Recently, a survey on remote sensing platforms and sensors was provided by Toth and Jóźków []. The authors gave a general review in current remote sensing platforms, including satellites, airborne platforms, UAVs, ground-based mobile and static platforms, sensor georeferencing and supporting navigation infrastructure, and provided a short summary of imaging sensors.

In the literature, we found that overviews of remote sensing sensors were quite rare. One reason for this finding was that this topic is fairly broad. Usually, one can find detailed knowledge from thick books or a very simple overview from some webpages. As most readers need to obtain relevant knowledge within a reasonable time period and with a modest depth, the contribution of our paper is valuable. In this paper, we review the history of remote sensing, the interaction of the electromagnetic spectrum [EMS] and objects, imaging sensors and non-imaging sensors [e.g., laser rangefinders/altimeters], and commonly used satellites and their characteristics. In addition, future trends and potential applications are addressed. Although this paper is mainly about satellite sensors, there is no apparent boundary between satellite sensors and airborne, UAV-based, or ground-based sensors except that satellite sensors have more interaction with the atmosphere. Therefore, we use the term “remote sensing sensors” generally.

Advertisement

2. Remarkable development in spaceborne remote sensing

Although the term ‘remote sensing’ was introduced in 1960. However, in practice, remote sensing has a long history. In the 1600s, Galileo used optical enhancements to survey celestial bodies []. An early exploration of prisms was conducted by Sir Isaac Newton in 1666. Newton discovered that a prism dispersed light into a spectrum of red, orange, yellow, green, blue, indigo, and violet and recombined the spectrum into white light. One hundred years later, in 1800, Sir William Herschel explored the thermal infrared electromagnetic radiation for the first time in the world. Herschel measured the temperature of light that had been split with a prism into the spectrum of visible colors. In the following decades, some attempts were made with aerial photographs using cameras attached to balloons. However, the results were not satisfactory until 1858, when Gasper Felix Tournachon took the first aerial photograph successfully from a captive balloon from an altitude of 1200 feet over Paris. Later, in 1889 in Labruguiere, France, Arthur Batut attached a camera and an altimeter to kites for the first time so that the image scale could be determined. Therefore, he is considered to be the father of kite aerial photography. Then, at the beginning of the twentieth century, the camera was able to be miniaturized [e.g., 70 g] so that it was easily carried by pigeons. The Bavarian Pigeon Corps took the first aerial photos using a camera attached to a pigeon in 1903. During the First World War, the use of aerial photography grew. Later, in 1936, Albert W. Stevens took the first photograph of the actual curvature of the earth from a free balloon at an altitude of 72,000 feet. The first space photograph from V-2 rockets was acquired in 1946. addresses the evolution of the remote sensing, excluding the early development stage. The table starts with the use of aerial photographs for surveying and mapping as well for military use. The milestones in this evolution [see ] were referenced to [, ]. Additionally, recent developments in microsatellites and satellite constellations are also listed in .

PhasesTime seriesRemarksAirborne remote sensingDuring the First and Second World WarsThe use of photographs for surveying, mapping, reconnaissance and military surveillanceRudimentary spaceborne satellite remote sensingIn the late 1950sThe launch of Sputnik 1 by Russia in 1957 and Explorer 1 by US in 1958Spy satellite remote sensingDuring the Cold War [1947–1991]Remote sensing for military use spilled over into mapping and environment applicationsMeteorological satellite sensor remote sensing1960˜The launch of the first meteorological satellite [TIROS-1] by the US in 1960. Since then, data in digital formats and the use of computer hardware and softwareLandsat1972˜Landsat 1, 2, and 3 carrying a multispectral scanner; Landsat 4 and 5 carried a Thematic Mapper sensor; Landsat 7 carries an Enhanced Thematic Mapper; Landsat 8 carries the Operational Land Imager. Landsat satellites have high resolution and global coverage. Applications were initially local and have become global since thenEuropean Space Agency’s first Earth observing satellite program1991˜The European Space Agency launched the first satellite ERS-1 in 1991, which carried a variety of earth observation instruments: a radar altimeter, ATSR-1, SAR, wind scatterometer, and microwave radiometer. A successor, ERS-2, was launched in 1995Earth observing system [EOS]Since the launch of the Terra satellite in 1999Terra/Aqua satellites carrying sensors, such as MODIS and taking measurements of pollution in the troposphere [MOPITT]. Global coverage, frequent repeat coverage, a high level of processing, easy and mostly free access to dataNew millenniumAround the same time as EOSNext generation of satellites and sensors, such as Earth Observing-1, acquiring the first spaceborne hyperspectral dataPrivate industry/commercial satellite systems2000˜1. Very high-resolution data, such as IKONOS and Quickbird satellites
2. A revolutionary means of data acquisition: daily coverage of any spot on earth at a high resolution, such as Rapideye
3. Google streaming technology allows rapid data access to very high-resolution images
4. The launch of GeoEye-1 in 2008 for very high-resolution imagery [0.41 m]Microsatellite era and satellite constellations2008˜1. Small satellites and satellite constellation [RapidEye and Terra Bella, formerly Skybox]: RapidEye was launched in August, 2008, with five EOS. These are the first commercial satellites to include the Red-Edge band, which is sensitive to changes in chlorophyll content. On March 8, 2016, Skybox imaging was renamed to Terra Bella. Satellites provided the ability to capture the first-ever commercial high-resolution video of Earth from a satellite and the ability to capture high-resolution color and near-infrared imagery
2. For the first time, Russia carried out a single mission to launch 37 satellites in June of 2014
3. ESA launched the first satellite of the Sentinel constellation in April of 2014.
4. SpaceX reusable rocket capacity since December of 2015
5. Current satellites in high revisiting period, large coverage, and high spatial resolution, up to 31 cm

Table 1.

Evolution and advancement in remote sensing satellites and sensors.

Advertisement

3. Characteristics of materials in electromagnetic spectrum [EMS]

Remote sensors remotely interact with objects on the surface of the Earth. Objects on the surface of the Earth generally include terrain, buildings, road, vegetation, and water. The typical materials of these objects that interact with the EMS are categorized into groups: transparent and opaque [partly or fully absorbed].

3.1. Electromagnetic spectrum

contains the EMS range from gamma rays to radio waves. In remote sensing, typical applications include the visible light [380–780 nm], infrared [780 nm–0.1 mm], and microwave [0.1 mm–1 m] ranges. This paper treats the terahertz [0.1–1 mm] range as an independent spectral band separate from microwaves. Remote sensing sensors interact with objects remotely. Between sensors and the earth surface, there is atmosphere. It is estimated that only 67% of sunlight directly heats the Earth []. The remainder of the light is absorbed and reflected by the atmosphere. The Earth’s atmosphere strongly absorbs infrared and UV radiation. In visible light, typical remote sensing applications include the blue [450–495 nm], green [495–570 nm], and red [620–750 nm] spectral bands for panchromatic or multispectral or hyperspectral imaging. Current bathymetric and ice LIDAR generally uses green light [e.g., NASA’s HSRL-1 LIDAR, with a spectrum of 532 nm]. However, new experiments have shown that in the blue spectrum, such as at 440 nm, the absorption coefficient for water is approximately an order of magnitude smaller than at 532 nm, and 420–460 nm light can penetrate relatively clear water and ice much deeper, offering substantial improvements in sensing through water for the same optical power output, thus reducing power requirements []. The red spectrum together with near-infrared [NIR] is typically used for vegetation applications. For example, the Normalized Difference Vegetation Index [NDVI] is used to evaluate targets that may or may not contain live green vegetation. Infrared is invisible radiant energy. Usually, infrared is divided into different regions: near IR [NIR, 0.75–1.4 μm], shortwave IR [SWIR, 1.4–3 μm], mid-IR [MIR, 3–8 μm], longwave IR [LWIR, 8–15 μm], and far IR [FIR, 15–1000 μm]. Alternatively, according to the ISO 20473 scheme, another division is proposed as NIR [0.78–3 μm], MIR [3–50 μm], and FIR [50–1000 μm]. Most of the infrared radiation in sunlight is in the NIR range. Most of the thermal radiation emitted by objects near room temperature is infrared []. In nature, on the surface of the Earth, almost all thermal radiation consists of infrared in the mid-infrared region, which is a much longer wavelength than that in sunlight. Of these natural thermal radiation processes, only lightning and natural fires are hot enough to produce much visible energy, and fires produce far more infrared than visible light energy. NIR is mainly used in medical imaging and physiological diagnostics. One typical application of MIR and FIR is thermal imaging, for example, night vision devices. In the MIR and FIR spectrum bands, water shows high absorption, and biological systems are highly transmissive.

Figure 1.

The electromagnetic spectrum. Image from UC Davis ChemWiki, CC-BY-NC-SA 3.0.

With regard to the terahertz spectrum band, terahertz frequencies are useful for investigating biological molecules. Unlike more commonly used forms of radiated energy, this range has rarely been studied, partly because no one knew how to make these frequencies bright enough [] and because practical applications have been impeded by the fact that ambient moisture interferes with wave transmission []. Nevertheless, terahertz light [also called T-rays] has remarkable properties. T-rays are safe, non-ionizing electromagnetic radiation. This light poses little or no health threat and can pass through clothing, paper, cardboard, wood, masonry, plastic, and ceramics. This light can also penetrate fog and clouds. THz radiation transmits through almost anything except for not metal and liquid [e.g., water]. T-rays can be used to reveal explosives or other dangerous substances in packaging, corrugated cardboard, clothing, shoes, backpacks, and book bags. However, the technique cannot detect materials that might be concealed in body cavities [].

The terahertz region is technically the boundary between electronics and opt-photonics []. The wavelengths of T-rays—shorter than microwaves, longer than infrared—correspond with biomolecular vibrations. This light can provide imaging and sensing technologies not available through conventional technologies, such as microwaves []. For example, T-rays can penetrate fabrics. Many common materials and living tissues are semi-transparent and have ‘terahertz fingerprints’, permitting them to be imaged, identified, and analyzed []. In addition, terahertz radiation has the unique ability to non-destructively image physical structures and perform spectroscopic analysis without any contact with valuable and delicate paintings, manuscripts, and artifacts. In addition, terahertz radiation can be utilized to measure objects that are opaque in the visible and near-infrared regions. Terahertz pulsed imaging techniques operate in much the same way as ultrasound and radar to accurately locate embedded or distant objects []. Current commercial terahertz instruments include Terahertz 3D medical imaging, security scanning systems, and terahertz spectroscopy. The latest breakthrough research [9.2016] on terahertz applications was that MIT invented a terahertz camera that can read a closed book. This camera can distinguish ink from a blank region on paper. The article indicates that ‘In its current form the terahertz camera can accurately calculate distance to a depth of about 20 pages’ []. It is expected that in the future, this technology can be used to explore and catalog historical documents without actually having to touch or open them and risk damage.

Regarding microwaves, shorter microwaves are typically used in remote sensing. For example, this region is used for radar, and the wavelength is just a few inches long. Microwaves are typically used for obtaining information on the atmosphere, land, and ocean, such as Doppler radar, which is used in weather forecasts, and for gathering unique information on sea wind and wave direction, which are derived from frequency characteristics, including the Doppler effect, polarization, back scattering, that cannot be observed by visible and infrared sensors []. In addition, microwave energy can penetrate haze, light rain and snow, clouds, and smoke []. Microwave sensors work in any weather condition and at any time.

3.2. Objects and spectrum

When light encounters an object, they can interact in several different ways: transmission, reflection, and absorption. The interaction depends on the wavelength of the light and the nature of the material of the object.

Most materials exhibit all three properties when interacting with light: partly transmission, partly reflection, and partly absorption. According to the dominant optical property, we categorize objects into two typical types: transparent materials and opaque materials.

Transparent material allows light to pass through the material without being scattered or absorbed. Typical transparent objects include plate glass and clean water. shows the transmission spectrum of soda-lime glass with a 2-mm thickness. Soda-lime glass is typically used in windows [also called flat glass] and glass containers. From , it can be seen that soda-lime glass nearly blocks UV radiation. Nevertheless, it has high transmittance in the visible light and NIR wavelengths. It is easy to understand that when a laser scanner with a wavelength of 905, 1064, or 1550 nm hits a flat glass window or a glassy balcony, over 80% of the laser energy passes through the glass and hits the objects behind the window. Another typical example of transmissive material is clear water. Water transmittance is very high in the blue-green part of the spectrum but diminishes rapidly in the near-infrared wavelengths [see ]. Absorption, on the other hand, is notably low in the shorter visible wavelengths [less than 418 nm] but increases abruptly in the range of 418–742 nm. A laser beam with a wavelength of 532 nm [green laser] is typically applied in bathymetric measurements as this wavelength has a high water transmittance. According to the Beer-Lambert law, the relation between absorbance and transmittance is as follows: Absorbance = −log [Transmittance].

Figure 2.

Transmission spectrum of soda-lime glass with a 2-mm thickness. Obtained from Wikipedia [22].

Figure 3.

Liquid water absorption spectrum. Obtained from Wikipedia [23].

Opacity occurs because of the reflection and absorption of light waves off the surface of an object. The reflectance of light depends on the material of the surface that the light encounters. There are two types of reflection: one is specular reflection and another is diffuse reflection. Specular reflection is when light from a single incoming direction is reflected in a single outgoing direction. Diffuse reflection is the reflection of light from a surface such that an incident ray is reflected at many angles rather than at just one angle, as in the case of specular reflection. Most objects have mixed reflective properties []. Representative reflective materials include metals, such as aluminum, gold, and silver. From , it can be seen that aluminum has a high reflectivity over various wavelengths. In the visible light and NIR wavelengths, the reflectance of aluminum reaches up to 92%, while this value increases to 98% in MIR and FIR. Silver has a higher reflectance than aluminum when the wavelength is longer than 450 nm. At a wavelength of 310 nm, the reflectance of aluminum is zero []. The reflectance of gold significantly increases at a wavelength of approximately 500 nm, reaching a very high reflectance starting in the infrared. This figure indicates that regardless of the wavelength at which the sensor operates, it is inevitable to encounter high reflection from aluminum surfaces.

Figure 4.

Reflective spectrum of metals: aluminum, gold, and silver.

The physical characteristics of the material determine what type of electromagnetic waves will and will not pass through it. shows examples of the reflection spectrums of dry bare soil, green vegetation, and clear water. The reflection of dry bare soil increase as the wavelength increases from 400 to 1800 nm. Green vegetation has a high reflectance in the red light and near-infrared regions. These characteristics have been applied for distinguishing green vegetation from other objects. In addition, the previous figure shows that water has a low absorbance in the visible light region. shows that water reflects visible light at a low rate [

Chủ Đề