What is imaging system in remote sensing?

Remote sensing is the process of detecting and monitoring the physical characteristics of an area by measuring its reflected and emitted radiation at a distance (typically from satellite or aircraft). Special cameras collect remotely sensed images, which help researchers "sense" things about the Earth. Some examples are:

  • Cameras on satellites and airplanes take images of large areas on the Earth's surface, allowing us to see much more than we can see when standing on the ground.
  • Sonar systems on ships can be used to create images of the ocean floor without needing to travel to the bottom of the ocean.
  • Cameras on satellites can be used to make images of temperature changes in the oceans.

Some specific uses of remotely sensed images of the Earth include:

  • Large forest fires can be mapped from space, allowing rangers to see a much larger area than from the ground.
  • Tracking clouds to help predict the weather or watching erupting volcanoes, and help watching for dust storms.
  • Tracking the growth of a city and changes in farmland or forests over several years or decades.
  • Discovery and mapping of the rugged topography of the ocean floor (e.g., huge mountain ranges, deep canyons, and the “magnetic striping” on the ocean floor).

In 2015, one of the most remarkable events in the space industry was when SpaceX realized the reusability of its rocket for the first time. Additionally, in June 2014, Russia used 1 rocket to launch 37 satellites at the same time. At present, many countries have the capability to launch multiple satellites in one mission. For example, NASA and the US Air Force launched 29 satellites in a single mission in 2013. At that time, the mission represented the most satellites ever launched at one time []. In 2015 and 2016, China and India launched 20 satellites in single mission, respectively. At present, six organizations have the capability to launch multiple satellites in a single mission: Russia, USA, China, India, Japan, and ESA. This trend indicates that in the future, the cost of sending satellites to space will greatly decrease. More and more remote sensing resources are becoming available. It is of great importance to have a comprehensive survey of the available remote sensing technology and to utilize inter- or trans-disciplinary knowledge and technology to create new applications.

Remote sensing is considered a primary means of acquiring spatial data. Remote sensing measures electromagnetic radiation that interacts with the atmosphere and objects. Interactions of electromagnetic radiation with the surface of the Earth can provide information not only on the distance between the sensor and the object but also on the direction, intensity, wavelength, and polarization of the electromagnetic radiation []. These measurements can offer positional information about the objects and clues as to the characteristics of the surface materials.

Satellite remote sensing consists of one or multiple remote sensing instruments located on a satellite or satellite constellation collecting information about an object or phenomenon on the Earth surface without being in direct physical contact with the object or phenomenon. Compared to airborne and terrestrial platforms, spaceborne platforms are the most stable carrier. Satellites can be classified by their orbital geometry and timing. Three types of orbits are typically used in remote sensing satellites, such as geostationary, equatorial, and sun-synchronous orbits. A geostationary satellite has a period of rotation equal to that of Earth (24 hours) so the satellite always stays over the same location on Earth. Communications and weather satellites often use geostationary orbits with many of them located over the equator. In an equatorial orbit, a satellite circles the Earth at a low inclination (the angle between the orbital plane and the equatorial plane). The Space Shuttle uses an equatorial orbit with an inclination of 57°. Sun-synchronous satellites have orbits with high inclination angles, passing nearly over the poles. Orbits are timed so that the satellite always passes over the equator at the same local sun time. In this way, these satellites maintain the same relative position with the sun for all of its orbits. Many remote sensing satellites are sun synchronous, which ensures repeatable sun illumination conditions during specific seasons. Because a sun-synchronous orbit does not pass directly over the poles, it is not always possible to acquire data for the extreme polar regions. The frequency at which a satellite sensor can acquire data of the entire Earth depends on the sensor and orbital characteristics []. For most remote sensing satellites, the total coverage frequency ranges from twice a day to once every 16 days. Another orbital characteristic is altitude. The space shuttle has a low orbital altitude of 300 km, whereas other common remote sensing satellites typically maintain higher orbits ranging from 600 to 1000 km.

The interaction between a sensor and the surface of the Earth has two modes: active or passive. Passive sensors utilize solar radiation to illuminate the Earth’s surface and detect the reflection from the surface. They typically record electromagnetic waves in the range of visible (˜430–720 nm) and near-infrared (NIR) (˜750–950 nm) light. Some systems, such as SPOT 5, are also designed to acquire images in middle-infrared (MIR) wavelengths (1580–1750 nm). The power measured by passive sensors is a function of the surface composition, physical temperature, surface roughness, and other physical characteristics of the Earth []. Examples of passive satellite sensors are those aboard the Landsat, SPOT, Pléiades, EROS, GeoEye, and WorldView satellites. Active sensors provide their own source of energy to illuminate the objects and measure the observations. These sensors use electromagnetic waves in the range of visible light and near-infrared (e.g., a laser rangefinder or a laser altimeter) and radar waves (e.g., synthetic aperture radar (SAR)). A laser rangefinder uses a laser beam to determine the distance between the sensor and the object and is typically used in airborne and ground-based laser scanning. A laser altimeter uses a laser beam to determine the altitude of an object above a fixed level and is typically utilized in satellite and aerial platforms. SAR uses microwaves to illuminate a ground target with a side-looking geometry and measures the backscatter and travel time of the transmitted waves reflected by objects on the ground. The distance that the SAR device travels over a target in the time taken for the radar pulses to return to the antenna produces the SAR image. SAR can be mounted on a moving platform, such as spaceborne and airborne platforms. According to the combination of frequency bands and polarization modes used in data acquisition, sensors can be categorized as single frequency (L-band, C-band, or X-band), multiple frequency (a combination of two or more frequency bands), single polarization (VV, HH, or HV), and multiple polarization (a combination of two or more polarization modes). Currently, there are three commercial SAR missions in space: Germany’s TerraSAR-X and TanDEM-X (X-band with a ˜3.5 cm wavelength), Italy’s COSMO-SkyMed (X-band with ˜3.5 cm wavelength), and Canada’s RADARSAT-2 (C-band with ˜6 cm wavelength). In addition, ESA’s ERS-1, ERS-2, and Envisat also carried SAR, although these missions have ended. The latest SAR satellites from ESA include Sentinel-1A, Sentinel-1B, and Sentinel-3A. Typical SAR parameters are repeat frequency, pulse repetition frequency, bandwidth, polarization, incidence angle, imaging mode, and orbit direction [].

As sensor technology has advanced, the integration of passive and active sensors into one system has emerged. This trend makes it unclear difficult to categorize sensors in the traditional way, into passive sensors and active sensors. In this paper, we introduce the sensors in terms of imaging or non-imaging functionality. Imaging sensors typically employ optical imaging systems, thermal imaging systems, or SAR. Optical imaging systems use the visible, near-infrared, and shortwave infrared spectrums and typically produce panchromatic, multispectral, and hyperspectral imagery. Thermal imaging systems employ mid to longwave infrared wavelengths. Non-imaging sensors include microwave radiometers, microwave altimeters, magnetic sensors, gravimeters, Fourier spectrometers, laser rangefinders, and laser altimeters [].

It has been decades since Landsat-1, the first Earth resources technology satellite, was launched in 1972. Satellite platforms have evolved from a single satellite to multi-satellite constellations. Sensors have experienced unprecedented development over the years, from 1972 with the first multispectral satellite, Landsat-1, with four spectral bands to 1997 with the first hyperspectral satellite, Lewis, with 384 spectral bands. Spatial resolution has also significantly improved over the decades, from 80 m in Landsat-1 to 31 cm in Worldview-3. A number of studies on satellite imagery processing methods and applications have been conducted. A few papers providing sensor overviews have been published, including [, , ]. Blais [] reviewed the range sensors developed over the past two decades. The studied range sensors include single point and laser scanners, slit scanners, pattern projections, and time-of-flight systems. In addition, commercial systems related to range sensors were reviewed. Melesse et al. [] provided a survey of remote sensing sensors for typical environmental and natural resources mapping purposes, such as urban studies, hydrological modeling, land-cover and floodplain mapping, fractional vegetation cover and impervious surface area mapping, surface energy flux and micro-topography correlation, remotely sensed-based rainfall, and potential evapotranspiration for estimating crop water requirement satisfaction indexes. Recently, a survey on remote sensing platforms and sensors was provided by Toth and Jóźków []. The authors gave a general review in current remote sensing platforms, including satellites, airborne platforms, UAVs, ground-based mobile and static platforms, sensor georeferencing and supporting navigation infrastructure, and provided a short summary of imaging sensors.

In the literature, we found that overviews of remote sensing sensors were quite rare. One reason for this finding was that this topic is fairly broad. Usually, one can find detailed knowledge from thick books or a very simple overview from some webpages. As most readers need to obtain relevant knowledge within a reasonable time period and with a modest depth, the contribution of our paper is valuable. In this paper, we review the history of remote sensing, the interaction of the electromagnetic spectrum (EMS) and objects, imaging sensors and non-imaging sensors (e.g., laser rangefinders/altimeters), and commonly used satellites and their characteristics. In addition, future trends and potential applications are addressed. Although this paper is mainly about satellite sensors, there is no apparent boundary between satellite sensors and airborne, UAV-based, or ground-based sensors except that satellite sensors have more interaction with the atmosphere. Therefore, we use the term “remote sensing sensors” generally.

Advertisement

2. Remarkable development in spaceborne remote sensing

Although the term ‘remote sensing’ was introduced in 1960. However, in practice, remote sensing has a long history. In the 1600s, Galileo used optical enhancements to survey celestial bodies []. An early exploration of prisms was conducted by Sir Isaac Newton in 1666. Newton discovered that a prism dispersed light into a spectrum of red, orange, yellow, green, blue, indigo, and violet and recombined the spectrum into white light. One hundred years later, in 1800, Sir William Herschel explored the thermal infrared electromagnetic radiation for the first time in the world. Herschel measured the temperature of light that had been split with a prism into the spectrum of visible colors. In the following decades, some attempts were made with aerial photographs using cameras attached to balloons. However, the results were not satisfactory until 1858, when Gasper Felix Tournachon took the first aerial photograph successfully from a captive balloon from an altitude of 1200 feet over Paris. Later, in 1889 in Labruguiere, France, Arthur Batut attached a camera and an altimeter to kites for the first time so that the image scale could be determined. Therefore, he is considered to be the father of kite aerial photography. Then, at the beginning of the twentieth century, the camera was able to be miniaturized (e.g., 70 g) so that it was easily carried by pigeons. The Bavarian Pigeon Corps took the first aerial photos using a camera attached to a pigeon in 1903. During the First World War, the use of aerial photography grew. Later, in 1936, Albert W. Stevens took the first photograph of the actual curvature of the earth from a free balloon at an altitude of 72,000 feet. The first space photograph from V-2 rockets was acquired in 1946. addresses the evolution of the remote sensing, excluding the early development stage. The table starts with the use of aerial photographs for surveying and mapping as well for military use. The milestones in this evolution (see ) were referenced to [, ]. Additionally, recent developments in microsatellites and satellite constellations are also listed in .

PhasesTime seriesRemarksAirborne remote sensingDuring the First and Second World WarsThe use of photographs for surveying, mapping, reconnaissance and military surveillanceRudimentary spaceborne satellite remote sensingIn the late 1950sThe launch of Sputnik 1 by Russia in 1957 and Explorer 1 by US in 1958Spy satellite remote sensingDuring the Cold War (1947–1991)Remote sensing for military use spilled over into mapping and environment applicationsMeteorological satellite sensor remote sensing1960˜The launch of the first meteorological satellite (TIROS-1) by the US in 1960. Since then, data in digital formats and the use of computer hardware and softwareLandsat1972˜Landsat 1, 2, and 3 carrying a multispectral scanner; Landsat 4 and 5 carried a Thematic Mapper sensor; Landsat 7 carries an Enhanced Thematic Mapper; Landsat 8 carries the Operational Land Imager. Landsat satellites have high resolution and global coverage. Applications were initially local and have become global since thenEuropean Space Agency’s first Earth observing satellite program1991˜The European Space Agency launched the first satellite ERS-1 in 1991, which carried a variety of earth observation instruments: a radar altimeter, ATSR-1, SAR, wind scatterometer, and microwave radiometer. A successor, ERS-2, was launched in 1995Earth observing system (EOS)Since the launch of the Terra satellite in 1999Terra/Aqua satellites carrying sensors, such as MODIS and taking measurements of pollution in the troposphere (MOPITT). Global coverage, frequent repeat coverage, a high level of processing, easy and mostly free access to dataNew millenniumAround the same time as EOSNext generation of satellites and sensors, such as Earth Observing-1, acquiring the first spaceborne hyperspectral dataPrivate industry/commercial satellite systems2000˜1. Very high-resolution data, such as IKONOS and Quickbird satellites
2. A revolutionary means of data acquisition: daily coverage of any spot on earth at a high resolution, such as Rapideye
3. Google streaming technology allows rapid data access to very high-resolution images
4. The launch of GeoEye-1 in 2008 for very high-resolution imagery (0.41 m)Microsatellite era and satellite constellations2008˜1. Small satellites and satellite constellation (RapidEye and Terra Bella, formerly Skybox): RapidEye was launched in August, 2008, with five EOS. These are the first commercial satellites to include the Red-Edge band, which is sensitive to changes in chlorophyll content. On March 8, 2016, Skybox imaging was renamed to Terra Bella. Satellites provided the ability to capture the first-ever commercial high-resolution video of Earth from a satellite and the ability to capture high-resolution color and near-infrared imagery
2. For the first time, Russia carried out a single mission to launch 37 satellites in June of 2014
3. ESA launched the first satellite of the Sentinel constellation in April of 2014.
4. SpaceX reusable rocket capacity since December of 2015
5. Current satellites in high revisiting period, large coverage, and high spatial resolution, up to 31 cm

Table 1.

Evolution and advancement in remote sensing satellites and sensors.

Advertisement

3. Characteristics of materials in electromagnetic spectrum (EMS)

Remote sensors remotely interact with objects on the surface of the Earth. Objects on the surface of the Earth generally include terrain, buildings, road, vegetation, and water. The typical materials of these objects that interact with the EMS are categorized into groups: transparent and opaque (partly or fully absorbed).

3.1. Electromagnetic spectrum

contains the EMS range from gamma rays to radio waves. In remote sensing, typical applications include the visible light (380–780 nm), infrared (780 nm–0.1 mm), and microwave (0.1 mm–1 m) ranges. This paper treats the terahertz (0.1–1 mm) range as an independent spectral band separate from microwaves. Remote sensing sensors interact with objects remotely. Between sensors and the earth surface, there is atmosphere. It is estimated that only 67% of sunlight directly heats the Earth []. The remainder of the light is absorbed and reflected by the atmosphere. The Earth’s atmosphere strongly absorbs infrared and UV radiation. In visible light, typical remote sensing applications include the blue (450–495 nm), green (495–570 nm), and red (620–750 nm) spectral bands for panchromatic or multispectral or hyperspectral imaging. Current bathymetric and ice LIDAR generally uses green light (e.g., NASA’s HSRL-1 LIDAR, with a spectrum of 532 nm). However, new experiments have shown that in the blue spectrum, such as at 440 nm, the absorption coefficient for water is approximately an order of magnitude smaller than at 532 nm, and 420–460 nm light can penetrate relatively clear water and ice much deeper, offering substantial improvements in sensing through water for the same optical power output, thus reducing power requirements []. The red spectrum together with near-infrared (NIR) is typically used for vegetation applications. For example, the Normalized Difference Vegetation Index (NDVI) is used to evaluate targets that may or may not contain live green vegetation. Infrared is invisible radiant energy. Usually, infrared is divided into different regions: near IR (NIR, 0.75–1.4 μm), shortwave IR (SWIR, 1.4–3 μm), mid-IR (MIR, 3–8 μm), longwave IR (LWIR, 8–15 μm), and far IR (FIR, 15–1000 μm). Alternatively, according to the ISO 20473 scheme, another division is proposed as NIR (0.78–3 μm), MIR (3–50 μm), and FIR (50–1000 μm). Most of the infrared radiation in sunlight is in the NIR range. Most of the thermal radiation emitted by objects near room temperature is infrared []. In nature, on the surface of the Earth, almost all thermal radiation consists of infrared in the mid-infrared region, which is a much longer wavelength than that in sunlight. Of these natural thermal radiation processes, only lightning and natural fires are hot enough to produce much visible energy, and fires produce far more infrared than visible light energy. NIR is mainly used in medical imaging and physiological diagnostics. One typical application of MIR and FIR is thermal imaging, for example, night vision devices. In the MIR and FIR spectrum bands, water shows high absorption, and biological systems are highly transmissive.

What is imaging system in remote sensing?

Figure 1.

The electromagnetic spectrum. Image from UC Davis ChemWiki, CC-BY-NC-SA 3.0.

With regard to the terahertz spectrum band, terahertz frequencies are useful for investigating biological molecules. Unlike more commonly used forms of radiated energy, this range has rarely been studied, partly because no one knew how to make these frequencies bright enough [] and because practical applications have been impeded by the fact that ambient moisture interferes with wave transmission []. Nevertheless, terahertz light (also called T-rays) has remarkable properties. T-rays are safe, non-ionizing electromagnetic radiation. This light poses little or no health threat and can pass through clothing, paper, cardboard, wood, masonry, plastic, and ceramics. This light can also penetrate fog and clouds. THz radiation transmits through almost anything except for not metal and liquid (e.g., water). T-rays can be used to reveal explosives or other dangerous substances in packaging, corrugated cardboard, clothing, shoes, backpacks, and book bags. However, the technique cannot detect materials that might be concealed in body cavities [].

The terahertz region is technically the boundary between electronics and opt-photonics []. The wavelengths of T-rays—shorter than microwaves, longer than infrared—correspond with biomolecular vibrations. This light can provide imaging and sensing technologies not available through conventional technologies, such as microwaves []. For example, T-rays can penetrate fabrics. Many common materials and living tissues are semi-transparent and have ‘terahertz fingerprints’, permitting them to be imaged, identified, and analyzed []. In addition, terahertz radiation has the unique ability to non-destructively image physical structures and perform spectroscopic analysis without any contact with valuable and delicate paintings, manuscripts, and artifacts. In addition, terahertz radiation can be utilized to measure objects that are opaque in the visible and near-infrared regions. Terahertz pulsed imaging techniques operate in much the same way as ultrasound and radar to accurately locate embedded or distant objects []. Current commercial terahertz instruments include Terahertz 3D medical imaging, security scanning systems, and terahertz spectroscopy. The latest breakthrough research (9.2016) on terahertz applications was that MIT invented a terahertz camera that can read a closed book. This camera can distinguish ink from a blank region on paper. The article indicates that ‘In its current form the terahertz camera can accurately calculate distance to a depth of about 20 pages’ []. It is expected that in the future, this technology can be used to explore and catalog historical documents without actually having to touch or open them and risk damage.

Regarding microwaves, shorter microwaves are typically used in remote sensing. For example, this region is used for radar, and the wavelength is just a few inches long. Microwaves are typically used for obtaining information on the atmosphere, land, and ocean, such as Doppler radar, which is used in weather forecasts, and for gathering unique information on sea wind and wave direction, which are derived from frequency characteristics, including the Doppler effect, polarization, back scattering, that cannot be observed by visible and infrared sensors []. In addition, microwave energy can penetrate haze, light rain and snow, clouds, and smoke []. Microwave sensors work in any weather condition and at any time.

3.2. Objects and spectrum

When light encounters an object, they can interact in several different ways: transmission, reflection, and absorption. The interaction depends on the wavelength of the light and the nature of the material of the object.

Most materials exhibit all three properties when interacting with light: partly transmission, partly reflection, and partly absorption. According to the dominant optical property, we categorize objects into two typical types: transparent materials and opaque materials.

Transparent material allows light to pass through the material without being scattered or absorbed. Typical transparent objects include plate glass and clean water. shows the transmission spectrum of soda-lime glass with a 2-mm thickness. Soda-lime glass is typically used in windows (also called flat glass) and glass containers. From , it can be seen that soda-lime glass nearly blocks UV radiation. Nevertheless, it has high transmittance in the visible light and NIR wavelengths. It is easy to understand that when a laser scanner with a wavelength of 905, 1064, or 1550 nm hits a flat glass window or a glassy balcony, over 80% of the laser energy passes through the glass and hits the objects behind the window. Another typical example of transmissive material is clear water. Water transmittance is very high in the blue-green part of the spectrum but diminishes rapidly in the near-infrared wavelengths (see ). Absorption, on the other hand, is notably low in the shorter visible wavelengths (less than 418 nm) but increases abruptly in the range of 418–742 nm. A laser beam with a wavelength of 532 nm (green laser) is typically applied in bathymetric measurements as this wavelength has a high water transmittance. According to the Beer-Lambert law, the relation between absorbance and transmittance is as follows: Absorbance = −log (Transmittance).

What is imaging system in remote sensing?

Figure 2.

Transmission spectrum of soda-lime glass with a 2-mm thickness. Obtained from Wikipedia [22].

What is imaging system in remote sensing?

Figure 3.

Liquid water absorption spectrum. Obtained from Wikipedia [23].

Opacity occurs because of the reflection and absorption of light waves off the surface of an object. The reflectance of light depends on the material of the surface that the light encounters. There are two types of reflection: one is specular reflection and another is diffuse reflection. Specular reflection is when light from a single incoming direction is reflected in a single outgoing direction. Diffuse reflection is the reflection of light from a surface such that an incident ray is reflected at many angles rather than at just one angle, as in the case of specular reflection. Most objects have mixed reflective properties []. Representative reflective materials include metals, such as aluminum, gold, and silver. From , it can be seen that aluminum has a high reflectivity over various wavelengths. In the visible light and NIR wavelengths, the reflectance of aluminum reaches up to 92%, while this value increases to 98% in MIR and FIR. Silver has a higher reflectance than aluminum when the wavelength is longer than 450 nm. At a wavelength of 310 nm, the reflectance of aluminum is zero []. The reflectance of gold significantly increases at a wavelength of approximately 500 nm, reaching a very high reflectance starting in the infrared. This figure indicates that regardless of the wavelength at which the sensor operates, it is inevitable to encounter high reflection from aluminum surfaces.

What is imaging system in remote sensing?

Figure 4.

Reflective spectrum of metals: aluminum, gold, and silver.

The physical characteristics of the material determine what type of electromagnetic waves will and will not pass through it. shows examples of the reflection spectrums of dry bare soil, green vegetation, and clear water. The reflection of dry bare soil increase as the wavelength increases from 400 to 1800 nm. Green vegetation has a high reflectance in the red light and near-infrared regions. These characteristics have been applied for distinguishing green vegetation from other objects. In addition, the previous figure shows that water has a low absorbance in the visible light region. shows that water reflects visible light at a low rate (<5%). Indirectly, the figure indicates that water has a high transmittance in the visible light range.

What is imaging system in remote sensing?

Figure 5.

Examples of reflective materials. Image referenced from Wikimedia [26].

Advertisement

4. Spaceborne sensors

Spaceborne sensors have been developed for over 40 years. Currently, approximately 50 countries are operating remote sensing satellites []. There are more than 1000 remote sensing satellites available in space, and among these, approximately 593 are from the USA, over 135 are from Russia, and approximately 192 are from China [].

Conventionally, remote sensors are divided into two groups: passive sensors and active sensors, as we described in the first section. However, as sensor technology has advanced, nothing has been absolute. For example, an imaging camera is usually regarded as a passive sensor. However, in 2013, a new approach that integrates active and passive infrared imaging capability into a single chip was developed. This sensor enables lighter, simpler dual-mode active/passive cameras with lower power dissipation []. Alternatively, remote sensing sensors can be classified into imaging sensors and non-imaging sensors. In terms of their spectral characteristics, the imaging sensors include optical imaging sensors, thermal imaging sensors, and radar imaging sensors. illustrates the category in terms of imaging sensors and non-imaging sensors.

What is imaging system in remote sensing?

Figure 6.

Spaceborne remote sensing sensors.

4.1. Optical imaging sensors

Optical imaging sensors operate in the visible and reflective IR ranges. Typical optical imaging systems on space platform include panchromatic systems, multispectral systems, and hyperspectral systems. In a panchromatic system, the sensor is a monospectral channel detector that is sensitive to radiation within a broad wavelength range. The image is black and white or gray scale. A multispectral sensor is a multichannel detector with a few spectral bands. Each channel is sensitive to radiation within a narrow wavelength band. The resulting image is a multilayer image that contains both the brightness and spectral (color) information of the targets being observed. A hyperspectral sensor collects and processes information from 10 to 100 of spectral bands. A hyperspectral image consists of a set of images. Each narrow spectral band forms an image. The resulting images can be utilized to recognize objects, identify materials, and detect elemental components. gives a more detailed description of these optical imaging systems. It can be seen that when a light is split into multiple spectrums, the greater the number of spectrums is, the lower the imaging resolution will be. That is, a panchromatic image usually presents a higher resolution than a multispectral/hyperspectral image. Pan-sharpening technique was introduced by Padwick et al. in 2010 [] for improving the quality of multispectral images. This method combines the visual information of the multispectral data with the spatial information of the panchromatic data, resulting in a higher resolution color product equal to the panchromatic resolution.

Panchromatic systemsMultispectral systemsHyperspectral systemsSpectral range (nm)˜430–720˜430–720
˜750–950˜470–2000SatellitesQuickBird, SPOT, IKONOSSPOT, QuickBird, IKONOSTRW Lewis, EO-1Spectral bandMonospectral, black and white, gray-scale imageSeveral spectral bands10 to 100 of spectral bandsSpatial resolutionSubmeterUp to 1–2 mUp to 2 mApplicationsEarth observation and reconnaissance applicationsRed-green-blue (true color): visual analysis; Green-red-infrared: vegetation and camouflage detection; Blue-NIR-MIR: visualizing water depth, vegetation coverage, soil moisture content, and the presence of fires, all in a single image(i) Agriculture; (ii) eye care; (iii) food processing; (iv) mineralogy; (v) surveillance; (vi) physics; (vii) astronomy; (viii) chemical imaging; (ix) environmentAdvantagesHigh applicability in (i) imaging multiple targets; (ii) mosaic strips to large area; (iii) stereo and tristereo acquisition; (iv) linear feature acquisition, such as coastlines, pipelines, roads, and bordersDisadvantagesAffected by sun illumination and cloud coverage. Polar areas with seasonal changes in sun illumination and the equatorial belt with persistent cloud coverage

Table 2.

Satellite optical imaging systems.

4.2. Thermal IR imaging sensors

A thermal sensor typically operates in the electromagnetic spectrum between the mid-to-far-infrared and microwave ranges, roughly between 9 and 14 μm. Any object with a temperature above zero can emit infrared radiation and produce a thermal image. A warm object emits more thermal energy than a cooler object. Therefore, the object becomes more visible in an image. This is especially useful in tracking a living creature, including animals and the human body, and detecting volcanos and forest fires because a thermal image is independent from the lights in a scene and is available whether it is daytime or nighttime. Commonly used thermal imaging sensors include IR imaging radiometers, imaging spectroradiometers, and IR imaging cameras. Currently, the satellite IR sensors in use include ASTER, MODIS, ASAA, and IRIS. lists the thermal IR sensors and their applications.

SensorOperational wave bandDefinitionSatellites sensorsApplicationsIR imaging radiometerUV, mid-to-far-infrared, or microwaveMeasures the intensity of electromagnetic radiationASTERVolcanological, mineralogical, and hydrothermal studies, forest fires, glacier, limnological and climatological studies and DEMImaging spectroradiometerInfraredMeasure the intensity of radiation in multiple spectrumsMODIS, ASAS, IRISSea surface temperature, cloud characteristics, ocean color, vegetation, trace chemical species in the atmosphereInfrared imaging cameraMid-far infraredMeasure reflected energy from the surfaceVolcanology, determining thunderstorm intensity, identifying fog and low clouds

4.3. Radar imaging sensors

A radar (microwave) imaging sensor is usually an active sensor, operating in an electromagnetic spectrum range of 1 mm–1 m. The sensor transmits light to the ground, and the energy is reflected from the target to the radar antenna to produce an image at microwave wavelengths. The radar moves along a flight path, and the area illuminated by the radar, or footprint, is moved along the surface in a swath. Each pixel in the radar image represents the radar backscatter for that area on the ground. A microwave instrument can operate in cloudy or foggy weather and can also penetrate sand, water, and walls. Unlike infrared data that help us identify different minerals and vegetation types from reflected sunlight, radar only shows the difference in the surface roughness and geometry and the moisture content of the ground (the complex dielectric constant). Radar and infrared sensors are complimentary instruments and are often used together to study the same types of Earth surfaces []. Frequently used microwave spectrum bands for remote sensing include the X-band, C-band, S-band, L-band, and P-band. Specific characteristics of each band can be found in .

BandFrequency (GHz)Wavelength (cm)Key characteristicsKa40–270.75–1.11Usually for astronomical observationsK27–181.11–1.67Used for radar, satellite communications, astronomical observations, automotive radarKu18–121.67–2.5Typically used for satellite communicationsX12.5–82.4–3.75Widely used for military reconnaissance, mapping and surveillanceC4–83.75–7.5Penetration capability of vegetation or solids is limited and restricted to the top layers. Useful for sea-ice surveillanceS4–27.5–15Used for medium-range meteorological applications, for example, rainfall measurement, airport surveillanceL2–115–30Penetrates vegetation to support observation applications over vegetated surfaces and for monitoring ice sheet and glacier dynamicsP1–0.330–100So far, only for research and experimental applications. Significant penetration capabilities regarding vegetation canopy, sea ice, soil, and glaciers

Table 4.

Commonly used frequency and spectrum bands of radar imaging sensors.

Referenced from Born and Wolf [].

Conventional passive microwave imaging instruments (such as cameras or imaging radiometers) provide imagery with a relatively coarse spatial resolution when compared to an optical instrument. The diffraction-limited angular resolution of a camera aperture is directly proportional to the wavelength and inversely proportional to the aperture dimension []. To achieve a similar spatial resolution as optical instruments, a large antenna aperture (e.g., tens of kilometers) is needed. Clearly, it is not feasible to carry such a large antenna on a space platform. SAR is an active microwave instrument that resolves the above problem. SAR utilizes the motion of the spacecraft to emulate a large antenna from the small craft itself. The longer the antenna is, the narrower the beam is. A fine ground resolution usually results from a narrow beam width. At present, a synthesized aperture can be several orders of magnitude larger than the transmitter and receiver antenna. It has become possible to produce an SAR image with a half meter of accuracy [].

Specifically, SAR uses microwaves to illuminate a ground target with a side-looking geometry and measures the backscatter and traveling time of the transmitted waves reflected by objects on the ground. The distance the SAR device travels over a target in the time taken for the radar pulses to return to the antenna produces the SAR image. Typically, SAR is mounted on a moving platform, such as a spaceborne or airborne platform. According to the combination of frequency bands and polarization modes used in data acquisition, SAR can be categorized into []:

  • Single frequency (L-band, C-band, or X-band);

  • Multiple frequency (Combination of two or more frequency bands);

  • Single polarization (VV, HH, or HV);

  • Multiple polarization (Combination of two or more polarization modes).

The main parameters of designing and operating SAR include the power of electromagnetic energy, frequency, phase, polarization, incident angle, spatial resolution, and swath width. There are different types of SAR techniques, including ultra-wideband SAR, terahertz SAR, differential interferometry (D-InSAR), and interferometric SAR (InSAR). Ultra-wideband SAR utilizes a very wide range of frequencies of radio waves. This method results in a better resolution and more spectral information on target reflectivity. Therefore, this approach can be applied for scanning a smaller object or a closer area. Terahertz radiation works in the spectral range from 0.3 to 10 THz, typically between infrared and microwave. Typical characteristics of this wavelength range include its transmission through plastics, ceramics, and even papers. Terahertz radiation is extraordinarily sensitive to water content. If the material has even a small amount of water content, it will be fairly absorptive to terahertz light. Therefore, this radiation can be applied in detecting lake shores or coastlines. InSAR, also called interferometric SAR, is a technique that produces measurements from two or more SAR images. This technique is widely applied in DEM production and monitoring glaciers, earthquakes, and volcanic eruptions []. D-InSAR requires taking at least two images with the addition of a DEM. The DEM can be acquired from GPS measurements. This method is mainly used for monitoring subsidence movements, slope stability analysis, landslides, glacier movement, and 3D ground movement []. Doppler radar is used to acquire a distant object’s velocity relative to the radar. The main applications of this technique include aviation, sounding satellites, and meteorology. In general, SAR can reach a spatial resolution on the order of a millimeter.

4.4. Non-imaging sensors

A non-imaging sensor measures a signal based on the intensity of the whole field of view, mainly as a profile recorder. In contrast with imaging sensors, this type of sensor does not record how the input varies across the field of view. In the remote sensing field, the commonly used non-imaging sensors include radiometers, altimeters, spectrometers, spectroradiometers, and LIDAR. provides detailed information about conventional non-imaging sensors. In the remote sensing field, non-imaging sensors typically work in the visible, IR, and microwave spectral bands. The applications for non-imaging sensors mainly focus on height, temperature, wind speed, and other atmospheric parameter measurements.

SensorOperational wave bandDefinitionApplicationRadiometerUltraviolet, IR, microwaveTo measure the amount of electromagnetic energy present within a specific wavelength rangeCalculating various surface and atmospheric parametersAltimeterIR, microwave/radiowave, sonicTo measure the altitude of an object above a fixed levelMapping ocean-surface topography and the hills and valleys of the sea surfaceSpectrometerVisible, IR, microwaveTo measure the spectral content of the incident electromagnetic radiationMultispectral and hyperspectral imagingSpectro-radiometerVisible, IR, microwaveTo measure the intensity of radiation in multiple spectrumsMonitoring sea surface temperature, cloud characteristics, ocean color, vegetation, trace chemical species in the atmosphereLIDARUltraviolet, visible, NIRTo measure distance and intensityOcean, land, 3D topographic mappingDoppler LIDAR: measure the wave number for speed; Polarization effects of LIDAR: shapeMeteorology, cloud measurements, wind profiling and air quality monitoringSonarAcousticMeasure the distance to an object; determine the depth of water beneath ships and boatsNavigation, communication and security (e.g., vessels) and underwater object detection. For example, handheld sonar for a diverSodarAcousticAs a wind profiler, sodar systems measure wind speeds at various heights above the ground and the thermodynamic structure of the lower layer of the atmosphereMeteorology: atmospheric research, wind monitoring (typically in a range from 50 to 200 m above ground level)A radio acoustic sounding system (RASS)Radio wave and acoustic waveMeasuring the atmospheric lapse rate using backscattering of radio waves from an acoustic wave front to measure the speed of sound at various heights above the groundIs added to a radar wind profiler or to a sodar system

Lasers have been applied in measuring the distance and height of targets in the remote sensing field. We generally call a laser scanning system as LIDAR (light detection and ranging) system. Satellite LIDAR, airborne LIDAR, mobile mapping LIDAR, and terrestrial LIDAR are different carrier platforms. Laser sources include solid-state lasers, liquid lasers, gas lasers, semiconductor lasers, and chemical lasers (see ). Typical laser sources for laser rangefinders and laser altimeters include semiconductor laser and solid-state lasers. Semiconductor lasers typically produce light sources at wavelengths of 400–500 nm and 850–1500 nm. Solid-state lasers generate light at wavelengths of 700–820 nm, 1064 nm, and 2000 nm. Satellite or airborne LIDAR systems are typically operated at wavelengths of 905 , 1064 and 1550 nm. One of the main considerations for wavelength selection is the atmospheric transmission between the sensor and the surface of the Earth. Lower transmittance at a given wavelength means less solar radiation at that wavelength. The transmittance at 905 nm is approximately 0.6, while the wavelengths of 1064 and 1550 nm have similar transmittances of approximately 0.85. In addition, wavelength selection can also be a cost issue. Diode lasers at 905 nm are inexpensive compared to Nd:YAG solid-state lasers at 1064 nm and diode lasers at 1550 nm. In 2007, the cost of diode lasers at 1550 nm was 2.5 times higher than lasers at 905 nm. However, the wavelength of 1550 nm is a good candidate for use in invisible wavelength eye-safe LIDAR. The higher absorption of 1550 nm light by water makes it eye safe, and this absorption is approximately 175 times greater than that of 905 nm light. In addition, the solar background level of light at 1550 nm is approximately 50% lower than that of light at 905 nm. Making measurements at 1550 nm also results in a higher signal to noise ratio compared to using a beam at 905 nm. All in all, when ignoring the cost issue, a wavelength of 1550 nm has a clear advantage over light at 905 nm [].

Laser typesPump sourceTypical applicationsGas laserElectrical dischargeInterferometry, holography, spectroscopy, material processingChemical laserChemical reactionMilitary useDye laserOther laser, flashlampResearch, laser medicineMetal-vapor laserElectrical dischargePrinting and typesetting applications, fluorescence excitation examination, scientific researchSolid-state laserFlashlamp, laser diode, Fiber laser, Nd: YAG.Material processing, rangefinding, laser target designationSemiconductor laserElectrical currentTelecommunications, holography, printing, weapons, machining

In general, at a wavelength of 1064 nm, vegetation has stronger reflectance than soil, while at a wavelength of 1550 nm, soil shows greater a reflectance than vegetation. Taking measurements with different wavelengths is beneficial for object classification. Green lasers with a wavelength of 532 nm are usually pumped by a solid-state laser (Nd:YAG). This type of laser is widely used for bathymetric measurement. lists the typical applications of different laser light wavelengths.

Namewavelength (nm)Laser sourcesTypical applicationsUV laser355Gas laserCutting and drillingViolet laser405Semiconductor laser or solid-state laserLaser printing, data recording, laser microscopy, laser projection displays, spectroscopic measurementsBlue laser488Solid-state laserEnvironmental monitoring, medical diagnostics, handheld projectors and displays, telecommunicationsGreen laser532Solid-state laser (Nd:YAG)Bathymetric measurementRed laser640Semiconductor laserVegetation measurementNIR laser1064Semiconductor laser or solid-state laser (fiber laser)Airborne laser scanningNIR laser1550Semiconductor laser or solid-state laser (fiber laser)Airborne laser scanning

Table 7.

Commonly used laser wavelength.

Referenced from Hey [].

4.5. Commonly used remote sensing satellites

So far, more than 1000 remote sensing satellites have been launched. These satellites have been updated with new generation satellites. The few spectral sensors from the earliest missions have been upgraded to hyperspectral sensors with hundreds of spectral bands. The spatial and spectral resolutions have been improved on the order of 100-fold. Revisit times have been shortened from months to daily. In addition, more and more remote sensing data are available as open data sources. gives an overview of the commonly used remote sensing satellites and their parameters.

MissionCountryLaunch yearSensorsHeight of orbit (km)Swath (km)Revisit (day)ChannelsSpatial resolutionLandsatUSA1972, 1975, 1978,1982, 1984,1993, 1999,2013, 2020Panchromatic and multispectral sensor705185, 183167–11120 m, 100 m, 60 m, 30 m, 15 mSPOTUSA1986, 1990, 1993, 1998, 2002, 2012Imaging spectroradiometer694601–3Panchromatic, B, G, R, NIR2.5 m, 5 m, 10 m, 20 mERSESA1991, 1995IR radiometer, microwave sounder, Radiometer, SAR782–7855–100 km (AMI) - 500 km (ATSR)3, 35, 336SAR26 m across track and 6–30 m along trackRADARSATCanada1995, 2007, 2018SAR793–821, 798, 592.745–100, 18–500, 5–5001SAR8–100 m, 3–100 m, 3–100 mMODISUSA1999, 2002Imaging spectroradiometer70523301361000 m, 500 m, 250 mIKONOSUSA1999Imaging spectroradiometer68111.33Panchromatic, B, G, R, NIRPanchromatic:80 cm
B, G, R, NIR:3.2 mQuickBirdUSA2000, 2001Imaging spectroradiometer482, 45016.8–182.4–5.9Panchromatic, B, G, R, NIRPanchromatic:65 cm/61 cm
B, G, R, NIR:2.62 m/2.44 mEnvisatESA2002ASAR, MERIS, AATSR, RA-2, MWR, GOMOS, MIPAS, SCIAMACHY, DORIS, LRR7901150 km,
100 km, 400 km35 days15 bands (VIS, NIR), C-band300 m, 30–150 mGeoEyeUSA2008Imaging spectroradiometer68115.28.3Panchromatic, B, G, R, NIRPanchromatic:41 cm
B, G, R, NIR: 1.65 mWorldViewUSA2007
2009
2014
2016.9Imaging spectroradiometer, Laser altimeter496,
770,
617,
68117.6 km
16.4 km
13.1 km
14.5 km1.7
1.1
<1
3Panchromatic;
Panchromatic and eight multispectrum;
Panchromatic and eight multispectrum;
Panchromatic, B, G, R, NIRPanchromatic 0.5 m;
Panchromatic and stereo images:0.46 m
multispectral: 1.84 m;
Panchromatic 0.34 m and multispectral 1.36 mSentinel
1–6ESA2014, 2015,
2016,
2017, 2021Radar and super-spectral imaging693,
786, 814250 km
290 km,
250 km,12, 10, 27C-SAR, 12 bands (VIS, NIR, SWIR), 21 bands (VIS, NIR), S-band & X-band5–20 m, 5–40 m,
10 m & 20 m & 60 m

Table 8.

Remote sensing satellites.

Referenced from Refs. [, , , ].

Advertisement

5. Future and discussions

A common expectation from the remote sensing community is the ability to acquire data at high resolutions (spatial, spectral, radiometric, and temporal), at low cost, with open resource support and for the creation of new applications by the integration of spatial/aerial and ground-based sensors.

The development of smaller, cheaper satellite technologies in recent years has led many companies to explore new ways of using low Earth orbit satellites. Many companies have focused on remote imaging, for example, to gather optical or infrared imagery. In the future, a low-cost communications network between low Earth orbit satellites can be established to form a spatial remote sensing network. This network would integrate with a large number of distributed ground sensors to establish ground-space remote sensing. In addition, satellites can easily cover large swaths of territory, thereby supplementing ground-based platforms. Thus, data distribution and sharing would become very easy.

Openness and sharing resources can promote the utilization of remote sensing and maximize its output. In recent years, open remote sensing resources have made great progress. Beginning on April 1, 2016, all Earth imagery from a widely used Japanese remote sensing instrument operating aboard NASA’s Terra spacecraft since late 1999 has been available to users everywhere at no cost []. On April 8, 2016, ESA announced that an amazing 40-cm resolution WorldView-2 European cities dataset would be available for download through the Lite Dissemination Server. These data are made available free of charge. This dataset was collected by ESA, in collaboration with European Space Imaging, over the most populated areas in Europe at 40-cm resolution. These data products were acquired between February 2011 and October 2013. The dataset is available to ESA member states (including Canada) and European Union Member states []. In open remote sensing resources, NASA (USA) was a pioneer in sharing its imagery data. NASA has been cooperating with the open source community, and many NASA projects are also open source. NASA has also set up a special website to present these projects. In addition, some commercial companies like DigiGlobal (USA) have also partly opened their data to the public. In the future, more and more open resources will become available.

Future applications in remote sensing will combine the available resources from space/aerial/UAV platforms with ground-based data. The prerequisites of such resource integration are as follows: (i) the spatial resolution of satellite data is high enough to match ground-based data; for example, both spatial data and ground data are in the same order of accuracy. WorldView-3 has achieved a 30-cm spatial resolution, which is comparable with ground-based sub-centimeter data accuracy (e.g., 2 cm in mobile laser point cloud); (ii) cloud-based calculation supports big datasets from crowd-sourced remote sensing resources. The current situation shows promising support for the integration of multiple sources of remote sensing data. We expect to see new applications developing in the coming years.

Advertisement

6. Conclusions

This paper investigated remote sensing sensor technology both broadly and in depth. First, we reviewed some fundamental knowledge about the electromagnetic spectrum and the interaction of objects and the spectrum. It helps to understand that when a sensor is operated in a certain wavelength how environmental objects will react to it. In addition, we also highlighted the terahertz region of the spectrum. Since little research has been done on this range, in the future, research efforts on new applications of terahertz radiation may be worth exploring. On the interaction of sensors with the environment, typical examples in glass, metal, water, soil, and vegetation were provided. Remote sensors were presented in terms of imaging sensors and non-imaging sensors. Optical imaging sensors and thermal imaging sensors, radar imaging sensors, and laser scanning were highlighted. In addition, commonly used remote sensing satellites, especially those from NASA and ESA, were detailed in terms of launched time, sensors, swath width, spectrum bands, revisit time and spatial resolution.

Advertisement

Acknowledgments

We would like to thank TEKES for its funding support in the project of COMBAT and also the financial support from EU project 6Aika.

What is imaging in remote sensing?

It is important to distinguish between the terms images and photographs in remote sensing. An image refers to any pictorial representation, regardless of what wavelengths or remote sensing device has been used to detect and record the electromagnetic energy.

What is the purpose of imaging system?

Imaging systems are devices used for the purpose of measuring position, momentum, energy, or mass of charged particles. Microscopes provide detailed information of initial positions by magnifying these into more easily measurable final conditions.

What are 2 types of remote sensing images?

Remote Sensing Imagery.
Aerial Photography..
Satellite Imagery..

What are the 5 main components of a remote sensing system?

COMPONENTS OF REMOTE SENSING. ... .
1.1 Energy Source or Illumination. ... .
1.2 Interaction with the Target..
1.3 Recording of Energy by the Sensor. ... .
1.4 Transmission, Reception, and Processing. ... .
1.5 Interpretation and Analysis. ... .
CONCEPT OF SPECTRAL SIGNATURES. ... .
EARTH OBSERVATION SYSTEMS..