3.3 Satellite data

General introduction to the earth observation data                                                                                                               

Earth Observation (EO) data, sometimes also called Remote Sensing (RS) can provide important spatial and temporal information to monitor hazards in space and time such as flooding, landsliding, soil erosion and volcanic eruptions. EO can be described as the process of making measurements or observations without direct contact with the object being measured or observed. Thus, while in the geo-informatics context satellites often come to mind, even amateur photography is a form of Remote Sensing.

If you would like to know more about remote sensing, you can consults the ITC textbook on GI Science and Earth Observation. Click here to download the textbook.

Introduction to remote sensing 

Remote sensing is the science and technology of observing an object / phenomenon by instrument based techniques.  Remote means observation is done at a distance without physical contact with the object of interest, and sensing means detection of energy, such as light or another form of electromagnetic energy. Remote sensing can provide important spatial and temporal data for the different layers  in a geographic information system, such as the infrastructure maps and landuse maps. 

The first requirement for remote sensing is to have an energy source to illuminate the target. Sunlight is one type of energy sources in the form of electromagnetic radiation for sensors. The sun's radiation reaches the Earth's surface and the reflected radiation is received by the sensors. The processing stations on the  ground receive the signals from both the sensors and the satellite communication. After analysis of the images, it can be applied to many fields.     

The total range of wavelengths of EM radiation is called the EM spectrum. Figure 1 illustrates the range of EM spectrum and figure 2 shows the relationship between wavelength, frequency and energy. The different portions of the spectrum are named as gamma rays, X-rays, UV radiation, visible radiation (light), infrared radiation, microwaves, and radio waves. Each of these named portions represents a range of wavelengths, not one specific wavelength. The EM spectrum is continuous and does not have any clearcut class boundaries. A short wavelength implies a high frequency, while long wavelengths are equivalent to low frequencies. Blue light has a higher frequency than red light. 

 Figure 1: Electro Magnetic (EM) Spectrum. (Click on the image to watch a video)

Different portions of the spectrum have differing relevance for Earth Observation, both in the type of information that we can gather and the volume of geospatial data acquisition (GDA). The majority of GDA is accomplished by sensing in the visible andinfrared range. The UV portion covers the shortest wavelengths that are of practical use for Earth Observation. UV radiation can reveal some properties of minerals and the atmosphere. Microwaves are at the other end of the useful range for Earth Observation; they can, among other things, provide information about surface roughness and the moisture content of soils.

The visible portion  of the spectrum, with wavelengths producing colour (blue, green and red), is only a very small fraction of the entire EM wavelength range. The visible portion  of the spectrum, with wavelengths producing colour, is only a very small fraction of the entire EM wavelength range. Radiation beyond red light, with larger wavelengths in the spectrum, is referred to as infrared (IR). We can distinguish vegetation types and the stress state of plants by analysing near-infrared (and mid-infrared) radiation - this works much better than trying to do so by colour. Infrared radiation with a wavelength longer than 3 m is termed thermal infrared (TIR) because it produces the sensation of heat . Near-IR and mid-IR do not produce a sensation of something being hot. Thermal emissions of the Earth's surface (288 K) have a peak wavelength of 10 m. 

 Figure 2: Relationship between wavelength, frequency and energy. (Click on the image to watch a video)

Introduction to the sensors and platforms

For Earth Observation normally a sensor (i.e. a camera or scanner) is required and sensors can be operated at altitudes ranging from just a few centimeters above the ground - using field equipment - to those far beyond the atmosphere. Very often the sensor is mounted on a moving vehicle - which we call the platform - such as an aircraft or a satellite (See Figure 3). They are classified into three categories, Ground-based, airborne and the space-borne. Ground-based sensors are often used to record detailed information about the surface which is compared with information collected from aircraft or satellite sensors. Airborne sensors may be placed on a ladder, scaffolding, tall building, cherry-picker, crane, etc. Aerial platforms are primarily stable wing aircraft, although helicopters are occasionally used. In space, remote sensing is sometimes conducted from the space shuttle or, more commonly, from satellites.  

Satellites move along a fixed orbit, and are thus less flexible, but can provide data at regular intervals. Polar orbit and geostationary orbit are the two different classes of the fixed orbits. For the polar orbit, the satellites continuously circle the Earth at an altitude of some 500-900 km, passing over or near the poles. For the geostationary orbit, the satellite is always directly above a designated place on the equator, moving with the rotating Earth at an altitude of 36,000 km.  

                                                                                                              Figure 3: airborne and space-borne platform. 

EO data types

The data types we obtain depend primarily on the sensor types, just like you might take color or black/white photos with your camera.

The source of energy for data types of optical sensors is reflected sunlight, which contains visible light, but also ultraviolet (UV), infrared (IR), thermal and other energy. Data types of sensors which use reflected sunlight are called passive. RADAR sensors use their own source of energy, with different wave lengths in the microwave part of the Electro-Magnetic spectrum and Light Detection and Ranging (LiDAR) uses laser beams from an airplane. Data types of these sensors are called active.

The data the sensors recorded is in an image in the form of a grid, or pixels (See Figure 4) with rows and columns. Each pixel contains reflectance information, the Digital Number (DN). In the so-called panchromatic mode all reflectance   information is stored in one single grid (band) “ comparable to a digital camera. An optical sensor can also record the data in several bands where  each band contains information from a specific wavelength “ the so-called multi-spectral mode. Multiple bands can be combined in different ways to create (false) color images to serve specific purposes because different materials on the ground reflect energy in a characteristic spectral pattern.For example, vegetation is characterized by high energy in the Near Infra-Red (NIR), while for water the energy is very low. In the image this would result in high values (Digital Numbers [DN]) for vegetation and low values for water in the band corresponding to the NIR. Sensors on the Landsat satellite have a pixel size of 30m in the multi-spectral bands and 15m in panchromatic band. Other sensors like GeoEye-2 have a pixel size of less than 0.4m in the panchromatic mode.  

                   

Figure 4: Grid structure of a multi-band image.

 

                                  Click on the image RADAR and LiDAR to watch videos. 

Radar and LiDAR

Active RS technologies can provide accurate information about the terrain surface. They have their own source of energy. Examples of active systems are Radar and LiDAR.

Radar system is based on the microwave remote sensing principle. Microwave remote sensing uses electromagnetic waves with wavelengths between 1 cm and 1 m. These relatively long wavelengths have the advantage that they can penetrate clouds and are not affected by atmospheric scattering. The Radar antenna transmits microwave signal to the Earth's surface where they are backscattered. Backscattered signals are detected by the sensor. It can look through clouds  and are therefore very suitable for hazard mapping during cloudy weather conditions. Space borne radar systems are for instance Radarsat, Terra SAR-x and ESA's freely downloadable Sentinel-1 data. A well know example of radar data is acquired during the Shuttle Radar Topography Mission (SRTM) of February 2000. This elevation data can be downloaded for free at a horizontal resolution of 30 m.

Light Detection And Ranging (LiDAR) emits laser beams with a high frequency and collects the reflections. LiDAR data can be acquitted during the day and night, create very accurate surface models and measure and analyze the height of buildings and trees structures.

Optical satellite system

Optical satellite systems, with passive sensors, detect sunlight radiation reflected from the earth and thermal radiation in the visible and infrared of the electromagnetic spectrum. The examples are Landsat, ASTER, GeoEye-1, GeoEye-2 , WorldView-1 , WorldView-2, IKONOS, QuickBird, SPOT, and the Pleiades 1A & 1B satellite constellation. ESA's Sentinel-2 data is nowadays freely downloadable at a minimal resolution of 10 m. . 

Landsat represents the world's longest continuously acquired collection of space-based moderate-resolution land remote sensing data. Four decades of imagery provides a unique resource for those who work in agriculture, geology, forestry, regional planning, education, mapping, and global change research. Landsat images are also invaluable for emergency response and disaster relief.

ASTER satellite sensor is one of the five state-of-the-art instrument sensor systems on-board of the Terra Platform, launched on December 18, 1999.ASTER has 14 bands of information. Aster Global Digital Elevation Model (GDEM) made from the VNIR bands 3-N(nadir) and 3-B(backward looking). The spatial resolution is 30 m.

GeoEye-1 satellite is launched in September 2008 and has an average revisit time of 8 to 2 days, depending on the off-nadir look angle. The spatial resolution in the panchromatic mode is 0.46 m. and multi-spectral 1.84 m.

GeoEye-2 satellite sensor will be held in storage until needed. This satellite will be capable of discerning objects on the Earth's surface as small as 34 cm in the panchromatic and 1.36 meter in the multi-spectral mode.Due to its high resolution and stereo capability, the GeoEye-1 is very suitable for the mapping of for instance landslides, earthquake damage and infra-structures at risk.

WorldView-1 satellite is launched in September 2007 and has an average revisit time of 1.7 days. It is capable of collecting up to 750,000 square kilometers) per day of 0.46 meter imagery.

WorldView-2 satellite is launched October 2009, and provides 0.46m Panchromatic mono and also stereo satellite image data. The multi-spectral bands have a resolution of 1.8 m.

The satellite acts like a paintbrush, sweeping back and forth to collect very large areas of multispectral imagery in a single pass. WorldView-2 alone is able to collect nearly 1 million km2 every day with a revisit time of 1.1 day only, providing the most same-day passes of any commercial high resolution satellite.Due to its high resolution and stereo capability, this satellite is very suitable for the mapping of for instance landslides, and infra-structures at risk.

IKONOS satellite sensor was successfully launched as the first commercially available high resolution satellite sensor on September 1999 at Vandenberg Air Force Base, California, USA.Its capabilities include capturing a 4 m multi-spectral and 1.0 m. panchromatic resolution. Its applications include both urban and rural mapping of natural resources and of natural disasters, tax mapping, agriculture and forestry analysis, mining, engineering, construction, and change detection. It can yield relevant data for nearly all aspects of environmental study.  The IKONOS satellite sensor can be programmed to acquire stereo IKONOS satellite image data for the production of digital elevation models (DEM's).

QuickBird satellite sensor was launched October 2001 at Vandenberg Air Force Base, California, USA. QuickBird satellite collects image data to 0.7 m. pixel resolution panchromatic and 2.9 m. multi-spectral. The revisit time is 1 “ 3 ½ days. The satellite is an excellent source of environmental data useful for analyses of changes in land usage and hazard studies.

The SPOT-5 satellite sensor has been decommissioned as of March 31, 2015. SPOT-5 was placed into orbit on May 2002. Archived SPOT-5 Satellite Imagery will remain available.SPOT-5 has a spatial resolution of the panchromatic band of 2.5 / 5 m. and of the multi-spectral bands of 10 m. The revisit time was 2 “ 3 days. 

SPOT-6 satellite sensor built by AIRBUS Defence & Space was launched on September 2012. SPOT 7 is the twin satellite of SPOT 6 and was launched June 2014.Spatial resolution of the  panchromatic band is 1.5 m. and of the multi-spectral bands 6.0 m. Both satellites covers wide areas in record time (six million square km per day “ an area ten times the size of France). The constellation has a daily revisit everywhere.

Pleiades-1A satellite sensor was launched on December 2011 and provides ortho-rectified high resolution image data. The spatial resolution of the panchromatic band is 0.5 m. and that of the multi-spectral bands 2 m. The image location accuracy can be improved up to an exceptional 1 meter -  by the use of GCPs. Because the satellite has been designed with urgent tasking in mind, images can be requested less than six hours before they are acquired. This functionality will prove invaluable in situations where the expedited collection of new image data is crucial, such as crisis monitoring. 

Pleiades-1B twin satellite sensor of Pleiades 1A was launched on December 2012. It has the same sensor properties as Pleiades 1A. The constellation enables daily revisits to any location on the planet. It has also the capability of stereo imaging: 20 km x 280 km.

Sentinel-2  satellite sensor was launched by the European Space Agency (ESA) in the midst of 2015. The freely downloadable data contains three visible and near infra-red bands with a spatial resolution of 10 m. Other bands are red-edge/shortwave-infrared bands at 60 m and three 3 atmospheric correction bands at the same resolution.

 

 

Satellite data application for the natural hazards

For a number of hazards satellite-based information is the major source for generating hazard inventories, and hazard monitoring (e.g. tropical cyclones, forest fires, and drought). For others it supports ground based measurements (e.g. earthquakes, volcanic eruptions, coastal hazards). There are hazard types that cannot be recorded by a network of measurement stations, as these do not have specific measurable characteristics (such as landslides, forest fires and snow avalanches). There are also many areas where recorded information is not available. Thus the identification of hazardous phenomena may require techniques such as automatic classification or expert visual interpretation of remote sensing data. However, ground based data remains crucial to support information extracted from EO data.

To obtain changes over time - for instance to identify flooded areas or landslides - is  to apply automatic classification methods with dedicated software programs. These methods make use of reflectance information in different parts of the electromagnetic spectrum captured by different bands in the optical and infrared domain. It is however advised to use also additional ground measurements, if available. For the detection of for instance small to medium sized landslides high spatial resolution EO data are needed with pixels sizes of 1 “ 5 m. An advisable method is to combine the multi-spectral EO data with elevation models. In this way attribute data of slopes, such as steepness or length, can give important additional information on occurring surface processes. It will be even possible to create 3-D visualizations of the terrain, with the classified images draped over it.

The use of optical satellite data is often hampered by the presence of clouds. Microwave sensors are therefore a better tool for the study of river flooding during cloudy situations. Although classification of the radar data is limited, because of the absence of multi-spectral bands, the extent of the flooded area can be in most situations well detected. Sequential inundation phases, including duration, depth of inundation, and direction of current can be analysed, preferably with pre-optical imagery and additional elevation data of for instance the terrains next to the active river bed.

Mapping of forest fires with satellite information is done by mapping the fires themselves using thermal sensors, or through the mapping of burnt areas, e.g. using MODIS or AVHRR which have a high temporal resolution, or through synthetic aperture radar.

For visual interpretation of hazard phenomena that cannot be automatically obtained from satellite images (such as landslides) and for geomorphological interpretation of hilly and mountainous areas, stereoscopic imagery with a high to very high resolution is required. Very high resolution imagery (QuickBird, IKONOS, WorldView, GeoEye, SPOT-5, Resourcesat, Cartosat, Formosat and ALOS-PRISM) have become the best option now for visual mapping from satellite images, and the number of operational sensors with similar characteristics is growing year by year, as more countries are launching earth observation satellites with stereo capabilities and spatial resolution of 3 meters or better. The high costs may still be a limitation for obtaining these very high resolution images for particular study areas, especially for multiple dates after the occurrence of main triggering events such as tropical storms or cyclones. Therefor one might consider to use as an alternative the freely downloadable data of ESA's Sentinel-2 satellite with a spatial resolution of 10 m. and a revisit time of only one week or more.

Download the satellite data (Click the pictures below to visit the websites):

For further reading

A review of the status of satellite remote sensing and image processing techniques for mapping natural hazards and disasters. (Click here to download)

Remote Sensing and GIS for Natural Hazards Assessment and Disaster Risk Management (Click here to download)

main-menu