Deceti.jpg (13851 bytes)

Theoretical Basis

 

Electromagnetic Radiation


In remote sensing, information on the object concerned is conveyed to the observer through electromagnetic energy, which is the information carrier and thus provides the communication link.
Remote sensing data are basically records of electromagnetic radiation reflected and emitted from the object/earth features under investigation.
The proportions of energy reflected, absorbed and transmitted will vary for different earth features, depending on their material type and condition. These differences permit us to distinguish different features on an image.
However, even within a given feature type, the proportion of energy which is reflected, absorbed and transmitted will vary at different wavelengths.
These two basic characteristics of electromagnetic radiation enable us to identify and study an object/earth feature, or in other words, to apply remote sensing.
Electromagnetic radiation is a form of energy transfer in free space which exhibits both wave and properties.
The wave can be described in terms of wavelength (ë), which is the distance of separation between adjacent wave peaks, or its frequency (f), which is the number of wave peaks passing a fixed point out a given time.
Sensing systems collect information through several portions of the electromagnetic spectrum.
In remote sensing, it is most common to categorise electromagnetic waves by their wavelength location within the electromagnetic spectrum. Although names are generally assigned to regions of the electromagnetic spectrum for convenience, there is no clear-cut dividing line between one nominal spectral region and the next. Divisions of the spectrum have grown out of the various methods for sensing each type of radiation more than from inherent differences in the energy characteristics of various wavelengths.
The portions of the electromagnetic spectrum commonly used in remote sensing are:


BULLET.JPG (677 bytes) gamma rays,
BULLET.JPG (677 bytes) x - rays,
BULLET.JPG (677 bytes) ultraviolet, it adjoins the blue end of the visible portion of the spectrum,
BULLET.JPG (677 bytes) visible (0,4–0,7 ėm),
BULLET.JPG (677 bytes) infrared, which adjoins the red end of the visible region and is further divided into three portions: near – IR (0,7–1,3 ėm), mid – IR (1,3–3 ėm) and thermal IR (beyond 3 ėm),
BULLET.JPG (677 bytes) microwaves (1 mm – 1 m).

Spectral responses measured by remote sensors over various features at various wavelengths often permit an assessment of the type and/or condition of the features and are often referred to, as “spectral signatures”. “Spectral signatures” enable us to distinguish snow from water, vegetation from soil and so on.

Reference:
Rokos, D. “Photointerpretation and Remote Sensing”, NTUA, 1979
C.P.Lo., “Applied remote sensing”, University of Georgia, 1986.

 

Basic Concepts and Magnitudes of E.M.R.

TOP


Electromagnetic radiation is the basis of photointerpretation - remote sensing, since the frequency of sensing, the registration and the measurement of that type of energy (natural or artificial), which leaves (reflected or emitted) a region/surface of interest, determines also the magnitude, the precision and the completeness of the information which we can remotely acquire for it, with a proper study of its basic physical, chemical and biologic properties and of the "effects" of the multi-dimensional interventions of humankind on earth, soil, vegetation, water and the built environment.
The radiant flux incident to the sensor energy is a very important factor in remote sensing. In practice, we are not interested (except for some limited cases) in the radiant flux density which leaves a specific surface in various directions. On the contrary, we are interested in the E.M.R. which is sensed/registered by a remote sensor and which comes from the solid angle of its instantaneous field of view (IFOV).
Another conception related to the radiant flux incident to the remote sensor/system (not strictly scientific), which is frequently used, is the so called brightness.
A human being as an integrated living remote sensing system can perceive directly (for example by observing an area of the surface of the earth vertically or laterally) or indirectly (by observing an aerial photo or some other remotely sensed images) and can also discern a bright surface/appearance from a less bright one, even if he does not know the exact quantity of radiance received from the sensor.
The radiant intensity and the radiation incident to the sensor are attributed to the E.M.R., which relates to a specific angle of view. It becomes obvious that even the same elements/ characteristics / appearances of the environment will be differently presented on the image plane when they are viewed from different angles of view.
Since the procedure of interaction of each element, characteristic, and appearance of the Natural Earth Surface with the energy of the E.M.R. depends on the amount of the incident radiation as well as the temperature, measurements of the energy reflected and thermal emission cannot safely determine the identity characteristics and the state of an object only by interpreting a remote sensing image where the above mentioned quantities have been registered.

 

Region of Sensitivity for an Artificial System of Photointerpretation and Remote Sensing

TOP


a) in the ultra violet region (ë=0.3-0.4ėm)
b) in the near infrared region (ë=0.7-1.3ėm)
c) in the middle infrared region (ë=1.3-3.0ėm)
d) in the thermal infrared region (ë=3.0-14.0ėm)
e) in the microwave radiation region (ë=5-500mm)

Thus giving the possibility to observe the respective images in the part of the spectrum of the electromagnetic radiation covering the region from 0.3 to 15.0 ėm.

Reference: Rokos, D., in "Environmental Crisis", pp=215-259, ELKAM, 1993

 

Artificial System of Acquisition of Analogic and Digital Remote Sensing Images based on the registration of the reflected Electromagnetic Radiation (E.M.R)

TOP


It consists of:

a. The natural source of the E.M.R., the Sun. (1)

b. The interaction of the E.M.R. emitted from the sun towards the earth (3) and of the part of the E.M.R. reflected from the earth (2) with the atmosphere
c. The interaction of the emitted radiation (1) (already influenced by the atmosphere) with the Natural Earth Surface (N.E.S) (3).

d. The passive sensor/system (4) which is sensitive to the radiation reflected from the N.E.S. (after its interaction with the atmosphere) and registers, in an analogic form (photographic camera) or in a digital form (radiometer, scanner), the respective remotely sensed images.

These sensors are sensitive to the wavelengths between 0.3ėm and 3.0ėm.

Reference: Rokos, D.,   in "Environmental Crisis", pp=215-259, ELKAM, 1993

 

Artificial System of Remote Sensing on the basis of the registration of the emitted (thermal) Electromagnetic Radiation

TOP


It consists of:

a. The N.E.S. as a source of emitted thermal radiation (which has either been absorbed from the sun's radiation, and/or comes from physical, chemical and biological processes of the interior and/or the surface of the Earth). It can also be a consequence of human activities at the wavelengths of ë=3.0-14.0ėm.

b. The interaction of the thermal radiation emitted from the N.E.S. with the atmosphere (2) on its path towards the sensor/system (4) which is sensitive to that part of radiation.

c. The sensor/system (4).

A sensor/system (multispectral scanner) can be simultaneously sensitive to the electromagnetic radiation reflected and emitted from the N.E.S. which is registered in distinct characteristic zones (channels) concerning specific wavelengths (e.g. the Thematic Mapper (TM) of the Landsat US satellite registers the E.M.R. in three regions of the visible part of the spectrum in, one of the near infrared, two the middle infrared and one the thermal infrared).

Reference: Rokos, D.,  in "Environmental Crisis", pp=215-259, ELKAM, 1993

 

Active System of acquisition of Remote Sensing data on the basis of the transmittance of artificial E.M.R. and the registration of the radiation backscattered from the earth

TOP


It consists of:

a. A source transmitting artificial E.M.R. (e.g. microwave side looking radar (1)) and the respective sensor/system for the registration of the back scattered radiation, (after its interaction with the atmosphere) (4).

b. The interaction of the radiation transmitted with the atmosphere (2) on its path from the source (1) and its back scattering towards the sensor (4) within the system (1)+(4).

c. The interaction of the artificial E.M.R. transmitted with the N.E.S. (3).


A sensor/system of this type in the part of the spectrum interval between 0.8 and 100.0 cm can operate independently of  weather conditions.

Reference: Rokos, D., in "Environmental Crisis", pp=215-259, ELKAM, 1993

 

Digital Images

TOP


A digital image consists of a grid of cells, each called a picture element or pixel, representing the brightness of each area with a numeric value ranging from 0 to 255 in a 8bit system (256=28). A grayscale or panchromatic image is displayed by using pixel values to control screen brightness (intensity). The lower the pixel value, the darker the gray. When we display more than one channel, each as a different primary colour, then the brightness levels may be different for each channel and they will combine to form a colour image.


l4-1.gif (18361 bytes)

 

Digital Image Formats

TOP


Generally, scanning saves an image to a file. Thus, each row of the array or matrix will normally correspond to one scan line. In grayscale scanning, a value ranging from zero (for black) to 255 (for white) is stored for each cell. In colour scanning mode, a value from 0 to 255 is stored for each of the three primary colours (red, green and blue).
Digital images can be packaged mainly in the following three generic formats:
BULLET.JPG (677 bytes) Band SeQuential (BSQ)
BULLET.JPG (677 bytes) Band Interleaved by Line (BIL) and
BULLET.JPG (677 bytes) Band Interleaved by Pixel (BIP)
In BSQ format the digital values of the bands of the image are stored sequentially. First band, second band... and so on.
In BIL format the digital values of the relative rows of every image band are stored sequentially. First row of first band, first row of second band..., second row of first band, second row of second band... and so on.
In BIP format the digital values of the relative pixels of every image band are stored sequentially. First pixel of first band, first pixel of second band...  second pixel of first band, second pixel of second band... and so on.

(Verbyla et al.in Processing Digital Images in GIS , 1997)

 

Photorecognition elements

TOP

BULLET.JPG (677 bytes) Tone refers to the relative brightness or colour of objects in an image. Tone and colour are the most fundamental elements of image interpretation. l4-2.gif (5780 bytes)
BULLET.JPG (677 bytes) Size, shape, texture and pattern are spatial arrangements of tone/colour

l4-3.gif (26881 bytes)

BULLET.JPG (677 bytes) Shadow may provide an idea of the profile and relative height of a target. it is useful for enhancing or identifying topography and landforms, particularly in radar imagery.
BULLET.JPG (677 bytes) Association takes into account the relationship between the target of interest and other recognizable objects or features in proximity.
 

The photos are from CCRS, 1998

 

Individual’s perceptive ability

TOP


Depending on individual perceptive ability, experience and the type of imagery, human beings are adept to varying degrees at visually interpreting images, especially those produced by cameras. When the interpreter can identify what he sees, he can better understand the spatial arrangements of the objects represented and the context of an image. Through photointerpretation, the image data become usable information.
But there are certain limitations and constraints to human interpretation. For example, the human eye sees only in the visible part of the electromagnetic spectrum and the human brain assigns characteristic colours -the “true”colours- to the features of the earth’s surface. Satellite imaging devices additionally record parts of the electromagnetic spectrum to which the human eye is not sensitive, resulting, after the necessary processing, in images that reproduce the world in unfamiliar or “false” colours. Specialist knowledge and experience are needed to improve the interpreter’s capacity to understand and analyse satellite remotely sensed images.

 

Human ability in detecting gray shades differences

TOP


l4-4.gif (8531 bytes)

Source :
NOAA/NESDIS Forecast Products
Development Team 1998

 

Conventional aerial photographs

TOP


Panchromatic aerial photographs
display the record of terrain brightness across the entire visible part of the electromagnetic spectrum as a grayscale image.


l4-5.gif (27819 bytes)

Infrared aerial photographs use film sensitive to the entire 0.3 to 0.9 ėm wavelength range and are useful for detecting differences in vegetation cover, due to their sensitivity to IR reflectance.

colour photography involves the use of three layer film each layer being sensitive to different  ranges of light (blue, green and red light).
SPOT Panchromatic image, date 21/09/93, resolution 10m, CNES

In False colour photography (CIR), the three emulsion layers are sensitive to green, red and near-infrared radiation, which are processed to appear as blue, green and red, which is valuable in scientific studies of vegetation.

(CCRS, Fundamentals of Remote Sensing, 1998)


Multispectral Photography uses multi-lens systems with different film-filter combinations to simultaneously acquire photos in a number of different spectral ranges. A multispectral image provides the possibility of obtaining a more refined classification than it is possible with a single spectral band.
The concept of subdividing spectral ranges of radiation into intervals of continuous wavelengths (bands) is illustrated in the following four panels, showing a coloured map and the resulting black and white photos made with blue, green and red filters.


l4-6.gif (51472 bytes)

colour photo                  Blue                Green                Red

 

(Short 1998, Goddard Space Flight Center, NASA)

 

Thermal infrared spectrum

TOP


Apart from visual and near-infrared, other bands of the spectrum commonly used include thermal infrared (heat) and microwave (radar).


l4-7.gif (16833 bytes)

The major subdivisions of the EM spectrum (Conventional separations) [Ramsey 1998]


Thermal infrared spectrum
- emitted, can only be detected using electro-optical sensors. The radiation is emitted from the Earth’s surface in the form of heat. Because the gases absorb electromagnetic energy in specific regions of the spectrum, the wavelengths that we can use most effectively for remote sensing in the thermal infrared portion of the spectrum are 3,5-5,5ėm and 8-14ėm, atmospheric windows:


l4-8.gif (10569 bytes)

Source: CCRS, 1998

 

What about Landsat Imagery?

TOP


Orbital characteristics of Landsat: the satellites’ orbital inclination is nearly polar. Landsat sensor systems have the ability to observe the globe with approximately 26 km of sidelap between successive orbits, with a maximum at 810 north and south latitudes (about 85%) and a minimum at the equator (about 14%), useful for some stereoscopic analysis applications.
For a polar orbiting satellite, scanning is achieved by having the axis of rotation of the mirror along with the direction of motion of the satellite so that the scan lines are at right angles to the direction of motion of the satellite.

A number of sensors have been on board the Landsat series of satellites, including the Return Beam Vidicon (RBV) camera systems, the MultiSpectral Scanner (MSS) systems, and the Thematic Mapper (TM).

l4-9.gif (11131 bytes)

 

What is an Imaging Radar ?

TOP


Imaging RADAR (Radio Detection And Ranging) systems are active sensors which provide their own source of electromagnetic energy. Therefore, images can be acquired day or night. Also, microwave energy is able to penetrate through clouds and most rain, making it an all weather sensor.
Radar images are composed of many dots, or picture elements, each representing the radar backscatter for that area on the ground.


l4-11.gif (2322 bytes)

l4-10.gif (3881 bytes)

l4-12.gif (5814 bytes)

In analysing radar images, the higher or brighter the backscatter on the image, the rougher the surface being imaged.
Backscatter is also sensitive to the target’s electrical properties, including water content.

 

Imaging different types of surface with radar [Wong 1996]


ntualogo.GIF (15016 bytes)

National Technical University of Athens
Dept. of Rural & Surveying Engineering
Laboratory of Remote Sensing