Deceti.jpg (13851 bytes)

Introduction to Photointerpretation and Remote Sensing

1.1 Concepts and Definitions
1.2 Object of Photointerpretation and Remote Sensing
1.3 Applications of Photointerpretation and Remote Sensing
1.4 Imagery used for Photointerpretation purposes
1.5 Parameters influencing aerial photograph acquisition
1.6 Platforms
1.7 Human beings and Remote Sensing Systems. Possibilities and Constraints

1.1 Concepts and Definitions

Photointerpretation is the methodology for acquiring information from photograms or stereopairs. It was developed in parallel with Photogrammetry, and is the first application of remote sensing familiar to human beings, in the sense that photograms and stereoviews constitute an analogical optical-mechanical and photochemical equivalent to the sensitivity of the eye to visible light, optical perception and stereoscopic vision.

Reference: Rokos, D. “Photointerpretation and Remote Sensing”, NTUA, 1979


Remote Sensing
is the science and art of obtaining information on an object, area or phenomenon without physically contacting them, through the analysis of data acquired from a distance.
Using different kinds of imagery which is further processed and interpreted, useful data for application in agriculture, archaeology, forestry, geography, geology and planning is produced. However, the prime objective of remote sensing is to extract environmental and natural resource data related to our earth.

Reference:
Lillesand, T.M. and Kiefer, R.W., “Remote Sensing and Image Interpretation”, 2nd edition.
C.P.Lo., “Applied remote sensing”, University of Georgia.


1-1.gif (4624 bytes)


1.2 Object of Photointerpretation and Remote Sensing

TOP


Photointerpretation’s prime objective is the intensive and orthological use of air photograms (more than of terrestrial ones) for the acquisition and exploitation of information, concerning all scientific and technical fields that require knowledge of multiple elements, prototypes, conditions, preconditions and parameters of the physical and socio-economic reality of an area. Therefore, planning of construction works, monitoring a system or a phenomenon, or even managing a development project at a state or regional level, can rely on the above-mentioned elements.

Reference:.Rokos, D. “Photointerpretation and Remote Sensing”, NTUA, 1979

 

1.3 Applications of Photointerpretation and Remote Sensing

TOP


Photointerpretation and remote sensing are used in scientific and technical fields, such as:

BULLET.JPG (677 bytes) Geology and Hydrogeology.
BULLET.JPG (677 bytes) Soil science.
BULLET.JPG (677 bytes) Forestry.
BULLET.JPG (677 bytes) Ecology and environmental sciences.
BULLET.JPG (677 bytes) Road network, traffic and railway projects and more generally transportation and access routes.
BULLET.JPG (677 bytes) Planning, monitoring and supervision of technical works.
BULLET.JPG (677 bytes) Exploration, inventory, mapping and management of natural and human resources of a country/region.
BULLET.JPG (677 bytes) Agriculture.
BULLET.JPG (677 bytes) Urban planning.
BULLET.JPG (677 bytes) Physical planning.
BULLET.JPG (677 bytes) Hydrology.
BULLET.JPG (677 bytes) Archaeology and monument protection.
BULLET.JPG (677 bytes) Geography.
BULLET.JPG (677 bytes) Cadastre.
BULLET.JPG (677 bytes) Exploration, detection and mapping of land uses and of their changes.
BULLET.JPG (677 bytes) Social sciences.

BULLET.JPG (677 bytes) Creation of databases of qualitative and quantitative information, as infrastructure for development planning, resulting from a multi-disciplinary co-operation.


Reference:.Rokos, D. “Photointerpretation and Remote Sensing”, NTUA, 1979

 

1.4 Imagery used for photointerpretation purposes

TOP


From the three types of imagery, which are:

BULLET.JPG (677 bytes) the aerial photographs
BULLET.JPG (677 bytes) the terrestrial photographs
BULLET.JPG (677 bytes) the satellite images
the most widely used in photointerpretation applications are the aerial photographs; that means images taken from an aircraft carrying a photogrammetric camera with a perpendicular or an inclined optical axis.
We are using terrestrial photographs on special occasions, but, since 1972, there has been increasing use of satellite images for photointerpretation applications in the exploration of natural resources etc. (LANDSAT, SPOT, SOYUZ, MOS Programmes).
Although images with an inclined optical axis give us an aspect of the ground more familiar to our sight, we generally use images with a perpendicular optical axis.
In any case, the parallel and complementary exploitation of perpendicular/inclined images is particularly useful if they are available or can be taken during field surveys.
Aerial photographs with a perpendicular optical axis are taken with the already known photogrammetric cameras, which have the following characteristics:
a. Focal distance f = 88 mm, f = 152 mm and f = 305 mm and photograph dimensions 23 cm X 23 cm and
b. Focal distance f = 70 mm, f = 115 mm and f = 210 mm and photograph dimensions 18 cm X 18 cm.

Photogrammetric cameras 70/18 and 88/23 are called ultra wide-angle cameras, because their field of view is 1200 – 1390.
Photogrammetric cameras 115/18 and 152/23 are called wide-angle cameras, because their field of view is 1000 – 1050, while photogrammetric cameras 210/18 and 305/23 are called acult-angle, because their field of view is 600 – 700.

Reference:.Rokos, D. “Photointerpretation and Remote Sensing”, NTUA, 1979

 

1.5 Parameters influencing aerial photograph acquisition

TOP


The main parameters influencing an aerial photograph’s acquisition and consequently the particular photointerpretation possibilities (that means both the photograms' quality and the figure of the different elements possibly identified on these photograms) are:

BULLET.JPG (677 bytes) The exploitation of general, bibliographical, topographic and cartographic material, aerial photographs and remotely sensed data available, in order to identify the general types of existing models and characteristics or explore them in detail.
BULLET.JPG (677 bytes) The selection of the aircraft which will be used as a platform for the camera
BULLET.JPG (677 bytes) The weather conditions.
BULLET.JPG (677 bytes) The choice of the season and of the acquisition time.
BULLET.JPG (677 bytes) The sun's position during acquisition.
BULLET.JPG (677 bytes) The selection of the flight’s axis direction.
BULLET.JPG (677 bytes) The selection of the film’s photo-emulsion and the appropriate filters.
BULLET.JPG (677 bytes) The flight’s height.

BULLET.JPG (677 bytes) The selection of photogrammetric cameras or other remote sensing sensors/systems providing imagery with the appropriate overlap.


Reference:.Rokos, D. “Photointerpretation and Remote Sensing”, NTUA, 1979

 

1.6 Platforms

TOP


In order to record the electromagnetic energy, which exists in a wide range of wavelengths and frequencies, different varieties of remote sensing instruments are required. For these different imaging systems to work properly, the choice of the sensor platform is crucial.
In the 1950s and early 1960s balloons, tethered balloons, helicopters, even rockets were used as aerial platforms for photographic systems. Since then, aircraft has been normally employed as platforms for conventional photographic systems.
In recent years, increasing attention has been paid to spacecraft as suitable sensor platforms because they have overcome the difficulties of the ceiling limit and operation duration. The use of spacecraft orbiting regularly around the earth at a height of several hundreds of kilometres provides regular surveillance of the earth with suitable remote sensing devices.
The term “spacecraft” refers to artificial satellites, manned or unmanned, space vehicles, space shuttles etc. (Nimbus, TIROS, ERTS/LANDSAT, NOAA, Mariner, Mercury, Apollo. Gemini, Skylab, Meteor, SPOT, MOS, METEOSAT etc. programmes).

Reference: C.P.Lo., “Applied Remote Sensing”, University of Georgia.

 

1.7 Human beings and Remote Sensing Systems. Possibilities and Constraints

TOP


The human eye, like the photographic camera, converts the differences/changes of the reflective/emissive radiation to tone/shade/ colour differences. Both the human eye and the camera register space in detail and in a geometrical integration. On the other hand, remote sensing scanners, being more expensive than cameras and having a more complicated structure convert the differences/changes of the reflective/emissive radiation into electrical signals.

The human eye is limited as a remote sensor because of:

BULLET.JPG (677 bytes) its sensitivity only in the visible part of the electromagnetic spectrum
BULLET.JPG (677 bytes) its incapacity to discern many differences of shades
BULLET.JPG (677 bytes) its inability to analyse at the same time, more than one image from different portions of the spectrum.
In the process of analysis and interpretation of remotely sensed images, the well- trained specialist photointerpreter, has significant insurmountable advantage against any machine, because of his superior intelligence in evaluating, in an integrated way, the qualitative and quantitative characteristics of the objects/facts/ phenomena/ events/appearances/etc. of the natural and socioeconomic reality.

Reference: Rokos, D. “Photointerpretation and Remote Sensing”, NTUA, 1979


TOP

ntualogo.GIF (15016 bytes)

National Technical University of Athens
Dept. of Rural & Surveying Engineering
Laboratory of Remote Sensing