International Journal of Environmental Monitoring and Analysis
Volume 3, Issue 6, December 2015, Pages: 420-424

Robust Method for Deforestation Analysis of Satellite Images

Ioan Ispas1, Eduard Franti1, 2, Florin Lazo1, Elteto Zoltan1

1Centre for New Electronic Architecture, Research Institute for Artificial Intelligence, Bucharest, Romania

2National Institute for Research and Development in Microtechnologies, Micromachined Structures, Microwave Circuits and Devices Laboratory, Bucharest, Romania

Email address:

(I. Ispas)
(E. Franti)
(F. Lazo)
(E. Zoltan)

To cite this article:

Ioan Ispas, Eduard Franti, Florin Lazo, Elteto Zoltan. Robust Method for Deforestation Analysis of Satellite Images. International Journal of Environmental Monitoring and Analysis. Vol. 3, No. 6, 2015, pp. 420-424. doi: 10.11648/j.ijema.20150306.16


Abstract: The aim is to design a robust method for tracking real time deforestation in a local area under satellite observation. Deforested areas are obtained by a procedure of differentiating between two successive images (temporal). The resulting method proves to be robust, the analyzed satellite image having multiple alterations: cutting (minus 3-10%), translation (5-10%), rotation (2-10 degrees), parasite random noise (5-15%), different brightness and contrast (5-10%) and cloudy areas (15-20%).

Keywords: Satellite Images, Digital Image Processing Deforestation, Forest Satellite Surveillance


1. Introduction

The importance and seriousness of deforestation is specifically detailed in the document provided by the FAO commission (the Food and Agriculture Organization) of the United Nations [1] covering the period 1990-2005. The conclusions of this report need not be presented here. However, it is necessary to note that the immense effort of mapping and classifying the agricultural and forest land across the planet surface has been enabled by automatic procedures based on the FRA (Forest Resources Assessment) system of satellite surveillance and on the Global FRA 2010 Remote Sensing Survey program of satellite surveillance.

The work of reference on remotely monitoring environmental systems (Remote Sensing and Environmental Systems Analysis [2]) reviews the main programs for automatic tracking of soil changes, especially for forest areas. Most such programs are based on multi-band imaging data provided by the Landsat satellite system. This information can be accessed in the archives on the Internet, and some are even free of charge, but they have not been updated recently.

The Copernicus Global Land Service system [14] as part of the Earth Observation European Programs is among the most cited automatic surveillance systems. One of the services it provides relates to surveillance of forest areas for logging detection and prosecution.

The EEA (the European Environment Agency) operates at European level implementing programs like Corine Land Cover or GIO Land [15] which provide information data bases, some even free of charge (but not newer than 2006), which can constitute starting points in the establishment of automatic surveillance systems for forests.

Such surveillance programs exist on all continents. NASA’s very extensive program LCLUC (Land Cover & Land-Use Change) should be mentioned, with an annual budget of $ 7.5 million, covering both US and other continents territory [16].

Additionally, in the US, the forest fire detection software NOAA Hazard Mapping System for Fire and Smoke Product [17] runs and performs in real-time, with online access. This national system of tracking and triggering the automatic alarm systems for forest fires could not be operational without specialized automatic computerized tracking applications. The applications process and analyze real-time multi-band images provided by satellites such as Landsat, giving information about the nature of the disturbances that occur in the forest.

Unlike the above-mentioned global territory surveillance systems, the local systems for monitoring deforestation are flexible, can be updated, and are, obviously, cheaper.

2. Basic Elements in Designing a Forest Surveillance Satellite Application

In designing a forest satellite surveillance application it is necessary to meet the following five basic requirements [8]:

1)  automation; - allow automatic or semi-automatic surveillance.

2)  generality; - cover a wide range of forest patterns (forests of different types).

3)  reactivity; - detect in real time any deforestation below a predetermined threshold.

4)  robustness; - take account of possible obstructions in satellite images (clouds, smoke, fog).

5)  accessibility; - can be used on common personal computers by non-information staff.

Fig. 1. Functional diagram of a semiautomatic forest surveillance application.

The basic stages of a forest satellite surveillance application are:

1)  preprocessing satellite imagery: geometric distortion correction, multi-band information integration [10].

2)  Masking (eliminating shutter influences) clouds, mist, smoke [11,12].

3)  Extraction / detection of forest areas [8,9].

4)  Comparing (by difference) successive images [8,13].

5)  Classification of different elements for the detection of deforested areas [8,13].

Each stage contains specific algorithms and methods that may be regarded as independent procedures themselves. The step of clouds masking has a high degree of difficulty when large portions (over 40% of the picture) are covered by clouds or when a diffuse fog layer is present, which is why this step may require semi-automatic methods [12].

The functional diagram of a multisensory forest surveillance application [13], integrating satellite data and images acquired by multiple frequency bands is given in in Figure 1.

3. Designing the Application for Deforestation Detection

The following method provides real-time tracking of local deforestation based on updated satellite images. It was designed using MatLab libraries (the Image Processing Tool and the Computer Vision Tool).

The procedure for differentiating two successive satellite images proves to be robust, as the new satellite image shows multiple alterations: cutting (minus 3-10%), translation (5-10%), rotation (2-10 degrees), parasite random noise (5-15%), different brightness and contrast (5-10%), and cloudy areas (15-20%).

The satellite images are taken from Google Maps and have a resolution of 3m (one pixel corresponds to the pitch of a 3x3 meters square).

The proposed five-step method is simple, robust, and fast. It can be a starting point for a local tracking system which can then incorporate elements of assistance and supervision by learning the characteristics of the surveilled areas.

4. Case Study

The extraction of cleared areas is proposed to be performed on basis of the two satellite images (source: Google Earth).

Fig. 2. The source image forest0.bmp RGB24 630x796.

Fig. 3. Image forest1.bmp RGB24 517x740, rotation +4o, cloud area 6%, brightness / contrast +5%, random noise +1%.

4.1. Reading the Images for Comparison

Im0=imread(‘forest0.bmp’);

Im1=imread(‘forest1.bmp’);

The entry images are stored as whole matrices Im0 and Im1 with three layers of RGB color (Fig.2 and Fig.3).

4.2. Extraction of Wooded Areas Using a Vegetation Mask

im=Im0; imrange=im;

imrange(:,:,1)=histeq(im(:,:,1), ];

imrange(:,:,2)=histeq(im(:,:,2), ];

imrange(:,:,3)=histeq(im(:,:,3), ];

mask0= imrange(:,:,1)~=255 & imrange(:,:,2)~=255 & imrange(:,:,3)~=255;

mask0=bwmorph(mask,'dilate');

mask0=bwmorph(mask,'close');

Fig. 4. Forest extraction: binary image mask0.

For masking forest vegetation in the initial image (Figure 2) intervals (of lower threshold and upper threshold) were used for each of the three color layers:

IntervalRed = ];

IntervalGreen = ];

IntervalBlue = ];

These three parameters are adjustable; they were obtained through procedures of supervised learning for the specific local areas that are monitored by satellite.

The mask obtained after vegetation extraction on basis of the three color ranges contains an excess of forest vegetation outside the forest contour (Figure 4). This excess will be corrected by differentiating between the two successive images taken at a relatively small time interval.

While extracting excess vegetation can be corrected, the situation of selected vegetation deficit originating in choosing narrow intervals should be avoided.

Fig. 5. Forest extraction: Binary image mask1.

This step is identically reiterated for the new image, obtaining mask1 (Fig.5) in a similar way.

4.3. Overlapping the Altered Image on the Initial One (Image Registration)

It is the most laborious stage and it is found in the specialised literature under the denomination of image registration. The new satellite image, compared to the previous one may display a certain degree of alteration, which is due to different weather and visibility conditions.

The white frame is just for highlighting purposes.

Fig. 6. forest1 superimposed on forest0.

In the current situation, the new image is rotated, translated and partially covered by a cloud (Figure 6), but, overall, alteration items do not exceed 20% of the initial content. In the end, the degree of image alteration sets the level of robustness for the method of deforestation analysis.

The algorithms used [3, 4, 5] in MATLAB functions are designed for a comprehensive range of possible image transformations: rotation, translation, affine or projective transformations, reflection, etc.

Im0=rgb2gray (imread(‘forest0.bmp’); Im1=rgb2gray(imread(‘forest1.bmp’);

[featuresIn validPtsIn] = extractFeatures(Im0, ptsIn);

[featuresOut validPtsOut] = extractFeatures(Im1, ptsOut);

index_pairs = matchFeatures(featuresIn, featuresOut);

matchedPtsIn = validPtsIn(index_pairs(:,1));

matchedPtsOut = validPtsOut(index_pairs(:,2));

t_concord = cp2tform( double( matchedPtsOut.Location), double( matchedPtsIn.Location), 'nonreflective similarity');

Imres = imtransform(Im1, t_concord, 'XData', [1 size(Im0,2)], 'YData', [1 size(Im0,1);

An alternative to achieving this key step (image registration) would be to use the estimateGeometricTransform function, instead of the cp2form (spatial transformation from control point pairs).

The EstimateGeometricTransform is a MatLab function from the ComputerVision package which implements an alternative algorithm for matching images, belonging to the RANSAC family (the Random Sample Consensus Algorithm) [6].

It is essential to note that an error of only few pixels in the overlay of the new, altered image on the initial (source) image can lead to undesirable results. Therefore, this stage is semi-automatic and may require supervision from a human decision maker.

The resulting image (Figure 6) contains the new image restored (converted by rotation, translation, etc) superimposed on the initial image. This composite image will be subtracted from the original image to obtain the missing green areas.

4.4. Extracting Cloud Covered Areas

GrayThres =135;

mask1=Im1 > GrayThres;

mask1=mask1(:,:,1) & mask1(:,:,2)&mask1(:,:,3);

Clouds or cloudy areas in the digital images are characterized by values in the upper half of the range [0-255]; in addition, the values of the three color layers are very close, which is why shades of gray are obtained.

The proposed threshold parameter in this case is GrayThres = 135, which sufficiently meets this condition, as shown in the figure (Figure 7).

Fig. 7. Mask2 extraction cloudy area.

4.5. The Extraction and Measurement of Cleared Areas

AreaThres = 300;

maskfinal = mask0-mask1-mask2;

maskfinal = 1-bwmorph(1-mask, 'dilate');

maskfinal = bwareaopen(maskfinal, AreaThres);

regs = regionprops(maskfinal, 'Area');

allArea = [regs. Area];

To find deforested areas, as cutouts in the green forest area of the original picture, the difference is made between masks, thus obtaining the final mask (Figure 8).

Some further processing is then required to eliminate all areas - false alarms - that are below a certain threshold called AreaThres, which sets a minimum cut surface that triggers the alert.

In this particular case, the AreaThres parameter has the value 300, which corresponds to an area of approximately 2500m2 = 0.25 ha.

The cloud shape is observed in the central part.

Fig. 8. The image difference between masks.

5. Results

The result in this case (Figure 9) consists in triggering alert notification for four cleared areas, with a surface (in pixels) of: 2857, 1365, 1154 and 715 pixels. Using a scale of 1 pixel ~ 9 m2 the approximate surface of cleared land for the four cleared areas is: 2.5 ha, 1.2 ha, 1 ha and 0.6 ha.

Fig. 9. Extracting the four rectangular deforested areas, with surfaces of 2.5 hectares, 1.2 ha, 1 ha and 0.6 ha.

6. Conclusions

The proposed method for detecting local deforestation by consecutive satellite images is designed in five successive steps.

The automated method for detecting local deforestation is based on two consecutive satellite images and depends on only five parameters, grouped as the first three and the next two, thus enabling accuracy adjustment:

IntervalRed, IntervalGreen, IntervalBlue – are used for selecting (masking) the green forest area in the images;

GrayThres – is used for selecting (masking) cloud covered areas;

AreaThres – is used for selecting the significant difference threshold for the surfaces in the deforested area.

These parameters were adjusted experimentally but they can be improved through supervised procedures.

Step three, overlapping the images (Image Registration) is sensitive step in case of massive alterations of satellite images due to weather reasons.

This stage, image registration, shows the robustness level this method provides in detecting deforestation; in this case, the new satellite image having multiple alterations: cutting minus 10%, translation 10%, rotation 8o, parasite random noise 10%, different brightness and contrast 16%, and cloudy areas 18%.

Acknowledgement

This work was supported by a grant of the Romanian National Authority for Scientific Research, Programme for research - Space Technology and Advanced Research - STAR, project number 82/2013.


References

  1. www.fao.org - FAO Document Repository: Forestry Paper 169.
  2. Ni-Bin Chang (Editor), Environmental Remote Sensing and Systems Analysis, CRC Press, 2012.
  3. Ioannis Manakos, Mathias Braun. Land Use and Land Cover Mapping in Europe Practices & Trends, Springer, 2014.
  4. Herbert Bay, Andreas Ess, Tinne Tuytelaars, Luc Van Gool, SURF: Speeded Up Robust Features", Computer Vision and Image Understanding (CVIU), Vol. 110, No. 3, pp. 346--359, 2008.
  5. Goshtasby, Ardeshir, "Piecewise linear mapping functions for image registration," Pattern Recognition, Vol. 19, 1986, pp. 459-466.
  6. Goshtasby, Ardeshir, "Image registration by local approximation methods," Image and Vision Computing, Vol. 6, 1988, pp. 255-261.
  7. O. Chum and J. Matas. Optimal randomized RANSAC, IEEE Transactions on Pattern Analysis and Machine Intelligence, 30(8)Ș1472-1482, 2008.
  8. Gregory P. Asner, David E. Knapp, Aravindh Balaji, Guayana Páez-Acosta. Automated mapping of tropical deforestation and forest degradation: CLASlite, Journal of Applied Remote Sensing, Vol. 3, 033543, 2009.
  9. Ben DeVries, Jan Verbesselt, Lammert Kooistra and Martin Herold, Detecting Tropical Deforestation and Degradation Using Landsat Time Series, papers submitted to Geoscience and Remote Sensing Symposium (IGARSS), 2014 IEEE International.
  10. J. G. Masek, E. F. Vermote, N. E. Saleous, R. Wolfe, F. G. Hall, K. F. Huemmrich, F. Gao, J. Kutler, and T.-k. Lim, \A Landsat Surface Reectance Dataset," Geoscience and Remote Sensing Letters, vol. 3, no. 1, pp. 68-72, 2006.
  11. C. Huang, N. Thomas, S. N. Goward, J. G. Masek, Z. Zhu, J. R. G. Townshend, and J. E. Vogelmann, Automated masking of cloud and cloud shadow for forest change analysis using Landsat images," International Journal of Remote Sensing, vol. 31, no. 20, pp. 5449-5464, Oct. 2010.
  12. A. K. Sah, B. P. Sah, K. Honji, N. Kubo, S. Senthil. Semi-automated cloud/shadow removal and land cover change detection using satellite imagery, International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XXXIX-B7, 2012 XXII ISPRS Congress, Melbourne, Australia.
  13. M.A. Wulder, J.A. Dechka, M.A. Gillis, J.E. Luther3, R.J. Hall, A. Beaudoin, S.E. Franklin. Operational mapping of the land cover of the forested area of Canada with Landsat data: EOSD land cover program, NOVEMBER/DECEMBER 2003, VOL. 79, NO. 6 pp.2075-1083, THE FORESTRY CHRONICLE.
  14. Land.copernicus.eu - Copernicus Global Land Service.
  15. www.eea.europa.eu - European Environment Agency; GIO Land Service; CORINE Land Cover.
  16. cce.nasa.gov - NASA Carbon Cycle & Ecosystems - LCLUC Program.
  17. www.ospo.noaa.gov – NOAA (National Oceanic and Atmospheric Administration) Office of Satellite and Product Operations; Hazard Mapping System Fire and Smoke Product.

Article Tools
  Abstract
  PDF(1585K)
Follow on us
ADDRESS
Science Publishing Group
548 FASHION AVENUE
NEW YORK, NY 10018
U.S.A.
Tel: (001)347-688-8931