Issues in species classification of trees in old growth conifer stands

Donald G. Leckie, Sally Tinis, Trisalyn Nelson, Charles Burnett, François A. Gougeon, Ed Cloney, Dennis Paradine

Research output: Contribution to journalArticlepeer-review

81 Scopus citations


Old growth temperate conifer forest canopies are composed of assemblages of tree crowns that vary by species, height, size, and intercrown distance. The challenge this complexity presents to species classification is formidable. In this paper we describe the exploration of spectral properties of old growth tree crowns as captured on two independent acquisitions of 0.7 m ground resolution compact airborne spectrographic imager (CASI) airborne multispectral imagery. Underlying spectral separability is examined, and classifications of manually delineated crowns are compared against field-surveyed ground truth. Technical issues and solutions addressing individual tree species classification of old growth conifer stands are discussed. The study site is a western hemlock, amabilis fir, and red cedar dominated old growth area on Vancouver Island on the west coast of Canada. Within-species spectral variability is large because of illumination and view-angle conditions, openness of trees, natural variability, shadowing effects, and a range of crown health. As well, spectral differences between species are not large. An object-oriented illumination and view-angle correction was effective at reducing the effect of view angle on spectral variability. Radiometric normalization between imagery from flight lines of the same site and time period was successful, but normalization with data from other sites and days was not. The use of spectral samples from sunlit areas of the tree crowns (mean-lit signature) produced the best spectral separability and species classification. Because of the wide within-species variability and spectral overlap among species, it was also found useful to create internal subclasses within a species (e.g., normal and bright crowns). It was not feasible to consistently classify species of shaded crowns or stressed trees, and it was necessary to create overall shaded tree and unhealthy classes to prevent these trees from corrupting the final species classification. Classification results also depend on the visibility of trees in the imagery. This was demonstrated by different visibility and shade conditions of trees between the two image dates. This effect is particularly strong in old growth stands because of variations in stem density and spacing and the range of tree heights and sizes. Old growth stands will have shaded, unhealthy, and visually or spectrally unusual trees. Excluding these and considering species classification of manually delineated trees of the normal and bright spectral classes, modest success was achieved, in the order of 78% accuracy. Hemlock and balsam were confused, and cedar classification was mostly confused by the presence of unhealthy trees. If speciation of shaded and unhealthy trees was required, overall species classification was weak. It was shown, however, that shaded and unhealthy trees could be identified well using a classification with shaded and unhealthy classes. Species classification of the remaining trees, including the unusual trees, was 67% for the 1996 imagery and 77% for the 1998 imagery. Despite the difficulties in classifying species in an old growth environment, practical solutions to issues are available and viable spectral classifications are possible. Fully automated species isolation and classification add further complications, however, and new approaches beyond simple spectral techniques need to be explored.

Original languageEnglish (US)
Pages (from-to)175-190
Number of pages16
JournalCanadian Journal of Remote Sensing
Issue number2
StatePublished - Apr 2005
Externally publishedYes

ASJC Scopus subject areas

  • Earth and Planetary Sciences(all)


Dive into the research topics of 'Issues in species classification of trees in old growth conifer stands'. Together they form a unique fingerprint.

Cite this