Usage
  • 56 views
  • 230 downloads

Advancing Forest Health Monitoring: Harnessing the Power of Deep Learning Computer Vision for Remote Sensing Applications

  • Author / Creator
    Kapil, Rudraksh
  • Forests provide immense economic, ecological, and societal values, making forest health monitoring (FHM) a crucial task for guiding conservation and management of these essential ecosystems. Drones have seen increased popularity in this domain due to their ability to collect high-resolution, multi-modal images over a large area of interest (AOI). Naturally, different sensors (e.g., thermal) can capture more information than just RGB cameras and lead to a more comprehensive understanding of the AOI. The processing and analysis of these images has largely been done manually or using manually crafted indices, posing a severe bottleneck in terms of the size of the AOI and generalizability of results to different locations with dissimilar tree species. Computer vision techniques, particularly those relying on deep learning (DL), have the potential to overcome these issues and yield more effective FHM, especially when information from multiple sensors is combined. Therefore, the overarching goal of this thesis is successfully applying DL and computer vision techniques to process and analyze multi-modal drone images for FHM.
    Towards achieving this goal, first, a new workflow to generate high-quality thermal orthomosaics is proposed. Orthomosaicking removes distortions from nadir (i.e., downward-facing) images and stitches them together to produce one broader image encompassing the entire AOI. Typical thermal-only orthomosaicking workflows suffer from gaps and swirling artifacts due to the poor structure-from-motion (SfM) performance on the low-contrast and low-resolution thermal images. Instead, the proposed workflow leverages the superior SfM results from simultaneously acquired, higher-quality RGB images and performs image co-registration using a learned affine transformation to generate thermal orthomosaics that are free from the mentioned issues and precisely aligned with their RGB counterparts, without disturbing the radiometric information of the original images. Second, the focus shifts to precisely detecting individual tree crowns from the aligned RGB-thermal imagery. Shorter trees hidden in RGB images by the shadows of neighbouring larger trees become apparent in thermal images. Detecting these trees correctly is critical in many monitoring tasks, e.g., bark beetles preferentially attack smaller, younger trees during their endemic population stages. To appropriately leverage both image modalities, a novel unsupervised domain adaptation (UDA) strategy is proposed to adapt an existing state-of-the-art RGB-only detection model to thermal data and fuse the features extracted from both prior to detection. The proposed method outperforms existing UDA and image-level fusion techniques without requiring any annotations for training. Finally, the vital FHM task of bark beetle attack stage classification is considered. In sufficiently large numbers, these insects pose a devastating threat to forest ecosystems by exacerbating tree mortality. Infested trees gradually show crown discoloration in four separate `attack' stages, and effectively distinguishing between these stages over a wide area can drastically expedite the early detection of bark beetle outbreaks. Traditionally, manual identification is done by experts using helicopter surveys or collected imagery, both of which are arduous tasks. Instead, the proposed method in this thesis leverages a transfer learning technique to train a deep attack-stage classification model that distinguishes between all visible stages with a near-perfect accuracy in the presence of limited training data.

    Across all three objectives, the novel methods proposed in this thesis show significant improvement over previous state-of-the-art techniques. These results are derived through extensive experimentation on different datasets. For the first two objectives, a newly collected RGB-thermal drone image dataset over a forested region in central Alberta, Canada, is used. For the third, an existing bark beetle attack stage classification dataset collected from a forested region in Northern Mexico is used.

  • Subjects / Keywords
  • Graduation date
    Fall 2023
  • Type of Item
    Thesis
  • Degree
    Master of Science
  • DOI
    https://doi.org/10.7939/r3-nnb4-bd55
  • License
    This thesis is made available by the University of Alberta Libraries with permission of the copyright owner solely for non-commercial purposes. This thesis, or any portion thereof, may not otherwise be copied or reproduced without the written consent of the copyright owner, except to the extent permitted by Canadian copyright law.