UAVs and Machine Learning to Advance Antarctic Vegetation Monitoring

In a paper published in the journal Sensors, researchers highlighted the vulnerability of east Antarctic vegetation, mainly moss and lichen, to climate change and ozone depletion. They introduced an innovative workflow utilizing deep learning and machine learning (ML) techniques, specifically extreme gradient boosting (XGBoost) and U-Net, to analyze high-resolution multispectral imagery collected from the Antarctic specially protected area (ASPA) 135 near Casey station.

Ground image of study area depicting distribution of moss and lichen vegetation. Study: https://www.mdpi.com/1424-8220/24/4/1063
Ground image of study area depicting distribution of moss and lichen vegetation. Study: https://www.mdpi.com/1424-8220/24/4/1063

Results showed robust performance by XGBoost and enhanced accuracy with U-Net. These solutions offer promising noninvasive monitoring solutions for Antarctic ecosystems and emphasize the potential of unmanned aerial vehicles (UAVs) and AI in remote sensing applications.

Related Work

Previous studies have underscored the significance of monitoring Antarctica's terrestrial ecosystems, predominantly dominated by moss and lichen vegetation, amidst extreme climate conditions. The Windmill Islands coastline in East Antarctica hosts extensive moss forests, experiencing diverse environmental stressors. While traditional remote sensing methods have been pivotal, the emergence of UAVs -mounted advanced sensors offers unprecedented detail, revolutionizing monitoring techniques. However, more studies are needed to integrate UAV and AI for vegetation mapping in Antarctica, mainly using multispectral imagery.

ML Classifier Training and Verification

The training phase for the ML classifiers began with preparing training data, followed by implementation in Python 3.8.10. Researchers utilized various libraries for data processing and ML tasks, comprising the geospatial data abstraction library (GDAL) 3.0.2, XGBoost 1.5.0, scientific kit-learn (Scikit-learn) 0.24.2, open-source computer vision library (OpenCV) 4.6.0.66, and MATLAB plotting library (Matplotlib) 3.8.2. These libraries facilitated tasks ranging from geospatial data handling to computer vision and ML algorithms.

The training of the U-Net model was conducted in Google Colab, leveraging a graphics processing unit (NVIDIA T4 GPU, NVIDIA). The proposed process pipeline for ML model training integrates both XGBoost and U-Net classifiers. XGBoost, known for its efficiency and flexibility, underwent model training and fine-tuning using region of interest (ROI) multispectral and mask files. The process involved loading the ROI, calculating spectral indices to enhance classification accuracy, and tuning hyperparameters to optimize model performance. The trained XGBoost model conducted feature importance analysis and was validated using test data to ensure effectiveness.

The U-Net model, characterized by its U-shaped encoder-decoder structure, underwent similar training and fine-tuning procedures. The process included cropping collected data into tiles, designing a custom U-Net architecture, defining loss functions for evaluation, and optimizing hyperparameters for model training. Table 5 outlines the key parameters and configurations employed during model development.

Both algorithms were verified using evaluation metrics such as precision, recall, F1-score, and Intersection over Union (IoU). The trained models were applied to test datasets, and K-fold cross-validation was employed to assess their ability to handle new unseen data. In conclusion, the prediction phase encompassed applying inference to individual tiles and then stitching together the resulting predicted tiles to generate final segmented maps for both the XGBoost and U-Net models.

Advancing UAV-Based Ecological Monitoring

This study aimed to validate a methodology for assessing moss health and lichen in Antarctica using multispectral imagery captured by UAVs and ML classifiers. By employing ML techniques like XGBoost and U-Net, the research focused on segmenting moss health and lichen across Antarctic moss beds. The findings highlighted the effectiveness of XGBoost in accurately classifying Moss's health and lichen, achieving an F1-score of 89%. Researchers identified the challenges, such as limited training data and resolution constraints of the multispectral camera, suggesting the need for higher-resolution imagery and more extensive ground truth data to improve model performance.

Moreover, the study introduced a novel two-stage ensemble methodology, integrating XGBoost predictions into the U-Net model to enhance segmentation accuracy. This approach significantly improved precision, recall, F1-score, and Intersection over Union (IoU) values across all classes, demonstrating the synergistic potential of combining ML algorithms for enhanced predictive performance. Acknowledging advancements, researchers noted limitations like weather-dependent UAV operations and the necessity for diverse training data, pointing to future research and technological developments in UAV-based ecological monitoring in polar environments.

Overall, the study underscores the transformative potential of UAV technology and ML techniques in advancing the non-invasive monitoring of polar ecosystems. By leveraging high-resolution imagery and sophisticated ML algorithms, researchers can gain valuable insights into vegetation dynamics and health, contributing to biodiversity conservation efforts and enhancing our understanding of ecosystem responses to climate change in remote and challenging environments like Antarctica. Future advancements in UAV technology and data processing methods promise to overcome current limitations and expand the scope and applicability of UAV-based ecological research in polar regions.

Conclusion

To summarize, this paper introduced a novel workflow combining UAVs, multispectral and red, green, and blue (RGB) imagery, and ML to monitor the health of Antarctic vegetation. It compared two ML classifiers, XGBoost and U-Net, with XGBoost demonstrating robust classification performance. While U-Net showed moderate performance initially, Method 2 displayed improvements, highlighting the influence of training samples on segmentation outcomes.

Considering Antarctica's scarcity of training data, leveraging XGBoost for initial predictions followed by DL models like U-Net presents a viable strategy. Future research should focus on capturing more high-resolution RGB data for enhanced labeling and higher-resolution multispectral data for more accurate segmentation. Additionally, exploring other ML algorithms could further improve segmentation outcomes, contributing to the non-invasive monitoring of Antarctic ecosystems facilitated by UAV technology.

Journal reference:
  • Raniga, D., et al. (2024). Monitoring Antarctica’s Fragile Vegetation Using Drone-Based Remote Sensing, Multispectral Imagery and Artificial Intelligence. Sensors, 24:4, 1063. DOI: 10.3390/s24041063, https://www.mdpi.com/1424-8220/24/4/1063.

Article Revisions

  • Jun 25 2024 - Fixed broken links to journal paper.
Silpaja Chandrasekar

Written by

Silpaja Chandrasekar

Dr. Silpaja Chandrasekar has a Ph.D. in Computer Science from Anna University, Chennai. Her research expertise lies in analyzing traffic parameters under challenging environmental conditions. Additionally, she has gained valuable exposure to diverse research areas, such as detection, tracking, classification, medical image analysis, cancer cell detection, chemistry, and Hamiltonian walks.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Chandrasekar, Silpaja. (2024, June 24). UAVs and Machine Learning to Advance Antarctic Vegetation Monitoring. AZoAi. Retrieved on November 08, 2024 from https://www.azoai.com/news/20240212/UAVs-and-Machine-Learning-to-Advance-Antarctic-Vegetation-Monitoring.aspx.

  • MLA

    Chandrasekar, Silpaja. "UAVs and Machine Learning to Advance Antarctic Vegetation Monitoring". AZoAi. 08 November 2024. <https://www.azoai.com/news/20240212/UAVs-and-Machine-Learning-to-Advance-Antarctic-Vegetation-Monitoring.aspx>.

  • Chicago

    Chandrasekar, Silpaja. "UAVs and Machine Learning to Advance Antarctic Vegetation Monitoring". AZoAi. https://www.azoai.com/news/20240212/UAVs-and-Machine-Learning-to-Advance-Antarctic-Vegetation-Monitoring.aspx. (accessed November 08, 2024).

  • Harvard

    Chandrasekar, Silpaja. 2024. UAVs and Machine Learning to Advance Antarctic Vegetation Monitoring. AZoAi, viewed 08 November 2024, https://www.azoai.com/news/20240212/UAVs-and-Machine-Learning-to-Advance-Antarctic-Vegetation-Monitoring.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Machine Learning Predicts Recovery in Endurance Athletes But Requires Personalized Strategies