A recent study published in the journal Computers, Environment and Urban Systems demonstrates a transformative methodology leveraging artificial intelligence (AI) and remote sensing data sources to enable automated, large-scale mapping of urban street trees. As conventional field-based urban tree surveys pose substantial challenges regarding costs and scalability, the research highlights more efficient data-driven mapping approaches harnessing geographic imagery and deep learning algorithms.
The Value of Urban Forests
Urban trees and green spaces provide invaluable ecosystem services and enhance human health and well-being in cities globally. They reduce air pollution, mitigate urban heat island effects, absorb stormwater runoff, enhance biodiversity, and provide soothing natural views that reduce stress. Trees also increase property values and promote social cohesion through community greening programs.
However, urban centers often lack comprehensive, up-to-date data on tree locations, numbers, species composition, structural characteristics, and canopy health. Field surveys to inventory urban forests require immense manual effort and are limited in scale due to the costs of sending trained personnel to map trees block-by-block across cities. As urban greening initiatives expand globally, this data deficit severely constrains scientific understanding of urban forest extent, benefits, and vulnerabilities.
Potential of Geospatial AI Innovations
Recent advances in geographic imaging, remote sensing, and AI offer immense potential to revolutionize urban forest assessment and monitoring in a highly cost-effective, scalable manner. The proliferation of public aerial and satellite imagery provides extensive overhead views of urban canopies. Global mapping firms are compiling vast databases of street-level imagery, capturing ground views along most public rights of way. Meanwhile, deep learning algorithms enable automatically extracting semantic information from visual data through multi-layered neural networks trained on labeled examples.
The confluence of these technologies could enable automated urban tree mapping by applying optimized AI models to diverse remote-sensing imagery. This study explores and demonstrates this transformative potential through a sophisticated methodology.
Combining Remote Sensing and Deep Learning
The study puts forth an innovative methodology integrating state-of-the-art deep convolutional neural networks (CNNs), geographic imagery databases, and multi-perspective remote sensing data sources to accurately detect and map urban street trees.
Two distinct CNN models are optimized through supervised training on labeled urban tree imagery - YOLOv5 for identifying trees in ground-level photos and DeepForest for classifying tree crowns in overhead imagery. Transfer learning enables adapting these advanced architectures to the urban mapping task.
The optimized models are applied to corresponding geographic imagery. Google Street View panoramas provide detailed horizontal views of street trees from four cardinal directions at queried locations. Satellite and aerial images are tiled into windows for overhead analysis.
Detected trees are matched between the data sources by calculating positioning coordinates and angles. This enables fusing the aerial canopy perspectives with ground-level views to pinpoint tree locations precisely. The multi-modal approach provides localized validation through street views while enabling large-scale automated analysis.
By leveraging extensive public satellite, aerial imagery, and street-level photo databases, the methodology enables cost-effective wide-area urban tree mapping compared to manual surveying. It also facilitates continuous monitoring by incorporating updated remote sensing feeds over time.
Rigorous Validation Across Diverse Urban Environments
The methodology was extensively validated on nearly 60 hectares encompassing a heterogeneous urban area with varied street layouts, building configurations, and tree densities in Lleida, Spain. The models were optimized by tuning parameters like grid spacing, detection radius, and image resolution to maximize accuracy based on ground truth data.
Key Results Demonstrating Transformative Potential
The optimized models achieved 79% accuracy in detecting visible trees from aerial imagery and 62% accuracy in pinpointing tree locations within street-level panoramas. The capacity to leverage existing geographic imagery and AI innovations demonstrates the immense potential to revolutionize large-scale automated urban tree mapping.
Towards Intelligent Urban Forest Management
By enabling comprehensive, scalable, and dynamic urban tree mapping, the study demonstrates the immense potential of geospatial and AI innovations to provide actionable intelligence guiding data-driven urban forest planning, management, and stewardship.
The resulting insights can help prioritize investment in targeted tree planting, preservation, and maintenance where need and impact are the greatest. Automated mapping can also help cities set measurable canopy cover goals and quantify progress towards climate, equity, and sustainability targets.
Future Outlook
As geographic data accuracy and AI techniques continue advancing, the presented methodology can become a transformative standard worldwide for smarter, more resilient urban forests. Ongoing research can focus on detecting occluded trees using lidar and hyperspectral data and improving species classification capabilities. The fusion of multiple data sources, AI methods, and validation mechanisms provides an exciting roadmap for developing an integrated digital ecosystem supporting sustainable, equitable urban forests globally.
Journal reference:
- Velasquez-Camacho, L., Etxegarai, M., & de-Miguel, S. (2023). Implementing Deep Learning algorithms for urban tree detection and geolocation with high-resolution aerial, satellite, and ground-level images. Computers, Environment and Urban Systems, 105, 102025. https://doi.org/10.1016/j.compenvurbsys.2023.102025, https://www.sciencedirect.com/science/article/pii/S0198971523000881