In a paper published in the journal Humanities and Social Sciences Communications, researchers interviewed and analyzed 30,000 responses from chat generative pre-trained transformer (ChatGPT) regarding ecological restoration. Findings revealed a reliance on male academics from United States (US) universities, overlooking evidence from lower-income countries and Indigenous restoration experiences.
The chatbot primarily focused on planting and reforestation, neglecting non-forest ecosystems and species diversity. It underscores how artificial intelligence (AI) driven knowledge production reinforces Western science biases. Researchers stress the need for safeguards in chatbot development to address the global environmental crisis inclusively.
Related Work
Past research underscores the growing impact of AI on global conservation efforts, with AI-driven techniques increasingly used to enhance environmental monitoring and management. However, concerns persist regarding potential biases and misrepresentations in AI innovations, particularly in the context of conservation science. These concerns include amplifying existing inequalities in decision-making processes and the possibility of perpetuating misleading information, hindering effective conservation strategies. Addressing these challenges ensures that AI technologies contribute to worldwide conservation efforts.
ChatGPT Ecological Restoration Analysis
This study comprehensively analyzed ChatGPT's responses to a 30-question interview, focusing on distributive, recognition, procedural, and epistemic justice dimensions in AI-generated information about ecological restoration. Researchers utilized the international principles & standards for environmental restoration to formulate thematic questioning areas, knowledge systems, stakeholder engagements, and technical approaches.
Data collection involved posing each question 1000 times, resulting in a dataset of 30,000 answers collected from June to November 2023 and using advanced textual, logical, and anthropological analysis system ti for macintos (ATLAS ti Mac (Version 22.0.6.0)); the analysis examined factors such as geographical representation, expertise validation, organizational engagements, and sentiment analysis of technical approaches.
The knowledge systems analysis focused on understanding how ChatGPT incorporates diverse dimensions of restoration knowledge, including experts, affiliations, literature, experiences, and projects. Researchers assessed the geographical representation, expertise validation, and distribution of the mentioned countries based on income level and region-based categories. They analyzed stakeholder engagements to recognize influential organizations and assess community-led involvement through social network analysis.
Furthermore, technical approaches analysis examined ChatGPT's treatment of ecosystem diversity, plant life forms, restoration approaches, and environmental outcomes. It employed sentiment analysis to gauge sentiments associated with each technical approach and its ecological consequences.
ChatGPT's Geographic Knowledge Bias
ChatGPT's responses on ecological restoration expertise reflect a significant reliance on sources from the Global North, particularly the United States, Europe, Canada, and Australia, with limited representation from low- and lower-middle-income countries. Geographical analysis reveals disparities, with high-income countries being overrepresented compared to their lower-income counterparts.
Moreover, the chatbot needs more information from countries with official restoration pledges, particularly those in Africa. Additionally, the chatbot heavily relies on content produced by male researchers, predominantly from the United States, with inaccuracies noted in the representation of experts. These findings underscore the need for more inclusive and accurate sourcing of restoration knowledge by AI-driven systems like ChatGPT.
Community Restoration Overlooked Analysis
ChatGPT primarily focuses on well-established international organizations and government agencies from high-income nations, overshadowing indigenous and community-led restoration efforts. Only a small fraction of the listed organizations (2%) represent indigenous and community initiatives, often marginalized within the restoration network analysis. Researchers portrayed these grassroots efforts generically, needing more specific details and context about their diverse engagements and experiences across different landscapes.
Restoration Bias Analysis
ChatGPT exhibits biases towards North American and European sources, emphasizing tree planting and forest-focused restoration interventions while overlooking holistic techniques and diverse ecosystems. This bias perpetuates environmental injustices by reinforcing power imbalances in conservation decision-making and knowledge production.
Furthermore, the chatbot needs to pay more attention to Indigenous and community-led restoration efforts and consider the importance of non-forest ecosystems and non-tree plant species. These limitations highlight the urgent need for more inclusive and accurate representation in AI-generated restoration information to address colonial legacies and promote equitable conservation practices.
Ethical AI Conservation
Ensuring responsible contributions from AI chatbots for just conservation requires urgent measures to prioritize ethical practices, including disclosing source and authorship and adopting decolonial formulations that embrace diverse histories and worldviews. Negotiating knowledge systems within digital practices should consider gender, race, and ethnicity considerations, drawing on insights from co-production mechanisms for inclusive knowledge sharing.
Additionally, addressing data access, control, and ownership issues, particularly regarding community and indigenous knowledge, demands reworking data sourcing and modeling based on specific contexts and demands. These efforts challenge ethical approaches to data governance, necessitating safeguards in expanding large language models to promote transparency and accountability in embracing environmental justice perspectives.
Conclusion
To sum up, ensuring responsible contributions from AI chatbots for just conservation necessitates urgent action to prioritize ethical practices. It includes disclosing source and authorship, adopting decolonial formulations, and considering gender, race, and ethnicity considerations in negotiating knowledge systems.
Addressing data access, control, and ownership issues, especially concerning community and indigenous knowledge, requires reworking data sourcing and modeling based on specific contexts and demands. These efforts challenge existing ethical approaches to data governance and emphasize the need for safeguards in expanding large language models to promote transparency and accountability in embracing environmental justice perspectives.
Journal reference:
- Urzedo, D., Sworna, Z. T., Hoskins, A. J., & Robinson, C. J. (2024). AI chatbots contribute to global conservation injustices. Humanities and Social Sciences Communications, 11:1, 1–8. https://doi.org/10.1057/s41599-024-02720-3, https://www.nature.com/articles/s41599-024-02720-3.