As AI becomes essential in newsrooms, this study uncovers how closer collaboration between journalists and AI technologists can enhance news production while ensuring editorial control and ethical standards remain intact.
Research: "It Might be Technically Impressive, But It's Practically Useless to Us": Practices, Challenges, and Opportunities for Cross-Functional Collaboration around AI within the News Industry
*Important notice: arXiv publishes preliminary scientific reports that are not peer-reviewed and, therefore, should not be regarded as definitive, used to guide development decisions, or treated as established information in the field of artificial intelligence research.
In an article recently posted on the arXiv preprint* server, researchers conducted a detailed investigated the integration of artificial intelligence (AI) into journalistic workflows within news organizations. They aimed to understand current practices, challenges, and opportunities for collaboration between journalists and AI professionals with a specific focus on Chinese newsrooms.
Through interviews with journalists, AI technologists, and AI workers from leading Chinese news organizations, the study explored how these collaborations work and offered recommendations to enhance future teamwork in the news industry. It also discussed the unique challenges faced by Chinese newsrooms, including concerns over autonomy and outsourcing to technology
Background
The increasing use of AI across industries has transformed business operations, including news production. AI can potentially change how news is created, distributed, and consumed. A 2023 global survey by the London School of Economics found that over 75% of news organizations in 46 countries now use AI in their processes. Technologies like computer vision (CV) and natural language processing (NLP) are applied at different stages of news production and distribution.
However, integrating AI into newsrooms requires collaboration between journalists and AI experts, which is often challenging due to their different backgrounds. This integration has also raised ethical concerns, such as how AI systems may affect fairness, transparency, and journalistic integrity. Additionally, challenges such as limited shared expertise, misaligned goals, and power imbalances highlight the need for effective collaboration to ensure AI systems align with journalistic principles.
About the Research
In this paper, the authors conducted a two-stage study involving 26 professionals from five major news organizations. Participants included 17 journalists, six AI technologists, and three AI workers (primarily data annotators), all actively collaborating on AI-related tasks within their organizations. These organizations had already established teams of AI researchers, algorithm engineers, data scientists, and journalists to explore perceptions and improve collaboration.
The research used semi-structured interviews and workshops to gather insights into current collaboration strategies and identify areas for improvement. A key focus was addressing the knowledge gap between these groups, which was seen as a critical factor in successful collaboration. In the first stage, interviews were conducted to understand participants' experiences and perceptions of collaboration, uncovering the challenges and opportunities involved.
In the second stage, three workshops were held, during which participants worked together to imagine more effective and equitable collaboration methods. These workshops also highlighted the often-overlooked contributions of AI workers, whose roles are critical yet underappreciated. They discussed ways to bridge the knowledge gap between journalists and AI experts, encouraging productive teamwork.
Furthermore, collaboration dynamics were assessed at different stages of AI-driven news production. Three main stages were identified: defining the task, designing the tool, and applying it in the newsroom.
Initially, journalists proposed task requirements, while AI professionals translated these into technical goals. During the design phase, AI experts led the development process, with feedback from journalists. However, journalists were not deeply involved in this phase, leading to concerns about misaligned expectations and AI tools not meeting practical journalistic needs. In the final stage, journalists used the AI tools and evaluated their effectiveness in news production.
Key Findings
This work revealed several key findings about collaboration between journalists and AI professionals. First, the complexity of AI and concerns about losing editorial control through outsourcing led many newsrooms to hire their own AI experts and form internal collaboration teams. Both journalists and AI professionals recognized the need for close teamwork due to AI's growing influence on news production.
Second, the study emphasized the importance of closing the knowledge gap between journalists and AI experts. Activities such as workshops and lectures helped both groups better understand each other's fields. Yet, the study found that informal methods, such as using analogies to explain complex concepts, were also crucial in fostering collaboration.
Third, the researchers highlighted the need for more inclusive collaboration. AI workers, who are often marginalized as "invisible labor," were shown to be vital contributors to AI-driven journalism and should be better integrated into collaboration efforts. More involvement of these workers would not only improve collaboration but also the overall outcomes of AI integration.
The study also pointed out that AI-driven news production requires journalists to develop new skills and adapt to AI-driven workflows. AI tools can free journalists to focus on tasks like investigative reporting and editorial decisions, but this shift demands significant changes in how journalists work and raises concerns about the quality of AI-generated content, which may not always meet journalistic standards.
Applications
By improving collaboration between journalists and AI experts, newsrooms can develop AI tools that enhance various stages of news production, such as content generation, audience analysis, and curation. These tools can improve efficiency by automating repetitive tasks, like data analysis. It can also help journalists better understand audience behavior, allowing them to tailor content to meet reader preferences.
The recommendations for more inclusive collaboration can help newsrooms promote innovation and efficiency. Adopting these strategies can improve AI integration and enhance the overall news production process. Additionally, AI can promote diversity and inclusion by providing tools that support underrepresented groups in the newsroom. This could be particularly important in fostering greater equity for AI workers.
Conclusion
In summary, integrating AI into the news industry offers both opportunities and challenges. Effective collaboration between journalists and AI experts is essential to develop AI-driven news systems that align with ethical and journalistic standards. The authors highlighted the importance of filling the knowledge gap between different roles and promoting teamwork. Moreover, addressing power imbalances and creating clear communication channels between journalists, AI technologists, and AI workers is crucial to ensure successful collaboration.
The findings have significant implications for the future of journalism. As AI continues transforming the industry, collaboration between journalists and AI professionals is key to creating transparent, accountable, and equitable news systems. Following the recommendations, news organizations can improve AI integration and streamline news production and distribution.
Future work should continue exploring how AI impacts newsroom workflows and investigate further how underrepresented workers, such as AI annotators, can be more fully included in the collaboration process. Overall, this research provided a strong foundation for further investigation into the challenges and opportunities of AI in the news industry.
*Important notice: arXiv publishes preliminary scientific reports that are not peer-reviewed and, therefore, should not be regarded as definitive, used to guide development decisions, or treated as established information in the field of artificial intelligence research.
Journal reference:
- Preliminary scientific report.
Xiao, Q., & et al. "It Might be Technically Impressive, But It's Practically Useless to Us": Practices, Challenges, and Opportunities for Cross-Functional Collaboration around AI within the News Industry. arXiv, 2024, 2409, 12000. DOI: 10.48550/arXiv.2409.12000, https://arxiv.org/abs/2409.12000