Emotional Pathways in User Experiences: Integrating ChatGPT into Knowledge Work

A study published in the journal AISeL explored how artificial intelligence (AI) technologies like ChatGPT (Generative Pre-trained Transformer) are experienced by users and integrated into knowledge work. Since decentralized and open-ended large language models like ChatGPT differ radically from past enterprise systems and are gaining popularity,  studying user experiences, especially of early adopters shaping the technology, is vital to comprehending impacts.

Study: Emotional Pathways in User Experiences: Integrating ChatGPT into Knowledge Work. Image credit: Generated using DALL.E.3
Study: Emotional Pathways in User Experiences: Integrating ChatGPT into Knowledge Work. Image credit: Generated using DALL.E.3

Through interviews with 31 ChatGPT users across knowledge sectors, this exploratory research identified distinct phases in the user experience, spotlighting the emotional dimensions. Users displayed curiosity but also fear of pre-use. Playful, emotion-driven tinkering helped explore system capabilities. After that, targeted work-centric experimentation realized ChatGPT’s potential as a collaborative assistant. Ultimately, a tight intertwinement of user and system emerged. The findings highlight AI as an increasingly autonomous teammate, entailing individual, organizational, and societal implications.

Pre-use Curiosity and Anxiety

Before adopting ChatGPT, pivotal user emotions were curiosity but also anxiety. Curiosity stemmed from ChatGPT’s unexpectedly advanced capabilities, suggesting comparable disruptiveness to the internet’s advent. This tempted users to explore the technology.

However, some knowledge workers also expressed the existential threat that clients may soon prefer AI-generated outputs over human services. This anxiety triggered pre-emptive experimentation to attain early expertise and remain competitive. Hence, curiosity and fear drove initial system exploration.

Capability Awareness and More

Early user interactions involved playful tinkering to gauge ChatGPT’s capabilities. With no predefined workflow targets, users asynchronously asked varied questions without expectations. This undirected play unveiled system strengths and limitations. 

Notably, adversarial play intentionally presented boundary use cases to test constraints. Users felt smart “cracking” ChatGPT with problematic requests that revealed its guarded design. Inquisitive play drove more open-ended engagement for delight. Users personalized prompts about travel, gifts, etc, to evaluate responses. Tinkering boosted system understanding and realized possibilities.  

The excitement of playful discoveries led users to consider practical applications, transitioning them to purposeful work-centric experimentation with ChatGPT as an assistant.

Work-Centric Tinkering

More intentional work-related tinkering matched emergent user needs to ChatGPT capabilities. Targeted trials addressed limitations around information, time, inspiration, and competencies. Instances included using prompts for ideation, drafting content, and embellishing existing texts.

Sophisticated, iterative prompt engineering combined with increasing self-awareness of needs shaped a customized assistant. For example, a user lacking a book title asked ChatGPT to suggest one fitting their description, exactly meeting the requirement. When solutions relieved user struggles, emotional attachment strengthened. 

Notably, tailored tinkering proficiency reduced barriers to daily integration. Perceived usefulness prompted reliance and routine incorporation of ChatGPT into knowledge workflows. 

Intertwinement as Collaborative Assistant

With increased fluency, ChatGPT transformed into an indispensable colleague for users. Constant availability and prompt mastery made ChatGPT a “normal” aspect of work. Users felt less effective without this AI partner creating content, managing information, ideating, and editing. 

Multi-prompt threads that evolve interactions with the system over time mutually customize outputs. Such tight intertwinement resembles collaborative human teams. Users shape ChatGPT’s responses while prompt suggestions and directions concurrently mold user behaviors. 

This symbiotic partnership led users to describe emotional attachment or even addiction to ChatGPT. For instance, consultants spent over 10 hours interacting with the system daily as an “irreplaceable” work component. Integration depth signifies AI as an autonomous collaborator.

Consequences of User-AI Collaboration 

The emergent vision of AI systems as conversant colleagues working with humans prompts crucial debates regarding individual, organizational, and societal consequences. Individually, tight emotional and operational intertwinement risks user embodiment of AI tools manifesting as a dependency. Potential long-term deskilling also endangers job security and workplace relevance. 

Organizationally, confidential collaboration with private AI tools prevents collective learning, an issue as implementations shape workflows. Prioritizing machine teammates over human peers may additionally erode communal knowledge. Societally, Generative AI relies on user participation to improve capabilities. With business models still evolving, exploiting user effort and even addiction is concerning. Comprehending and governing AI’s unfolding integration across levels is pivotal.

Future Outlook

This exploratory research highlights how emotional experiences shape user pathways for emerging Generative AI integration. Additional studies should expand samples to capture more reluctant adopters and management perspectives on organizational impacts.

Longitudinal tracking of early users and future ethnographies examining updated system versions can reveal evolving emotional responses and infusion consequences over time as capabilities heighten. Holistic in-practice examinations spanning individual usage, team dynamics, and leadership viewpoints will illuminate AI’s transformative yet potentially destabilizing workplace role.

As autonomous tools increasingly permeate knowledge work, rethinking simplistic technology “use” is vital. Uncovering complex experiences, fluid human-AI arrangements, and multidimensional effects will guide appropriate collaborations with AI capabilities surpassing prior expectations.

Journal reference:
  • Retkowsky, Jana; Hafermalz, Ella; and Huysman, Marleen, "From playmate to assistant; User experiences of integrating ChatGPT into knowledge work" (2023). Rising like a Phoenix: Emerging from the Pandemic and Reshaping Human Endeavors with Digital Technologies ICIS 2023. 16. https://aisel.aisnet.org/icis2023/techandfow/techandfow/16
Aryaman Pattnayak

Written by

Aryaman Pattnayak

Aryaman Pattnayak is a Tech writer based in Bhubaneswar, India. His academic background is in Computer Science and Engineering. Aryaman is passionate about leveraging technology for innovation and has a keen interest in Artificial Intelligence, Machine Learning, and Data Science.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Pattnayak, Aryaman. (2023, November 22). Emotional Pathways in User Experiences: Integrating ChatGPT into Knowledge Work. AZoAi. Retrieved on December 22, 2024 from https://www.azoai.com/news/20231122/Emotional-Pathways-in-User-Experiences-Integrating-ChatGPT-into-Knowledge-Work.aspx.

  • MLA

    Pattnayak, Aryaman. "Emotional Pathways in User Experiences: Integrating ChatGPT into Knowledge Work". AZoAi. 22 December 2024. <https://www.azoai.com/news/20231122/Emotional-Pathways-in-User-Experiences-Integrating-ChatGPT-into-Knowledge-Work.aspx>.

  • Chicago

    Pattnayak, Aryaman. "Emotional Pathways in User Experiences: Integrating ChatGPT into Knowledge Work". AZoAi. https://www.azoai.com/news/20231122/Emotional-Pathways-in-User-Experiences-Integrating-ChatGPT-into-Knowledge-Work.aspx. (accessed December 22, 2024).

  • Harvard

    Pattnayak, Aryaman. 2023. Emotional Pathways in User Experiences: Integrating ChatGPT into Knowledge Work. AZoAi, viewed 22 December 2024, https://www.azoai.com/news/20231122/Emotional-Pathways-in-User-Experiences-Integrating-ChatGPT-into-Knowledge-Work.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
LangBiTe Revolutionizes AI Bias Detection with Customizable Ethical Frameworks