Augmented Reality (AR) is a technology that overlays digital information, such as images, videos, or sounds, onto the real world, enhancing the user's perception and interaction with their environment. It's used in various applications, from gaming and entertainment to education, navigation, and industrial design.
This study examines the impact of different peripheral vision multiplexing configurations on augmented information detection, focusing on head-mounted displays and smart glasses. The research, involving 19 participants across three experiments, reveals that bilateral see-through setups consistently outperform unilateral configurations, offering insights for improved design in vision multiplexing technologies, especially in real-world scenarios involving mobility.
Researchers have developed the SmartGlove framework, which connects animal electronic identification (EID) with augmented reality smart glasses (ARSGs) for hands-free visualization of specific animal data in the field. This innovative approach offers potential benefits for precision livestock farming by improving data consultation and interpretation, ultimately enhancing farm management and efficiency.
Researchers have introduced an innovative approach, Augmented Reality in Human-Robot Collaboration (AR-HRC), to automate construction waste sorting (CWS) and enhance the safety and efficiency of waste management. By integrating AR technology, this method allows remote human assistance and minimizes direct exposure to hazards, ultimately improving occupational safety and the quality of waste sorting processes.
This study explores the development and usability of the AIIS (Artificial Intelligence, Innovation, and Society) collaborative learning interface, a metaverse-based educational platform designed for undergraduate students. The research demonstrates the potential of immersive technology in education and offers insights and recommendations for enhancing metaverse-based learning systems.
This paper explores the potential of metaverse technology, including augmented reality (AR), virtual reality (VR), and mixed reality (MR), in the field of plant science. It discusses how extended reality (XR) technologies can transform learning, research, and collaboration in plant science while addressing the challenges and hurdles in adopting these innovative approaches.
Researchers have developed eco-friendly, flexible electret loudspeakers that offer high-quality sound and sustainability. With versatility in shape and size, these speakers open doors for immersive human-machine interactions, enhancing audio experiences in various applications.
Researchers have introduced NeRF-Det, a cutting-edge method for indoor 3D object detection using RGB images. By integrating Neural Radiance Fields (NeRF) with 3D detection, NeRF-Det significantly enhances the accuracy of object detection in complex indoor scenes, making it a promising advancement for applications in robotics, augmented reality, and virtual reality.
This paper presents a novel approach to pupil tracking using event camera imaging, a technology known for its ability to capture rapid and subtle eye movements. The research employs machine-learning-based computer vision techniques to enhance eye tracking accuracy, particularly during fast eye movements.
Researchers discuss how artificial intelligence (AI) is reshaping higher education. The integration of AI in universities, known as smart universities, enhances efficiency, personalization, and student experiences. However, challenges such as job displacement and ethical considerations require careful consideration as AI's transformative potential in education unfolds.
Researchers explore the transformative potential of ARGUS, a visual analytics tool designed to enhance the development and refinement of intelligent augmented reality (AR) assistants. By offering real-time monitoring, retrospective analysis, and comprehensive visualization, ARGUS empowers developers to understand user behavior, AI model performance, and physical environment interactions, revolutionizing the precision and effectiveness of AR assistance across diverse domains.
Researchers delve into the realm of intelligent packaging powered by AI to ensure food freshness, offering insights into global advancements. The study highlights the potential of AI-driven solutions for monitoring freshness, though challenges in sensor technology and algorithm optimization remain.
A recent study proposes a system that combines optical character recognition (OCR), augmented reality (AR), and large language models (LLMs) to revolutionize operations and maintenance tasks. By leveraging a dynamic virtual environment powered by Unity and integrating ChatGPT, the system enhances user performance, ensures trustworthy interactions, and reduces workload, providing real-time text-to-action guidance and seamless interactions between the virtual and physical realms.
The integration of AIoT and digital twin technology in aquaculture holds the key to revolutionizing fish farming. By combining real-time data collection, cloud computing, and AI functionalities, intelligent fish farming systems enable remote monitoring, precise fish health assessment, optimized feeding strategies, and enhanced productivity. This integration presents significant implications for the industry, paving the way for sustainable practices and improved food security.
Researchers propose novel adaptive control methods for collaborative robotic arms, reducing task completion time and mode switches while decreasing perceived workload. Through a VR study, continuous and threshold-based approaches showed promising results, highlighting the need for customizable configurations and individualized options to enhance user acceptance and usability in real-world applications.
This article explores the challenges of performing surgery in space during Moon and Mars missions and highlights advancements in surgical robotics to address these challenges. Reduced gravity, radiation exposure, and limited medical support pose unique obstacles. The development of miniaturized medical devices, robotic surgery simulations, and autonomous surgical robots, along with the application of AI, haptic sensors, minimally invasive techniques, and 3D printing, offer potential solutions.
Terms
While we only use edited and approved content for Azthena
answers, it may on occasions provide incorrect responses.
Please confirm any data provided with the related suppliers or
authors. We do not provide medical advice, if you search for
medical information you must always consult a medical
professional before acting on any information provided.
Your questions, but not your email details will be shared with
OpenAI and retained for 30 days in accordance with their
privacy principles.
Please do not ask questions that use sensitive or confidential
information.
Read the full Terms & Conditions.