Effects of Peripheral Vision Multiplexing Configurations on Augmented Information Detection

In the paper published in the journal Scientific Reports, researchers examined the effects of different peripheral vision multiplexing configurations on augmented information detection. This optical approach, utilized in recent head-mounted displays (HMD) and smart glasses, superimposed multiple views over an observer's natural field, aiding mobility.

Study: Effects of Peripheral Vision Multiplexing Configurations on Augmented Information Detection. Image credit: Generated using DALL.E.3
Study: Effects of Peripheral Vision Multiplexing Configurations on Augmented Information Detection. Image credit: Generated using DALL.E.3

The study compared unilateral opaque, unilateral, and bilateral see-through setups, considering real-world factors like target position and gaze movement for a more realistic assessment. Results consistently showed a lower target detection rate in unilateral configurations compared to the bilateral setup, indicating a more pronounced impact of binocular rivalry on target visibility. However, incorporating naturalistic elements mitigates this effect, providing insights for improved vision multiplexing design and suggesting potential avenues for future research.

Impact of Visual Rivalry in Displays

Vision multiplexing integrates augmented information into a user’s natural field of view, notably in recent HMDs and smart glasses, aiding tasks like driving and walking. It takes two primary forms: bi-ocular vision with an opaque display on one eye and monocular vision using see-through displays. Both have practical applications, assisting those with visual field loss. Yet, this technique can trigger visual rivalry, potentially impacting peripheral display visibility, especially with moving backgrounds.

Experimental Setup and Methodology Overview

The study involved 19 subjects, including three authors, aged 23 to 51. Across three experiments, 12 subjects participated in Experiment 1, while 13 participated in Experiments 2 and 3. These subjects were selected based on their regular or corrected-to-normal visual acuity and normal stereo vision, ensuring a consistent baseline for the experiments. The overlap of participants across experiments was due to availability.

All participants provided informed consent in compliance with ethical guidelines, and the experiments, approved by the Massachusetts Eye and Ear Human Studies Committee, adhered to the principles outlined in the Declaration of Helsinki. The experiments utilized familiar stimuli, featuring a simulated virtual corridor to mimic forward walking, generated using the Unity game engine.

The corridor moved at a typical adult walking speed while displaying various peripheral multiplexing configurations via ten-by-ten peripheral targets. These targets, more significant than those in prior studies, comprised horizontally drifting gratings to represent practical augmented information in existing multiplexing displays. The experimental setup involved subjects reporting whenever a considerable portion of the peripheral target vanished, reflecting a substantial loss of information. Before the main experiments, participants underwent a practice session to familiarize themselves with the task demands and criteria for target disappearance.

Experiment 1 investigated the impact of multiplexing configuration and target eccentricity on target visibility, exploring variations in peripheral display locations. This experiment maintained typical laboratory conditions, utilizing a recorded 2D video of the moving corridor on a virtual screen via Meta Quest 2 HMD. Experiment 2 expanded on this by examining the effects of depth conditions—2D versus 3D moving backgrounds—on target visibility across the different multiplexing configurations. The 3D condition simulated continuously varying depth cues, mimicking real-world scenarios where background depth changes during mobility. Both experiments employed randomized presentations of multiplexing configurations and target eccentricities or depth conditions to analyze their effects comprehensively.

Experiment 3 studied how different eye movements impact target visibility across varied multiplexing configurations in a 3D depth setting. Subjects engaged in central fixation, saccadic eye movement, and smooth pursuit conditions while tracking a fixation cross with eye-tracking technology. Data analysis involved statistical tests to understand how these eye movements influenced target visibility across diverse multiplexing conditions, shedding light on the relationship between eye movements and augmented information perception in dynamic visual environments.

Experiment Findings: Target Visibility Analysis

In Experiment 1, investigating the impact of target eccentricity and multiplexing configuration on target visibility, an interaction between eccentricity and configuration significantly influenced visibility. Notably, unilateral multiplexing configurations exhibited a more significant decrease in target visibility. Comparing different eccentricities revealed significantly higher visibility at 5° than at 10° or 15°, aligning with previous research indicating reduced task performance with increasing eccentricity.

Both multiplexing configuration and eccentricity significantly modulated target visibility. The bilateral see-through configuration notably outperformed unilateral configurations, emphasizing a possible influence of binocular rivalry on visibility. Additionally, lower visibility in the unilateral see-through configuration than in the opaque one suggested the role of both target contrast and monocular rivalry.

Experiment 2, examining the impact of background depth and multiplexing configuration on target visibility, revealed no interaction effect but a significant main effect of configuration. Once again, the bilateral see-through configuration displayed superior target visibility compared to unilateral configurations, emphasizing binocular rivalry's potential impact. Moreover, visibility variations were observed in different depth conditions, notably lower visibility in the bilateral see-through configuration in 3D compared to 2D, indicating depth's influence on target visibility.

In Experiment 3, analyzing the effect of eye movements and multiplexing configuration on target visibility, the interaction between configuration and eye movement was insignificant. However, configuration and eye movement individually affected target visibility significantly. Once more, the bilateral see-through configuration exhibited better visibility, and executing saccadic or smooth pursuit eye movements increased target visibility compared to central fixation, reinforcing the influence of eye movements on perceiving augmented information.

The findings across all experiments highlight the substantial role of multiplexing configuration, eccentricity, depth, and eye movements in influencing target visibility in increased reality settings.

Conclusion

To summarize, exploring vision multiplexing configurations in augmented reality during mobility highlighted the prevalence of binocular rivalry's impact on target visibility. The bilateral see-through design boasted the highest visibility but introduced depth-switching concerns. Unilateral displays, while less visible, showed fewer depth-switching issues. Both configurations, however, raised potential concerns about inattentional blindness and distraction from rivalry suppression.

Real-world scenarios with natural eye movements may partially alleviate these challenges. Yet, the accommodation-vergence conflict in HMDs and the interplay of depth cues present ongoing challenges. Further research is necessary in augmented reality settings to devise practical design solutions for crafting optimal multiplexing displays that resist visual rivalry or employ dynamic contrast manipulation to sustain visibility.

Journal reference:
Silpaja Chandrasekar

Written by

Silpaja Chandrasekar

Dr. Silpaja Chandrasekar has a Ph.D. in Computer Science from Anna University, Chennai. Her research expertise lies in analyzing traffic parameters under challenging environmental conditions. Additionally, she has gained valuable exposure to diverse research areas, such as detection, tracking, classification, medical image analysis, cancer cell detection, chemistry, and Hamiltonian walks.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Chandrasekar, Silpaja. (2023, November 21). Effects of Peripheral Vision Multiplexing Configurations on Augmented Information Detection. AZoAi. Retrieved on November 23, 2024 from https://www.azoai.com/news/20231121/Effects-of-Peripheral-Vision-Multiplexing-Configurations-on-Augmented-Information-Detection.aspx.

  • MLA

    Chandrasekar, Silpaja. "Effects of Peripheral Vision Multiplexing Configurations on Augmented Information Detection". AZoAi. 23 November 2024. <https://www.azoai.com/news/20231121/Effects-of-Peripheral-Vision-Multiplexing-Configurations-on-Augmented-Information-Detection.aspx>.

  • Chicago

    Chandrasekar, Silpaja. "Effects of Peripheral Vision Multiplexing Configurations on Augmented Information Detection". AZoAi. https://www.azoai.com/news/20231121/Effects-of-Peripheral-Vision-Multiplexing-Configurations-on-Augmented-Information-Detection.aspx. (accessed November 23, 2024).

  • Harvard

    Chandrasekar, Silpaja. 2023. Effects of Peripheral Vision Multiplexing Configurations on Augmented Information Detection. AZoAi, viewed 23 November 2024, https://www.azoai.com/news/20231121/Effects-of-Peripheral-Vision-Multiplexing-Configurations-on-Augmented-Information-Detection.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
AR and Computer Vision Revolutionize Bridge Inspections