Augmented Reality-Enabled Human-Robot Collaboration for Construction Waste Sorting

In a recent research article published in the Journal of Environmental Management, the authors proposed the integration of Augmented Reality in Human-Robot Collaboration (AR-HRC) to automate the process of construction waste sorting (CWS) and decrease risks to manual workers.

Study: Augmented Reality-Enabled Human-Robot Collaboration for Construction Waste Sorting. Image credit: Vladyslav Horoshevych/Shutterstock
Study: Augmented Reality-Enabled Human-Robot Collaboration for Construction Waste Sorting. Image credit: Vladyslav Horoshevych/Shutterstock

Background

Efficiently managing CW is crucial for sustainable urban development. Current manual sorting practices pose safety risks; although robotic sorting has been explored, concerns persist about real-world accuracy. The authors introduce a new approach, AR-HRC for CW sorting, which leverages AR to optimize efficiency and occupational safety by allowing remote human assistance and reducing direct exposure to hazards.

The HRC waste sorting model

The HRC sorting system includes robots that handle major sorting tasks, and human workers assist when needed. AR technology is integrated into this system, offering visualization, communication, and ergonomics benefits. AR enables workers to monitor and instruct robots, improving safety and efficiency. This innovative approach enhances occupational safety and waste sorting quality while promoting the adoption of robotic sorting technology in the industry.

This proposed system comprises four interlinked modules, namely, Perception, Communication, Robotic Sorting (RS), and AR. The Perception module relies on a Red, Green, Blue, and Depth (RGB-D) sensor for waste object detection and gathering information, e.g., their location and category. This data is categorized into two levels: the first includes location and waste category, allowing for automatic sorting, while the second only offers location information, necessitating manual operator intervention.

The RS module employs a robotic arm with a vacuum gripper to execute waste sorting tasks based on information provided by the Perception module. Precise calibration ensures the accuracy of object recognition, and the system focuses on 2D planar grasping for simplicity.

The AR module, integrated with Microsoft HoloLens2, enables operators to monitor and direct the sorting process. Augmented information, like waste categories, is displayed as semi-transparent cubes, and operators can interact through gestures and buttons for marking and re-marking waste. The Communication module, utilizing the Robot Operating System (ROS), facilitates seamless data exchange between all the communication modules in the system.

The key features of the AR module include Augmented Monitoring, Human Instruction, and Safety Alerts. Coordinate transformation aligns the AR environment with the real world, mitigating disparities between Unity's left-handed coordinate system and ROS's right-handed coordinate system.

Experimental demonstrations

The experimental setup involved the UR5e robot and OnRobot VG10, equipped with a RealSense D435i depth sensor, operated through ROS on Ubuntu 20.04. The AR component was developed using Unity on Windows 11 and was experienced through a Microsoft HoloLens 2 AR head-mounted display.

In laboratory experiments, the system effectively sorted non-inert construction waste (e.g., plastics) from mixed waste. The AR-enabled system allows workers to monitor and improve the sorting process, and the robot executes tasks autonomously based on detected waste information. Workers could intervene when the system misclassified waste.

In terms of occupational safety and health (OSH), feedback from professionals indicated positive views. A techno-economic assessment suggested that the system could retrofit existing waste sorting lines with an estimated initial investment to be recouped within a year.

Conclusion and discussion

Current construction and demolition waste sorting (CWS) operations heavily rely on manual labor, exposing workers to various risks affecting their safety and health. These risks encompass ergonomic issues due to repetitive and awkward sorting motions, physical challenges such as noise and vibrations, potential exposure to harmful chemicals, and mechanical hazards like being caught in conveyor systems. The complexity of CWS, particularly in situations involving visually similar materials with different properties, poses a significant challenge for vision-based robotic sorting systems. To address these dilemmas, this research introduces an AR-HRC approach, aimed at enhancing both CWS efficiency and occupational safety and health.

The primary contribution of this study is the development of an AR-HRC method for CWS, capitalizing on the strengths of human workers and robots. While the core sorting tasks are performed by robots, human workers are responsible for instructing robots based on their visual acumen and experience when necessary. The AR medium facilitates seamless and effective communication between workers and robotic sorting systems, enhancing safety and overall work performance. Notably, this method minimizes the physical proximity of workers to the machinery and waste, ensuring safer and healthier working conditions.

However, some limitations persist in the research. The object segmentation process relies on traditional image analysis, which may introduce errors; future improvements in this area could leverage advanced computer vision methods. Additionally, the scope of human involvement in the system is currently limited to rectifying incorrect categorizations by sensors, with the potential to enhance grasping pose optimization through AR. For further validation of this system in the real world, dynamic CWS settings are essential.

In conclusion, the AR-enabled HRC method offers a promising solution to the intricate challenges of CWS by combining the strengths of humans and robots, fostering efficiency and occupational safety and health in this domain.

Journal reference:

Soham Nandi

Written by

Soham Nandi

Soham Nandi is a technical writer based in Memari, India. His academic background is in Computer Science Engineering, specializing in Artificial Intelligence and Machine learning. He has extensive experience in Data Analytics, Machine Learning, and Python. He has worked on group projects that required the implementation of Computer Vision, Image Classification, and App Development.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Nandi, Soham. (2023, October 23). Augmented Reality-Enabled Human-Robot Collaboration for Construction Waste Sorting. AZoAi. Retrieved on July 01, 2024 from https://www.azoai.com/news/20231019/Augmented-Reality-Enabled-Human-Robot-Collaboration-for-Construction-Waste-Sorting.aspx.

  • MLA

    Nandi, Soham. "Augmented Reality-Enabled Human-Robot Collaboration for Construction Waste Sorting". AZoAi. 01 July 2024. <https://www.azoai.com/news/20231019/Augmented-Reality-Enabled-Human-Robot-Collaboration-for-Construction-Waste-Sorting.aspx>.

  • Chicago

    Nandi, Soham. "Augmented Reality-Enabled Human-Robot Collaboration for Construction Waste Sorting". AZoAi. https://www.azoai.com/news/20231019/Augmented-Reality-Enabled-Human-Robot-Collaboration-for-Construction-Waste-Sorting.aspx. (accessed July 01, 2024).

  • Harvard

    Nandi, Soham. 2023. Augmented Reality-Enabled Human-Robot Collaboration for Construction Waste Sorting. AZoAi, viewed 01 July 2024, https://www.azoai.com/news/20231019/Augmented-Reality-Enabled-Human-Robot-Collaboration-for-Construction-Waste-Sorting.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Unraveling the Complexity of AR in Education: A Vocational College Study