In a recent research article published in the Journal of Environmental Management, the authors proposed the integration of Augmented Reality in Human-Robot Collaboration (AR-HRC) to automate the process of construction waste sorting (CWS) and decrease risks to manual workers.
Background
Efficiently managing CW is crucial for sustainable urban development. Current manual sorting practices pose safety risks; although robotic sorting has been explored, concerns persist about real-world accuracy. The authors introduce a new approach, AR-HRC for CW sorting, which leverages AR to optimize efficiency and occupational safety by allowing remote human assistance and reducing direct exposure to hazards.
The HRC waste sorting model
The HRC sorting system includes robots that handle major sorting tasks, and human workers assist when needed. AR technology is integrated into this system, offering visualization, communication, and ergonomics benefits. AR enables workers to monitor and instruct robots, improving safety and efficiency. This innovative approach enhances occupational safety and waste sorting quality while promoting the adoption of robotic sorting technology in the industry.
This proposed system comprises four interlinked modules, namely, Perception, Communication, Robotic Sorting (RS), and AR. The Perception module relies on a Red, Green, Blue, and Depth (RGB-D) sensor for waste object detection and gathering information, e.g., their location and category. This data is categorized into two levels: the first includes location and waste category, allowing for automatic sorting, while the second only offers location information, necessitating manual operator intervention.
The RS module employs a robotic arm with a vacuum gripper to execute waste sorting tasks based on information provided by the Perception module. Precise calibration ensures the accuracy of object recognition, and the system focuses on 2D planar grasping for simplicity.
The AR module, integrated with Microsoft HoloLens2, enables operators to monitor and direct the sorting process. Augmented information, like waste categories, is displayed as semi-transparent cubes, and operators can interact through gestures and buttons for marking and re-marking waste. The Communication module, utilizing the Robot Operating System (ROS), facilitates seamless data exchange between all the communication modules in the system.
The key features of the AR module include Augmented Monitoring, Human Instruction, and Safety Alerts. Coordinate transformation aligns the AR environment with the real world, mitigating disparities between Unity's left-handed coordinate system and ROS's right-handed coordinate system.
Experimental demonstrations
The experimental setup involved the UR5e robot and OnRobot VG10, equipped with a RealSense D435i depth sensor, operated through ROS on Ubuntu 20.04. The AR component was developed using Unity on Windows 11 and was experienced through a Microsoft HoloLens 2 AR head-mounted display.
In laboratory experiments, the system effectively sorted non-inert construction waste (e.g., plastics) from mixed waste. The AR-enabled system allows workers to monitor and improve the sorting process, and the robot executes tasks autonomously based on detected waste information. Workers could intervene when the system misclassified waste.
In terms of occupational safety and health (OSH), feedback from professionals indicated positive views. A techno-economic assessment suggested that the system could retrofit existing waste sorting lines with an estimated initial investment to be recouped within a year.
Conclusion and discussion
Current construction and demolition waste sorting (CWS) operations heavily rely on manual labor, exposing workers to various risks affecting their safety and health. These risks encompass ergonomic issues due to repetitive and awkward sorting motions, physical challenges such as noise and vibrations, potential exposure to harmful chemicals, and mechanical hazards like being caught in conveyor systems. The complexity of CWS, particularly in situations involving visually similar materials with different properties, poses a significant challenge for vision-based robotic sorting systems. To address these dilemmas, this research introduces an AR-HRC approach, aimed at enhancing both CWS efficiency and occupational safety and health.
The primary contribution of this study is the development of an AR-HRC method for CWS, capitalizing on the strengths of human workers and robots. While the core sorting tasks are performed by robots, human workers are responsible for instructing robots based on their visual acumen and experience when necessary. The AR medium facilitates seamless and effective communication between workers and robotic sorting systems, enhancing safety and overall work performance. Notably, this method minimizes the physical proximity of workers to the machinery and waste, ensuring safer and healthier working conditions.
However, some limitations persist in the research. The object segmentation process relies on traditional image analysis, which may introduce errors; future improvements in this area could leverage advanced computer vision methods. Additionally, the scope of human involvement in the system is currently limited to rectifying incorrect categorizations by sensors, with the potential to enhance grasping pose optimization through AR. For further validation of this system in the real world, dynamic CWS settings are essential.
In conclusion, the AR-enabled HRC method offers a promising solution to the intricate challenges of CWS by combining the strengths of humans and robots, fostering efficiency and occupational safety and health in this domain.
Journal reference:
- Chen, J., Fu, Y., Lu, W., & Pan, Y. (2023). Augmented reality-enabled human-robot collaboration to balance construction waste sorting efficiency and occupational safety and health. Journal of Environmental Management, 348, 119341. https://doi.org/10.1016/j.jenvman.2023.119341, https://www.sciencedirect.com/science/article/pii/S0301479723021291