A three-dimensional mapping and virtual reality-based human–robot interaction for collaborative space exploration. (3rd June 2020)
- Record Type:
- Journal Article
- Title:
- A three-dimensional mapping and virtual reality-based human–robot interaction for collaborative space exploration. (3rd June 2020)
- Main Title:
- A three-dimensional mapping and virtual reality-based human–robot interaction for collaborative space exploration
- Authors:
- Xiao, Junhao
Wang, Pan
Lu, Huimin
Zhang, Hui - Abstract:
- Human–robot interaction is a vital part of human–robot collaborative space exploration, which bridges the high-level decision and path planning intelligence of human and the accurate sensing and modelling ability of the robot. However, most conventional human–robot interaction approaches rely on video streams for the operator to understand the robot's surrounding, which lacks situational awareness and force the operator to be stressed and fatigued. This research aims to improve efficiency and promote the natural level of interaction for human–robot collaboration. We present a human–robot interaction method based on real-time mapping and online virtual reality visualization, which is implemented and verified for rescue robotics. At the robot side, a dense point cloud map is built in real-time by LiDAR-IMU tightly fusion; the resulting map is further transformed into three-dimensional normal distributions transform representation. Wireless communication is employed to transmit the three-dimensional normal distributions transform map to the remote control station in an incremental manner. At the remote control station, the received map is rendered in virtual reality using parameterized ellipsoid cells. The operator controls the robot with three modes. In complex areas, the operator can use interactive devices to give low-level motion commands. In the less unstructured region, the operator can specify a path or even a target point. Afterwards, the robot follows the path orHuman–robot interaction is a vital part of human–robot collaborative space exploration, which bridges the high-level decision and path planning intelligence of human and the accurate sensing and modelling ability of the robot. However, most conventional human–robot interaction approaches rely on video streams for the operator to understand the robot's surrounding, which lacks situational awareness and force the operator to be stressed and fatigued. This research aims to improve efficiency and promote the natural level of interaction for human–robot collaboration. We present a human–robot interaction method based on real-time mapping and online virtual reality visualization, which is implemented and verified for rescue robotics. At the robot side, a dense point cloud map is built in real-time by LiDAR-IMU tightly fusion; the resulting map is further transformed into three-dimensional normal distributions transform representation. Wireless communication is employed to transmit the three-dimensional normal distributions transform map to the remote control station in an incremental manner. At the remote control station, the received map is rendered in virtual reality using parameterized ellipsoid cells. The operator controls the robot with three modes. In complex areas, the operator can use interactive devices to give low-level motion commands. In the less unstructured region, the operator can specify a path or even a target point. Afterwards, the robot follows the path or navigates to the target point autonomously. In other words, these two modes rely more on the robot's autonomy. By virtue of virtual reality visualization, the operator can have a more comprehensive understanding of the space to be explored. In this case, the high-level decision and path planning intelligence of human and the accurate sensing and modelling ability of the robot can be well integrated as a whole. Although the method is proposed for rescue robots, it can also be used in other out-of-sight teleoperation-based human–robot collaboration systems, including but not limited to manufacturing, space, undersea, surgery, agriculture and military operations. … (more)
- Is Part Of:
- International journal of advanced robotic systems. Volume 17:Number 3(2020:May/Jun.)
- Journal:
- International journal of advanced robotic systems
- Issue:
- Volume 17:Number 3(2020:May/Jun.)
- Issue Display:
- Volume 17, Issue 3 (2020)
- Year:
- 2020
- Volume:
- 17
- Issue:
- 3
- Issue Sort Value:
- 2020-0017-0003-0000
- Page Start:
- Page End:
- Publication Date:
- 2020-06-03
- Subjects:
- Human–robot space exploration -- human–robot interaction -- 3D mapping -- virtual reality -- rescue robotics
Robotics -- Periodicals
Robotics
Periodicals
629.892 - Journal URLs:
- http://arx.sagepub.com/ ↗
http://search.epnet.com/direct.asp?db=bch&jid=13CR&scope=site ↗
http://www.intechweb.org/journal.php?id=3 ↗
http://www.uk.sagepub.com/home.nav ↗ - DOI:
- 10.1177/1729881420925293 ↗
- Languages:
- English
- ISSNs:
- 1729-8806
- Deposit Type:
- Legaldeposit
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library DSC - BLDSS-3PM
British Library HMNTS - ELD Digital store - Ingest File:
- 13859.xml