Stereo camera visual SLAM with hierarchical masking and motion-state classification at outdoor construction sites containing large dynamic objects. (16th February 2021)
- Record Type:
- Journal Article
- Title:
- Stereo camera visual SLAM with hierarchical masking and motion-state classification at outdoor construction sites containing large dynamic objects. (16th February 2021)
- Main Title:
- Stereo camera visual SLAM with hierarchical masking and motion-state classification at outdoor construction sites containing large dynamic objects
- Authors:
- Bao, Runqiu
Komatsu, Ren
Miyagusuku, Renato
Chino, Masaki
Yamashita, Atsushi
Asama, Hajime - Abstract:
- ABSTRACT: At modern construction sites, utilizing GNSS (Global Navigation Satellite System) to measure the real-time location and orientation (i.e. pose) of construction machines and navigate them is very common. However, GNSS is not always available. Replacing GNSS with on-board cameras and visual simultaneous localization and mapping (visual SLAM) to navigate the machines is a cost-effective solution. Nevertheless, at construction sites, multiple construction machines will usually work together and side-by-side, causing large dynamic occlusions in the cameras' view. Standard visual SLAM cannot handle large dynamic occlusions well. In this work, we propose a motion segmentation method to efficiently extract static parts from crowded dynamic scenes to enable robust tracking of camera ego-motion. Our method utilizes semantic information combined with object-level geometric constraints to quickly detect the static parts of the scene. Then, we perform a two-step coarse-to-fine ego-motion tracking with reference to the static parts. This leads to a novel dynamic visual SLAM formation. We test our proposals through a real implementation based on ORB-SLAM2, and datasets we collected from real construction sites. The results show that when standard visual SLAM fails, our method can still retain accurate camera ego-motion tracking in real-time. Comparing to state-of-the-art dynamic visual SLAM methods, ours shows outstanding efficiency and competitive result trajectory accuracy.ABSTRACT: At modern construction sites, utilizing GNSS (Global Navigation Satellite System) to measure the real-time location and orientation (i.e. pose) of construction machines and navigate them is very common. However, GNSS is not always available. Replacing GNSS with on-board cameras and visual simultaneous localization and mapping (visual SLAM) to navigate the machines is a cost-effective solution. Nevertheless, at construction sites, multiple construction machines will usually work together and side-by-side, causing large dynamic occlusions in the cameras' view. Standard visual SLAM cannot handle large dynamic occlusions well. In this work, we propose a motion segmentation method to efficiently extract static parts from crowded dynamic scenes to enable robust tracking of camera ego-motion. Our method utilizes semantic information combined with object-level geometric constraints to quickly detect the static parts of the scene. Then, we perform a two-step coarse-to-fine ego-motion tracking with reference to the static parts. This leads to a novel dynamic visual SLAM formation. We test our proposals through a real implementation based on ORB-SLAM2, and datasets we collected from real construction sites. The results show that when standard visual SLAM fails, our method can still retain accurate camera ego-motion tracking in real-time. Comparing to state-of-the-art dynamic visual SLAM methods, ours shows outstanding efficiency and competitive result trajectory accuracy. GRAPHICAL ABSTRACT: UF0001 … (more)
- Is Part Of:
- Advanced robotics. Volume 35:Number 3/4(2021)
- Journal:
- Advanced robotics
- Issue:
- Volume 35:Number 3/4(2021)
- Issue Display:
- Volume 35, Issue 3/4 (2021)
- Year:
- 2021
- Volume:
- 35
- Issue:
- 3/4
- Issue Sort Value:
- 2021-0035-NaN-0000
- Page Start:
- 228
- Page End:
- 241
- Publication Date:
- 2021-02-16
- Subjects:
- Dynamic visual SLAM -- motion segmentation -- hierarchical masking -- object motion-state classification -- ego-motion tracking
Robotics -- Periodicals
Robotics -- Japan -- Periodicals
Robotics
Japan
Periodicals
629.89205 - Journal URLs:
- http://www.catchword.com/rpsv/cw/vsp/01691864/contp1.htm ↗
http://catalog.hathitrust.org/api/volumes/oclc/14883000.html ↗
http://www.tandfonline.com/toc/tadr20/current ↗
http://www.tandfonline.com/ ↗
http://firstsearch.oclc.org ↗
http://firstsearch.oclc.org/journal=0169-1864;screen=info;ECOIP ↗
http://www.ingentaselect.com/vl=16659242/cl=11/nw=1/rpsv/cw/vsp/01691864/contp1.htm ↗ - DOI:
- 10.1080/01691864.2020.1869586 ↗
- Languages:
- English
- ISSNs:
- 0169-1864
- Deposit Type:
- Legaldeposit
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library DSC - 0696.926500
British Library DSC - BLDSS-3PM
British Library STI - ELD Digital store - Ingest File:
- 22679.xml