Deep learning-based artificial vision for grasp classification in myoelectric hands. (3rd May 2017)
- Record Type:
- Journal Article
- Title:
- Deep learning-based artificial vision for grasp classification in myoelectric hands. (3rd May 2017)
- Main Title:
- Deep learning-based artificial vision for grasp classification in myoelectric hands
- Authors:
- Ghazaei, Ghazal
Alameer, Ali
Degenaar, Patrick
Morgan, Graham
Nazarpour, Kianoush - Abstract:
- Abstract: Objective . Computer vision-based assistive technology solutions can revolutionise the quality of care for people with sensorimotor disorders. The goal of this work was to enable trans-radial amputees to use a simple, yet efficient, computer vision system to grasp and move common household objects with a two-channel myoelectric prosthetic hand. Approach . We developed a deep learning-based artificial vision system to augment the grasp functionality of a commercial prosthesis. Our main conceptual novelty is that we classify objects with regards to the grasp pattern without explicitly identifying them or measuring their dimensions. A convolutional neural network (CNN) structure was trained with images of over 500 graspable objects. For each object, 72 images, at 5 ∘ intervals, were available. Objects were categorised into four grasp classes, namely: pinch, tripod, palmar wrist neutral and palmar wrist pronated. The CNN setting was first tuned and tested offline and then in realtime with objects or object views that were not included in the training set. Main results . The classification accuracy in the offline tests reached 85 % for the seen and 75 % for the novel objects; reflecting the generalisability of grasp classification. We then implemented the proposed framework in realtime on a standard laptop computer and achieved an overall score of 84 % in classifying a set of novel as well as seen but randomly-rotated objects. Finally, the system was tested with twoAbstract: Objective . Computer vision-based assistive technology solutions can revolutionise the quality of care for people with sensorimotor disorders. The goal of this work was to enable trans-radial amputees to use a simple, yet efficient, computer vision system to grasp and move common household objects with a two-channel myoelectric prosthetic hand. Approach . We developed a deep learning-based artificial vision system to augment the grasp functionality of a commercial prosthesis. Our main conceptual novelty is that we classify objects with regards to the grasp pattern without explicitly identifying them or measuring their dimensions. A convolutional neural network (CNN) structure was trained with images of over 500 graspable objects. For each object, 72 images, at 5 ∘ intervals, were available. Objects were categorised into four grasp classes, namely: pinch, tripod, palmar wrist neutral and palmar wrist pronated. The CNN setting was first tuned and tested offline and then in realtime with objects or object views that were not included in the training set. Main results . The classification accuracy in the offline tests reached 85 % for the seen and 75 % for the novel objects; reflecting the generalisability of grasp classification. We then implemented the proposed framework in realtime on a standard laptop computer and achieved an overall score of 84 % in classifying a set of novel as well as seen but randomly-rotated objects. Finally, the system was tested with two trans-radial amputee volunteers controlling an i-limb Ultra TM prosthetic hand and a motion control TM prosthetic wrist; augmented with a webcam. After training, subjects successfully picked up and moved the target objects with an overall success of up to 88 % . In addition, we show that with training, subjects' performance improved in terms of time required to accomplish a block of 24 trials despite a decreasing level of visual feedback. Significance . The proposed design constitutes a substantial conceptual improvement for the control of multi-functional prosthetic hands. We show for the first time that deep-learning based computer vision systems can enhance the grip functionality of myoelectric hands considerably. … (more)
- Is Part Of:
- Journal of neural engineering. Volume 14:Number 3(2017:Jun.)
- Journal:
- Journal of neural engineering
- Issue:
- Volume 14:Number 3(2017:Jun.)
- Issue Display:
- Volume 14, Issue 3 (2017)
- Year:
- 2017
- Volume:
- 14
- Issue:
- 3
- Issue Sort Value:
- 2017-0014-0003-0000
- Page Start:
- Page End:
- Publication Date:
- 2017-05-03
- Subjects:
- myoelectric hand prosthesis -- convolutional neural network -- grasp classification
Neurosciences -- Periodicals
Biomedical engineering -- Periodicals
612.8 - Journal URLs:
- http://iopscience.iop.org/1741-2552/ ↗
http://ioppublishing.org/ ↗ - DOI:
- 10.1088/1741-2552/aa6802 ↗
- Languages:
- English
- ISSNs:
- 1741-2560
- Deposit Type:
- Legaldeposit
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library DSC - BLDSS-3PM
British Library STI - ELD Digital store - Ingest File:
- 11487.xml