Profile
International Journal of Computer & Software Engineering Volume 3 (2018), Article ID 3:IJCSE-130, 7 pages
https://doi.org/10.15344/2456-4451/2018/130
Research Article
Object Shape Classification Using Spatial Information in Myoelectric Prosthetic Control

Ryusei Shima1, Yunan He1*, Osamu Fukuda1, Nan Bu2, Hiroshi Okumura1 and Nobuhiko Yamaguchi1

1Department of Information Science, Saga University, Saga, Japan
2Department of Control and Information Systems Engineering, NIT, Kumamoto College, Kumamoto, Japan
Yunan He, Department of Information Science, Saga University, Saga, Japan; E-mail: heyunan@live.com
31 January 2018; 14 March 2018; 16 March 2018
Shima R, He Y, Fukuda O, Bu N, Okumura H, et al. (2018) Object Shape Classification Using Spatial Information in Myoelectric Prosthetic Control. Int J Comput Softw Eng 3: 130. doi: https://doi.org/10.15344/2456-4451/2018/130

References

  1. Farina D, Jiang N, Rehbaum H, Holobar A, Graimann B, et al. (2014) The extraction of neural information from the surface EMG for the control of upper-limb prostheses: emerging avenues and challenges. IEEE Transactions on Neural Systems and Rehabilitation Engineering 22: 797- 809. View
  2. Fukuda O, Tsuji T, Kaneko M, Otsuka A (2003) A human-assisting manipulator teleoperated by EMG signals and arm motions. IEEE Transactions on Robotics and Automation 19: 210-222. View
  3. Parker P, Englehart K, Hudgins B (2006) Myoelectric signal processing for control of powered limb prostheses. Journal of Electromyography and Kinesiology 16: 541-548. View
  4. Oskoei M A, Hu H (2007) Myoelectric control systems-A survey. Biomedical Signal Processing and Control 2: 275-294. View
  5. Atzori M, Cognolato M, Muller H (2016) Deep Learning with Convolutional Neural Networks Applied to Electromyography Data: A Resource for the Classification of Movements for Prosthetic Hands. Front Neurorobot 10: 9. View
  6. Xie HB, Guo T, Bai S, Dokos S (2014) Hybrid soft computing systems for electromyographic signals analysis: a review. Biomedical Engineering Online 13: 8. View
  7. Fougner A, Stavdahl O, Kyberd PJ, Losier YG, Parker PA (2012) Control of upper limb prostheses: terminology and proportional myoelectric control-a review. IEEE Transactions on neural systems and rehabilitation engineering 20: 663-677. View
  8. Fang Y, Hettiarachchi N, Zhou D, Liu H (2015) Multi-Modal Sensing Techniques for Interfacing Hand Prostheses: A Review. IEEE Sensors Journal 15: 6065-6076. View
  9. Ohnishi K, Kajitani I, Morio T, Takagi T (2013) Multimodal sensor controlled three Degree of Freedom transradial prosthesis. Proc IEEE Int Conf Rehabilitation Robotics. View
  10. Xiloyannis M, Gavriel C, Thomik AAC, Faisal AA (2017) Gaussian Process Autoregression for Simultaneous Proportional Multi-Modal Prosthetic Control With Natural Hand Kinematics. IEEE Transactions on Neural Systems and Rehabilitation Engineering 25: 1785-1801. View
  11. Fukuda O, Takahashi Y, Bu N, Okumura H, Arai K, et al. (2017) Development of an IoT-based Prosthetic Control System. Journal of Robotics and Mechatronics 29: 1049-1056 View
  12. Trachtenberg MS, Singhal G, Kaliki R, Smith RJ, Thakor NV, et al. (2011) Radio frequency identification-An innovative solution to guide dexterous prosthetic hands. Proc Int Conf IEEE Engineering in Medicine and Biology Society 2011: 3511-3514. View
  13. Dosen S, Cipriani C, Kostic M, Controzzi M, Carrozza MC, et al. (2010) Cognitive vision system for control of dexterous prosthetic hands: experimental evaluation. Journal of Neuroengineering and Rehabilitation 7: 42. View
  14. Markovic M, Dosen S, Popovic D, Graimann B, Farina D (2015) Sensor fusion and computer vision for context-aware control of a multi degree-offreedom prosthesis. J Neural Engineering 12: 066022. View
  15. Bando Y, Bu N, Fukuda O, Okumura H, Arai K (2017) Object classification using a deep convolutional neural network and its application to myoelectric hand control. Proc 22nd Int Symp on Artificial Life and Robotics.
  16. Bu N, Bandou Y, Fukuda O, Okumura H, Arai K (2017) A semi-automatic control method for myoelectric prosthetic hand based on image information of objects. Proc Int Conf Artificial Intelligence, Robotics and Human- Computer Interaction. View
  17. DeGol J, Akhtar A, Manja B, Bretl T (2016) Automatic grasp selection using a camera in a hand prosthesis. Proc IEEE 38th Ann Int Conf Engineering in Medicine and Biology Society 2016: 431-434. View
  18. Ghazaei G, Alameer A, Degenaar P, Morgan G, Nazarpour K (2017) Deep learning-based artificial vision for grasp classification in myoelectric hands. J Neural Eng 14: 036025. View
  19. LeCun Y, Bengio Y, Hinton G (2015) Deep learning. Nature 521: 436-444 View
  20. Eitel A, Springenberg JT, Spinello L, Riedmiller M, Burgard W (2015) Multimodal deep learning for robust RGB-D object recognition. Proc IEEE/ RSJ Int Conf Intelligent Robots and Systems. View
  21. Rahman MM, Tan Y, Xue J, Lu K (2017) RGB-D object recognition with multimodal deep convolutional neural networks. Proc IEEE Int Conf Multimedia and Expo. View
  22. Zia S, Yuksel B, Yuret D, Yemez Y (2017) RGB-D object recognition using deep convolutional neural networks. Proc IEEE Int Conf Computer Vision. View
  23. Krizhevsky A, Sutskever I, Hinton GE (2012) Image Net classification with deep convolutional neural networks. Adv Neural Inf Process Syst 25: 1097- 1105. View
  24. Ioffe S, Szegedy C (2015) Batch normalization: accelerating deep network training by reducing internal covariate shift. Proc Int Conf Machine Learning. View
  25. Jia Y, Shelhamer E, Donahue J, Karayev S, Long J, et al. (2014) Caffe: Convolutional Architecture for Fast Feature Embedding. Proc ACM Int Conf Multimedia. View