Fingerspelling Classification for Robot Control

Kevin McCready, Sonya Coleman, Dermot Kerr, Nazmul Siddique, Emmett Kerr, Yiannis Aloimonos, Cornelia Fermüller

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Improvements to human-robot interaction methods could increase the ease of use of robots in manufacturing environments. Many of these environments are noisy and therefore preclude the use of audio communication between humans or in human-robot interactions. Therefore, this paper proposes using a gesture based communication system for robot control. To that end, the VGG16 and VGG19 convolutional neural network (CNN) structures are used for gesture classification along with 3 datasets of American Sign Language (ASL) fingerspelling images. The model performance is evaluated, and modifications made to their parameters to improve performance, before applying them to robot control tasks. The results show that with parameter tuning, test accuracies of up to, 100% are achievable.
Original languageEnglish
Title of host publication2024 IEEE 22nd International Conference on Industrial Informatics (INDIN)
PublisherIEEE
Pages1-6
Number of pages6
ISBN (Electronic)979-8-3315-2747-1
ISBN (Print)979-8-3315-2748-8
DOIs
Publication statusPublished online - 12 Dec 2024

Publication series

Name
ISSN (Print)1935-4576
ISSN (Electronic)2378-363X

Fingerprint

Dive into the research topics of 'Fingerspelling Classification for Robot Control'. Together they form a unique fingerprint.

Cite this