Implementing Gesture Recognition in a Sign Language Learning Application

Daphne Tan, Kevin Meehan

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Citations (Scopus)

Abstract

Artificial Intelligence (AI) has become increasingly prevalent in contemporary times. It has a wide variety of application areas which can almost replicate tasks that humans would normally perform. Many companies that are using this form of technology are making efficiencies by replacing humans with AI agents. However, researchers are still making efforts to find ways to enhance artificial intelligence to be more 'human-like'. Gesture recognition is a form of human computer interaction in which AI has the potential to improve. Similar to humans, AI has the ability to 'see' and recognise gestures. Sign language is a language that a small proportion of the human population know and use. However, it is slowly gaining popularity and more resources are being provided in order to learn the language. Some people tend to go for in-person classes whereas others tend to go online or use applications to self-learn. This research discovers the success of technology such as gesture recognition to assist in learning sign language. The main research aim is to determine whether gesture recognition can assist self-learners in learning the language. The research has explored the use of Convolutional Neural Networks (CNN) to detect shapes that represent sign language form. The research demonstrated different accuracies based on a small sample size of 10 participants using three different types of datasets: non pre-processed, skin mask, and Sobel filtered images. The CNN model trained with the skin mask dataset was overall the most suitable model in identifying gestures from images; however, the CNN model trained with the non pre-processed dataset was slightly more accurate in recognising the American Sign Language (ASL) gestures in realtime. All CNN models demonstrated accuracy levels above 70% proving that the CNN has the ability to recognise sign language gestures.

Original languageEnglish
Title of host publication2020 31st Irish Signals and Systems Conference, ISSC 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728194189
DOIs
Publication statusPublished - Jun 2020
Event31st Irish Signals and Systems Conference, ISSC 2020 - Letterkenny, Ireland
Duration: 11 Jun 202012 Jun 2020

Publication series

Name2020 31st Irish Signals and Systems Conference, ISSC 2020

Conference

Conference31st Irish Signals and Systems Conference, ISSC 2020
Country/TerritoryIreland
CityLetterkenny
Period11/06/2012/06/20

Keywords

  • Computer Vision
  • Convolutional Neural Network
  • Gesture Recognition
  • Sign Language

Fingerprint

Dive into the research topics of 'Implementing Gesture Recognition in a Sign Language Learning Application'. Together they form a unique fingerprint.

Cite this