User trainable sign language to speech glove using KNN classifier

V. Shwetha, Vijayalaxmi, Dhanin Anoop Asarpota, Himanshu Verma

Research output: Contribution to journalArticle

Abstract

A sizable population around the world has some form of hearing or speaking disability. This creates a communication barrier among them and the rest of the world. Sign language was introduced to bridge this gap. The objective is to design a glove that can help translate sign language to text and that can be trained by the user itself if required. To achieve this a glove was designed using five flex sensors, three contact sensors and an accelerometer. Flex sensors were chosen as they are resistive devices that change resistance when bent. Due to their compactness, they can also be easily put on to a glove along with contact sensors and an accelerometer. The data from these sensors is then fed to an Arduino where it is read and processed before being sent to MATLAB via Bluetooth. After getting the values, the smart gesture detection algorithm must be designed so as to improve accuracy. To do this the data from the Arduino is first used to train a KNN model for classification. The model created after training is then used for classification of the gestures. A GUI was designed with control signals that allows the user to make a word from these gestures and then the word is converted to speech. The glove accurately gives us data points that can be used to classify various gestures of the American Sign Language. The interactive GUI developed in MATLAB enables a user to easily use the glove to make and/or edit a word created by the user and then recite it out on a speaker.

Original languageEnglish
Pages (from-to)3053-3058
Number of pages6
JournalCompusoft
Volume8
Issue number2
Publication statusPublished - 01-01-2019
Externally publishedYes

Fingerprint

Contact sensors
Classifiers
Graphical user interfaces
Accelerometers
MATLAB
Sensors
Bluetooth
Audition
Communication

All Science Journal Classification (ASJC) codes

  • Computer Science(all)

Cite this

Shwetha, V., Vijayalaxmi, Asarpota, D. A., & Verma, H. (2019). User trainable sign language to speech glove using KNN classifier. Compusoft, 8(2), 3053-3058.
Shwetha, V. ; Vijayalaxmi ; Asarpota, Dhanin Anoop ; Verma, Himanshu. / User trainable sign language to speech glove using KNN classifier. In: Compusoft. 2019 ; Vol. 8, No. 2. pp. 3053-3058.
@article{7f10f516022248968285e21b20f9f4f2,
title = "User trainable sign language to speech glove using KNN classifier",
abstract = "A sizable population around the world has some form of hearing or speaking disability. This creates a communication barrier among them and the rest of the world. Sign language was introduced to bridge this gap. The objective is to design a glove that can help translate sign language to text and that can be trained by the user itself if required. To achieve this a glove was designed using five flex sensors, three contact sensors and an accelerometer. Flex sensors were chosen as they are resistive devices that change resistance when bent. Due to their compactness, they can also be easily put on to a glove along with contact sensors and an accelerometer. The data from these sensors is then fed to an Arduino where it is read and processed before being sent to MATLAB via Bluetooth. After getting the values, the smart gesture detection algorithm must be designed so as to improve accuracy. To do this the data from the Arduino is first used to train a KNN model for classification. The model created after training is then used for classification of the gestures. A GUI was designed with control signals that allows the user to make a word from these gestures and then the word is converted to speech. The glove accurately gives us data points that can be used to classify various gestures of the American Sign Language. The interactive GUI developed in MATLAB enables a user to easily use the glove to make and/or edit a word created by the user and then recite it out on a speaker.",
author = "V. Shwetha and Vijayalaxmi and Asarpota, {Dhanin Anoop} and Himanshu Verma",
year = "2019",
month = "1",
day = "1",
language = "English",
volume = "8",
pages = "3053--3058",
journal = "Compusoft",
issn = "2320-0790",
publisher = "National Institute of Science Communication and Information Resources (NISCAIR)",
number = "2",

}

Shwetha, V, Vijayalaxmi, Asarpota, DA & Verma, H 2019, 'User trainable sign language to speech glove using KNN classifier', Compusoft, vol. 8, no. 2, pp. 3053-3058.

User trainable sign language to speech glove using KNN classifier. / Shwetha, V.; Vijayalaxmi; Asarpota, Dhanin Anoop; Verma, Himanshu.

In: Compusoft, Vol. 8, No. 2, 01.01.2019, p. 3053-3058.

Research output: Contribution to journalArticle

TY - JOUR

T1 - User trainable sign language to speech glove using KNN classifier

AU - Shwetha, V.

AU - Vijayalaxmi,

AU - Asarpota, Dhanin Anoop

AU - Verma, Himanshu

PY - 2019/1/1

Y1 - 2019/1/1

N2 - A sizable population around the world has some form of hearing or speaking disability. This creates a communication barrier among them and the rest of the world. Sign language was introduced to bridge this gap. The objective is to design a glove that can help translate sign language to text and that can be trained by the user itself if required. To achieve this a glove was designed using five flex sensors, three contact sensors and an accelerometer. Flex sensors were chosen as they are resistive devices that change resistance when bent. Due to their compactness, they can also be easily put on to a glove along with contact sensors and an accelerometer. The data from these sensors is then fed to an Arduino where it is read and processed before being sent to MATLAB via Bluetooth. After getting the values, the smart gesture detection algorithm must be designed so as to improve accuracy. To do this the data from the Arduino is first used to train a KNN model for classification. The model created after training is then used for classification of the gestures. A GUI was designed with control signals that allows the user to make a word from these gestures and then the word is converted to speech. The glove accurately gives us data points that can be used to classify various gestures of the American Sign Language. The interactive GUI developed in MATLAB enables a user to easily use the glove to make and/or edit a word created by the user and then recite it out on a speaker.

AB - A sizable population around the world has some form of hearing or speaking disability. This creates a communication barrier among them and the rest of the world. Sign language was introduced to bridge this gap. The objective is to design a glove that can help translate sign language to text and that can be trained by the user itself if required. To achieve this a glove was designed using five flex sensors, three contact sensors and an accelerometer. Flex sensors were chosen as they are resistive devices that change resistance when bent. Due to their compactness, they can also be easily put on to a glove along with contact sensors and an accelerometer. The data from these sensors is then fed to an Arduino where it is read and processed before being sent to MATLAB via Bluetooth. After getting the values, the smart gesture detection algorithm must be designed so as to improve accuracy. To do this the data from the Arduino is first used to train a KNN model for classification. The model created after training is then used for classification of the gestures. A GUI was designed with control signals that allows the user to make a word from these gestures and then the word is converted to speech. The glove accurately gives us data points that can be used to classify various gestures of the American Sign Language. The interactive GUI developed in MATLAB enables a user to easily use the glove to make and/or edit a word created by the user and then recite it out on a speaker.

UR - http://www.scopus.com/inward/record.url?scp=85063724750&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85063724750&partnerID=8YFLogxK

M3 - Article

VL - 8

SP - 3053

EP - 3058

JO - Compusoft

JF - Compusoft

SN - 2320-0790

IS - 2

ER -

Shwetha V, Vijayalaxmi, Asarpota DA, Verma H. User trainable sign language to speech glove using KNN classifier. Compusoft. 2019 Jan 1;8(2):3053-3058.