A Real-Time Sign-Language Learning Tool
Fathima Nuzla Ismail (Asia Pacific Institute of Information Technology)
An estimated 360 million people worldwide suffer from hearing loss. Sign language is the primary language for many hearing impaired. Human sign-language interpreters are not always available. Since most people do not understand sign language, making it almost impossible to have daily-life communication with people who are deaf or mute. The researchers have examined the potential of input sensors such as data gloves or special cameras are used to recognize sign language. The former provide good recognition results but are inconvenient to wear and have proven too expensive for mass use. And web cameras struggle to cope with issues such as tricky real-world backgrounds or illumination when not under controlled conditions that enable accurate hand tracking. A real time sign language learning too for Sri Lanka is presented by this poster which address the deaf community in Sri Lanka which could express their words through sign language, and also the system makes an effort to understand it and digitized to it to text and speech in real time. The research is based on gesture recognition system using the Support Vector Machine and Hidden Markov Models for identifying static and dynamic gestures respectively.