Information Terminal For the Disabled (Engelliler Için Bilgi Terminali)

In this project, we have conducted research on gesture and speech recognition methodologies to aid the communication hearing impaired and visually impaired individuals, with the support of the Scientific and Technological Research Council of Turkey. The main objective of this project is to design and implement an information terminal system that can translate fingerspelling to speech and vice versa, by using recognition and synthesis techniques for each modality.Through the use of multimodal input and output methods, the information terminal acts as a communication medium between deaf and blind people. The system converts fingerspelled words to speech and vice versa using fingerspelling recognition, fingerspelling synthesis, speech recognition and speech synthesis in Czech, Russian and Turkish Languages. Throughout this project, we have worked in the following areas:

  • Development of Multi-Lingual Fingerspelling databases for Czech, Russian and Turkish Languages
  • Research on the analysis of hand gestures from videos and the development of a prototype of a fingerspelling recognizer for Czech, Russian and Turkish Languages
  • Research on the development of a fingerspelling sytnhesis avatar through using motion capture data
  • The development of a multimodal information kiosk and the integration of fingerspelling recognition, fingerspelling synthesis, speech recognition and speech synthesis modules.

The output of the project is in the form of theses, reports, journal papers and conference papers. In addition, we have produced databases and demonstrator programs.

This work is supported by RFBR and TÜBITAK foundations in the framework of the bilateral Russian-Turkish project (# 09-07-91220 / 108E113), and by the Grant of the President of Russia (# MK-64898.2010.8).

Enterface Information

  • The integration part of the project was completed at the eNTERFACE’10 workshop in Amsterdam, Netherlands. Results of the “Automatic Fingersign to Speech Translator” project are available at the in eNTERFACE'10 Final Presentation page

Some sample figures and screenshots from the software developed

To download the demo video click on the image below:

Publications related to the project

Theses

  • Turan Can Gürel, “TURKISH SIGN LANGUAGE ANIMATION WITH ARTICULATED BODY MODEL”,MSc Thesis, Bogazici University, 2010

Journal Articles

Proceedings in International Conferences

  • A.A.Kindiroglu, H.Yalcin, O.Aran, M.Hruz, P.Campr, L.Akarun, A.Karpov, “A Multi-Lingual Fingerspelling Recognition for Handicapped Kiosk”, PRIA2010, Pattern Recognition and Image Analysis: New Information Technologies, St. Petersburg, December 2010, accepted.
  • Pavel Campr, Erinç Dikici, Marek Hruz, Alp Kindiroglu, Zdenek Krnoul, Alexander Ronzhin, Hasim Sak, Daniel Schorno, Lale Akarun, Oya Aran, Alexey Karpov, Murat Saraclar, Milos Zelezny, “Automatic Fingersign to Speech Translator”, Proceedings of eNTERFACE 2010, The Summer Workshop on Multimodal Interfaces, 2010
  • Oya Aran, Cem Keskin, Lale Akarun, “Sign Language Tutoring Tool”, EUSIPCO’05, Antalya, September 2005.
  • Cem Keskin, Oya Aran, Lale Akarun, “Real Time Gestural Interface For Generic Applications”, EUSIPCO’05, Antalya, September 2005.

Proceedings in Local Conferences

Database