CANPAN ブログ検索
Loading
  • もっと見る
«The Nippon Foundation Launches World’s First Online Sign Language Learning Game SignTown (1) | Main | The Nippon Foundation Surpasses 2 Million Mark in Free COVID-19 PCR Testing for Caregivers in Tokyo Area»
Blog Profile.jpg
Yohei Sasakawa
Profile
Twitter
Google
this blog www
<< 2022年05月 >>
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30 31        
What's New?
Categories
Monthly Archive
Comments
Links
https://blog.canpan.info/yoheisasakawa/index1_0.rdf
https://blog.canpan.info/yoheisasakawa/index2_0.xml
The Nippon Foundation Launches World’s First Online Sign Language Learning Game SignTown (2) [2021/10/25]
SignTown was developed as an easy and enjoyable way for both deaf and hearing people to learn and experience sign language. We hope people will be encouraged to work on their sign language and use it in their daily lives. In addition to being able to act as interpreters in social and work situations, this could encourage more people such as doctors, teachers and store employees to communicate with their patients, students and customers who are deaf using sign language.

I believe the launch of SignTown is a great step in the direction of a more inclusive society as it will lay a solid foundation for the further development of a sign language recognition model while, at the same time, raising public awareness about sign language and promoting the social inclusion of the deaf community.

Players are supposed to make signs in front of a camera to complete every task required in relation to daily activities, such as packing their bags for a trip, finding a hotel to stay in, or ordering food at a café.

In response, the AI-powered recognition model will give immediate feedback on their signing accuracy. Cute hand-shaped characters scattered throughout the game will also explain to users the concepts of sign language and deaf culture.  

In this way, both hearing and deaf people can learn sign culture and deaf culture, and sign languages in Japan and Hong Kong, in a fun and relaxing manner.

Previous models of sign language recognition have not yielded a satisfactory accuracy rate because linguistic analysis of sign language has not yet been fully utilized in analyzing the visual-gestural language data.

In sign language, apart from hand movements, other gestural information such as body movements, facial expressions, head positions and movements, and mouth shapes play an equally important role in grammar. Exclusion of any of these parameters in a sign, a phrase or a clause could result in ungrammatical or uninterpretable messages.

While sign languages vary from one country to another, phonetic features like handshapes, orientations and movements are universal, and the number of possible combinations is finite, hence recognition models are possible.  

Our project team has successfully constructed the first machine-learning-based model that can recognize 3D sign language movements, and track and analyze hand and body movements as well as facial expressions using a standard camera.

The next move in the project is to generate a sign dictionary that not only incorporates a search function but also provides a virtual platform to facilitate sign language learning and documentation based on AI technology.

The Nippon Foundation’s ultimate goal is to develop an automatic translation model that can recognize natural conversations in sign language and convert them into spoken language using the cameras of commonly used computers and smartphones.

I am hopeful that people will play with SignTown and someday be able to use sign language in their daily lives, lowering barriers to employment for persons who are deaf and hard of hearing.

Now that the Tokyo Paralympics Games this summer have given added momentum to global efforts toward a more inclusive society, it would be wonderful if more deaf and hearing people to learn and experience sign language with SignTown.

To try “SignTown”, please go to: https://signtown.org/

(End)
Comment
Comments