The ability to conduct daily communication is of critical importance to individuals as well as society. People with complex communication needs do not possess the necessary cognitive abilities and/or motor skills to conduct verbal conversation. They often rely on AAC (which are approaches other than speech and texts for compensating communication impairments) to supplement their verbal interactions. They also need to undergo regular treatments and interventions to acquire non-verbal-based daily communication skills.
The aim of this research project is to apply recent and continuous research breakthroughs in AI and robotics to support non-verbal language usage for people with complex communication needs – through the design, implementation, and validation of new AI-based speech & language technologies for universal usability, especially in users with communicative impairments. Our objectives include:
In this demonstration, we show one of our AAC client applications – which is targeted at users with cerebral palsy and normal intelligence. We have integrated a Tobii eye tracker into our application as an eye-gazing-based input method; so that users with severe physical limitations can express themselves by eye movement. The client application is connected to our cloud AAC server, which hosts a language model and provides intelligent AAC symbols recommendations based on the patterns of users’ input. We also collaborate with a local company to develop AAC clients on social robots for young children with special needs.