SUITCEYES stands for Smart, User-friendly, Interactive, Tactual, Cognition-Enhancer that Yields Extended Sensosphere 

SUITCEYES proposes a new, intelligent, flexible and expandable mode of haptic communication via soft interfaces. Based on user needs and informed by disability studies, the project combines smart textiles, sensors, semantic technologies, image processing, face and object recognition, machine learning, and gamification. It will address three challenges: perception of the environment; communication and exchange of semantic content; learning and joyful life experiences. SUITCEYES will extract and map the inner structure of high-dimensional, environmental and linguistic clues to low-dimensional spaces, which then translate to haptic signals. It will also utilize image processing, mapping environmental data to be used for enriched semantic reasoning. SUITCEYES’ intelligent haptic interface will help the users to learn activation patterns via a new medium. With this interface, users will be able to take more active part in society, improving possibilities for inclusion in social life and employment.

The solution will be developed in a user-cantered iterative design process, with frequent evaluations and optimizations. The users’ learning experiences will be enriched through gamification and mediated social interactions. The proposed solution will take into account the potential differences in levels of impairments and user capabilities and adapt accordingly.

Please Accept your cookies to watch this content.


Researchers/University employees

Research groups





Please Accept your cookies to watch this content.

SUITCEYES Documentary