Using the Pepper Robot to Support Sign Language Communication

Agentic AI
Published: arXiv: 2509.09889v1
Authors

Giulia Botta Marco Botta Cristina Gena Alessandro Mazzei Massimo Donini Alberto Lillo

Abstract

Social robots are increasingly experimented in public and assistive settings, but their accessibility for Deaf users remains quite underexplored. Italian Sign Language (LIS) is a fully-fledged natural language that relies on complex manual and non-manual components. Enabling robots to communicate using LIS could foster more inclusive human robot interaction, especially in social environments such as hospitals, airports, or educational settings. This study investigates whether a commercial social robot, Pepper, can produce intelligible LIS signs and short signed LIS sentences. With the help of a Deaf student and his interpreter, an expert in LIS, we co-designed and implemented 52 LIS signs on Pepper using either manual animation techniques or a MATLAB based inverse kinematics solver. We conducted a exploratory user study involving 12 participants proficient in LIS, both Deaf and hearing. Participants completed a questionnaire featuring 15 single-choice video-based sign recognition tasks and 2 open-ended questions on short signed sentences. Results shows that the majority of isolated signs were recognized correctly, although full sentence recognition was significantly lower due to Pepper's limited articulation and temporal constraints. Our findings demonstrate that even commercially available social robots like Pepper can perform a subset of LIS signs intelligibly, offering some opportunities for a more inclusive interaction design. Future developments should address multi-modal enhancements (e.g., screen-based support or expressive avatars) and involve Deaf users in participatory design to refine robot expressivity and usability.

Paper Summary

Problem
The main problem this research paper addresses is the lack of support for sign language communication, particularly for the Deaf community. Sign languages are natural languages with their own grammars and vocabularies, but they are often not recognized or supported in everyday life. This can lead to communication barriers and social isolation for Deaf individuals.
Key Innovation
This paper presents a novel approach to supporting sign language communication using a Pepper robot. The robot is designed to learn and mimic sign language gestures, allowing it to communicate with Deaf individuals in a more natural and intuitive way. The researchers focus on Italian Sign Language (LIS) and its vocabulary, and develop a system that can recognize and generate LIS signs.
Practical Impact
This research has the potential to improve communication and social interactions between Deaf individuals and the broader community. By providing a machine that can understand and generate sign language, Deaf individuals can more easily access information, services, and social connections. This could lead to greater inclusion and equality for Deaf people in education, employment, and other areas of life.
Analogy / Intuitive Explanation
Imagine having a personal assistant that can understand and respond to your body language, rather than just your voice. The Pepper robot is like a virtual interpreter that can learn and mimic sign language gestures, allowing Deaf individuals to communicate more easily with others. It's like having a new way to express yourself and connect with others, using a language that's natural and intuitive for you.
Paper Information
Categories:
cs.RO cs.HC
Published Date:

arXiv ID:

2509.09889v1

Quick Actions