Socially Assistive Robots and Embodied Conversational Agents for children with Autism Spectrum Disorders and Language Disorders to rehabilitate their linguistic skills

In the last years, researchers and therapists have pinpointed a number of critical aspects in current speech-language interventions. Several studies have explored the use of technology to overcome these barriers and to support speech-therapy in children with language impairments (e.g. DLD and ASD). We propose a conceptual framework for designing linguistic activities (for assessment and training), based on advances in psycholinguistics. Moving from this theoretical framework, we identified a development process – from the UX design to coding of activities – which is based on a novel set of Design Patterns at multiple layers of abstraction. We then put this framework into practice by implementing these patterns into two technological solutions – tablet and robots – and performing an empirical study to evaluate their benefits.
Modeling Empathy for Socially Assistive Robots

A SAR system should have the capacity to place itself into a user’s emotional position (or another agent) and behave, taking such emotional understanding into account. In other words, a SAR should act like an «empathetic agent» not only in terms of a target of empathy (triggers empathy in the user) but also as an observer (empathizes with the user). This research investigates the role of SAR as a target of empathy. It evaluates the influence of narrative choice (first-person and third-person narrative voices) on the user’s empathy. With this purpose in mind, we conducted an empirical study to examine and compare the user behavior during storytelling told by a SAR in first- or third-person narrative voice. Also, we collected video and audio data for modeling empathic behaviors during storytelling interactions.
HARMONI for composing social interaction with Socially Assistive Robots
The research and development of socially interactive robots is a complex challenge because of the wide variety ofcapabilities needed for effective social human-robot interactions (HRI). Many of these capabilities, including perception, dialog, and control, have state of the art methods and solutions, but combining those into a comprehensive and seamless interactionis still an open challenge. We describe HARMONI, a multi-modal, open-source tool for rapid social HRI development and deployment. HARMONI is centered around a ROS package for interaction development, including decision management and node orchestration. HARMONI systematically integrates with disparate functionalities needed to conduct a meaningful social human-robot interaction such as external cloud services, AI models, and modules for sensing, planning, and acting on avariety of platforms. HARMONI was applied to the QT robot platform and usability tests were conducted to evaluate the ease and speed of development and deployment.