Interpretación de expresiones faciales en adultos mayores utilizando la visión artificial del robot humanoide NAO

The project deals with the development of a robot human interaction application between an older adult and the humanoid robot NAO belonging to the laboratories of the Salesian Polytechnic University, in order to recognize facial expressions and according to these program activities of movement routi...

Descripción completa

Autor Principal: Calvopiña Iglesias, Francisco Rafael
Otros Autores: Valladares Romero, Pedro Esteban
Formato: bachelorThesis
Idioma: spa
Publicado: 2017
Materias:
Acceso en línea: http://dspace.ups.edu.ec/handle/123456789/14020
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
Sumario: The project deals with the development of a robot human interaction application between an older adult and the humanoid robot NAO belonging to the laboratories of the Salesian Polytechnic University, in order to recognize facial expressions and according to these program activities of movement routines and basic dialogues with the adult higher. The application starts with a graphical interface developed in wxPython where a face is saved and enter personal data, then continue with the recognition of facial expressions and the classification of activities that was developed in Python. For the design of the activities, the needs of the elderly were investigated, and the therapies that could be performed with the robot were analyzed, such as reality-oriented, reminiscence and playful therapies that were developed with artificial vision libraries and recognition of voice. The activities that were developed in Python are songs, stories, exercises, phrases and jokes, but Choregraphe was also used in the activity of recognizing figures, movements and dances. At the end of each attendance session the robot NAO send personal data, facial expressions and activities developed between the robot and the older adult via e-mail to a person in charge of the older adult and locally by touching the robot's head