Geometry and handwriting rely heavily on the visual representation of basic shapes. It can become challenging for students with visual impairments to perceive these shapes and understand complex spatial constructs. For instance, knowing how to draw …
Presented our work on the design of tangible devices for classroom orchestration.
I gave a short talk at the HRI Mini symposium on Orchestration of Robot Populated Classrooms, a recent work started with Sina Shahmoradi, Jauwairia Nasir and Utku Norman
Advancing intuitive human-machine interaction with human-like social capabilities for education in schools
Tangible Haptic-Enabled Swarm for Learners
Context: Visuo-Motor coordination problems can impair children in their academic achievements and in their everyday life. Gross visuo-motor skills, in particular, are required in a range of social and educational activities that contribute to children’s physical and cognitive development such as playing a musical instrument, ball-based sports or dancing. Children with visuo-motor coordination difficulties are typically diagnosed with developmental coordination disorder or cerebral palsy and need undergo physical therapy. The therapy sessions are often not engaging for children and conducted individually.
Context: Online learning presents several advantages: decreasing cost, allowing more flexibility and access to far away training resources. However, studies have found that it also limits communications between peers and teachers, limits physical interactions and that it requires a big commitment on the student’s part to plan and stay assiduous in their learning.
Goals & Milestones In this project, we aim to design and test a novel way to engage students in collaborative online learning by using haptic enabled tangible robots.
Context Natural language is an important part of communication since it offers an intuitive and efficient way of conveying ideas to another individual. Enabling robots to efficiently use language is essential for human-robot collaboration. In this project, we aim to develop an interface between a dialog manager (i.e. DialogFlow) and ROS (Robotics Operating System). By doing this, we will be able to use the powerful dialogue systems in human-robot interaction scenario.