cse-honours

Find Your Voice: Use of Voice Assistant for Learning

Context Many children struggle to find their voice in social situations. They may be shy, suffer from social anxiety, be new to a culture (migrants) or have impairments in communication due to atypical development (e.g., autism spectrum disorder, speech or hearing disorders). The voices of these children often go unheard, as they find it hard to contribute to a conversation. The Find your Voice (FyV, http://wafa.johal.org/project/fyv/) project was initiated to investigate how joke telling could help children to speak up and gain confidence.

Human Action Recognition from AD Movies

Context Action Recognition is curcial for robots to perfoma around humans. Indeed, robot need to asses human action and intentions in order to assist them in everyday life tasks and to collaborative efficiently. The field of action recognition has aimed to use typical sensors found on robots to recognize agnts, objects and actions they are performing. Typical approach is to record a dataset of various action and label them. But often theses actions are not natural and it can be difficult to represent the variety of ways to perform actions with a lab built dataset.

I Spy - An AR Game with a Robot in Home Environment

Context In the context of the (HURL project)[https://wafa.johal.org/project/hurl/], we investigate various ways to enable novice users to train robots to perform everyday life tasks. In particular, we employ gamification and Augmented Reality (AR) to make the training more engaging for the human trainer. In this project, your goal will be to implement a game “I spy” to be played with a robot in a home environment. For this game, we would like to use AR with the Hololens 2

Machine Learning for Social Interaction Modelling

Context The field of social human-robot interaction is growing. Understanding how communication between humans (human-human) generalises during human-robot communication is crucial in building fluent and enjoyable interactions with social robots. Everyday, more datasets that feature social interaction between humans and between humans and robots are made freely available online. In this project we propose to take a data-driven approach to build predictive models of social interactions between humans (HH) ( and between humans and robots (HR) interaction using 3 different datasets.

Persuasive Robots - Exploring Behavioural Styles

Context Social robots are foreseen to be encountered in our everyday life, playing roles as assistant or companion (to mention a few). Recent studies have shown the potential harmful impacts of overtrust in social robotics [1], as robots may collect sensitive information without the user’s knowledge. Behavioural styles allow robots to express themselves differently within the same context. Given a specific gesture, keyframe manipulation can be used in order to generate style-based variation to the gesture.

Robot Writing

Context Handwriting is one of the most important motor skill we learn as human. Children with handwriting difficulties can find their academic impacted and often succeed less at school. In this project we propose to explore new methods for a robot to write using a pen. The project will use the Fetch Robot and aim to evaluate how a robot could learn handwriting trajectories using human demonstrations. Goals & Milestones During this project, the student will:

Tangible e-Ink Paper Interfaces for Learners

Context While digital tools are more and more used in classrooms, teachers’ common practice remains to use photocopied paper documents to share and collect learning exercises from their students. With the Tangible e-Ink Paper (TIP) system, we aim to explore the use of tangible manipulatives interacting with paper sheets as a bridge between digital and paper traces of learning. Featuring an e-Ink display, a paper-based localisation system and a wireless connection, TIPs are envisioned to be used as a versatile tool across various curriculum activities.

Tangible Human Swarm Interaction

Context: Visuo-Motor coordination problems can impair children in their academic achievements and in their everyday life. Gross visuo-motor skills, in particular, are required in a range of social and educational activities that contribute to children’s physical and cognitive development such as playing a musical instrument, ball-based sports or dancing. Children with visuo-motor coordination difficulties are typically diagnosed with developmental coordination disorder or cerebral palsy and need undergo physical therapy. The therapy sessions are often not engaging for children and conducted individually.

Tangible Robots for Collaborative Online Learning

Context: Online learning presents several advantages: decreasing cost, allowing more flexibility and access to far away training resources. However, studies have found that it also limits communications between peers and teachers, limits physical interactions and that it requires a big commitment on the student’s part to plan and stay assiduous in their learning. Goals & Milestones In this project, we aim to design and test a novel way to engage students in collaborative online learning by using haptic enabled tangible robots.

TeleOp Game - Explore Gamification of Robot Teleoperation

Context: Service robots are becoming more common in households. Tasks that they may be assigned include vacuuming the floor or grabbing objects from the top shelf. Augmented reality (AR) is one of the many interfaces the robot and the human user may communicate through. During teleoperation or robot training, human disengagement can be an issue (the task demonstrated by the human often repetitive). As a preliminary step, we aim to explore how gamification can affect human engagement in an AR game setting where the objective of the game is to complete a common household task (such as clearing the table).