Research Theme: Robotics




SARAS - Smart Autonomous Robotic Assistant Surgeon
Road Event Detection for Autonomous Driving
Artificial Intelligence for Autonomous Driving

The goal of SARAS is to develop a next-generation surgical robotic platform that allows a single surgeon (i.e., without the need for an expert assistant surgeon) to execute robotic minimally invasive surgery (R-MIS), thereby increasing the social and economic efficiency of a hospital while guaranteeing the same level of safety for patients. This platform is called solo-surgeon system.

Autonomous cars need to be able to understand complex, human centred environments. This requires, in particular, to recognise significant events taking place in the road scene in real time.
We are annotating a significant fraction of the Oxford RobotCar Dataset with actions and activity labels, together with bounding boxes indicating where the action/event is taking place in the image plane.

The project concerns the design and development of novel ways for robots and autonomous machines to interact with humans in a variety of emerging scenarios, including: human-robot interaction, autonomous driving, personal (virtual or robotic) assistants. In particular, we believe novel, disruptive applications of AI require much more sophisticated forms of communication between humans and machines, something that goes far beyond conventional explicit and linguistic exchange of information towards implicit non-verbal communication and understanding of each other's behaviour.

AVATAR - Towards emotional robot surrogates

In partnership with Cognitive Robotics group and the University of Malta, we seek to work towards designing 'emotional' robotic avatars able to both enhance remote personal presence experience, and provide realistic body surrogates for people with disabilities. The emotional state of an individual is detected and recognised in real-time in their natural environment, and communicated by the robot through appropriate facial expressions and bodily gestures enacted by the robot’s head, face, limbs and torso.