Theory of Mind in Human Robot Interactions

RoboRabbitLabs

Challenges and opportunities that human perception provides to robotic design

This is an essay that I wrote for a course in 2015.

Navigating a social world with robot partners: A quantitative cartography of the Uncanny Valley

Introduction

In this essay I will describe some aspects of human perception and cognition which may be considered to have an impact on human-robot interaction. I would like to discuss the findings on the Uncanny Valley as described in the provided article, Navigating a social world with robot partners: A quantitative cartography of the Uncanny Valley (Mathur and Reichling, 2015) and the parallels we might draw to other aspects of human-robot interaction. I will consider the application of theory-of-mind to the uncanny valley concept. Furthermore, I would like to discuss the possibility of robots making use of human cognitive and perceptive limitations and biases to pursue their goals more effectively.

Perception

Perception is the…

View original post 2,015 more words

Robots spotted in the wild: Bellabot by Pudu

RoboRabbitLabs

In hindsight, I regret not blogging more about robots I’ve discovered on my travels and in working situations. This is where my joy and enthusiasm about the robots really gets fuelled, when I interact with them spontaneously. Recently with friends at our favourite Asian restaurant (Mchi in Ijburg – FYI the food was great and each plate left empty!), we encountered some Bellabots working as assistants to the restaurant staff. I was with my buddy Vikram Radhakrishnan, who is also crazy about robots and my partner Renze de Vries, who has made quite a few robots himself – check out his Youtube channel here). We worked on the Anki Vector project together back in 2019 – time flies when you’re in lockdown. Imagine our excitement to find these amazing robots serving dinner to patrons as if it was the most natural thing in the world! That really made…

View original post 164 more words

Emotions in Robots – Prof Hatice Gunes for the Royal Institute

RoboRabbitLabs

I recently subscribed to the Royal Institute lecture series, and when I am able to catch some of the lectures, the content and moderation is always incredibly good. You can watch the Royal Institute’s videos on Science here on Youtube.

Go to their website to watch some of these talks live.

Lately, I saw an excellent lecture on Creating Emotionally Intelligent Technology by Professor Hatice Gunes, the Leader of the Affective Intelligence & Robotics Lab at the University of Cambridge.

Here is a link to the recording if you’d like to watch the video yourself: Vimeo video link

To start off, Prof Gunes does an amazing job of introducing emotions in technology, the work up to now, and why we’d want to achieve this goal. She then covers the work that her own lab has been doing to take the field further, and this is quite interesting.

View original post 318 more words