Ohbot and Microsoft Cognitive Services
We took two Ohbots along to an evening event at the Manchester Museum of Science and Industry. One was controlled by me, typing as fast as I could from a laptop behind a screen. The other was hooked up to Microsoft Cognitive Services.
Ohbot is a robot head that’s usually used for teaching children to code with a "Scratch"-like block programming interface. At the Museum event we programmed our Ohbots to speak and move in response to what they were “seeing” through their cameras.
For the manual Ohbot, the eye and head movements were controlled using the motion sensors on a micro:bit which was attached to my knee with duct tape. For the autonomous Ohbot the eyes were programmed to follow the movements of the visitor and the head followed the movement of the eyes.
Both Ohbots had lip synchronisation code running. The manual robot was saying what I was typing and the autonomous one was selecting phrases from a database matched by age, gender, emotion, hair colour, makeup and facial hair.
There’s a film of the results here. You’ll see that the majority of visitors thought that the autonomous Ohbot was more accurate and human-like than the one that I was controlling.
We’ve incorporated Microsoft Cognitive Services features in our latest release of Ohbot software on the Windows App Store. Use of these features requires creation of a Microsoft Cognitive Services account and acceptance of their conditions. As Ohbot is primarily used with children we have been particularly mindful of privacy issues.