Philips Research - Technologies

Robotics

 
Progress in various technologies - hardware, software and artificial intelligence - brings the vision of personal robotics closer. While much work is done on the technology many open questions needs to be answered about the social impact of having a personal robot in a domestic environment. Philips Research is investigating technical and social aspects of user-interface robots in an 'Ambient Intelligence' environment, in which technology is seamlessly integrated in your life.

Project Goal
Our goal is to stimulate Human-Robot Interaction research by building a research community through supporting a common hardware and software platform.

iCat
iCat is a research platform for studying human-robot interaction topics. The robot is 38 cm tall and is equipped with 13 servos that control different parts of the face, such as the eyebrows, eyes, eyelids, mouth and head position. With this setup iCat can generate many different facial expressions - happy, surprise, angry, sad - that are needed to create social human-robot interaction dialogues.

iCat expressions

A camera installed in iCat's head can be used for different computer vision capabilities, such as recognizing objects and faces. Each foot contains a microphone to record sounds, perform speech recognition and to determine the direction of the sound source. A speaker and soundcard are installed to play sounds and speech. Finally, touch sensors and multi-colour LEDs are installed in the feet and ears to sense whether the user touches the robot and to communicate further information encoded by coloured light. For instance, the operation mode of the iCat (e.g. sleeping, awake, busy, listening) can be encoded by the colour of the LEDs in the ears.

Social Intelligence
Various user studies on social intelligence aspects of user-interface robots have been performed in our test facility HomeLab at Philips Research Eindhoven, the Netherlands. HomeLab is a smart home concept with various means to observe the behavior of users. During some of our studies we investigated the perceived personality of the iCat by letting users interact with the iCat during a game setting (TicTacToe) or task setting (programming a VCR). These studies show measurable differences in effectiveness and enjoyability of the tasks the users had to perform, depending on iCat's personality.

Videos on iCat:
+ Katy as game buddy
+ Comparison between two characters
 

iCat

OPPR
Programming user-interface robots like the iCat is easy with our Open Platform for Personal Robotics ('OPPR') software development environment. We have developed the Dynamic Module Library ('DML') that allows you to easily build software components that run on several PCs.

OPPR contains a Robot Animation Editor for creating believable robot animations. This graphical tool gives you precise control over every individual servo and multi-colour LED. Once you have developed animations they can be used to control the iCat using the Robot Animation Engine. This engine blends at runtime pre-scripted animations and robot behaviors (sensor-actuator loops). With this setup you can create separate animations and control algorithms for head movements, facial expressions, eye-blinking and lip synchronization. OPPR contains a ready-to-use library with animations that can be used for building human-robot dialogues. A scripting engine makes developing these dialogues easy.

OPPR can be both used by beginners and advanced programmers. Scripting and graphical tools make developing animated dialogues easy. Advanced programmers can build their own DML software components, such as vision and speech recognition components, using C++.

iCat research

 

Purchase Information
Philips Research is currently working on a limited edition of the iCat platforms. These platforms are sold to universities and research laboratories for research purpose only. By providing the iCat user-interface robot platform, OPPR development software and a community support website (online from June) we provide researchers a kick-start for their research.

The iCat research platform will become available from Q3 2005. If you are interested in purchasing this platform for your research, download the purchase flyer or contact us at robotics@philips.com.

FAQ
Q.   Why investigate human-robot interaction?
A.   Technology helped us to become more productive and makes our lives easier. However, up to now we needed to adapt ourselves to technology instead of being able to communicate with technology in a Human way. Human-robotics is an interesting route bridging the gap between human interaction and technology.
Q.   Where can I see a demonstration of the iCat?
A.   The iCat will be demonstrated during the AISB 2005, ICRA 2005 and AAMAS 2005 conferences.



Further Information

iCat & OPPR community site:
+ iCat community website

Publications:

  • A.J.N. van Breemen, iCat: Experimenting with Animabotics, AISB 2005 Creative Robotics Symposium, Hatfield, England, April 2005.
  • C. Bartneck, J. Reichenbach, A.J.N. van Breemen, In your face, robot! The influence of a character's embodiment on how users perceive its emotional expressions, Design and Emotion, Ankara, Turkey, 2004.
  • A.J.N. van Breemen, Animation Engine for Believable Interactive User-Interface Robots, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS2004), Sendai, Japan 2004.
  • A.J.N. van Breemen, Bringing Robots to Life: Applying principles of animation to robots, CHI2004, Vienna, 2004.
 


iCat