Tuesday, 29 June 2010

Robots and Avatars Round Table Debate with Noel Sharkey at NESTA

I took part in an interesting Lunchtime Debate yesterday at NESTA, organised by bodydataspace

"Robots and Avatars is an innovative project exploring how young people will work and play with new representational forms of themselves and others in virtual and physical life in the next 10-15 years."

There were fifteen of us from creative, academic and scientific backgrounds sitting around the table discussing our thoughts on the near future of Artificial Intelligence, robots and avatars. For me, I'm relatively new to all this in a practical sense, having created live interactive characters as part of Alternate Reality Games including the Star Trek movie ARG written characters/scripts for AI/chat bots for the Sherlock Holmes movie game (the BBC covered it)... and I've just finished a prototype empathetic robot who chats, Daemon (you can see all my blog posts on that).
In Science-Fiction terms I've been fascinated by our relationships with robots and AI since I was a kid. Now I'm talking with people who are trying to make it all a reality and I love it. My angle is as a storyteller... as narrative and character creator looking for a wider audience. I'm keen to take my ideas and experiments to the film, games and publishing industries to see if they'd like to run with them. My area in the past few years has been in making those story world elements become physically real in people's everyday lives. The great quote from the Star Trek ARG I worked on being "Oh my God Star Trek's real everyone!" from a fan who found our Romulan crash site in an Oxfordshire field.

Here's my rough write up of my notes from the session, which serve as a good brain dump for now:

Professor of Artificial Intelligence Prof. Kerstin Dautenhahn took part in the discussion, I was so impressed by her and I'm hoping to visit her at the University of Hertfordshire if she will be kind enough to spare me some time to pick her brains and get some inspiration from her.
Also attending was Paul Harter, programmer and digital artist, who struck a chord with me when he said that he's more interested in 'Artificial Humanity' than 'Artificial Intelligence' because he's interested in the imperfections of human personality rather than a logical, perfect model. This is very much my approach, as a writer I know that people prefer characters who are flawed and who behave illogically.
Professor Noel Sharkey who's somewhat of a celebrity in robot and AI circles and is also Professor of AI, Robotics and Public Engagement at Sheffield Uni., gave us a useful over view:
he talked about taking ideas to the film industry, which as I say I'm keen on. Other roles for robots, points out Sharkey, are service robots, robots for care, companions, education and therapy. He mentioned our tendency to 'anthropormorphic projection', where we just can't help letting our imaginations fill in the gaps where robots are concerned, giving them human like emotions and behaviour. An important point for me is that robots don't need to be 'humanoid' for us to do this: the very simplest movement from a very basic shape will lead us to see 'life' in an object, we can create 'artificial artefacts'. I'm very interested in exploiting this, what I think of as the 'theatre' of i, whereby people are playing along with AIs. This leads me to think again about my intention to write up a kind of 'rules of engagement' for role-playing for people who aren't professional actors. Online communities already have strict self moderating behaviour, which I'm sure is being well documented by academics... I'd like to see what people want to do and when, how to avoid feeling embarrassed or annoying other people etc. when playing and role playing face to face. I'll start doing this by analysing the success of the Natural Theatre Company performance approach, who I performed with for twelve years. I'll also be looking at film robot characters and how people relate to them and their part in the narrative.


Friday, 18 June 2010

Daemon robot prototype: video

video

Here's the video of how the user experiences the Daemon in its current prototype form. In a relaxing 'living room' installation the Daemon plays music through iTunes and chooses different music for you when you hold its hand, depending on your mood. The Daemon instant chats with you using your TV monitor whilst you use your laptop. The Daemon reacts when you enter the room, tracks your face, moving and tilting its head when listening and reacting to your face and your touch. The Daemon's goal is to allow you to relax and express how you feel as you would with a therapist, a trusted friend or indeed your own inner voice.
If you find this vid takes too long to load, you can watch it on You Tube

Tuesday, 15 June 2010

It's alive! Daemon therapeutic prototype looks, listens, chats, plays music & loses its head.





So last week I was able to present a working prototype, well a demo-video working prototype, of my Daemon robot research project to an audience at Watershed, Bristol. Working with Tarim and Dan Williams, colleagues at the Pervasive Media Studio, we managed to get together something that resembled my drawings and which could do some of the things we hoped we could achieve, and all within a couple of weeks.
We were able to build a simple prototype robot with a little added theatre, the end result is an installation with a relaxing atmosphere and a robotic entity that looks at you, listens to you, converses with you through instant chat and plays music it thinks will help with your mood.
I tried to create a 'living room' feel, so that you are sitting comfortably next to your Daemon in low light, with music playing. This is to create a relaxed atmosphere in which you might like to confide private feelings with your Daemon, as you might with a trusted friend or therapist or indeed, as was my original idea, as you would converse with yourself, the Daemon being your own inner voice made visible.
In this first very early stage, with two months R&D and a very small budget, I asked Tarim and Dan to help me make a simple prototype in a few days. We used Arduino and servos within the robot body to allow its head to move side to side and turn on one side in a 'listening' position. A web cam and face tracking allows it to follow your face as your move, again to give the impression of 'listening' and being interested. Tarim made a text interface so that the user and Daemon could chat with each other instantaneously. This works automatically, using prewritten replies and question I had written for the Daemon in the style of a very simple chat-bot. The user types into their laptop and the Daemon speaks on it's own monitor next to it, as if this is your domestic TV screen or monitor. Having identified my desire to be able to manually chat and operate or 'puppet' Daemon too, Tarim built a manual override into the system. For me as a writer/performer who loves improvising with the audience via characters, this is great fun. It means that what the Daemon says and does can be 'puppeted' either entirely manually (as in via computer commands) or as a mixture with the automated chat-bot. This is a very important tool for this platform.
We also have the Daemon's ability to change music when you squeeze its hand. Dan was able to set this up (we have a button hidden in the robot's hand) and his next step is to make it work with iTunes. I'm keen that the Daemon works as an interface with iTunes, as Daemon is about communicating emotion, particularly through music. Perhaps in future Daemon will read your emotions via BCI. For now I'm liking the 'fortune teller' approach whereby it tells you to squeeze it's hand and think your most private thoughts, desires, fears and it will understands them, playing you the music and giving you suggestions and comforts it thinks will help you.
And here's one I made earlier:
I made the prototype this time to look like the aliens from Close Encounters of the Third Kind, partly becasue as a child I desperately wanted them to come and take me away like they did Roy Neary. I sat at my piano playing that little five-note tune over and over. That movie has influenced me in many ways so here it pops up again. Since I wanted to put pretty much all of my scant budget into paying the programming tech brains, I had to make the robot body in a Blue Peterish way using tupper-ware boxes and humous pots covered in fabric. The head is papier mache. I tell you I had to do a lot of learning to make that bloody thing. Mainly in how many glues, no matter how expensive, do NOT stick polyethyline and polythene. Nothing on God's Earth would stick the different plastics inside the head to those in the body long enough to stop the damn thing's head falling off every 15 minutes. Therefore it wasn't ready to test on public users by last Wednesday's evening presentation. Instead we made a demo video, which I was able to shoot while it still had its head on.
The video is on YouTube and you can see interviews and photos on the D-Shed website as well as my Pervasive Media Studio blog and flickr pages. I'm writing up a proper report of the project and where we're going next with it, so look out for that too.