`A team of students from the HIT Lab introduced the next generation of scientists to our Nao robots at the Christchurch Big Science Day, on October 31st, 2015.
Kids got a chance to watch the robots perform, interact with them (the knock-knock jokes jokes were hard to hear over the noise and laughter), and they got a chance to see how the magic is done: actually programming the robots, and watching them respond. A great time was had by all.
More details about the Big Science Day here
Can people form emotional connections with robots? Come and find out for yourself at the University of Canterbury Human Interface Technology Lab’s (HITLab) showcase at the Big Science Day, this Saturday.
‘Humanoids’ Kant, Aristotle and Socrates will dish out hugs, dance Gangnam Style and even sing Happy Birthday as part of demonstrations by UC Masters’ students at the event in Cathedral Square, from 10am – 4pm.
Human Interface Technology Master’s degree student Helen Figg says the event is an exciting chance to test what the group has learned in the real world.
“We’ve been working towards this event as part of our investigation into how people interact with robots. As well as children wanting to experience an emotional connection, we have found they want to understand the robots’ personalities. The interactions we will demonstrate reflect this.”
Kant, Aristotle and Socrates are programmed with different personalities. Their actions are directed through computer software. Students have coded instructions for dancing, hugging, singing, walking around obstacles, performing knock-knock jokes, arm wrestling and answering general knowledge questions for this project.
The student group, made up of Helen Figg, Omprakash Rudhru, Qi Min Ser, Sakthi Priya Balaji Ranganathan and Sathya Kumar Barathan, have completed two test demonstrations in preparation for Saturday’s event. They first took the robots to meet children from Cubs (part of Scouts), before refining their interactions and conducting further testing with Burnside Primary School students.
“The most popular interactions so far with the children have been the dancing and the knock-knock jokes. It’s about that one-to-one interaction,” Figg says.
UC’s Colleges of Engineering and Science will also have a rocket, a drone, an electric go-kart and the Formula SAE race car on show at the Big Science Day. An interactive chemistry show will take place on the main stage, in association with UC students from ChemSoc.
For further information contact:
Margaret Agnew, UC Senior External Relations Advisor
p 027 503 0168 or (03) 364 2775
Helen Figg, Master of Human Interface Technology student
p 022 645 1140
Work in the HIT Lab over the summer! We're looking for a few graduate students to take up UC Summer Scholarships to work on projects in the lab.
A UC Summer Research Scholarship provides the opportunity for a student to work on a supervised research project for 400 hours (approximately 10 weeks) over the summer period (November 2015 – February 2016), to complete a short research skills programme (November/December 2015) and to give a presentation at the Summer Research Scholarship Feedback Day (February 2016).
We're looking for interested students with the necessary skills to work with our research staff on:
Spend summer doing something cool and learning new skills in the HIT Lab!
Find out more about the summer program here.
HITLab NZ's associate professor, Dr Christoph Bartneck has been working on a project with a visiting student, Shogo Nishiguchi from Japan, about getting a Lego robot to hold a conversation. The project has attracted the media's attention and Dr Bartneck had an interview with Radio NZ last week. Below is the interview he had with Radio NZ.
Christchurch's Imagination Station has made use of a friendly fireman to try and ask for donations to keep the centre running.
The all-singing, dancing, talking robot was designed by a Japanese Masters student Shogo Nishiguchi, from Osaka University, and New Zealand researchers from the University of Canterbury and stands just 20 centimetres high.
"We used a fireman with a helmet, so we could easily put a camera inside the helmet and it didn't look too awkward," said Canterbury University's Christoph Bartneck.
The robot was specially programmed to be able to hold a conversation - providing there wasn't too much interruption.
"People talk over each other in normal conversation - humans can do that. But for a robot, that's quite hard, because distinguishing between an audio signal from the speech of a user and the speech that it generates itself or even the sound of its motors is very very hard.
"So therefore it is necessary for the robot to either listen or speak, but not both at the same time."
It was programmed with a series of pre-scripted conversations, although it responded to whatever visitors said to it.
"You can essentially say anything, and the computer will come back with an answer. It doesn't always make sense, but it's pretty good," said Dr Bartneck.
"But at the end of the day, people, and particularly children, are quite unpredictable, and they can come up with all sorts of questions that you are being directed."
Mr Nishiguchi said that to solve the problem they embedded an online database of conversation, called Chatbot, which gives proper answers to questions.
"The advantage of scripting is that the conversation is heading to a goal set by the user. On the other hand, Chatbot can answer any random questions. [The robot] makes use of both advantages," he said.
"That's where the chatbot technology comes in: that whenever the robot doesn't really understand what the person is saying, it doesn't really fit into this dialogue that we've pre-scripted, it would send it off to Chatbot, and Chatbot would come back with a response," said Dr Bartneck.
They even programmed it to have its very own sense of humour.
"When the people were actually trying to put in some money, we'd say, 'Oh, I'm sorry - I only accept $50 or $100 notes. No, just kidding, you can put in whatever you want.' Keeping it light-hearted."
But has it got enough heart to replace a human?
"Ah, this is a very nasty comparison! Usually people in the robotics community try to steer away from comparing humans directly to robots, for the very simple reason that humans usually always win, and it always showcases how horrible robots still are in comparison to humans.
"But what we have to look at is the progress that robots make - it's much fairer to compare a robot that we're making right now to one we made two or three years ago. We can monitor the programmes we have made, and see what further directions we should take."