Welcome to UBC’s Robot Kindergarten!

VANCOUVER (NEWS 1130) – Within the next 100 years, it’s predicted that helpful, intelligent, self-organizing, self-replicating robots will play a much bigger role in all aspects of our lives… So, we’d better teach them to behave!

That’s the aim of UBC’s “robot kindergarten,” which is laying out and teaching the rules for engagement between humans and robots.

Mechanical engineering professor and robotics expert Elizabeth Croft says that in a world of Wall-Es and Rosies, walking-and-talking avatars, smart driverless cars and automated medical assistants, it is crucial to get the game right.

“We have all the necessary ingredients to create devices that can compute quickly, can learn and match patterns, are broadly connected to communicate with each other and many other devices. There’s also the ‘maker movement’ — the ability to rapidly prototype even biological materials and integrate them into silicon based materials,” says Croft.

“We are poised for a sort of a Pre-Cambrian explosion of robotic devices and the rules of engagement are very important.”

Before that happens, Croft believes we need to precisely lay out the differences between robots and humans.

“We have an ethical responsibility to prioritize the lives of human beings. Robots are not people and we need to treat them as a different category and make sure the safeguarding of human life is always above the missions or requirements of a robotic device.”

To that end, Croft’s lab welcomes researchers from different disciplines — ethics, law, machine learning, experts in human computer interaction — as well as different international cultures to tackle some big questions in the Open Roboethics Initiative.

“It’s very important that when we design and program robots that the fundamental rules about what we allow them to do and to not do are well thought,” Croft tells NEWS 1130.

For example, if a human and a robot are accessing the same resource — the same roadway, same tools, same power source — who yields? Does the person always get their way? What if the robot is doing something for the greater good, such as a robotic ambulance?

Croft says there are important day-to-day questions, too. How should a robot act when handing over a bottle of water?

“We are also teaching robots in order to make them friendlier and more acceptable to people. When a robot and person are reaching for the same object, we tend to hesitate automatically and it’s kind of a signal. We don’t go mashing in there and grab whatever it is,” she explains.

“We’ve been seeing how people perceive the behaviour of the robot and whether they like the fact it yields to them, whether they find it more collaborative when the robot communicates it needs a resource and whether they are willing to let a robot that needs that resource have priority.”

Croft says they are also teaching robots how to take turns, how to share, and how to give up an object.

“When people hand things back and forth, there are actually unspoken rules around the handover. The person who is giving is responsible for the safety of the object, to not drop it. The person who is taking the object is the person who handles the speed of that handover.”

Croft says they took those rules and applied them human-robot interactions.

“When we controlled the robot in that way, people liked it much better. The more safe they felt about the handover, the more they trusted the robot.”

If the robot held onto the object too tightly, people’s trust tended to evaporate. Croft says it really is like teaching a small child.

“Absolutely. Hand it over, but don’t be grabby,” she laughs.

Top Stories

Top Stories

Most Watched Today