Paraplegic’s Robotic Exoskeleton to take First Kick of World Cup

FIFA-World-Cup-2014_0

There are some World Cup Football Teams who you may think need a miracle in order to win the coveted World Cup. But a miracle, of sorts, will take place on the football pitch of the Arena Corinthians in  São Paulo at the opening ceremony of the World Cup at 5pm 12 June (local time). A young Brazilian will take a kick of a football on the centre spot.

But this is no ordinary young man, nor is the kick trivial. The boy, yet to be chosen from a shortlist of nine aged between 20 and 40,will be  a paraplegic. He will rise from his wheelchair and walk to the midfield and then kick the ball. How?

It’s down to Miguel Nicolelis and his team of neuro-engineers and scientists at Duke University in North Carolina. And if the event works as intended, it should spell the end for wheelchairs, and the evolution of mind-controlled exoskeleton robots. Here’s a picture of the robot:

miguel.nicolelisx299

The mind-controlled robotic exoskeleton is a complex robotic suit built from lightweight alloys and powered by hydraulics. When a paraplegic person straps themselves in, the machine does the job that their leg muscles no longer can. But there are no buttons levers or controls to tell the robot what to do. Only the human brain.

The exoskeleton is the culmination of years of work by an international team of scientists and engineers on the Walk Again project. The robotics work was coordinated by Gordon Cheng at the Technical University in Munich and the French researchers built the exoskeleton. Nicolelis’ team focused on what many say is the most difficult bit; ways to read people’s brain waves, and use those signals to control robotic limbs.

To operate the exoskeleton, the person is helped into the suit and given a cap to wear that is fitted with electrodes to pick up their brain waves. These signals are passed to a computer worn in a backpack, where they are decoded and used to move hydraulic drivers on the suit. There’s a battery that powers everything, with a two hour life before it needs re-charging.

Nicolelis releases video with robotic exoskeleton in action

The operator’s feet rest on plates which have sensors to detect when contact is made with the ground. With each footstep, a signal is transmitted to a vibrating device sewn into the forearm of the wearer’s shirt. The device fools the brain into thinking that the sensation came from their foot. In virtual reality simulations, patients felt that their legs were moving and touching something. Here’s a diagram showing the details.

Last month, Nicolelis and his colleagues went to some football matches in São Paulo to check whether mobile phone radiation from the crowds might interfere with the suit. Electromagnetic waves could make the exoskeleton misbehave, but they were reassured.

miguel-nicolelis zzzzzzzzzzzzzzz

This is ground-breaking robotic/artificial intelligence/mind-control technology all rolled into one: Let’s keep our fingers crossed that  we will all witness the miracle first kick of the World Cup on 12 June.

VN:F [1.9.11_1134]
VN:F [1.9.11_1134]

iCub is Growing up Fast!

iCub

Financed by the European Union’s cognition unit (€8.5m)  iCub is a robot with an impressive AI. It has has sophisticated motor skills, which include the ability to grasp and manipulate objects. However, unlike other “programmed” robots, iCub acquires its skills naturally,  using its body to sense, then gbather information and then explore the world. This is of course similar to the way a two-year-old baby learns and interracts with the environment around it. That may be why iCub has a baby-like face.

iCub is not a single entitiy anymore; a creche of 25 iCubs have been rolled out to centres across Europe, the US and Japan who are collaborating in the programme. Most specialise in some aspect of AI/robotics, such as Dominey’s facility in Lyon, where they are concentrating on iCub’s ability to interact with humans on shared tasks through language and action. Others focus on the more practical side of manipulation of objects.

All the various parts of this series of ambitious projects are connected by using how a baby/child learns as he or she grows and advances . By 18 months, a human can already understand the gesture to pick up a pencil.  This is not about language (at least not to begin with). Young children are born equipped to explore their environment and interact and co-operate with their parents before they can speak or understand speech. These social developmental drives are built into iCub’s operating system. Then, it’s simply a question of interacting with iCub and letting its own body and sensors guide it. Using a bow and arrow? No problem!

Through repetitive play, one team has taught iCub to distinguish a stuffed toy octopus from a purple car, this despite iCub never having seen the objects before. New gestures are learned by grasping iCub’s arm and rotating it in a certain way. These are then recorded in its autobiographical memory. Consequently next time it can make the gesture without being prompted. In the same way iCub can be taught new words and concepts, by through interacting with its human tutor, much in a way a baby interracts with its mother during play.

Recent developments include a touch-sensitive skin to enhance iCub’s ability to gauge when it is getting too close to an object and is in danger of hitting it. This is onviously a prerequisite for persuading people that it is safe to interact with robots at close quarters!

Watch out for more news about iCub as it learns and grows- perhaps into an iWolf?

VN:F [1.9.11_1134]
VN:F [1.9.11_1134]