Will neuromorphic-controlled robots soon become a reality?
With today’s rapid pace of innovation, technologies considered a little more than Sci-Fi a decade ago are becoming a reality. Mind-controlled robotics is one example. Emily Newton reports
Will neuromorphic-controlled robots soon become a reality?
Many home and industrial robots today respond to voice commands, but what if you could control them with just a thought? While it may sound far-fetched, this technology is more developed than people realise. Mind-controlled robots may not populate store shelves or be an enterprise solution right now, but they could transform some fields soon.
As the name suggests, mind control robots control by thinking rather than physical or vocal inputs. In the future, that could look like you turning a robotic vacuum on and telling it to clean a room in your house by thinking about it. For now, though, the field focuses mainly on mechanical body parts.
Neuromorphic devices (those that mimic neuro-biological architectures present in the nervous system) such as prosthetic devices that connect to your nervous system, are the earliest and most common example.
Mind-controlled neuroprosthetic limbs emerged as early as 2012, when a team of scientists successfully implanted a robotic arm that a patient could control with their brain waves.
Other mind-controlled robots include thought-to-speech translators, brain-controlled wheelchairs and even industrial bots you can adjust with your thoughts. As technology advances, new possibilities will emerge, too. It could offer game-changing improvements in the medical field, for instance, helping people with disabilities that limit their motor creating a more equitable future.
Wearable robots, exoskeletons, already help patients with paralysis to walk again by assisting their movements. Mind-controlled robots can go a step further by requiring no physical input to use. Robotic prosthetics could also replace traditional ones to give patients with amputations fully functioning limbs or other body parts that work just like their natural ones.
Outside the medical field, mind-controlled robots could provide similar benefits. Workers would no longer need to rely on fine motor skills or vocal commands to guide machines in the workplace.
While these machines are highly complex, the basics of how they work are straightforward. It all starts with a system called a brain-controlled interface. BCIs measure the brain and spinal cord’s activity and translate it into electrical signals to control a physical device.
While there’s no direct one-to-one translation to turn brain signals into electrical commands – artificial intelligence (AI) can facilitate this.
BCIs today typically use machine learning algorithms to recognise and classify brain waves. This AI-driven approach lets these systems become more accurate over time and adapt to specific users’ brain waves.
Some of these systems are remarkably fast, too, with a 2017 algorithm from MIT taking just 10 to 30 milliseconds to classify these signals.
Right now, this process requires more thinking than moving your natural arms around. Machine learning algorithms need guidance, so you have to think about agreeing or disagreeing with its choices so it can learn and prevent future mistakes. As these algorithms improve, though, that extra step could fade.
While many studies have shown the potential for mind-controlled robotics, it’s not ready for widespread real-world application just yet. One of the biggest challenges is that brain waves aren’t uniform, which makes it hard for machines to understand them.
As BCI researcher José del R. Millán points out, “If I move my hand, the brain is not only focused on that, the brain is processing many other things.”
This variability makes it difficult to achieve 100% accuracy. Similarly, BCIs must clean a lot of noise from your brainwaves to determine what you’re actually trying to do, which can slow the process and hinder accuracy.
The brain activity-measuring part of the system can be tricky, too. Traditional BCIs are invasive, often requiring surgery, making them inaccessible, expensive and potentially risky. Non-invasive sensors are becoming more common, but things like sweat, skin thickness and hair can interfere with them.
Despite these obstacles, researchers across the globe are making some impressive strides. A team in Australia created a graphene-based sensor with a 75% reduction in skin impedance, meaning it’s better at detecting brain waves while being non-invasive. The sensor was also more resistant to sweat, making it more practical.
Machine learning algorithms are also getting better at interpreting users’ brain signals. One 2021 study developed a model that takes just three to five attempts to correct errors in its interpretation. Further training of similar algorithms could produce fast-learning AI to enable more accurate and easy-to-use BCI systems.
So while fully functional robot arms may still sound futuristic, it’s a reality that’s drawing closer every day. As this research continues, mind-controlled robotics will become more accurate, reliable and affordable. A shift that may help make the world a more accessible and convenient place.
Subscribe to our Editor's weekly newsletter