Today’s technological frontiers are paving the way for real Robocop patrols in the future.
In the 27 years since the original RoboCop movie was released, technology has advanced by leaps and bounds. This year’s reboot of the popular science fiction franchise arrives at a time when augmented reality, facial recognition technology, bionic prosthetics and neural sensors are making headlines.
While the technology depicted in the new remake is beginning take shape in reality, Charles Higgins, associate professor of neuroscience and electrical engineering at the University of Arizona, believes we still have a ways to go before they're science fact.
“In the next twenty or thirty years, you will start to see people who are unable to walk do so with functional prosthetics,” he told Discovery News. “But to be superhuman is going to be much farther off. To be able to do the things that RoboCop can do, I think is 100 years off or more.”
Still, it’s worth taking inventory of RoboCop’s features in a contemporary light. He may be a century away from patrolling the streets, but today’s technological frontiers are paving his way. So before you see the new movie, click through to check out some of RoboCop’s specs and see how they stack up.
To function, Alex Murphy -- the man who becomes RoboCop -- needs neural implants.
The only human body parts that remain of Alex Murphy -- the man who becomes RoboCop -- are the brain and spinal cord of his central nervous system, two lungs, one arm, his face and most of his head. To function, neural implants are inserted into his brain.
Today, laboratories are using an array of sharp electrodes inserted into the brains of monkeys to stimulate various functions. “That would be the closest thing we’ve seen in the real world to something like RoboCop,” Higgins said. “Basically, one of the monkey’s arms could be paralyzed and it could feed itself with the robotic arm.”
However, the problem with this approach is that the brain surrounds these sharp electrodes and eventually renders them ineffective. “The longest anyone’s ever been able to keep them effective is a few years,” Higgins said. “It works fine in monkeys, in the laboratory, but not for clinical purposes.”
Most promising is electrocorticography technology (ECoG), where less invasive electrodes are laid on top of the brain. The procedure still requires brain surgery, but the electrodes don’t penetrate the brain.
“You can stimulate and record from those electrodes. That, I think, is the most promising technology that will lead us to something like RoboCop,” Higgins said. “Right now, it’s being used for stopping epileptic seizures.”
Huge improvements in battery technology are needed power an exoskeleton and untether a real Robocop from an electrical outlet.
Like RoboCop’s bionic limbs, Raytheon’s exoskeleton suits are capable of giving soldiers superhuman strength and advanced mobility. However, Higgins points out a glaring problem.
“There’s a really important piece of technology missing to enable something like RoboCop, and that’s power supply technology, ” he said. “The problem is that Raytheon’s exoskeleton has a huge power cable going to the wall. There’s no battery with enough energy density that can power that.”
Higgins says battery technology simply isn’t up to snuff to power an untethered RoboCop. To mimic the muscles in our torso and legs with motors or actuators -- let alone provide the superhuman equivalent -- would require an enormous amount of energy.
“Where’s the battery for all this?” Higgins asked. “Is RoboCop going to run out of batteries in the middle of a car crash? When he’s chasing a criminal, is he going to have to stop and recharge?”
The fact that we humans use food and convert it to energy so efficiently still makes us, in a way, superior. “Our technology is nowhere near being able to take something as dumb as food and convert it to the kind of energy density we get out of our muscles,” Higgins said.
Head’s up displays and Google Glass-like devices could be the first wearable technology to include the kind of advanced facial recognition systems seen in the movie.
Thanks to the facial recognition technology hardwired to RoboCop’s operating system, it only takes a matter of seconds for him to identify potential perpetrators. As he scans the crowd, up pops statistics and the person’s record on his field of vision. While facial recognition is upon us, it’s still in its infancy as an external device.
Head’s up displays and Google Glass-like devices could be the first wearable technology to include facial recognition, but Higgins says to truly step into RoboCop territory, “what you really want is to have it directly in your brain -- something that can’t be removed by an enemy or removed by a fall.”
Systems intended for the visually impaired -- that connect with the optical nerve -- do exist, but the vision attained is hazy at best. For example, Higgins says a person might only be able to distinguish the presence of a door.
“To get to the kind of technology you see in RoboCop, you’re talking not about an optic nerve interface, but a brain interface and a really detailed understanding of what the visual cortex does,” he said. “We are not at that point. We do not understand the visual brain well enough to impose that information on top of existing visual information.”
Robocop's aim is true, but his emotions get in the way.
Once the perps have been identified, it’s time to take them out. In one scene, RoboCop trains to shoot humanoid robots before they shoot him. Equipped with motion tracking, scopes and sights, his visor gives him precision aim. However, since he’s still prone to human emotions, nervousness makes him less effective, causing his shot to be off target. When RoboCop is comfortable, his emotional focus synchronizes with his suit’s system for a more precise shot.
With existent motion tracking technologies, facial recognition and the target precision of military drones, it’s not so hard to imagine these technologies applied to head’s up displays and augmented reality glasses. In fact, lesser versions of this tech are already being applied to the helmets of pilots, soldiers and even skiers.
And thanks to electrocardiography sensors and heart rate monitors, predictive technology that gauges levels of emotion is starting to appear. Along the same lines as RoboCop’s training, the video game “Nevermind” monitors a player’s heart rate to establish a fear level as they navigate a nerve-wracking labyrinth. The game becomes more difficult if a player starts panicking. On the other hand, if the player remains calm, the game gets easier.
Between shifts, RoboCop goes into full-body docking stations to have essential fluids replaced and circulated.
Like any human or machine, RoboCop must be maintained. Between shifts, he goes into full-body docking stations to have essential fluids replaced and circulated. While docked, his neurotransmitters are manipulated -- specifically dopamine, the neurotransmitter that controls the brain’s reward and pleasure centers.
Higgins pointed out there’s already been a number of studies showing animals can be easily controlled in this manner.
“You can stimulate the pain and pleasure centers in rats and pretty much make them do anything you want by remote control. If ethics allowed, you could do it in humans,” he said. “Even more so, you could give a person pleasure by setting off dopamine pathways if they did what you wanted them to. You could give them pain if they did the wrong thing and you could rapidly condition them to do whatever you want.”
RoboCop isn’t aware of what functions are controlled by himself or by his neural implants.
Part man, part machine, RoboCop isn’t aware of what functions are controlled by himself or by his neural implants. However, in one scene, his human emotions override the operating system and he becomes more autonomous.
The common sci-fi assumption is that the machines will over take the humans. But could the human brain fight back?
“I think definitely so. This is what the human brain is best at -- adapting to changing conditions,” Higgins said. “Although we may not do it for ethical reasons, we have a pretty good idea of how to manipulate a human brain. If you set a machine to do that, I think a person would eventually figure it out and try to overcome it.”