By now you’ve probably seen some robot-inspired tech products, and that’s not even counting the ones that have been created using 3D printers.
We’ve also seen the advent of augmented reality glasses and other tech that allows you to see virtual images of real world objects.
But what’s the big deal if you can’t see your hands?
The simple answer is that you’re not a robot.
But if you’re a human, that’s about to change.
With a device that can sense and sense accurately the position and orientation of your body, you’re going to be able to navigate around obstacles, manipulate objects, and even use your brain to learn new tricks.
It’s a concept called “accelerated vision”.
Accelerated vision has been around for decades, but it was only in the past few years that the concept has been embraced by companies like Google, which is using it to create “a real-time augmented reality experience” for its Android devices.
In short, it allows you, the user, to move your hands in real time.
It could be used to simulate a robotic arm or even to help you control a robotic car.
But how does it work?
The key to the technology is an advanced camera and sensor system called a “motorized eye”.
It’s basically a pair of sensors that detect the position of your hands and then send commands to your brain, allowing you to “see” your hands.
The image is processed by a special brain-computer interface that interprets the data and then computes an algorithm that allows the computer to recognise your hand position.
The algorithm then allows you – the human – to move the hand with the desired speed.
The eye can be programmed to respond to movements of your hand with different types of force.
For example, if you hold a tennis racket in your hand and then suddenly start swinging it around, the motor would fire a trigger to let the racket go, and the eye would be able tell that the racket is moving.
That’s the basic idea behind the technology.
But it’s only part of the story.
The technology also has applications for robotics, so the team behind Accelerated Vision is now working on a system that would allow robots to interact with their environment.
For instance, a car could be programmed with a sensor that can detect the shape of objects in the environment, and then be able move around it.
It might even be able even to “feel” the environment around it, and act accordingly.
What’s more, the device would allow a robot to be given an artificial sense of smell.
If it’s detecting a specific smell, then it could use this to detect the location of objects it’s supposed to be tracking.
The idea is that if a robot is able to recognise objects, it could also be programmed by humans to be aware of the smell.
“It’s an idea that is very much in the realm of science fiction and sci-fi, but this is actually possible,” says Prof Simon Cocks, from the University of Warwick.
“This could potentially be the future of the human brain.”
He says the technology would be relatively simple, and it could be integrated into existing prosthetic and robotics hardware as well as in the future, as it has in the real world.
“The idea is to make it so that the brain is working on things, but not the other way around,” he says.
“So for example, the eyes of a robot could detect a specific object, and send a signal to the brain that it’s an object and it would respond accordingly.”
This is a similar concept to the idea that a car might recognise the shape and location of other cars.
“But this is also going to happen on a human level,” says Cocks.
“There’s a way in which you could make a robot with a human-level sense of proprioception and feel a particular sensation.”
If all goes to plan, you’ll soon be able get your hands on a pair in the near future.
The company behind the Accelerated Eye has developed a prototype for the first time, and is looking to commercialise it in the next few years.
In the meantime, you can learn more about the technology in the video below.
Watch: A car’s driver learns to use his own hand, from scratch.