In movies, we sometimes see characters use telekinesis, moving objects without touching them. Of course, special effects, whether simply based on wires on full CGI is what is really behind the movement.
You can even create the illusion of telekinesis in real life with the right combination of wires, fake walls, willing participants who are up for some stunts, and a setting that will bring you an audience that is unaware of the setup.
That’s what you see in this video of a setup for an elaborate prank at what looks like an ordinary coffee shop:
But telekinesis is possible with some help from technology - so long as you limit yourself to manipulating objects in virtual reality. That is what the invention from CTRL-labs enables, as you can see in this video demonstration:
The CTRL-kit makes it possible to use one’s hands to pick up and finely control the speed of objects in 3D space. It empowers people to interact with virtual environments the way we naturally interact with the real world.
RELATED: THE HISTORY AND SCIENCE OF VIRTUAL REALITY HEADSETS
The technology was unveiled in December 2018 when, CTRL-labs CEO Thomas Reardon introduced Slush attendees to CTRL-kit, a non-invasive electromyography device that translates neural signals into control. Using the CTRL-kit SDK and API, developers can integrate neural control into XR, robotics, productivity, and clinical research applications.
See the video of his presentation below:
To obtain further insight into this human-machine interface, I sent some questions to Adam Berenzweig, Director of Research and Development at CTRL-labs. They are shown as the headers above his answers below.
Is your wrist strapped device your main focus?
Yes, our main focus and product is CTRL-kit, which is a wearable, non-invasive neural interface device and SDK that directly decodes neural signals with single-neuron resolution. The device sits snugly on the forearm, capturing intention at the surface of the skin. This hardware, in concert with our SDK, allows XR developers, roboticists, designers, and researchers to build personalized neural control experiences.
What prompted your company to pursue this invention?
We invented CTRL-kit to address the fundamental problem that human output -- our capacity to do and produce things -- is detrimentally limited by the unnatural and impersonal control schemes of existing devices. While humans have a remarkable capacity to consume and process information, current technologies restrict human output bandwidth with an unavoidable layer of friction.
We saw the need for an entirely new type of technology that decodes the nervous system and adapts to your intentions. Current pervasive input paradigms require steep learning curves.
Touch screens are error-prone and struggle to perceive pressure. Computer vision requires line-of-sight to the subject, leading to latency and occlusion.
To eradicate this problem, we’re building a device that captures intention via the body’s innate output signaling source: the neuromuscular system. We developed CTRL-kit to decode neural signals on the surface of the forearm to effortlessly and reliably transform intentions into actions.
The video above shows that in action. “CTRL-kit detects the intent to jump' before a finger even hits the keyboard. Dino Game can be played without a keyboard, with your arms crossed, or even with your hands in your pocket.”
How long did it take to perfect the prototype?
CTRL-labs was founded in 2015, and we publicly announced the current version of CTRL-kit in 2018, which is now in the hands of select developers.
Do people complain about having to hold a device to interact in a VR environment?
Yes, many VR developers complain about having to hold a device. Existing hardware such as Valve's Knuckles controllers reveals that this is a core pain point in the VR industry. Highly articulated and expressive hands are critical to a great immersive experience.
This is why we designed CTRL-kit to be worn at the forearm, to give users more freedom in their movements and solve the problem of feeling restricted by having to hold a device in their hands.
In the video above, you can see how it allows people to play video games without using any form of manually held control. “Stop playing the controller, play the game.” is the appealing proposition that the CTRL-kit offers.
Is it useful only for VR games, or do you anticipate other uses for it?
VR is one of many use cases for CTRL-kit. In addition to XR developers, our early access partners include roboticists, designers, and scientific researchers. CTRL-kit is designed for anyone looking to reimagine human-machine interactions and the way we control and do things.
Long-term, we are paving the way for mass consumer adoption of non-invasive neural interface technology within all human-machine interactions, which would capture a global smartphone audience of over 3 billion people.
People would use CTRL-kit in their everyday lives to control all types of devices such as their computer, phone and smart devices at home. It would replace less intuitive hardware like the mouse and keyboard and eliminate the need to type, swipe, click, drag, zoom, etc. to control and do things.
The video above shows how hand movements can be used to generate colorful visuals and sound in Electric Dots, a real-time fractal particle system.
At what point do you expect it would be available on the consumer market?
CTRL-kit will be commercially available in 2020. It is also currently in preview with select developer partners, so those who are interested in obtaining it can apply for early access by joining our waitlisthere.
However, he could not offer definitive answers to the cost of the product or if it will be available in stores. No matter what the price, they’d have some very willing customers as indicated by the first comment on the video below.
It shows a CTRL-kit experience that enables one “to engage force and dynamic range to control things at a distance.” The virtual environment offers various objects that can be manipulated in various ways. That allows the player to “explore hand poses, and add custom events to ‘flick; items away or ‘snap’ objects into assembly."
The enthusisastic resonse to the video from at least one viewer is: "Take my money please."