Mind. Blown. That is how I feel every time I see the demo of the CTRL-labs product. CTRL-labs is creating the future of neural interfaces, interpreting neural signals to control machines. No chips planted in the skull. Nothing invasive. It uses a simple armband and signal processing and the latest in ML/AI to interpret down to the single neuron level to determine your intent. Their product literally knows that you're thinking about moving your fingers, before your muscles fire! It is one of the most startlingly impressive demos I've ever seen, I couldn't stop thinking about it. And the technology could be one of the most important innovations with applications in the fields of medicine, robotics, productivity, communication, gaming, ar/vr and more.
The demo. Imagine for a minute - you slide the armband on, rigged with sensors, connected to a mini computer 'interpreter' nearby. As you type on a keyboard, the sensors detect your unique neural pattern that your brain is sending to activate the muscles in your fingers to type words. And then, after a minute or two of typing on the keyboard, learning your unique neural pattern, you push the keyboard away, type on the table as if you are hitting the keys, and the typing continues on screen.
Remember the game Asteroids? The second demo involves laying your hand flat on the table, and while you think about moving your fingers, they twitch almost imperceptibly. The sensors detect your neural signal and using the latest in ML/AI allow you to have near perfect game play on the screen without moving your hands.
These demos were just the proof of concept but they give you a glimpse into the future. In no time the armband might be miniaturized to fit on your watch. Or embedded in your clothing. Making it seamless to use.
Imagine the potential in the medical field - giving perception back to amputees. Or controlling robotics where your hand movements can be mimicked perfectly in warehouse across the country or robots on Mars. Or in a virtual world, you no longer need controllers, you can map your body's movements directly to in-game actions. Or maybe you want to send a text or a note without anyone knowing you're doing it. No keyboard needed. In just a few short years the way we interact with machines and each other will change dramatically. It starts to look like ESP.
The company was founded by Thomas Reardon, an entrepreneur that Brad Silverberg (Fuel's co-founder) worked with at Microsoft. Reardon worked on Brad’s Windows 95 team and was integral to the success of Internet Explorer. When he founded a wireless networking company called Avogadro in 2000, Brad was an early backer at Ignition. After selling Avogadro to Openwave in 2001 , Reardon became the CTO and his team developed the first mobile web browser.
Reardon left Openwave in 2004 to attend Columbia, earning a degree in Classics, followed by a Masters in Neurobiology from Duke, and a PhD from Columbia. While there, he teamed up with a few PhD colleagues and engineers, and started CTRL-labs to develop a better human-computer interface.
Lux Capital and Google Ventures are co-leading a new round of funding in the company alongside Vulcan Capital, Founders Fund, Amazon Alexa Fund, and a handful of notable angels. The existing investors, including Fuel, are also investing. We invested in the company’s first round of funding alongside Spark Capital, Matrix Partners (the original investors in the virtual reality company Oculus, which was acquired by Facebook) and Breyer Capital.
We are thrilled to be partners with Reardon and the team at CTRL-labs and can’t wait to see what applications of the technology the team and developers dream up next! If you're a developer and want to integrate neural control into your hardware or applications, you can learn more here.