Here’s a proposition for you… I’d like you to program a computer to understand thoughts in real time, and use those thoughts to extend the definition of the body to anything digital. Oh, and you have to do it in three months.
That’s the situation we found ourselves back in 2010. We built one of the world’s first intracortical neural interfaces. At the beginning, I thought it might be an impossible task. I mean, that’s all science-fiction stuff, right? But, three months later, it was working.
It turns out, like most things we work with, it is just a big data analysis problem. Here’s the trick… neurons vote.
Each stream of what are called spike-chains in the literature are basically time series data of the neuron firing events. It really is no different than data coming from customers, every action they do sends an event. The same way you see trends in how customers vote with their actions, when neurons fire faster or slower than normal, you can pick that up and do some actions based upon it.
By directly connecting to neurons in the motor cortex and basal ganglia we got streams of these neurons voting to control a video game, but the principle would work the same to control a robotic neural-prosthetic arm.
Event data is powerful.
If you’d like to know more about how this no-longer-science-fiction project was built, or how to build your own neural interface, we have a case study you can download.