Now, Cognixion is bringing its AI communication app to the Vision Pro, which Forsland says has more functionality than the purpose-built Axon-R. âThe Vision Pro gives you all of your apps, the app store, everything you want to do,â he says.
Apple opened the door to BCI integration in May, when it announced a new protocol to allow users with severe mobility disabilities to control the iPhone, iPad, and Vision Pro without physical movement. Another BCI company, Synchron, whose implant is inserted into a blood vessel adjacent to the brain, has also integrated its system with the Vision Pro. (Apple is not known to be developing its own BCI.)
In Cognixionâs trial, the company has swapped out Appleâs headband for its own, which is embedded with six electroencephalographic, or EEG, sensors. These collect information from the brainâs visual and parietal cortex, located at the back of the head. Specifically, Cognixionâs system identifies visual fixation signals, which occur when a person is maintaining their gaze on an object. This allows users to select from a menu of options in the interface using mental attention alone. A neural computing pack worn at the hip processes brain data outside of the Vision Pro.
âThe philosophy of our approach is around reducing the amount of burden that is being generated by the personâs communication needs,â says Chris Ullrich, Cognixionâs chief technology officer.
Current communication tools can help but arenât ideal. For instance, low-tech handheld letterboards allow patients to look at certain letters, words, or pictures so that a caregiver can guess their meaning, but theyâre time-consuming to use. And eye tracking technology is still expensive and not always reliable.
âWe actually build an AI for each individual participant that is customized with their history of speaking, their style of their humor, anything they’ve written, anything they’ve said, that we can gather. We crunch all that down into something that is a user proxy,â Ullrich says.
