The Best New AI Products

Get a recap of new AI tools every week:

Andreas Forsland, CEO of Cognixion

6 Questions with Andreas Forsland, CEO of Cognixion

It's our pleasure to catch up with Andreas Forsland, who is the CEO of Cognixion. Cognixion is aiming to unlock speech for hundreds of millions of people worldwide by leveraging hardware and AI together.

1. What inspired you to pursue a career in AI, and how did you get started in the field?

My background was in Natural User Experience Design and CX (Customer Experience). I've always been fascinated with our mind and body and how humans interact with the world around us.

I've been exposed to AI for a couple of decades but mostly in the realm of big data and personalized content and web/mobile experiences. It's only over the past decade that processing and underlying enabling technologies have unlocked their ability to be used or adapted for wearable and truly human augmentation use cases.

2. Can you give us an overview of how you started Cognixion?

We began as a small startup focused on translating wearable sensor data into easily memorable wireless input controls for mobile devices. In 2015, we imagined being able to use AI to assist with a direct brain interface to control mobile device applications.

In 2016, we did our first formal IRB-study testing a "thought to speech" interface with people with disabilities and found that they could learn to utilize a wearable neural interface, but there were many user, market, and technical requirements making it a daunting challenge to build and launch such a product.

We learned that creating a bionic neural interface to control dynamic mobile content, requires an extremely low latency closed loop system with tight integration between the sensors and the computer and interface, and that the role of AI in various aspects of the loop would be critical for creating a delightful experience that anyone would be able to start using out of the box.

We chose to create a fully integrated AR (Assisted Reality) headset with various sensors including a BCI (brain-computer interface) and wireless network connectivity for absolute mobile freedom to those who wear it and interact with its applications.

This means that we must have AI running at low levels of the embedded system, as well as in the applications on and off the device - specifically for decoding user intentions and reducing the time required to generate outcomes - ranging from spoken words to prompts to an embedded AI assistant to controlling the environment around the user.

3. What are some of the most exciting and challenging aspects of working in AI today?

We were starved for robust, reliable, and fast AI capabilities for years, and now we have the opposite problem which is determining which ones are the best ones to integrate. Things have accelerated so greatly in the past year, and anticipate an explosion of capabilities and services will emerge.

We are very selective and critical about how we evaluate or integrate other companies' AI versus our proprietary technology. Because our company, Cognixion, is mission-driven to create a future where people with disabilities are "AI-Powered People", I'd say the most exciting and challenging aspects are in the realm of User Experience and Human Factors Testing - specifically to use AI to augment someone's abilities or their speeds for communication and agility interacting with things in their world.

Accessible AI is the next frontier, and I'm happy that Cognixion is at the forefront of this convergent aspect of AI technology and society.

4. How do you see AI evolving in the coming years, and what role do you think it will play in society?

It's less about what role it will play, and what barriers we see getting in its way, especially for people with disabilities - but this can also include more tech, independent children, and senior citizens, too.

We envision that technology and user interface to knowledge, services, and the built environment will become more and more invisible or ambient. We call this Ambient Intelligence, and you can see how buildings are getting smarter, machines are getting smarter, networks are getting smarter, and devices are getting smaller, smarter, and more humanistic - as an extension of the human experience.

We envision a REAL WORLD in that the human experience is augmented. In fully immersive experiences like XR/VR, the role of AI is more fully abstracting what it's like to be human and enhancing human capabilities that defy the physics or constraints of the real world.

As a result of this, where the real world will be exponentially more interactive or accessible, and virtual worlds will be so convincingly realistic, that the role of AI will need to be extremely ethical, in service of the human, and to provide humans with clarity of what's real or not real, and in a world where AI is generating content that is extremely difficult to discern fact from fiction, it will be imperative to ensure that fact-checking or other methods to validate information - include what aspects are generated by a human vs the machine - will be important.

Lastly, we see the notion of shared autonomy where the system learns the user simultaneously as the user learns the system, and can adapt to the user's situation to enhance the speed of learning and adaptation of the human and machine as an agent.

5. What AI tools have you tried and been impressed with recently?

We don't disclose confidential R&D activities or endorse technologies that we've not yet commercialized. We have fully integrated Amazon Alexa inside of our Cognixion ONE device.

We are exploring how generative AI could play a valuable role in enhancing the quality of life and daily active living capabilities of people with disabilities.

This means we are focused on AI that serves a meaningful but robust capability for making it easier (or even possible) to interact with other people, places, and things, but also exploring generative AI for creative self-expression and cognitive development.

6. What can we expect to see coming out of Cognixion in the near future?

We are planning to release a version of Cognixion ONE, the assisted reality device that includes an integrated brain-computer interface, as a medical device that will be cleared by the FDA, aiming to be fundable by medical insurance.

We will also be releasing a special version for researchers working on next-generation experiences and services as well as clinical translational research for novel new use cases in healthcare.