Design & implementation for Assisted Reality at Cognixion

Company Overview

Cognixion is a startup that blends augmented reality with assistive tech such as brain computing interface to help people with a wide variety of physical and neurological abilities communicate and interact more effectively with the world around them.

Key definitions include:

Brain Computer Interface (BCI): “Brain Computer Interface (BCI) is a computer-based system that acquires brain signals, analyzes them, and translates them into commands that are relayed to an output device to carry out a desired action.” - National Institutes of health

Visual Stimulus: Visual Stimulus is in the form of a picture or color shown in from of a user to stimulate the brain, generating brain signals.

My Contributions

  • Informed product direction

  • Defined and implemented visual stimulus

  • Built interactive prototypes based on a complex code base in Unity

  • Created spatial design (AR) guidelines tailored to BCI

  • Shortened dev cycles by implementing visual stimulus in production branch

  • Advocated for design within Cognixion and XR Access community

How It All Started…

Cognixion hired me as a UX Consultant in June 2022 to lead research efforts for their AR + BCI software. I conducted user interview, synthesized interview data, and formed a design point of view recommendation to inform strategic product decisions happening at the time. At the end of the project, I moved on to a full time prototyping role at a VR company.

Boomeranged Back - Bridging Gaps

In November 2022, Cognixion was ready to build their new AR + BCI software. Beyond all the context i had already gathered, i was hired (and thrilled!) to help execute on the latest product strategy and solve two important gaps:

  • Bridge design and bio-signal work, by collaborating with bio-signal to define research informed visual stimulus animations for user focus and attention detection

  • Bridge design and engineering work, by partnering with engineering to prototype based on a complex code base and implement visual stimulus in production

Visual Stimulus Definition & Testing

I joined bio-signal daily standups to understand their world and learn quickly to create effective yet user friendly stimulus.

Then, I started exploring visual stimulus animations based on bio-signal research conducted by the team. All animations were built using Unity Shader Graph.

In order to validate the visual stimulus design efficacy, I ran multiple user testing sessions.

After iterating based on user testing feedback, i documented top performing results in the form of recommended values for each parameter.

I further enriched the documentation into a spatial design guideline for BCI.

We focused on medium and top performing variants in further design work.

Visual Stimulus Prototyping & Implementation

I joined engineering daily standup to learn how they worked together and develop rapport.

Pair programming with developers in Unity was key to integrate my prototypes seamlessly with their code base. I created branches copied from the develop branch to use the latest code in the repo and isolate my work.

This approach allowed me to quickly navigate their code, subscribe to relevant interaction events and identify selected visual stimulus using existing arguments and methods in my code.

This foundation allowed me to prototype with more autonomy. As an example, i added keyboard controls to adjust visual stimulus parameters (e.g. size, spacing) during run time based on user testing feedback.

Beyond prototyping, I also collaborated with engineers to implement stimulus animations and other UI assets in production.

I focused on frontend and needed to make sure my implementation worked seamlessly with backend logics.

For example, documenting parameters and their references (such as frequency) that i used in shader graph was key so that engineers could pass them as arguments in their code.

Below are an example of visual stimulus i implemented and its material instance showing parameters adjustable via backend.

 

Product Demos

Here is an experience intended for users with limited motor abilities to respond to a yes/no question quickly by focusing on the visual stimulus.

Yes/No/Rest BCI Demo

The following video demonstrates a conversation between a caregiver and a user with limited motor abilities. The user types out responses using a BCI keyboard, enabling them to communicate freely.

BCI Keyboard (5-key constraint)

What’s Next …

Establishing a strong foundation by defining spatial design guidelines and developing tech stack familiarity was critical.

To go further, i will leverage it to explore LLM driven BCI experiences to make responses more relevant to conversations and reduce user effort.

Previous
Previous

Unlocking personalized world creation in minutes at OVA

Next
Next

XR Access Talk