Your Mind is All You Need: Controlling virtual environments with your brain.

December 4, 2023

Introduction

Virtual environments act as a transition phase from idea to reality. To enable a safe and robust real-life solution, virtual worlds are environments where we can accurately test our implementations, explore the edge cases and finally bring our ideas to fruition.

All of this is with only one goal in mind, to enable people with disabilities (especially quadriplegics) to regain a level of independence. Playing Mario Kart by only using your brain is cool, but imagine the impact such a technology - like controlling one’s wheelchair with the brain - can have on someone’s life.

This article explores the act of creating a virtual environment and enabling people to overcome their physical limitations by empowering the mind.

Our virtual world

In our group, we have two types of virtual environments. The first one is used for data acquisition and the second one is used to test the brain-computer interface (BCI). For implementation, we are using the Unity game engine, which is an established environment for game development.

Data Collection Environment


Imagine moving your arms. This is called motor imagery and, according to research, patterns can be identified in the neuronal signals to be able to connect the signals to the action itself. By feeding a lot of neuronal data to a machine learning algorithm, we hope that it learns to recognize these patterns and be able to classify the correlated action. This is why we need an environment to collect data with.

The requirements to enable the collection of data are straightforward:

  • Show the person the task to be done.
  • Give the person time to imagine the task.
  • Give the person a bit of time to reset.
  • Collect and label the data.
  • Repeat.

We built a game in Unity that shows the subject icons, which uniquely identify the tasks to be done. The cycle of displayed images on the screen is: yellow cross – cue icon – black screen. To collect the signals, we use the lab streaming layer (LSL), which is a middleware for synchronizing information, specifically built to be used for neuronal signals. Every time an icon is shown, an associated marker is sent to the LSL stream and can be then connected to the EEG data. This way we can collect data and use it to gain insight from our experimental design, but also train our machine learning algorithms.

A yellow arrow pointing to the leftDescription automatically generated

Figure 1: Image of a cue shown during the data collection experiment

BCI Environment

Let’s use the example of controlling a wheelchair. One should be able to move it forward, backward and rotate it left and right. In the case of quadriplegic people, this is currently done via a joystick controlled by the chin or by detecting head-gestures. Our aim is to control the wheelchair with the mind. There are risks involved with this technology, like wrongly classifying the signals and moving the wheelchair in an undesired direction. This can lead to accidents, which we wish to eliminate.  To ensure the robustness of our BCI, we test it in a virtual environment that simulates moving a wheelchair inside a room. Here, the pilot must move the wheelchair from point A to B without colliding with the obstacles spread inside the defined space.

We read EEG data in real-time via the LSL, filter them with our signal processing pipeline, classify them with our pretrained models and translate them to actions in our Unity engine game. A glimpse in our environment can be seen in Figure 2. We have opted for this environment because of the released list of tasks by the Cybathlon committee, which will be presented in the following section.

A screenshot of a video gameDescription automatically generated

Figure 2: Screenshot from our wheelchair prototype game.

Cybathlon Tasks

To test the quality of our brain-computer interface, our goal is to participate in the ETH Cybathlon 2024 in the BCI discipline. Here, our quadriplegic pilot will use our BCI to solve virtual tasks using only neuronal signals as input. The virtual tasks are related to daily use cases [1]. One of the tasks is controlling a wheelchair throughout a room. Another one is moving a cursor on a screen and clicking different buttons. The beauty of these tasks is that, when translated into real-life, they can be of great assistance to quadriplegic people.

A video game of a room with a chair and desksDescription automatically generated

Figure 3: Screenshot of wheelchair task from the Cybathlon 2024 Races and Rules Document.

A computer screen with a circular designDescription automatically generated

Figure 4: Screenshot of cursor task from the Cybathlon 2024 Races and Rules Document.

We let our creativity flourish inside the virtual world, explore unknown lands, and gain never seen insight. And, with the knowledge we acquire, we hope to bring the real world one step forward.

Article by Iusti

Edited by Carlota and Leona


References and Thumbnail:

[1] https://cybathlon.ethz.ch/documents/races-and-rules/CYBATHLON%202024/CYBATHLON_RacesAndRules_2024.pdf

could put sponsor info here
or a contact us thing... just remember that this shows up on all pages
Start Now