GitHub link for this project:
https://github.com/jackparsons93/BCI-Shooter-Logic
Building a Unity BCI Shooter Where EEG Controls Bullet Power
I’ve been interested in building game mechanics that respond to real-time brain activity instead of just using EEG as a graph on a screen. For this project, I wanted something simple and visual: a 2D shooter where the strength of each projectile changes based on my brain state.
That’s what this repository does. It combines a Python EEG server using OpenBCI and BrainFlow with a Unity project written in C#. The Python side reads EEG data, extracts band-power features, and sends them to Unity over UDP. Then the Unity side turns those values into gameplay by scaling bullet size, damage, UI feedback, and overall weapon power in real time.
The core idea
The main idea is to connect neurofeedback-style metrics to a very direct game mechanic. Instead of showing the player a raw EEG dashboard, the game translates the signal into something easy to feel immediately: stronger shots when the brain-state score is better, weaker shots when it is worse.
In the repo README, I describe the project as a 2D Unity shooter where bullet power is controlled by real-time SMR and TBR brainwave ratios using OpenBCI. The pipeline is Python for EEG acquisition and DSP, then Unity for the game logic and presentation.
How the EEG gets into Unity
The Python script bci_server.py is the bridge between the OpenBCI hardware and the Unity game. It opens the board through BrainFlow, pulls recent EEG data, computes band powers, and packages the results into a UDP payload for Unity.
self.channel_map = { "F3": 0, "Fz": 1, "F4": 2, "C3": 3, "Cz": 4, "C4": 5, "Pz": 6, "Oz": 7}
I like that the channel map uses real scalp-site names instead of only raw channel numbers. That makes it easier to reason about what site is driving the game logic.
The script then computes band powers from the incoming signal using a bandpass filter plus Welch spectral estimation.
b, a = butter(4, [1.0 / (fs / 2.0), 45.0 / (fs / 2.0)], btype="band")x = filtfilt(b, a, x)freqs, psd = welch(x, fs=fs, nperseg=min(len(x), len(x)))
After that, it extracts the actual bands the game cares about.
return { "theta": OpenBCIClient._bandpower(freqs, psd, 4.0, 8.0), "alpha": OpenBCIClient._bandpower(freqs, psd, 8.0, 12.0), "smr": OpenBCIClient._bandpower(freqs, psd, 12.0, 15.0), "betaL": OpenBCIClient._bandpower(freqs, psd, 15.0, 20.0), "betaH": OpenBCIClient._bandpower(freqs, psd, 20.0, 30.0), "gamma": OpenBCIClient._bandpower(freqs, psd, 30.0, 45.0),}
That is the signal-processing core of the whole project. Without that step, Unity would just be getting raw EEG noise instead of meaningful features.
Finally, the Python side sends the data over UDP to the local Unity game.
sock.sendto(json.dumps(payload).encode('utf-8'), ("127.0.0.1", 5005))
I like this separation because it keeps the heavy EEG and DSP work on the Python side while Unity focuses on gameplay and presentation.
The player script
On the Unity side, the main gameplay entry point is PlayerShooter.cs. This script handles horizontal movement and automatic firing. The player can move left and right with the keyboard while the weapon fires on a timed rate.
if (Keyboard.current.dKey.isPressed || Keyboard.current.rightArrowKey.isPressed) moveInput += 1f;if (Keyboard.current.aKey.isPressed || Keyboard.current.leftArrowKey.isPressed) moveInput -= 1f;
The position is clamped so the player stays inside the lane of play.
float newX = transform.position.x + (moveInput * moveSpeed * Time.deltaTime);float clampedX = Mathf.Clamp(newX, -8f, 8f);transform.position = new Vector3(clampedX, fixedYPosition, transform.position.z);
The more important part is the shooting logic. Every time the weapon fires, the script reads the current BCI performance score and passes that value into the bullet.
GameObject newBullet = Instantiate(bulletPrefab, firePoint.position, Quaternion.identity);float currentFocus = bciReceiver.performance_score;newBullet.GetComponent().PowerUp(currentFocus);
This is really the heart of the gameplay idea. The bullet is not just spawned; it is born with a power level based on the current EEG-derived score.
The script also refuses to fire if the BCI signal is lost.
if (bciReceiver.signal_ok == 0) return;
I like that detail because it makes signal quality part of the game logic instead of pretending the data is always valid.
How the bullet changes with brain state
The script BCIBullet.cs is where the EEG score actually becomes visible. When a bullet is fired, the PowerUp() method scales its damage, size, and color based on the current focus score.
damage = 10 + Mathf.RoundToInt(40f * focusScore);
That means each shot starts with a base damage of 10 and can gain up to 40 more based on the brain-state score.
The same score also controls the projectile size.
float size = Mathf.Lerp(0.5f, 2.0f, focusScore);transform.localScale = new Vector3(size, size, 1f);
And it changes the bullet color from white toward a brighter yellow-orange as the score rises.
GetComponent().color = Color.Lerp(Color.white, new Color(1f, 0.8f, 0f), focusScore);
I like this design because it is immediate and easy to understand. The player does not need to stare at a separate EEG graph to know the system is responding. The projectile itself carries the feedback.
Enemy logic
The enemy side is intentionally simple. Enemy.cs moves each enemy downward, destroys it if it falls off-screen, and handles taking damage from bullets.
transform.Translate(Vector3.down * fallSpeed * Time.deltaTime);if (transform.position.y < -6f) { Destroy(gameObject);}
When an enemy is hit, its health drops and it flashes red briefly before either recovering or dying.
health -= damageAmount;GetComponent().color = Color.red;Invoke("ResetColor", 0.1f);
I think this works well for a BCI prototype because the game logic stays simple enough that the neurofeedback mechanic remains the main star.
Spawning enemies
EnemySpawner.cs handles the spawn loop. At regular intervals it picks a random X position across the top of the screen and instantiates a new enemy.
if (Time.time >= nextSpawnTime) { SpawnEnemy(); nextSpawnTime = Time.time + spawnRate;}
float randomX = Random.Range(minX, maxX);Vector3 spawnPosition = new Vector3(randomX, spawnY, 0f);Instantiate(enemyPrefab, spawnPosition, Quaternion.identity);
This is pretty standard arcade spawning logic, but it fits the project well. The important thing is that the enemies give the player something to react to while the BCI-controlled weapon system does its work.
The UI as a live BCI monitor
I also like the way BCIUIManager.cs turns the HUD into a live BCI monitor. It reads values from the receiver and shows signal state, active site, active metric, current raw metric value, baseline, and the final performance score.
string metricLabel = bci.activeMetric == "TBR" ? "TBR" : "SMR Ratio";float rawValue = bci.currentVal;
Then it formats the actual UI text block:
uiText.text = $"[ BCI MONITOR ]\n" + $"Signal: {sigColor}\n" + $"Site: {bci.activeSite} | Metric: {bci.activeMetric}\n" + $"Current {metricLabel}: {rawValue:0.000}\n" + $"Baseline: {bci.baselineVal:0.000}\n" + $"Power Level: {(bci.performance_score * 100f):0}%";
I think this is important because it keeps the game readable. The bullet visuals already communicate the feedback indirectly, but the UI lets me also inspect what the system thinks is happening numerically.
The controls shown in the UI make it clear that the player can switch sites and metrics during play.
$"1-4: Change Site (F3, Fz, F4, Cz)\n" +$"Q: TBR Mode (Focus)\n" +$"W: SMR Mode (Calm)\n" +$"SPACE: Reset Baseline"
That turns the project into more than a fixed demo. It becomes a small experimentation platform for trying different EEG control modes.
Why I like this architecture
What I like most about this repo is the separation of roles. Python handles BrainFlow, filtering, and band-power extraction. Unity handles movement, projectiles, enemies, UI, and the feel of the game. That makes the system easier to reason about and easier to extend.
It also means I can improve the EEG side later without rewriting the shooter logic, or improve the game side later without changing how the signal processing works.
Final thoughts
For me, this project is a fun example of how neurotechnology can become part of gameplay instead of just a research graph. The player moves normally, enemies spawn normally, and bullets fire normally, but the strength of those bullets depends on what the EEG system thinks my brain state is doing in real time.
That is what makes the project interesting to me. It turns EEG into something interactive and visible. Instead of just watching brain data, I get to play with it.
And honestly, that is where BCI game design starts to get exciting.
Leave a comment