Wunderstory dev@wunderstory.io

We gave a fruit fly a drone.

A fruit fly's brain has been mapped, neuron by neuron. Run that map on a GPU and you have a working fly mind in software. We have given it a body, and a drone for it to pilot into the physical world.

Side by side: a Crazyflie drone's CrazySim view of the world, and an animated heatmap of the fly connectome firing while it pilots.
Left, the drone's view of CrazySim. Right, all 127,000 fly neurons firing in real time as the connectome flies it.

A real brain, made to run.

FlyWire is a neuron-level map of a fruit fly's brain, charted from a real Drosophila melanogaster. We run all 127,000 cells as a leaky integrate-and-fire network on a single GPU using Brian2 and GeNN. Sensory inputs inject current into identified neuron populations using FlyWire IDs. In the drone, descending-neuron output becomes velocity setpoints that the Crazyflie firmware turns into rotor commands. Brain state persists when the body changes: voltages, refractory windows, plasticity, hunger.

Animated heatmap of fly-brain spike activity, captured live while the connectome was piloting the drone.
Live spike activity. Every dot is a neuron; brightness is its recent firing rate, log-scaled. Captured during a drone flight.

Bodies in use.

i.

Virtual fly. running.

NeuroMechFly v2 — a biomechanical model of an adult fruit fly, micro-CT-scanned and rigged for MuJoCo.

NeuroMechFly v2: a biomechanical fruit-fly model, micro-CT-scanned and rigged for MuJoCo physics.

ii.

Drone, in simulation. running.

The drone's view inside the CrazySim arena, driven by the fly connectome.

A Crazyflie 2.1 inside CrazySim's software-in-the-loop arena. The same descending-neuron output, mapped to velocity setpoints. The drone runs the unmodified Crazyflie firmware over the same transport a physical drone uses.

iii.

Drone, in your room. forthcoming.

The physical Crazyflie 2.1 quadcopter — Bitcraze's open-source nano drone.

A physical Crazyflie 2.1 with the AI-deck streaming 320×240 grayscale frames up to the GPU. Descending-neuron output streams back as velocity setpoints. Local socket replaced by a radio link.

Where this goes.

The work is moving outward from the simulator. Each step keeps the same brain state, plasticity, and motivation across the change.

Next. Physical Crazyflie, brain off-board. Vision over the AI-deck, descending-neuron output back as setpoints.

Further. Plasticity in the embodied loop. Synaptic learning during a closed-loop run, with the plasticity layer carrying across body changes.

Long-term. Multiple connectomes in a shared world. Several flies in one virtual space, each with independent state, plasticity, and motivation.

We are building this carefully and in the open. Subscribe for early access. Write us at dev@wunderstory.io to collaborate or see it run.