Biological Data Centers Are Coming, And I Support The Shift
AI is slamming into hard limits of power, water, and cost. That is why I am paying close attention to a radical alternative that uses living human neurons for computation. The idea is not science fiction. It is the core of what Cortical Labs is building, and it is now headed to real-world data centers.
Last year the team showed CL1, a code-deployable biological computer made of roughly 200,000 living human neurons. In February, they demonstrated that these neurons could learn to play Doom. That is a leap beyond their earlier Pong experiments in 2022 and a sign that adaptive, low-power computation may be possible at scale.
Now the effort is expanding from a single dish of brain cells to entire facilities. According to reporting, Cortical Labs is developing biological data centers in Melbourne and Singapore. I support this direction because it confronts the environmental cost of current AI while exploring a complementary path for intelligent systems.
What A Biological Computer Actually Is
The company calls the approach wetware. The core idea is simple to state and complex to execute. Neurons derived from human blood stem cells are grown on a chip. The system sends electrical signals in, then records the neurons responses as outputs.
In practice, the neurons form networks that self-organize and adapt. They change their activity patterns in response to feedback, a hallmark of learning. The CL1 system wraps this living network with electrodes, sensors, and software so developers can deploy tasks and collect results. You are not writing lines of code for a CPU. You are shaping inputs and rewards to guide a neural culture toward useful behavior.
That makes wetware very different from conventional chips. It is not clocked silicon with fixed instruction sets. It is a living substrate that learns, generalizes, and sometimes surprises. That may sound unsettling, but it is also what makes biological computation so promising for pattern recognition and adaptation.
From Pong To Doom, Why That Matters
Early work focused on Pong. Training a culture to track a bouncing ball proved that simple sensorimotor loops could be learned. The more recent Doom demo matters because it shows progress on complex, dynamic tasks. Doom requires navigation, prediction, and fast reactions in a richer environment.
I do not claim that playing a game equals general intelligence. It does not. But this milestone demonstrates that biological systems can be guided to perform nontrivial tasks under constraints. It also signals that the interface between living neurons and digital systems is maturing. Better inputs, better feedback, and better monitoring lead to better learning.
In other words, we are past the novelty phase. This is starting to look like a platform you can iterate on and scale.
Scaling Into Biological Data Centers
Here is where the plans get concrete. Cortical Labs is partnering with DayOne Data Centers to build two facilities. The Melbourne site is set to house around 120 CL1 units. The Singapore site is slated for as many as 1,000 units.
These will not replace racks of Nvidia accelerators overnight. Instead, they will run side by side with conventional infrastructure. Think of them as specialized nodes for tasks where biological computation excels, especially where low power and on-the-fly learning matter.
To me, the most compelling part of the plan is the shift from a single dish to a network of standardized nodes. That is how you move from a research curiosity to an operational service. You define I/O, calibrate performance, and build orchestration tools that schedule workloads across many biological units.
Energy Use Is The Killer Feature
Power is where wetware shines. According to the company, each CL1 node draws less power than a handheld calculator. That is orders of magnitude below a modern GPU. If future biological data centers truly consume a fraction of the power used by conventional AI processors, the implications are big.
AI data centers today face tough constraints. They generate noise. They require immense water for cooling. They can push up local electricity prices and strain grids. Every watt saved cascades through the entire facility, from power distribution to chillers and backup systems. A lower-power compute substrate directly reduces emissions and operational cost.
I am not naive about trade-offs. Wetware will need its own environmental controls to keep cultures viable. But the baseline energy difference is so large that even with overhead, the net footprint could be far lower than silicon-heavy clusters trying to keep GPUs cool around the clock.
What These Systems Might Be Good At
There is a lot we still do not know about practical workloads. That said, I see several near-term fits where biological computation could be compelling:
- Adaptive control and reinforcement learning. Systems that must learn quickly from sparse feedback and adapt to changing conditions.
- Low-power pattern recognition. Continuous signal processing where energy is at a premium, from sensor fusion to anomaly detection.
- Few-shot and online learning. Tasks that benefit from rapid generalization without massive pretraining runs.
- Search in noisy environments. Heuristics that exploit emergent dynamics for efficient exploration.
These examples do not demand peak floating point throughput. They demand data efficiency, adaptation, and robustness under uncertainty. That is where wetware may punch above its weight, while leaving brute-force matrix math to GPUs and specialized accelerators.
Limitations We Have To Acknowledge
The biggest open question is simple. Can biological computers keep up with the performance requirements of top-tier AI workloads? Right now, there is no evidence that they can match cutting-edge chips on training or inference throughput. We should not pretend otherwise.
There are other challenges too:
- Reproducibility. Ensuring that neuron cultures behave consistently across units and across time.
- Uptime and maintenance. Keeping living systems stable, sterile, and responsive in a data center environment.
- I/O bandwidth. Scaling electrodes, sensors, and signal processing to feed and read large networks.
- Programmability. Providing developers with abstractions, APIs, and tools to target wetware without bespoke tinkering.
- Benchmarking. Defining tasks and metrics that fairly compare biological and silicon approaches.
None of these are trivial. They are also not showstoppers if tackled systematically. The path to viability is clear. Standardize the hardware, automate culture management, harden the software stack, and publish transparent benchmarks.
How The Wetware Interface Works
It helps to demystify the interface. The neurons in CL1 are derived from human blood stem cells. The system sends electrical patterns into the culture through an array. It then reads the resulting activity through embedded chips. Over time, guided by reward signals, the culture rewires and improves its responses.
This is not mind uploading or consciousness in a jar. It is a lab-grown neural network that responds to stimuli and learns. The key is precise control of stimulation and rigorous capture of outputs. The better the interface, the more predictable and useful the computation.
If that precision keeps improving, I expect more complex tasks and more repeatable results. The trajectory from Pong to Doom suggests that interface quality is a major driver of capability.
Ethics, Safety, And Sourcing
Support for this technology must go hand in hand with strong ethics. Using neurons derived from blood stem cells brings questions of informed consent, sourcing, and oversight. Biosafety protocols must be tight. Disposal and sterilization processes should be auditable.
I also believe transparency matters. Clear statements on sourcing, culture development, and welfare standards will build trust. Regulatory guidance will likely evolve as the field matures. Getting ahead of that with voluntary disclosures and third-party review is smart.
The goal is not to sensationalize. It is to treat wetware as a responsible engineering discipline with the same rigor we expect in biotech and medical labs.
Why This Matters For AI Infrastructure
AI data centers are expensive, power hungry, and increasingly controversial. Communities push back on noise and water use. Utilities struggle to provision new load. Some analysts warn of financial bubbles around AI infrastructure. All of that points to the need for fresh approaches.
Biological data centers will not replace silicon. They can complement it. A cluster of CL1 nodes that learn effectively at a sliver of the power could take on specific workloads and free up GPUs for what they do best. Hybrid architectures are standard in computing. Wetware can be another specialized component in that mix.
If that happens, the benefits are tangible. Lower power draw. Smaller cooling plants. Less water consumption. Reduced local grid stress. And potentially lower total cost of ownership for targeted tasks.
What I Will Watch Next
To gauge progress, I will be looking for:
- Published benchmarks on real tasks that compare wetware to conventional accelerators.
- Reliability metrics like mean time between failures and culture longevity under load.
- Developer tooling that abstracts culture management and provides clean APIs.
- Operational playbooks for running hundreds to thousands of nodes with consistent performance.
- Regulatory and ethical frameworks that are clear, public, and independently reviewed.
On top of that, the buildout in Melbourne and Singapore will be a practical test. Installing 120 units in one site and up to 1,000 in another will surface every real-world challenge from contamination control to job scheduling. I am optimistic, but I want to see the data.
Bottom Line
New data centers powered by human brain cells are more than a headline. They represent a serious attempt to rethink computation for an era where power and sustainability are limiting factors. Cortical Labs has moved the concept from clever demo to deployment plan, with partners and site counts to match.
I support pushing this frontier. The environmental upside is too large to ignore, and the potential for adaptive, low-power intelligence is real. There are big questions about performance, reliability, and ethics. Those questions should be answered in public with evidence. If the answers trend positive, wetware could become a vital complement to silicon in the AI stack.
Key Takeaways
- Biological data centers built on living human neurons are moving from lab demos to planned facilities in Melbourne and Singapore.
- CL1 units use far less power than GPUs, potentially consuming a fraction of traditional AI processor energy.
- Progress from Pong to Doom shows improved interfaces and learning capability, though general performance remains unproven.
- Wetware will likely complement, not replace, silicon by targeting adaptive, low-power tasks.
- Success depends on reliability, reproducibility, tooling, and ethics. Transparent benchmarks and standards are essential.

Written by
Tharun P Karun
Full-Stack Engineer & AI Enthusiast. Writing tutorials, reviews, and lessons learned.