Every chip in your stack is fighting physics

Every digital chip does the same expensive trick: it keeps electrons in order while the environment pushes them toward disorder. As transistors shrink and AI workloads grow, the cost of keeping bits stable keeps rising.

Thermodynamic computing starts from a different premise. Instead of suppressing thermal noise, it uses it. Instead of treating randomness as a defect, it treats randomness as a computational resource.

How it works, in engineer terms

A traditional CPU computes by forcing a system into a precise state. Logic gates, clock edges, memory cells: all about making the machine deterministic.

A thermodynamic computer does controlled sampling. Its hardware naturally evolves through a probability distribution, and the computation is encoded in that distribution. If your workload is probabilistic, sampling is the job, not a side effect.

A 2025 Nature Communications paper demonstrated this with an 8-cell stochastic processing unit on a PCB. They ran Gaussian sampling, matrix inversion, uncertainty quantification, and Gaussian process regression. The paper argues future thermodynamic computers may outperform digital systems in speed or energy efficiency once the hardware scales.

The physics in one table

ConceptWhat it means
Landauer’s PrincipleErasing information has a minimum cost: kBT ln 2 per bit. Every logically irreversible operation costs energy. Source
Classical computingSpends energy to keep state stable
Quantum computingSpends engineering effort to suppress noise and preserve coherence
Thermodynamic computingBuilds computation directly out of stochastic dynamics. Noise is the medium, not the enemy

Berkeley Lab (2026) puts it bluntly: thermodynamic computing is “noise-powered.” Training is expensive (their work required 96 GPUs on Perlmutter), but inference on the trained physical hardware can be very low energy.

The startups

CompanyFoundedFundingWhat they buildSource
Extropic~2023UndisclosedThermodynamic sampling unit (TSU). Uses thermal fluctuations for probabilistic AI workloads. Co-founded by Guillaume Verdon (ex-Google Quantum AI)WIRED, extropic.ai
Normal Computing2022$35M seed + $50M (Samsung Catalyst)CN101 chip (taped out Aug 2025), targets multimodal diffusion GenAI inference. Founded by Google Brain/X engineersFortune, normalcomputing.com

Normal Computing’s CN101 tape-out moves thermodynamic computing from whiteboard theory into silicon schedule reality.

Computing paradigms compared

ParadigmCore resourceBest fitMain constraintStatus
Classical digitalDeterministic state, clocked logicGeneral-purpose, mature stacksEnergy, data movement, heatDominant, increasingly power-bound
QuantumCoherent quantum statesChemistry, optimization, cryptographyDecoherence, cryogenics, error correctionNarrow, operationally hard
NeuromorphicEvent-driven, brain-inspiredEdge inference, sensory, sparse workloadsProgramming model, toolingUseful in niches
ThermodynamicControlled stochastic dynamicsProbabilistic inference, generative AI, uncertaintyEquilibration time, scalability, trainingYoung, real, far from broad deployment

What changes if inference gets 10-100x cheaper

EffectWhy it matters
More models move from batch to always-onContinuous evaluation, richer retrieval, more frequent re-ranking become economical
Uncertainty becomes practicalIf hardware natively samples distributions, probabilistic outputs become cheap
Edge and local deployments improveLower power means less cooling, smaller hardware, more privacy
Model architectures may shiftBayesian methods and generative sampling pipelines become viable again

Realistic timeline

PeriodWhat to expect
2025-2027Prototypes, developer demos, narrow pilots
2027-2029First commercial niche deployments if silicon works and tooling improves
2030+Broader use in probabilistic inference if economics prove out

The Nature Communications prototype is 8 cells on a PCB. The paper acknowledges scalability limitations including inductors and transformers. Startup roadmaps are more aggressive, but roadmaps are not silicon.

Honest take

Thermodynamic computing is not the next GPU. It is its own attempt to answer: what if thermal noise is not a flaw but part of the compute fabric?

That question matters because AI is increasingly a power and cost problem, not just an algorithm problem. Too early to invest. Too important to ignore.