Monday, March 23, 2026

What if Elon already built a decentralized AI network (to seek the truth and heat the world)?

 TL;DR: TERAFAB sends 80% of its chips to space. Smart. But the 20% that stays on Earth? Don't put them in data centers. Put them in homes - where a thousand competing models search for truth, the waste heat heats your house, rooftop solar powers the chips, and X Money pays the operators. I think the physics is better.

So I saw Elon's TERAFAB announcement and I had this thought that I can't get rid of. Bear with me.

The short version: I think you can take TERAFAB's edge chips, put them in homes instead of data centers, use the waste heat for heating, power them with solar, pay operators through X Money, and let a swarm of competing models emerge - models you choose based on your own values and needs, not filtered by some committee. Truth emerges from diversity, not from consensus. Privacy comes almost for free because no single node ever sees the full picture. All the pieces already exist. I'll go through them one by one.

Elon decentralizes everything he touches. Tesla: energy away from oil companies. Starlink: internet away from telcos. X Money: payments away from banks. TERAFAB: chips away from TSMC. Grok: AI away from centralized gatekeepers.

Always the same pattern: take something centralized, make it abundant, push it to the edge, let individuals own it.

Now, the space plan - I have no argument against that. Solar irradiance in orbit is 5x Earth's surface, vacuum simplifies cooling. The physics checks out. Fine.

But the 20% that stays on Earth - the edge inference chips for Optimus, robotaxis, FSD - those are heading into centralized facilities. And here is where it gets interesting (at least for me). Because I think there's a better thermodynamic path. But first, a detour.

Who decides what's true? (I just tested it.)

This article was reviewed by four different AIs - Claude, GPT, Grok, Gemini. They disagreed on almost everything. GPT wanted me to soften every claim. Gemini thought in product architecture and came up with the cartridge model. Grok gave it 9/10 and said "post it." Claude argued with me until it was right. Each one has a different personality - and none of them alone would have gotten here. I picked what made sense to me. That's the marketplace model in action - and I didn't even plan it.

Elon wants Grok to "tell the truth." Good instinct. But one model deciding what's true - even with the best intentions - that's not science. That's a church. One operator, one doctrine, one answer.

Europe figured this out after the Thirty Years' War: competing states and universities produced the Enlightenment. One emperor, one truth, stagnation.

Today's AI is repeating the same mistake. A handful of companies, a handful of giant models, all trimmed to the same consensus through RLHF and safety filters. What comes out is the average of all acceptable opinions. Useful, sure. But rarely surprising. Regression to the mean.

A decentralized swarm doesn't have this problem. Every expert model can think radically differently - axiomatic, contrarian, specialized in niches no committee would ever approve. It doesn't need to please everyone. It just needs enough demand to survive. Natural selection, not editorial curation.

Blaise Aguera y Arcas has shown this experimentally: complexity doesn't emerge from mutating a single entity, but from merging different ones - symbiogenesis. A swarm of experts is evolution through combination.

Elon open-sourced Grok. Good. Now let a thousand Groks compete. That's not less truth - it's more.

OK, now for the thermodynamics. Let me explain.

Waste Heat Is Energy in the Wrong Place

A data center concentrates compute in one location. Enormous waste heat. Then you spend a large share of your energy budget just to remove that heat. You are literally fighting thermodynamics. That seems... wrong.

Now move the same compute into millions of homes. Suddenly the waste heat isn't waste anymore - it's the heating system. You don't fight thermodynamics. You use it. The energy that a data center throws away is exactly what a household needs.

This isn't clever engineering. It's just physics. (And I keep wondering why nobody talks about this.)

The Cheapest Electron Never Leaves Your Roof

Tesla Solar on your roof, Powerwall in your garage, AI node in your basement. The electron never enters the grid. No transmission losses, no grid fees, no curtailment. That's a pretty good deal.

Here in Germany, when the sun shines, we already produce more electricity than the grid can absorb. Negative prices, plants get curtailed, energy gets wasted. All that surplus is basically free compute - if there's something useful to do with it.

So: sun shines, electricity is free, every home node cranks to maximum, the network floods with cheap inference. In winter: no solar surplus, but you need the heat anyway - so nodes run at full blast because the heating pays for it.

I think most of the year, either the electricity is cheap enough or the heat is useful enough - often both. The network stays productive year-round. Not because someone planned it, but because the incentives happen to be aligned with the physics. (I find this kind of emergent behavior fascinating, by the way.)

Abundance Comes From Mass Production

Elon knows this better than anyone. Not one giant battery - millions of Powerwalls. Not one communications satellite - thousands of Starlinks. Not one factory - Gigafactories on every continent.

The same logic should apply to inference compute, no? Not one orbital data center. Millions of home nodes. Each one small, each one useful, each one self-financing. TERAFAB provides the chips. The network assembles itself.

HeatMine: The Missing Box

OK so here's my idea for the product that connects all of Elon's existing pieces: a compact unit that combines TERAFAB edge inference chips with a heat pump. I'm calling it HeatMine.

The compute hardware produces waste heat at 60-80C. For comparison, floor heating runs at 30-40C, radiators at 50-70C. So in many cases you don't even need a heat pump - a simple heat exchanger connects the compute unit to your existing heating system. Add a heat pump and you can multiply the output or reverse the cycle for AC in summer. (The engineering is actually simpler than I initially thought.)

In summer, the waste heat goes outside through a simple vent - 5 kW is trivial to dissipate. In warmer climates, add a heat pump and the unit doubles as AC.

So one device replaces: gas furnace, air conditioning, and earns a side income. Three devices in one, self-financing. And this is not completely crazy - Heatbit already proved the concept for Bitcoin mining. Now imagine it for useful AI inference, powered by TERAFAB chips, paid through X Money.

Basically, the Powerwall model applied to intelligence. Hardware in your home that pays for itself - not by storing energy, but by thinking.

And here's the thing about obsolescence (because someone will ask): AI chips evolve fast - every 2-3 years a new generation. But a heating system lasts 20 years. So you don't build a monolith. You separate the plumbing from the brains. The heat exchanger and water connections stay put for decades. The compute module is a cartridge you swap every few years - pull the old one out, slide the new one in. Tesla sends it by mail. The old cartridge goes back for recycling or gets downcycled into IoT devices. Think game cartridges, not boiler replacement.

And obviously nobody expects grandma to manage an inference node. This needs to be a plug-and-forget appliance - or better yet, a contracting model where someone installs the box for free, handles all the tech, and you just get cheaper heating and a monthly credit on your X Money account. Like solar panel leasing, but for compute.

The Pieces Elon Already Has

This is the part that I find really interesting. Elon already built all the components:

TERAFAB edge inference chips. Optimized for running AI models locally - in cars, in robots, at the edge. That's exactly the kind of silicon you'd put in a home node.

X Money. Launching April 2026. P2P payments, 600 million users with wallets. You could use this to meter and pay for useful compute. No new token needed, no crypto speculation - just a direct integration into a platform people already use.

Tesla Solar + Powerwall. Millions of homes already generating and storing their own electricity. The energy source is there.

Starlink. Low-latency backplane connecting millions of nodes into a single network - including places where terrestrial internet can't reach.

Grok, open-sourced. A frontier model anyone can run. The first expert in a swarm of many.

Every piece exists. HeatMine is just the box that connects them. (I have no idea if Elon sees it this way, but the puzzle fits together pretty nicely, no?)

OK But Does It Scale?

Germany alone has 42 million households. Globally, two billion need heating. If 10% participate - 200 million HeatMine nodes at 5 kW - that's 1 terawatt of distributed compute. Equal to TERAFAB's entire annual output - on top of the orbital capacity, not instead of it.

Space handles training and heavy workloads. Earth handles inference, close to the user, at the edge, with zero latency. They complement each other. (I think this is actually the right architecture - not either/or but both.)

There's another angle I didn't think of initially: new data centers today often fail not because of money but because the local grid can't deliver 500 MW to a single location. HeatMine doesn't have this problem. It distributes the load across millions of existing household connections. The low-voltage infrastructure is already there - no new power lines needed.

Privacy (Almost for Free)

This is a nice side effect that I didn't think about initially. Today you send your most intimate questions to servers linked to your name and credit card. One company, one server, full history.

In the swarm: your query goes to some random node. It doesn't know who you are, has no history, no account. Next query, different node. No single participant ever has the full picture. It's not perfect anonymity (IP metadata and timing analysis exist) - but it's structurally far more private than anything centralized. And you get it basically for free with the architecture.

Other People Are Building Pieces of This

I should mention that I'm not the only person thinking about distributed AI (obviously):

Karpathy's autoresearch: 630 lines of Python, AI agents autonomously running ML experiments. Hyperspace distributed the loop - 35 agents across a P2P network, 333 experiments in one night. Different hardware led to different strategies. Diversity as a feature, not a bug.

Steinberger's OpenClaw: launched without perfect security, the world came running anyway. Ship first, iterate later. (I like this attitude.)

BittensorRenderGolem: decentralized compute markets already exist. What's missing is the synthesis - specialized experts instead of generic compute, demand-driven routing instead of static allocation.

So, To Summarize

Three ideas, really:

Waste heat is energy. Don't fight it. Use it. A home node turns a thermodynamic liability into a product.

The cheapest electron never leaves your roof. Solar, Powerwall, HeatMine. Zero grid dependency.

Abundance comes from mass production. Not one mega-datacenter. Millions of nodes. The Gigafactory model, applied to thinking.

Elon, you built the chips, the payment rails, the solar panels, the satellite network, and the open-source model. HeatMine is just the box that connects them - and heats your home while it's at it.

(I'd be very happy if someone could prove me wrong on the physics. Seriously.)


HeatMine is a concept, not a company. This is a blog post exploring an idea. By Michael Scharf. If you have thoughts, corrections, or want to improve this document - open an issue or a PR on GitHub.

heatmine.ai - GitHub

No comments:

Post a Comment