Tech
Why the economics of orbital AI are so brutal
In a sense, this whole thing was inevitable. Elon Musk and his coterie have been talking about AI in space for years — mainly in the context of Iain Banks’ science-fiction series about a far-future universe where sentient spaceships roam and control the galaxy.
Now Musk sees an opportunity to realize a version of this vision. His company SpaceX has requested regulatory permission to build solar-powered orbital data centers, distributed across as many as a million satellites, that could shift as much as 100 GW of compute power off the planet. He has reportedly suggested some of his AI satellites will be built on the moon.
“By far the cheapest place to put AI will be space in 36 months or less,” Musk said last week on a podcast hosted by Stripe co-founder John Collison.
He’s not alone. xAI’s head of compute has reportedly bet his counterpart at Anthropic that 1% of global compute will be in orbit by 2028. Google (which has a significant ownership stake in SpaceX) has announced a space AI effort called Project Suncatcher, which will launch prototype vehicles in 2027. Starcloud, a startup that has raised $34 million backed by Google and Andreessen Horowitz, filed its own plans for an 80,000 satellite constellation last week. Even Jeff Bezos has said this is the future.
But behind the hype, what will it actually take to get data centers into space?
In a first analysis, today’s terrestrial data centers remain cheaper than those in orbit. Andrew McCalip, a space engineer, has built a helpful calculator comparing the two models. His baseline results show that a 1 GW orbital data center might cost $42.4 billion — almost 3x its ground-bound equivalent, thanks to the up-front costs of building the satellites and launching them to orbit.
Changing that equation, experts say, will require technology development across several fields, massive capital expenditure, and a lot of work on the supply chain for space-grade components. It also depends on costs on the ground rising as resources and supply chains are strained by growing demand.
Techcrunch event
Boston, MA
|
June 23, 2026
Designing and launching the satellites
The key driver for any space business model is how much it costs to get anything up there. Musk’s SpaceX is already pushing down on the cost of getting to orbit, but analysts looking at what it will take to make orbital data centers a reality need even lower prices to close their business case. In other words, while AI data centers may seem to be a story about a new business line ahead of the SpaceX IPO, the plan depends on completing the company’s longest-running unfinished project — Starship.
Consider that the reusable Falcon 9 delivers, today, a cost to orbit of roughly $3,600/kg. Making space data centers doable, per Project Suncatcher’s white paper, will require prices closer to $200/kg, an 18-fold improvement that it expects to be available in the 2030s. At that price, however, the energy delivered by a Starlink satellite today would be cost competitive with a terrestrial data center.
The expectation is that SpaceX’s next-generation Starship rocket will deliver those improvements — no other vehicle in development promises equivalent savings. However, that vehicle has yet to become operational or even reach orbit; a third iteration of Starship is expected to make its maiden launch sometime in the months ahead.
Even if Starship is completely successful, however, assumptions that it will immediately deliver lower prices to customers may not pass the smell test. Economists at the consultancy Rational Futures make a compelling case that, as with the Falcon 9, SpaceX will not want to charge much less than its best competitor — otherwise the company is leaving money on the table. If Blue Origin’s New Glenn rocket, for example, retails at $70 million, SpaceX won’t take on Starship missions for external customers at much less than that, which would leave it above the numbers publicly assumed by space data center builders.
“There are not enough rockets to launch a million satellites yet, so we’re pretty far from that,” Matt Gorman, the CEO of Amazon Web Services, said at a recent event. “If you think about the cost of getting a payload in space today, it’s massive. It is just not economical.”
Still, if launch is the bane of all space businesses, the second challenge is production cost.
“We always take for granted, at this point, that Starship’s cost is going to be hundreds of dollars per kilo,” McCalip told TechCrunch. “People are not taking into account the satellites are almost $1,000 a kilo right now.”
Satellite manufacturing costs are the largest chunk of that price tag, but if high-powered satellites can be made at about half the cost of current Starlink satellites, the numbers start to make sense. SpaceX has made great advances in satellite economics while building Starlink, its record-setting communications network, and the company hopes to achieve more through scale. Part of the reasoning behind a million satellites is undoubtedly the cost savings that come from mass production.
Still, the satellites that will be used for these missions must be large enough to satisfy the complex requirements for operating powerful GPUs, including large solar arrays, thermal management systems, and laser-based communications links to receive and deliver data.
A 2025 white paper from Project Suncatcher offers one way to compare terrestrial and space data centers by the cost of power, the basic input needed to run chips. On the ground, data centers spend roughly $570 to $3,000 for a kW of power over a year, depending on local power costs and the efficiency of their systems. SpaceX’s Starlink satellites get their power from on-board solar panels instead, but the cost of acquiring, launching, and maintaining those spacecraft delivers energy at $14,700 per kW over a year. Put simply, satellites and their components will have to get a lot cheaper before they’re cost-competitive with metered power.
The space environment is not fooling around
Orbital data center proponents often say that thermal management is “free” in space, but that’s an oversimplification. Without an atmosphere, it’s actually more difficult to disperse heat.
“You’re relying on very large radiators to just be able to dissipate that heat into the blackness of space, and so that’s a lot of surface area and mass that you have to manage,” said Mike Safyan, an executive at Planet Labs, which is building prototype satellites for Google Suncatcher that are expected to launch in 2027. “It is recognized as one of the key challenges, especially long term.”
Besides the vacuum of space, AI satellites will need to deal with cosmic radiation. Cosmic rays degrade chips over time, and they can also cause “bit flip” errors that can corrupt data. Chips can be protected with shielding, use rad-hardened components, or work in series with redundant error checks, but all these options involve expensive trades for mass. Still, Google used a particle beam to test the effects of radiation on its tensor processing units (chips designed explicitly for machine learning applications). SpaceX executives said on social media that the company has acquired a particle accelerator for just that purpose.
Another challenge comes from the solar panels themselves. The logic of the project is energy arbitrage: Putting solar panels in space makes them anywhere from 5x to 8x more efficient than on Earth, and if they’re in the right orbit, they can be in sight of the sun for 90% of the day or more, increasing their efficiency. Electricity is the main fuel for chips, so more energy equals cheaper data centers. But even solar panels are more complicated in space.
Space-rated solar panels made of rare earth elements are hardy, but too expensive. Solar panels made from silicon are cheap and increasingly prevalent in space — Starlink and Amazon Kuiper use them — but they degrade much faster due to space radiation. That will limit the lifetime of AI satellites to around five years, which means they will have to generate return on investment faster.
Still, some analysts think that’s not such a big deal, based on how quickly new generations of chips arrive on the scene. “After five or six years, the dollars per kilowatt-hour doesn’t produce a return, and that’s because they’re not state of the art,” Philip Johnston, the CEO of Starcloud, told TechCrunch.
Danny Field, an executive at Solestial, a startup building space-rated silicon solar panels, says the industry sees orbital data centers as a key driver of growth. He’s speaking with several companies about potential data center projects, and says “any player who is big enough to dream is at least thinking about it.” As a long-time spacecraft design engineer, however, he doesn’t discount the challenges in these models.
“You can always extrapolate physics out to a bigger size,” Field said. “I’m excited to see how some of these companies get to a point where the economics make sense and the business case closes.”
How do space data centers fit in?
One outstanding question about these data centers: What will we do with them? Are they general purpose, or for inference, or for training? Based on existing use cases, they may not be entirely interchangeable with data centers on the ground.
A key challenge for training new models is operating thousands of GPUs together en masse. Most model training is not distributed, but done in individual data centers. The hyperscalers are working to change this in order to increase the power of their models, but it still hasn’t been achieved. Similarly, training in space will require coherence between GPUs on multiple satellites.
The team at Google’s Project Suncatcher notes that the company’s terrestrial data centers connect their TPU networks with throughput in the hundreds of gigabits per second. The fastest off-the-shelf inter-satellite comms links today, which use lasers, can only get up to about 100 Gbps.
That led to an intriguing architecture for Suncatcher: It involves flying 81 satellites in formation so they are close enough to use the kind of transceivers relied on by terrestrial data centers. That, of course, presents its own challenges: The autonomy required to ensure each spacecraft remains in its correct station, even if maneuvers are required to avoid orbital debris or another spacecraft.
Still, the Google study offers a caveat: The work of inference can tolerate the orbital radiation environment, but more research is needed to understand the potential impact of bit-flips and other errors on training workloads.
Inference tasks don’t have the same need for thousands of GPUs working in unison. The job can be done with dozens of GPUs, perhaps on a single satellite, an architecture that represents a kind of minimum viable product and the likely starting point for the orbital data center business.
“Training is not the ideal thing to do in space,” Johnston said. “I think almost all inference workloads will be done in space,” imagining everything from customer service voice agents to ChatGPT queries being computed in orbit. He says his company’s first AI satellite is already earning revenue performing inference in orbit.
While details are scarce even in the company’s FCC filing, SpaceX’s orbital data center constellation seems to anticipate about 100 kW of compute power per ton, roughly twice the power of current Starlink satellites. The spacecraft will operate in connection with each other and use the Starlink network to share information; the filing claims that Starlink’s laser links can achieve petabit-level throughput.
For SpaceX, the company’s recent acquisition of xAI (which is building its own terrestrial data centers) will let the company stake out positions in both terrestrial and orbital data centers, seeing which supply chain adapts faster.
That’s the benefit of having fungible floating point operations per second — if you can make it work. “A FLOP is a FLOP, it doesn’t matter where it lives,” McCalip said. “[SpaceX] can just scale until [it] hits permitting or capex bottlenecks on the ground, and then fall back to [their] space deployments.”
Got a sensitive tip or confidential documents about SpaceX? For secure communication, you can contact Tim via Signal at tim_fernholz.21.
Tech
Blue Origin successfully re-uses a New Glenn rocket for the first time ever
Blue Origin has successfully reused one of its New Glenn rockets for the first time ever, marking a major milestone for the heavy-launch system as Jeff Bezos’ space company looks to compete with Elon Musk’s SpaceX.
But the overall mission’s success may be in question. Roughly two hours after the launch, Blue Origin revealed that the communications satellite that New Glenn carried to space for AST SpaceMobile wound up in an “off-nominal orbit,” meaning something may have gone wrong with the rocket’s upper stage. In other words, it appears the company missed the mark.
“We have confirmed payload separation. AST SpaceMobile has confirmed the satellite has powered on,” the company wrote on X. “We are currently assessing and will update when we have more detailed information.”
AST later said Blue Origin’s rocket placed its satellite into an orbit that was “lower than planned,” so the satellite will have to be de-orbited.
According to a timeline provided by Blue Origin prior to the launch, the upper stage of New Glenn should have performed a second burn roughly one hour after the rocket lifted off from Cape Canaveral, Florida. It’s unclear if that second burn ever happened, or if there were other problems with it, before the AST satellite was deployed.
The company accomplished the re-use feat Sunday on just the third-ever launch of New Glenn, and a little more than one year after the first flight of the new rocket system, which has been in development for more than a decade.
Making New Glenn reusable is crucial to its economics. SpaceX’s ability to re-fly Falcon 9 rocket boosters is one of the main reasons why it has come to dominate the global orbital launch market.
Techcrunch event
San Francisco, CA
|
October 13-15, 2026
While Blue Origin has already sent a commercial payload to space with New Glenn — Sunday was the second-such mission — the company wants to use the rocket for NASA moon missions, and to help both it and Amazon build space-based satellite networks. Blue Origin is currently finishing getting its first robotic moon lander ready for an attempted launch later this year.
The booster that Blue Origin re-flew on Sunday was the same one the company used in the second New Glenn mission in November. During that mission, the New Glenn booster helped put two robotic NASA spacecraft into space for a mission to Mars, before returning to a drone ship in the ocean. On Sunday, Blue Origin recovered the rocket booster a second time on a drone ship roughly 10 minutes after takeoff.
Any trouble deploying AST’s satellite could present a risk to Blue Origin’s near-term plans for New Glenn. Blue Origin has a deal with the communications company to send multiple satellites to orbit over the next few years as it works to build out its own space-based cellular broadband network.
This story has been updated with new information from Blue Origin and AST SpaceMobile.
Tech
Cracks are starting to form on fusion energy’s funding boom
It happens in every emerging industry: founders and investors push toward a common goal, until the money starts to roll in and that shared vision begins to diverge.
Cracks are emerging in the fusion power world, which I saw firsthand at The Economist’s Fusion Fest in London last week. It didn’t dampen the overall buoyant mood, lifted by fusion startups’ fundraising haul of $1.6 billion in the last 12 months. But people had differing opinions on two key questions: When should fusion startups go public? And are side businesses a distraction?
Going public was at the top of everyone’s minds. In the last four months, TAE Technologies and General Fusion have announced plans to merge with publicly traded companies. Both stand to receive hundreds of millions of dollars to keep their R&D efforts alive, and investors, some of whom have kept the faith for 20 years, finally see an opportunity to cash out.
Not everyone is in agreement. Most of those who I spoke to were worried these companies were going public far too early and that they hadn’t achieved key milestones that many view as vital in judging the progress of a fusion company.
First, a recap: TAE announced its merger with Trump Media & Technology Group in December. Though the deal isn’t yet completed, the fusion side of the business has already received $200 million of a potential $300 million in cash from the deal, giving it some runway to continue planning its power plant. (The remainder will reportedly land in its bank account once it files the S-4 form with the U.S. Securities and Exchange Commission.)
General Fusion said in January that it would go public via a reverse merger with a special purpose acquisition company. The deal could net the company $335 million and value the combined entity at $1 billion.
Both companies could use the cash.
Techcrunch event
San Francisco, CA
|
October 13-15, 2026
Before the merger announcement, General Fusion was struggling to raise funds, and around this time last year it laid off 25% of its staff as CEO Greg Twinney posted a public letter pleading for investment. It received a brief reprieve in August when investors threw it a $22 million lifeline, but that sort of money doesn’t last long in the fusion world, where equipment, experiments, and employees don’t come cheap.
TAE’s position wasn’t quite as dire, but it still required some funds. Pre-merger, the company raised nearly $2 billion, which sounds like a lot, but keep in mind the company is nearly 30 years old. What’s more, its valuation pre-merger was $2 billion, according to PitchBook. Investors were breaking even at best.
Neither company has hit scientific breakeven, a key milestone that shows a reactor design has power plant potential. Many observers doubt they’ll hit that mark before other privately held startups do. One executive told me, if they were in those shoes, they’re not sure how they would fill time on quarterly earnings calls if the companies didn’t hit scientific breakeven soon.
If TAE or General Fusion doesn’t deliver results, several people feared the public markets would sour on the entire fusion industry.
Now, not all may be lost. TAE has already started marketing other products, including power electronics and radiation therapy for cancer. That could give the company some near-term revenue to placate shareholders. General Fusion, though, hasn’t revealed any such plans.
And therein lies another divide: fusion companies remain split on whether they should pursue revenue now or wait until they have a working power plant.
Some companies are embracing the opportunity to make money along the way. Not a bad strategy! Fusion is a long game, so why not improve your odds? Both Commonwealth Fusion Systems and Tokamak Energy have said they’ll be selling magnets. TAE and Shine Technologies are both in nuclear medicine.
Other startups are worried that side hustles could become a distraction. Inertia Enterprises, for example, told me that they’re laser-focused on their power plant. That jibes with what another investor told me months ago: — they were worried that fusion startups could get distracted by profitable, but tangential businesses and fall off the lead.
There wasn’t consensus on the right time to go public either. I heard a few proposed milestones. Some believe startups should first reach that scientific breakeven milestone, in which a fusion reaction generates more energy than it needs to ignite. No startup has achieved that yet. The other possibilities are facility breakeven — when the reactor makes more energy than the entire site needs to operate — and commercial viability — when a reactor makes enough electrons to sell a meaningful amount to the grid.
We may have an answer to that question sooner than later. Commonwealth Fusion Systems expects it will hit scientific breakeven sometime next year, and some think the company might use that as an opportunity to go public.
Tech
TechCrunch Mobility: Uber enters its assetmaxxing era
Welcome back to TechCrunch Mobility, your hub for the future of transportation and now, more than ever, how AI is playing a part. To get this in your inbox, sign up here for free — just click TechCrunch Mobility!
A few weeks ago, I wrote about how Uber seemed to be everywhere, all at once in the emerging autonomous vehicle technology sector. The Financial Times has now put a number on it. The FT calculated that Uber has committed more than $10 billion to buying autonomous vehicles and taking equity stakes in the companies developing the tech, according to public records and discussions with folks behind the scenes. About $2.5 billion of that is in direct investments, with the remaining $7.5 billion to be spent on buying robotaxis over the next few years, the outlet reported.
We’ve reported on Uber’s numerous investments and deals with autonomous vehicle companies across drones, robotaxis, and freight. Some of its investments include WeRide, Lucid and Nuro, Rivian, and Wayve.
This rather large number (and particularly that $7.5 billion) got me thinking about another transformative era in Uber’s history and how it has visited these asset-heavy shores before. Uber might have started with a plan to be asset light, but for a brief period it did quite the opposite.
Uber went on a moonshot spree between 2015 and 2018. It launched electric air taxi developer Uber Elevate and the in-house autonomous vehicle unit Uber ATG, which would be boosted by its acquisition of Otto in 2016. It also snapped up micromobility startup Jump in 2018.
And then in 2020, Uber pulled the asset-heavy rip cord, ostensibly leaving all of those moonshots behind. Uber sold Uber ATG to Aurora, Jump to Lime, and Elevate to Joby Aviation. But it didn’t completely divest; it kept equity stakes in all of them.
Uber is now entering into a new and different asset-heavy era. It’s not plunking down millions, or even billions, to develop the technology in-house, although I’m sure folks there would be quick to pipe up that there is always R&D happening over at Uber. Instead, it appears to be focused on owning (or perhaps leasing) the physical assets.
Techcrunch event
San Francisco, CA
|
October 13-15, 2026
That could mean interesting line items on Uber’s balance sheet in the future.
Owning fleets of robotaxis built by other companies might not have been the original vision of Uber, or its former CEO Travis Kalanick, who has said the company made a mistake when it abandoned its AV development program. But this new approach could still get it to the same end point.
A little bird

Earlier this month, I interviewed Eclipse partner Jiten Behl about the venture firm’s new $1.3 billion fund and where that money might be headed. The firm, as I wrote, intends to incubate more startups (e.g., it was behind the Rivian spinout Also). Behl wouldn’t give me details, only stating, “We’re definitely working on a couple of really cool ideas.” He also said Eclipse is particularly interested in startups that work across enterprises.
Thanks to one little bird and some document diving by senior reporter Sean O’Kane, it looks like a seed round announcement is imminent for a San Francisco-based startup working on an autonomous hauler that I’ve been told doesn’t have a driver cab. This sounds similar to what Einride has built, but since we haven’t seen it, we’ll have to wait.
The company’s roster isn’t big, but it is chock-full of Silicon Valley tech elite, including a founder who was at Uber ATG, Pronto, and Waabi. Stay tuned for more.
Got a tip for us? Email Kirsten Korosec at kirsten.korosec@techcrunch.com or my Signal at kkorosec.07, or email Sean O’Kane at sean.okane@techcrunch.com.
Deals!

Slate is back with more capital as it prepares to put its first affordable pickup trucks into production by the end of 2026.
The electric vehicle startup, which got its start with backing from Jeff Bezos, raised another $650 million in a Series C funding round led by TWG Global. Keep your eye on TWG. This is the firm run by Guggenheim Partners chief executive (and Los Angeles Dodgers owner) Mark Walter and investor Thomas Tull.
Slate has raised about $1.4 billion to date, and its previous investors include General Catalyst, Jeff Bezos’ family office, VC firm Slauson & Co., and former Amazon executive Diego Piacentini, as TechCrunch first reported last year.
Other deals that got my attention …
Glydways, a San Francisco-based startup developing personal autonomous pods designed to operate on dedicated 2-meter-wide lanes in cities, raised $170 million in a Series C funding round co-led by Suzuki Motor Corporation, ACS Group, and Khosla Ventures. Existing investors Mitsui Chemicals and Gates Frontier and new investor Obayashi Corporation also participated. But wait, there’s more.
GM and Ford are reportedly talking to the Pentagon about whether the auto industry can help the military revamp its procurement program and find cheaper, faster ways to buy vehicles, munitions, or other hardware, the New York Times reported, citing anonymous sources.
Loop, a San Francisco-based startup, raised $95 million in a Series C funding round led by Valor Equity Partners and the Valor Atreides AI Fund, and includes investments from 8VC, Founders Fund, Index Ventures, and J.P. Morgan’s late-stage fund, Growth Equity Partners.
Monarch Tractor, the startup developing electric, autonomous tractors, has moved on to (ahem) a different pasture. The startup’s assets have been acquired by Caterpillar after struggling to pivot to a software services business.
Uber is increasing its stake in Delivery Hero by 4.5%, the Financial Times reported. Uber agreed to buy about 270 million euros in shares from Prosus, the Dutch investment group and Delivery Hero’s largest shareholder.
Notable reads and other tidbits

Doug Field, the high-profile executive who shaped Ford’s electric vehicle and technology strategies over the past five years, is leaving. Notably, Ford is shaking up the organization as well, creating a “product creation and industrialization” team to be led by COO Kumar Galhotra. Any guesses where Field is headed next? Perhaps he’ll return to Silicon Valley.
Lightship, the all-electric RV startup, is expanding its Colorado-based factory by another 44,000 square feet, which will allow it to quadruple its manufacturing capacity.
Rivian and battery recycling and materials startup Redwood Materials partnered years ago. We’re now seeing the fruits of that relationship. Redwood is installing battery energy storage at Rivian’s factory in Illinois. The catch? Redwood is using 100 second-life Rivian battery packs, which will provide 10 megawatt-hours (MWh) of dispatchable energy to reduce cost and grid load during peak demand periods.
Tesla created a new self-driving app that makes it easier for owners to subscribe to its Full Self-Driving software and see statistics on how — and how often — they use it. This may not be huge news, but it did catch my eye because of the gamified qualities of these new stats.
Waymo, as per usual, has a few news items this week. The Alphabet-owned company started testing its autonomous vehicles on public roads in London. It also removed its waitlist in Miami and Orlando to scale its robotaxi services in the two cities.
One more thing …
This newsletter isn’t my only project that is leaning more heavily into robotics. My podcast, the Autonocast, is too, as the worlds of autonomous vehicles, AI, and robotics mash together. Check out this interview with Foxglove founder Adrian MacNeil, who previously worked at Cruise.
