Connect with us

Tech

A peek inside Physical Intelligence, the startup building Silicon Valley’s buzziest robot brains

From the street, the only indication I’ve found Physical Intelligence’s headquarters in San Francisco is a pi symbol that’s a slightly different color than the rest of the door. When I walk in, I’m immediately confronted with activity. There’s no reception desk, no gleaming logo in fluorescent lights.

Inside, the space is a giant concrete box made slightly less austere by a haphazard sprawl of long blonde-wood tables. Some are clearly meant for lunch, dotted with Girl Scout cookie boxes, jars of Vegemite (someone here is Australian), and small wire baskets stuffed with one too many condiments. The rest of the tables tell a different story entirely. Many more of them are laden with monitors, spare robotics parts, tangles of black wire, and fully assembled robotic arms in various states of attempting to master the mundane.

During my visit, one arm is folding a pair of black pants, or trying to. It’s not going well. Another is attempting to turn a shirt inside out with the kind of determination that suggests it will eventually succeed, just not today. A third — this one seems to have found its calling — is quickly peeling a zucchini, after which it is supposed to deposit the shavings into a separate container. The shavings are going well, at least.

“Think of it like ChatGPT, but for robots,” Sergey Levine tells me, gesturing toward the motorized ballet unfolding across the room. Levine, an associate professor at UC Berkeley and one of Physical Intelligence’s co-founders, has the amiable, bespectacled demeanor of someone who has spent considerable time explaining complex concepts to people who don’t immediately grasp them. 

Image Credits:Connie Loizos for TechCrunch

What I’m watching, he explains, is the testing phase of a continuous loop: data gets collected on robot stations here and at other locations — warehouses, homes, wherever the team can set up shop — and that data trains general-purpose robotic foundation models. When researchers train a new model, it comes back to stations like these for evaluation. The pants-folder is someone’s experiment. So is the shirt-turner. The zucchini-peeler might be testing whether the model can generalize across different vegetables, learning the fundamental motions of peeling well enough to handle an apple or a potato it’s never encountered.

The company also operates a test kitchen in this building and elsewhere using off-the-shelf hardware to expose the robots to different environments and challenges. There’s a sophisticated espresso machine nearby, and I assume it’s for the staff until Levine clarifies that no, it’s there for the robots to learn. Any foamed lattes are data, not a perk for the dozens of engineers on the scene who are mostly peering into their computers or hovering over their mechanized experiments.

The hardware itself is deliberately unglamorous. These arms sell for about $3,500, and that’s with what Levine describes as “an enormous markup” from the vendor. If they manufactured them in-house, the material cost would drop below $1,000. A few years ago, he says, a roboticist would have been shocked these things could do anything at all. But that’s the point — good intelligence compensates for bad hardware.

Techcrunch event

Boston, MA
|
June 23, 2026

As Levine excuses himself, I’m approached by Lachy Groom, moving through the space with the purposefulness of someone who has half a dozen things happening at once. At 31, Groom still has the fresh-faced quality of Silicon Valley’s boy wonder, a designation he earned early, having sold his first company nine months after starting it at age 13 in his native Australia (this explains the Vegemite).

When I first approached him earlier, as he welcomed a small gaggle of sweatshirt-wearing visitors into the building, his response to my request for time with him was immediate: “Absolutely not, I’ve got meetings.” Now he has 10 minutes, maybe.

Groom found what he was looking for when he started following the academic work coming out of the labs of Levine and Chelsea Finn, a former Berkeley PhD student of Levine’s who now runs her own lab at Stanford focused on robotic learning. Their names kept appearing in everything interesting happening in robotics. When he heard rumors they might be starting something, he tracked down Karol Hausman, a Google DeepMind researcher who also taught at Stanford and who Groom had learned was involved. “It was just one of those meetings where you walk out and it’s like, This is it.”

Groom never intended to become a full-time investor, he tells me, even though some might wonder why not given his track record. After leaving Stripe, where he was an early employee, he spent roughly five years as an angel investor, making early bets on companies like Figma, Notion, Ramp, and Lattice while searching for the right company to start or join himself. His first robotics investment, Standard Bots, came in 2021 and reintroduced him to a field he’d loved as a kid building Lego Mindstorms. As he jokes, he was “on vacation much more as an investor.” But investing was just a way to stay active and meet people, not the endgame. “I was looking for five years for the company to go start post-Stripe,” he says. “Good ideas at a good time with a good team — [that’s] extremely rare. It’s all execution, but you can execute like hell on a bad idea, and it’s still a bad idea.”

Image Credits:Connie Loizos for TechCrunch

The two-year-old company has now raised over $1 billion, and when I ask about its runway, he’s quick to clarify it doesn’t actually burn that much. Most of its spending goes toward compute. A moment later, he acknowledges that under the right terms, with the right partners, he’d raise more. “There’s no limit to how much money we can really put to work,” he says. “There’s always more compute you can throw at the problem.”

What makes this arrangement particularly unusual is what Groom doesn’t give his backers: a timeline for turning Physical Intelligence into a money-making endeavor. “I don’t give investors answers on commercialization,” he says of backers that include Khosla Ventures, Sequoia Capital, and Thrive Capital among others that have valued the company at $5.6 billion. “That’s sort of a weird thing, that people tolerate that.” But tolerate it they do, and they may not always, which is why it behooves the company to be well-capitalized now.

So what’s the strategy, if not commercialization? Quan Vuong, another co-founder who came from Google DeepMind, explains that it revolves around cross-embodiment learning and diverse data sources. If someone builds a new hardware platform tomorrow, they won’t need to start data collection from scratch — they can transfer all the knowledge the model already has. “The marginal cost of onboarding autonomy to a new robot platform, whatever that platform might be, it’s just a lot lower,” he says.

The company is already working with a small number of companies in different verticals — logistics, grocery, a chocolate maker across the street — to test whether their systems are good enough for real-world automation. Vuong claims that in some cases, they already are. With their “any platform, any task” approach, the surface area for success is large enough to start checking off tasks that are ready for automation today.

Physical Intelligence isn’t alone in chasing this vision. The race to build general-purpose robotic intelligence — the foundation on which more specialized applications can be built, much like the LLM models that captivated the world three years ago — is heating up. Pittsburgh-based Skild AI, founded in 2023, just this month raised $1.4 billion at a $14 billion valuation and is taking a notably different approach. While Physical Intelligence remains focused on pure research, Skild AI has already deployed its “omni-bodied” Skild Brain commercially, saying it generated $30 million in revenue in just a few months last year across security, warehouses, and manufacturing. 

Image Credits:Connie Loizos for TechCrunch

Skild has even taken public shots at competitors, arguing on its blog that most “robotics foundation models” are just vision-language models “in disguise” that lack “true physical common sense” because they rely too heavily on internet-scale pretraining rather than physics-based simulation and real robotics data.

It’s a pretty sharp philosophical divide. Skild AI is betting that commercial deployment creates a data flywheel that improves the model with each real-world use case. Physical Intelligence is betting that resisting the pull of near-term commercialization will enable it to produce superior general intelligence. Who’s “more right” will take years to resolve.

In the meantime, Physical Intelligence operates with what Groom describes as unusual clarity. “It’s such a pure company. A researcher has a need, we go and collect data to support that need — or new hardware or whatever it is — and then we do it. It’s not externally driven.” The company had a 5- to 10-year roadmap of what the team thought would be possible. By month 18, they’d blown through it, he says.

The company has about 80 employees and plans to grow, though Groom says hopefully “as slowly as possible.” What’s the most challenging, he says, is hardware. “Hardware is just really hard. Everything we do is so much harder than a software company.” Hardware breaks. It arrives slowly, delaying tests. Safety considerations complicate everything.

As Groom springs up to rush to his next commitment, I’m left watching the robots continue their practice. The pants are still not quite folded. The shirt remains stubbornly right-side-out. The zucchini shavings are piling up nicely.

There are obvious questions, including my own, about whether anyone actually wants a robot in their kitchen peeling vegetables, about safety, about dogs going crazy at mechanical intruders in their homes, about whether all of the time and money being invested here solves big enough problems or creates new ones. Meanwhile, outsiders question the company’s progress, whether its vision is achievable, and if betting on general intelligence rather than specific applications makes sense.

If Groom has any doubts, he doesn’t show it. He’s working with people who’ve been working on this problem for decades and who believe the timing is finally right, which is all he needs to know.

Besides, Silicon Valley has been backing people like Groom and giving them a lot of rope since the beginning of the industry, knowing there’s a good chance that even without a clear path to commercialization, even without a timeline, even without certainty about what the market will look like when they get there, they’ll figure it out. It doesn’t always work out. But when it does, it tends to justify a lot of the times it didn’t.

source

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Tech

AWS revenue continues to soar as cloud demand remains high

Amazon Web Services ended 2025 with its strongest quarterly growth rate in more than three years.

The company reported Thursday that its cloud service business recorded $35.6 billion in revenue in the fourth quarter of 2025. This figure marks a 24% year-on-year increase and the business segment’s largest growth rate in 13 quarters. Annual revenue run rate for the business segment is $142 billion, according to Amazon. The cloud service also saw an increase in its operating income from $12.5 billion in the fourth quarter compared to $10.6 billion in the same period in 2024.

“It’s very different having 24% year-over-year growth on $142 billion annualized run rate than to have a higher percentage growth on a meaningfully smaller base, which is the case with our competitors,” Amazon CEO Andy Jassy said during the company’s fourth-quarter earnings call. “We continue to add more incremental revenue and capacity than others, and extend our leadership position.”

That fourth-quarter growth was fueled by new agreements with Salesforce, BlackRock, Perplexity, and the U.S. Air Force, among other companies and government entities.

“More of the top 500 U.S. startups use AWS as their primary cloud provider than the next two providers combined,” Jassy said. “We’re adding significant easy to core computing capacity each day.”

AWS also added more than a gigawatt of power to its data center network in the fourth quarter.

Jassy said AWS still sees a fair amount of its business coming from enterprises that want to move infrastructure from on-premise to the cloud. AWS is, of course, also seeing a boost from the AI boom, and Jassy credited AWS’s top-to-bottom AI stack functionality.

Techcrunch event

Boston, MA
|
June 23, 2026

“We consistently see customers wanting to run their AI workloads where the rest of their applications and data are,” Jassy said. “We’re also seeing that as customers run large AI workloads on AWS, they’re adding to their core AWS footprint as well.”

AWS made up 16.6% of Amazon’s overall $213.4 billion revenue in the fourth quarter.

AWS’s success wasn’t enough to appease Amazon investors, however. Amazon shares fell 10% in after-hours trading after investors reacted to the company’s plan to boost capital expenditures and missed Wall Street’s expectations on earnings per share.

source

Continue Reading

Tech

Reddit looks to AI search as its next big opportunity

Reddit suggested on Thursday that its AI-powered search engine could be the next big opportunity for its business — not just in terms of product, but also as a revenue driver impacting its bottom line. During the company’s fourth-quarter earnings call on Thursday, it offered an update on its plans to merge traditional and AI search together and hinted that although search is not yet monetized, “it’s an enormous market and opportunity.”

In particular, the company believes that generative AI search will be “better for most queries.”

“There’s a type of query we’re, I think, particularly good at — I would argue, the best on the internet — which is questions that have no answers, where the answer actually is multiple perspectives from lots of people,” said Reddit CEO Steve Huffman.

Traditional search, meanwhile, is more like navigation — it’s a way to find the right link to a topic or subreddit. But LLMs can be good at this, too, if not better, he said. “So that’s the direction we’re going.”

The exec also noted that weekly active users for search over the past year grew 30% from 60 million users to 80 million users. Meanwhile, the weekly active users for the AI-powered Reddit Answers grew from 1 million in the first quarter of 2025 to 15 million by the fourth quarter.

“We’re seeing a lot of growth there, and I think there’s a lot of potential too,” Huffman added.

Reddit said it’s working to modernize the AI answers interface by making its responses more media-rich, and pilots of this are already underway.

The company is also thinking about how it can position itself when it’s not just a social site, but a place people come for answers. Reddit told investors on the call that it’s doing away with the distinction between logged-in and logged-out users starting in Q3 2026, as it will aim to personalize the site — using AI and machine learning — and make it relevant to whoever shows up.

The company announced in 2025 it was planning to combine its AI search feature, Reddit Answers, with its traditional search engine to improve the experience for end users. In the fourth quarter, Reddit said it had made “significant progress” in unifying its core search and its AI feature. It also released five new languages on Reddit Answers and is piloting dynamic agents along with search results that include “media beyond text.”

Though Reddit sees value in its AI answers, it’s not been keeping that to itself. The company’s content licensing business, which allows other companies to train their AI models on its data, is growing, too. That business revenue is reported as part of Reddit’s “other” revenues (i.e., its non-ad revenue). This “other” revenue increased by 8% year-over-year to reach $36 million in Q4 and was up 22% to reach $140 million for 2025.

source

Continue Reading

Tech

Sapiom raises $15M to help AI agents buy their own tech tools

People without coding backgrounds are discovering that they can build their own custom apps using so-called vibe coding — solutions like Lovable that turn plain-language descriptions into working code.

While these prompt-to-code tools can help create nice prototypes, launching them into full-scale production (as this reporter recently discovered) can be tricky without figuring out how to connect the application with external tech services, such as those that can send text messages via SMS, email, and process Stripe payments.

Ilan Zerbib, who spent five years as Shopify’s director of engineering for payments, is building a solution that could eliminate these backend infrastructure headaches for non-technical creators.

Last summer, Zerbib launched Sapiom, a startup developing the financial layer that allows AI agents to securely purchase and access software, APIs, data, and compute — essentially creating a payment system that lets AI automatically buy the services it needs.

Every time an AI agent connects to an external tool like Twilio for SMS, it requires authentication and a micro-payment. Sapiom’s goal is to make this whole process seamless, letting the AI agent decide what to buy and when without human intervention.

“In the future, apps are going to consume services which require payments. Right now, there’s no easy way for agents to actually access all of that,” said Amit Kumar, a partner at Accel.

Kumar has met with dozens of startups in the AI payments space, but he believes Zerbib’s focus on the financial layer for enterprises, rather than consumers, is what’s truly needed to make AI agents work. That’s why Accel is leading Sapiom’s $15 million seed round, with participation from Okta Ventures, Gradient Ventures, Array Ventures, Menlo Ventures, Anthropic, and Coinbase Ventures.

Techcrunch event

Boston, MA
|
June 23, 2026

“If you really think about it, every API call is a payment. Every time you send a text message, it’s a payment. Every time you spin up a server for AWS, it’s a payment,” Kumar told TechCrunch.

While it’s still early days for Sapiom, the startup hopes that its infrastructure solution will be adopted by vibe-coding companies and other companies creating AI agents that will eventually be tasked with doing many things on their own.

For example, anyone who has vibe-coded an app with SMS capabilities won’t have to manually sign up for Twilio, add a credit card, and copy an API key into their code. Instead, Sapiom handles all of that in the background, and the person building the micro-app will be charged for Twilio’s services as a pass-through fee by Lovable, Bolt, or another vibe-coding platform.

While Sapiom is currently focused on B2B solutions, its technology could eventually empower personal AI agents to handle consumer transactions. The expectation is that individuals will one day trust agents to make independent financial decisions, such as ordering an Uber or shopping on Amazon. While that future is exciting, Zerbib believes that AI won’t magically make people buy more things, which is why he’s focusing on creating financial layers for businesses instead.

source

Continue Reading