Tech
Sam Altman got exceptionally testy over Claude Super Bowl ads
Anthropic’s Super Bowl commercial, one of four ads the AI lab dropped on Wednesday, begins with the word “BETRAYAL” splashed boldly across the screen. The camera pans to a man earnestly asking a chatbot (obviously intended to depict ChatGPT) for advice on how to talk to his mom.
The bot, portrayed by a blonde woman, offers some classic bits of advice. Start by listening. Try a nature walk! And then twists into an ad for a fictitious (we hope!) cougar-dating site called Golden Encounters. Anthropic finishes the spot by saying that while ads are coming to AI, they won’t be coming to it’s own chatbot, Claude.
Another one features a slight young man looking for advice on building a six pack. After offering his height, age, and weight, the bot serves him an ad for height-boosting insoles.
The Anthropic commercials are cleverly crafted at OpenAI’s users, after that company’s recent announcement that ads will be coming to ChatGPT’s free tier. And they caused an immediate stir, spawning headlines that Anthropic “mocks,” “skewers” and “dunks” on OpenAI.
They are funny enough that even Sam Altman admitted on X that he laughed at them. But he clearly didn’t really find them funny. They inspired him to write a novella-sized rant that devolved into calling his rival “dishonest” and “authoritarian.”
In that post, Altman explains that an ad-supported tier is intended to shoulder the burden of offering free ChatGPT to many of its millions of users. ChatGPT is still the most popular chatbot by a large margin.
But the OpenAI CEO insisted they were “dishonest” in implying that ChatGPT will twist a conversation to insert an ad (and possibly for an off-color product, to boot).”We would obviously never run ads in the way Anthropic depicts them,” Altman wrote in the social media post. “We are not stupid and we know our users would reject that.”
Techcrunch event
Boston, MA
|
June 23, 2026
Indeed, OpenAI has promised ads will be separate, labeled, and will never influence a chat. But the company has also said it is planning on making them conversation-specific — which is the central allegation of Anthropic’s ads. As OpenAI explained in its blog. “We plan to test ads at the bottom of answers in ChatGPT when there’s a relevant sponsored product or service based on your current conversation.”
Altman then went on to fling some equally questionable assertions at his rival. “Anthropic serves an expensive product to rich people,” he wrote. “We also feel strongly that we need to bring AI to billions of people who can’t pay for subscriptions.”
But Claude has a free chat tier, too, with subscriptions at $0, $17, $100, $200. ChatGPT’s tiers are $0, $8, $20, $200. One could argue the subscription tiers are fairly equivalent.
Altman also alleged in his post that: “Anthropic wants to control what people do with AI” He argues it blocks usage of Claude Code from “companies they don’t like” like OpenAI, and said Anthropic tells people what they can and can’t use AI for.
True, Anthropic’s whole marketing deal since day one has been “responsible AI.” The company was founded by two former OpenAI alums, after all, who claimed they grew alarmed about AI safety when they worked there.
Still, both chatbot companies have usage policies, AI guardrails, and talk about AI safety. And, while OpenAI allows ChatGPT to be used for erotica while Anthropic does not, it, too, has determined some content should be blocked, particularly in regards to mental health.
Yet Altman took this Anthropic-tells-you-what-to-do argument to an extreme level when he accused Anthropic of being “authoritarian.”
“One authoritarian company won’t get us there on their own, to say nothing of the other obvious risks. It is a dark path,” he wrote.
Using “authoritarian” in a rant over a cheeky Super Bowl ad is misplaced, at best. It’s particularly tactless when considering the current geopolitical environment in which protesters around the world have been killed by agents of their own government. While business rivals have been duking it out in ads since the beginning of time, clearly Anthropic hit a nerve.
Tech
After backlash, Adobe cancels Adobe Animate shutdown and puts app on ‘maintenance mode’
Adobe is putting on hold its plan to discontinue Adobe Animate following intense backlash from its customers after it announced plans to shut down the 2D animation software amid an increased focus on its investments in AI.
“We are not discontinuing or removing access to Adobe Animate. Animate will continue to be available for both current and new customers, and we will ensure you continue to have access to your content,” the company wrote in a post on Wednesday.
Adobe’s Monday announcement about discontinuing Animate was met with incredulity, disappointment, and anger, and users aired concerns about the lack of alternatives that mirror Animate’s functionality.
The company changed its tune on Wednesday, saying there would no longer be a “deadline or date by which Animate will no longer be available.”
“Adobe Animate is in maintenance mode for all customers. This applies to individual, small business, and enterprise customers. Maintenance mode means we will continue to support the application and provide ongoing security and bug fixes, but we are no longer adding new features. Animate will continue to be available for both new and existing users - we will not be discontinuing or removing access to Adobe Animate,” it said.
One customer, posting on X, had asked Adobe to at least open source the software rather than abandon it. Commenters on the thread responded with angst, saying things like, “this is legit gonna ruin my life,” and, “literally what the hell are they doing? animate is the reason a good chunk of adobe users even subscribe in the first place.”
On Monday, the company updated its support site and sent emails to existing customers announcing that Adobe Animate would be discontinued on March 1, 2026. Enterprise customers would continue to receive technical support through March 1, 2029, to ease the transition, the company said at the time. Other customers would have support through March of next year.
Adobe explained its decision to discontinue the program in an FAQ, saying, “Animate has been a product that has existed for over 25 years and has served its purpose well for creating, nurturing, and developing the animation ecosystem. As technologies evolve, new platforms and paradigms emerge that better serve the needs of the users. Acknowledging this change, we are planning to discontinue supporting Animate.”
Reading between the lines, it seemed as if Adobe was saying that Animate no longer represents the current direction of the company, which is now more focused on products that incorporate AI technologies.
What’s surprising is that Adobe couldn’t even recommend software that would fully replace what customers are losing with Animate. Instead, it said customers with a Creative Cloud Pro plan can use other Adobe apps to “replace portions of Animate functionality.”
For instance, it suggested that Adobe After Effects can support complex keyframe animation using the Puppet tool, and Adobe Express can be used for animation effects that can be applied to photos, videos, text, shapes, and other design elements.
There were hints that Adobe was headed in this direction when no mention was made of Animate at the company’s annual Adobe Max conference. Plus, no 2025 version of the software was released.
Before switching to “maintenance mode,” Abode had intended for the software to continue to work for those who have it downloaded. Typically, Adobe charged $34.49 per month for the software, which dropped to $22.99 with a 12-month commitment. The annual prepaid plan was available for $263.88. Now, the company says it will be available to new users, as well.
Some users have been recommending other animation programs to use as a replacement, including Moho Animation and Toon Boom Harmony.
Updated, February 4, 2026, to note that Adobe reversed its decision and announced the software would be placed in maintenance mode instead of discontinued.
Tech
ElevenLabs raises $500M from Sequoia at an $11 billion valuation
Voice AI company ElevenLabs said today it raised $500 million in a new funding round led by Sequoia Capital, which was an investor in the startup’s last secondary round through a tender. Sequoia partner Andrew Reed is joining the company’s board.
The startup is now valued at $11 billion, more than three times its valuation in its last round in January 2025. Earlier in the year, the Financial Times reported that the startup was looking to raise at that valuation.
The company said that existing investor a16z quadrupled its investment amount, and Iconiq, which led the last round, tripled it. Some prior investors, like BroadLight, NFDG, Valor Capital, AMP Coalition, and Smash Capital, also joined the round. New investors for the funding included Lightspeed Venture Partners, Evantic
Capital, and Bond.
ElevenLabs said that it will disclose some investors later in February, which might be strategic partners. The company has raised over $781 million to date. It said that it will use the funding for research and product development, along with expansion in international markets like India, Japan, Singapore, Brazil, and Mexico.
The company’s co-founder, Mati Staniszewski, indicated that ElevenLabs might work on agents beyond voice and incorporate video. In January, the company announced a partnership with LTX to produce audio-to-video content.
“The intersection of models and products is critical – and our team has proven, time and again, how to translate research into real-world experiences. This funding helps us go beyond voice alone to transform how we interact with technology altogether. We plan to expand our Creative offering – helping creators combine our best-in-class audio with video and Agents – enabling businesses to build agents that can talk, type, and take action,” he said in a statement.
The company has seen good growth momentum as it closed the year at $330 million ARR. In an interview with Bloomberg earlier this year, Staniszewski said that it took ElevenLabs five months to reach $200 million to $300 million in ARR.
Voice AI model providers are an attractive target for investors and big tech companies. In January, rival Deepgram raised $130 million from AVP at a $1.3 billion valuation. Meanwhile, Google hired top talent from voice model company Hume AI, including CEO Alan Cowen.
Tech
Roblox’s 4D creation feature is now available in open beta
Last year, Roblox launched an open source AI model that could generate 3D objects on the platform, helping users quickly design digital items such as furniture, vehicles, and accessories. The company claims the tool, called Cube 3D, has so far helped users generate over 1.8 million 3D objects since it was rolled out last March.
On Wednesday, the company launched the open beta for its anticipated 4D creation feature that lets creators make not just static 3D models, but fully functional and interactive objects. The feature has been in early access since November.
Roblox says 4D creation adds an important new layer: interactivity. With this technology, users can design items that can move and react to players in the game.

At launch, there are two types of object templates (called schemas) that creators can try out.
The first is the “Car-5” schema, which is used to create a car made of five separate parts: the main body and four wheels. Previously, cars were a single, solid 3D object that couldn’t move. The new system breaks down objects into parts and assigns behaviors to each so that they function individually within the virtual world. The AI therefore can generate cars with spinning wheels, making them more realistic and interactive.
The second is called “Body-1,” which can generate any object made from a single piece, like a simple box or sculpture.
The first experience with 4D generation is a game called Wish Master, where players can generate cars they can drive, planes they can fly, and even dragons.
Techcrunch event
Boston, MA
|
June 23, 2026
In the future, Roblox plans to let creators make their own schemas so they’ll have more freedom to define how objects behave. The company says it is also developing new technology that could use a reference image to create a detailed 3D model that matches the image’s style (example below).

The company says it is developing more ways to help people create games and experiences using AI, including a project it has dubbed “real-time dreaming.” Roblox CEO David Baszucki last month explained that this project would let creators build new worlds using “keyboard navigation and sharing real-time text prompts.”
The open beta comes on the heels of Roblox’s recent implementation of mandatory facial verification for users to access chat features in the game, following lawsuits and investigations related to child safety.
