Tech
IBM will hire your entry-level talent in the age of AI
While the artificial intelligence industry touts that AI will replace entry-level jobs, not every company is scaling back hiring these positions. In IBM’s case, it’s going all in.
Hardware giant IBM plans to triple entry-level hiring in the U.S. in 2026, according to reporting from Bloomberg. Nickle LaMoreaux, IBM’s chief human resource officer, announced the initiative at Charter’s Leading with AI Summit on Tuesday.
“And yes, it’s for all these jobs that we’re being told AI can do,” LaMoreaux said.
These jobs will look different than the entry-level jobs IBM used to offer, she explained. According to LaMoreaux, she went through and changed the descriptions for these entry-level jobs so they were less focused on areas AI can actually automate — like coding — and more focused on people-forward areas like engaging with customers.
This strategy makes sense. Even if an enterprise like IBM doesn’t necessarily need the same amount of entry-level talent that it did before, fostering less experienced workers helps ensure these employees have the skills needed for the higher-level roles down the road.
IBM didn’t specify how many people they would be hiring in this initiative. TechCrunch reached out to IBM for more information on the hiring plans.
This year could be a pivotal one regarding what the impact of AI on the hiring market will look like. An MIT study in 2025 estimated that 11.7% of jobs could likely already be automated by AI. A TechCrunch survey found that multiple investors think 2026 will start to show AI’s potential impact on the labor market — despite not being asked about labor specifically.
Techcrunch event
Boston, MA
|
June 23, 2026
Tech
Designer Kate Barton teams up with IBM and Fiducia AI for a NYFW presentation
On Saturday, designer Kate Barton will unveil her latest collection at New York Fashion Week — with a twist, of course. Barton teamed up with Fiducia AI to create a multilingual AI agent (built with IBM watsonx on IBM Cloud) to help guests identify pieces of the collection and try them on virtually.
TechCrunch caught up with Barton and Ganesh Harinath, the founder and CEO of Fiducia AI, before the show to learn more about the presentation.
For one, Barton said technology is baked into how she thinks. She likes playing with the real and the unreal, and found the idea of using AI-like set design, “a portal into the collection’s world, rather than ‘AI for AI’s sake,” she said.
“Today, tech is a tool for expanding the world around the clothes, how they are presented, and how people enter the story, and how we create that moment when your eyes do a double-take,” she told TechCrunch, adding that the goal for this collection was to create a sense of curiosity.
Harinath said his company used IBM watsonx, IBM Cloud, and IBM Cloud Object Storage to help pull off Barton’s presentation. It was a production-grade activation with a Visual AI lens (built with IBM watsonx) that detects pieces from Barton’s new collection. It can answer questions in any language via voice and text and offers photorealistic virtual reality try-ons.
“The hardest work wasn’t model tuning; it was orchestration,” he told TechCrunch. This isn’t the first time Barton has put a technological spin on her fashion — last season, she experimented with AI models, also in collaboration with Fiduicia AI.
At fashion week, there was some chatter about whether brands — and, if so, which ones — would be using technology and artificial intelligence. Barton thinks many brands are using AI, though quietly, mainly in operations. “Maybe fewer are using it publicly because of the potential reputational risk,” she said.
Techcrunch event
Boston, MA
|
June 23, 2026
It rhymes a bit with the early days when many big fashion names were nervous about starting websites. “Then it became inevitable, and eventually the question shifted from ‘should we be online’ to ‘is our online presence any good?’” she said.

Harinath added that, though many brands are experimenting with AI, much of its deployment remains at the surface level — such as chatbots, content generation, and internal productivity tools.
But Barton sees a world of better prototyping, better visualization, smarter production decisions, and more immersive ways to experience fashion, without replacing the humans who “actually make it worth wearing.” Change will only come with more clarity, she said, with “clear discourse, clear licensing, clear credit, and a shared understanding that human creativity is not an annoying overhead cost.”
“If the technology is used to erase people, I am not into it,” she said, adding that audiences are smarter than we think. “They can tell the difference between invention and avoidance.”
Despite the tension, AI is becoming more routine, and there will come a day when shows like Barton’s are just part of the norm. Harinath thinks AI in fashion will be normalized by 2028, and by 2030, he sees it becoming embedded into the operational core of retail.
“Most of this technology already exists — the differentiator now is assembling the right partners and building teams that can operationalize it responsibly,” he said.
Dee Waddell, Global Head of Consumer, Travel and Transportation Industries at IBM Consulting, agreed. “When inspiration, product intelligence, and engagement are connected in real time, AI moves from being a feature to becoming a growth engine that drives measurable competitive advantage,” Waddell told TechCrunch.
But until then, there is this show.
“The most exciting future for fashion is not automated fashion,” Barton said. “It is fashion that uses new tools to heighten craft, deepen storytelling, and bring more people into the experience, without flattening the people who make it.”
Tech
Hollywood isn’t happy about the new Seedance 2.0 video generator
Hollywood organizations are pushing back against a new AI video model called Seedance 2.0, which they say has quickly become a tool for “blatant” copyright infringement.
ByteDance, the Chinese company that recently finalized a deal to sell TikTok’s U.S. operations (it retains a stake in the new joint venture), launched Seedance 2.0 earlier this week. According to the Wall Street Journal, the updated model is currently available to Chinese users of ByteDance’s Jianying app, and the company says it will soon be available to global users of its CapCut app.
Similar to tools such as OpenAI’s Sora, Seedance allows users to create videos (currently limited to 15 seconds in length) by just entering a text prompt. And like Sora, Seedance quickly drew criticism for an apparent lack of guardrails around the ability to create videos using the likeness of real people, as well as studios’ intellectual property.
After one X user posted a brief video showing Tom Cruise fighting Brad Pitt, which they said was created by “a 2 line prompt in seedance 2,” “Deadpool” screenwriter Rhett Reese responded, “I hate to say it. It’s likely over for us.”
The Motion Picture Association soon issued a statement from CEO Charles Rivkin demanding that ByteDance “immediately cease its infringing activity.”
“In a single day, the Chinese AI service Seedance 2.0 has engaged in unauthorized use of U.S. copyrighted works on a massive scale,” Rivkin said. “By launching a service that operates without meaningful safeguards against infringement, ByteDance is disregarding well-established copyright law that protects the rights of creators and underpins millions of American jobs.”
The Human Artistry Campaign — an initiative backed by Hollywood unions and trade groups — condemned Seedance 2.0 as “an attack one very creator around the world,” while the actors’ union SAG-AFTRA said it “stands with the studios in condemning the blatant infringement enabled by Bytedance’s new AI video model Seedance 2.0.”
Techcrunch event
Boston, MA
|
June 23, 2026
Seedance videos have apparently featured Disney-owned characters such as Spider-Man, Darth Vader, and Grogu, better known as Baby Yoda, prompting the company to take legal action. Axios reports that Disney has sent a cease-and-desist letter accusing ByteDance of a “virtual smash-and-grab of Disney’s IP”and claiming the Chinese company is “hijacking Disney’s characters by reproducing, distributing, and creating derivative works featuring those characters.”
Disney isn’t necessarily opposed to working with AI companies — while it has reportedly sent a cease-and-desist letter to Google over similar issues, it’s signed a three-year licensing deal with OpenAI.
TechCrunch has reached out to ByteDance for comment.
Tech
‘Clueless’ -inspired app Alta partners with brand Public School to start integrating styling tools into websites
Much has changed for Jenny Wang, the founder who’s bringing “Clueless” fashion tech to life.
Last year, her company, Alta, raised $11 million in a round led by Menlo Ventures to let users create digital closets and try on their clothes with their own virtual avatars. It’s a tech once seen only in movies, most notably in “Clueless,” where Cher styles and plans her outfits using computer technology. Alta is similar to that, allowing users to plan and style outfits using the latest AI innovations.
A slew of big names participated in Atla’s round last year, including models Jasmine Tookes and Karlie Kloss, Anthropic’s VC arm Anthology Fund, and Rent the Runway cofounder Jenny Fleiss.
TechCrunch caught up with Wang during New York Fashion Week to talk about how the company has expanded since that round.
For starters, the product is officially in the app store; Time and Vogue named it one of the best innovations of last year, and Wang said more than 100 million outfits have been generated on the platform since its launch in 2023. It has partnerships with Poshmark and the Council of Fashion Designers of America, with more partnerships to be announced soon.
“Alta’s own app also features thousands of brands that users can shop from,” Wang said.
Right now, the company is focused on building app and website integration experiences for brands, she said, where customers can try on a designer’s clothing using a personalized Alta Avatar. This week, the company unveiled its first integration collaboration, teaming up with Public School, a storied New York City brand.
Techcrunch event
Boston, MA
|
June 23, 2026
“Shoppers can style looks from the new collection on their own Alta avatar,” Wang said.
She met the Public School team — Dao-Yi Chow and Maxwell Osborne — through the founder of Poshmak, who is also an angel investor in both companies.
“Public School designers Dao-Yi Chow and Maxwell Osborne had been looking for an AI partner and virtual try-on avatar solution, and Dao-Yi has been an Alta app user himself,” Wang said.
Public School actually went on hiatus for a few years, with this NYFW marking its grand re-debut. When asked, the founders of the brand said they rediscovered their voices and what they wanted to say.
“We have to look at tech as a partner in the business today,” Chow told TechCrunch, adding, “It’s not 2015 anymore,” so the team wants to take advantage of the latest technological developments. “We want to be thoughtful on how we use tech and AI,” he continued, “not as a design tool but as a tool to extend our storytelling and a tool to interact with the consumer and have them experience the brand even if they can’t do so in person.”

Wang said this is one of the first instances of a designer embedding personal avatar and styling technology into its own website. Near the bottom of Public School’s product page, there is an icon that says Style by Alta. Clicking that takes the customer to Alta for them to then style their avatars and test out how Public School clothing would look on them, should they purchase.
Users on Alta’s standalone app can also access Public School through Alta’s app. Wang said the goal is for Alta to integrate more experiences like this into other brands and websites, so Alta users can try on clothes on other websites even while outside the Alta app.
“Right now, a user would have to add a potential purchase into their Alta wishlist, then style outfits and try on their avatar, versus being able to do that directly on the brand website.” (For every site but Public School, that is.) “The goal is to bring their community on a new journey to engage with and shop the brand.”
Many major fashion brands, like Zara and Balmain, have already experimented with digital avatars. Wang said what makes Alta different here, especially compared to Zara, is that Alta avatars can put on at least 8 items within seconds, whereas Zara avatars can wear only four and often take around two minutes.
Overall, demand for virtual avatars has increased. Wang considers Alta both still the “Cluless” technology that it started out with, and a digital avatar business.
“The consumer Alta app is the ‘Clueless’ closet, while the enterprise Alta experience allows shoppers to style pieces and try the outfits on their pre-existing Alta avatar,” he said. Eventually, Wang said she wants Alta to be the “personal identity layer for the future of consumer AI and shopping.”
For agentic commerce to truly work, she said, “We need a data layer that understands the shopper’s style preferences, such as their closet, past purchases, and their avatar, likeness, and body, which is Alta.”
