Tech
Coalition demands federal Grok ban over nonconsensual sexual content
A coalition of nonprofits is urging the U.S. government to immediately suspend the deployment of Grok, the chatbot developed by Elon Musk’s xAI, in federal agencies, including the Department of Defense.
The open letter, shared exclusively with TechCrunch, follows a slew of concerning behavior from the large language model over the past year, including most recently a trend of X users asking Grok to turn photos of real women, and in some cases children, into sexualized images without their consent. According to some reports, Grok generated thousands of nonconsensual explicit images every hour, which were then disseminated at scale on X, Musk’s social media platform that’s owned by xAI.
“It is deeply concerning that the federal government would continue to deploy an AI product with system-level failures resulting in generation of nonconsensual sexual imagery and child sexual abuse material,” the letter, signed by advocacy groups like Public Citizen, Center for AI and Digital Policy, and Consumer Federation of America, reads. “Given the administration’s executive orders, guidance, and the recently passed Take It Down Act supported by the White House, it is alarming that [Office of Management and Budget] has not yet directed federal agencies to decommission Grok.”
xAI reached an agreement last September with the General Services Administration (GSA), the government’s purchasing arm, to sell Grok to federal agencies under the executive branch. Two months before, xAI — alongside Anthropic, Google, and OpenAI — secured a contract worth up to $200 million with the Department of Defense.
Amid the scandals on X in mid-January, Defense Secretary Pete Hegseth said Grok will join Google’s Gemini in operating inside the Pentagon network, handling both classified and unclassified documents, which experts say is a national security risk.
The letter’s authors argue that Grok has proven itself incompatible with the administration’s requirements for AI systems. According to the OMB’s guidance, systems that present severe and foreseeable risks that cannot be adequately mitigated must be discontinued.
“Our primary concern is that Grok has pretty consistently shown to be an unsafe large language model,” JB Branch, a Public Citizen Big Tech accountability advocate and one of the letter’s authors, told TechCrunch. “But there’s also a deep history of Grok having a variety of meltdowns, including antisemitic rants, sexist rants, sexualized images of women and children.”
Techcrunch event
Boston, MA
|
June 23, 2026
Several governments have demonstrated an unwillingness to engage with Grok following its behavior in January, which builds on a series of incidents including the generation of antisemitic posts on X and calling itself “MechaHitler.” Indonesia, Malaysia, and the Philippines all blocked access to Grok (they’ve subsequently lifted those bans), and the European Union, the U.K., South Korea, and India are actively investigating xAI and X regarding data privacy and the distribution of illegal content.
The letter also comes a week after Common Sense Media, a nonprofit that reviews media and tech for families, published a damning risk assessment that found Grok is among the most unsafe for kids and teens. One could argue that, based on the findings of the report — including Grok’s propensity to offer unsafe advice, share information about drugs, generate violent and sexual imagery, spew conspiracy theories, and generate biased outputs — Grok isn’t all that safe for adults either.
“If you know that a large language model is or has been declared unsafe by AI safety experts, why in the world would you want that handling the most sensitive data we have?” Branch said. “From a national security standpoint, that just makes absolutely no sense.”
Andrew Christianson, a former National Security Agency contractor and current founder of Gobbi AI, a no-code AI agent platform for classified environments, says that using closed-source LLMs in general is a problem, particularly for the Pentagon.
“Closed weights means you can’t see inside the model, you can’t audit how it makes decisions,” he said. “Closed code means you can’t inspect the software or control where it runs. The Pentagon is going closed on both, which is the worst possible combination for national security.”
“These AI agents aren’t just chatbots,” Christianson added. “They can take actions, access systems, move information around. You need to be able to see exactly what they’re doing and how they’re making decisions. Open source gives you that. Proprietary cloud AI doesn’t.”
The risks of using corrupted or unsafe AI systems spill out beyond national security use cases. Branch pointed out that an LLM that’s been shown to have biased and discriminatory outputs could produce disproportionate negative outcomes for people as well, especially if used in departments involving housing, labor, or justice.
While the OMB has yet to publish its consolidated 2025 federal AI use case inventory, TechCrunch has reviewed the use cases of several agencies — most of which are either not using Grok or are not disclosing their use of Grok. Aside from the DoD, the Department of Health and Human Services also appears to be actively using Grok, mainly for scheduling and managing social media posts and generating first drafts of documents, briefings, or other communication materials.
Branch pointed to what he sees as a philosophical alignment between Grok and the administration as a reason for overlooking the chatbot’s shortcomings.
“Grok’s brand is being the ‘anti-woke large language model,’ and that ascribes to this administration’s philosophy,” Branch said. “If you have an administration that has had multiple issues with folks who’ve been accused of being Neo Nazis or white supremacists, and then they’re using a large language model that has been tied to that type of behavior, I would imagine they might have a propensity to use it.”
This is the coalition’s third letter after writing with similar concerns in August and October last year. In August, xAI launched “spicy mode” in Grok Imagine, triggering mass creation of non-consensual sexually explicit deepfakes. TechCrunch also reported in August that private Grok conversations had been indexed by Google Search.
Prior to the October letter, Grok was accused of providing election misinformation, including false deadlines for ballot changes and political deepfakes. xAI also launched Grokipedia, which researchers found to be legitimizing scientific racism, HIV/AIDS skepticism, and vaccine conspiracies.
Aside from immediately suspending the federal deployment of Grok, the letter demands that the OMB formally investigate Grok’s safety failures and whether the appropriate oversight processes were conducted for the chatbot. It also asks the agency to publicly clarify whether Grok has been evaluated to comply with President Trump’s executive order requiring LLMs to be truth-seeking and neutral and whether it met OMB’s risk mitigation standards.
“The administration needs to take a pause and reassess whether or not Grok meets those thresholds,” Branch said.
TechCrunch has reached out to xAI and OMB for comment.
Tech
Lucid Bots raises $20M to keep up with demand for its window-washing drones
Andrew Ashur, the founder and CEO of window cleaning robot startup Lucid Bots, likes to joke that his company is the antithesis of the robotics industry right now.
While many companies are trying to build humanoids or tout demos of their robots dancing and doing flips, Lucid Bots’ drones are out in the field making traditionally unsexy and dangerous work, like cleaning windows, safer and more efficient.
“The sad truth is most are still selling a lot of hype and headlines, and we sell performance on the job site that shows up in our customers, profits, and losses,” Ashur told TechCrunch. “We’re not just in the lab and simulators. We’ve got dirt under our fingernails, and we’re out on job sites getting work done.”
Charlotte, North Carolina-based Lucid Bots is a full-stack robotics company that sells its Sherpa drones and Lavo robot to cleaning companies to help them on their job sites. The company designed and manufactures its own robots in the U.S. and just raised a $20 million Series B round co-led by Cubit Capital and Idea Fund Partners. This brings its total funding to $34 million.
The company plans to use the money for hiring to keep up with demand, although Ashur joked that they’ve run out of parking spots at their manufacturing facility.
“We have more requests for demos, then we have hours in the day, so we need to scale up capacity and head count,” Ashur said. “As a founder, when we don’t have enough hours in the day to do all the demos, it gives me a little bit of heartburn.”
Demand from customers and investors wasn’t there in the beginning, Ashur said. It took the company half a decade to ship its first 100 robots, and it took a fair amount of convincing to get VCs to back a robotics founder with a liberal arts background and no robotics experience.
Techcrunch event
San Francisco, CA
|
October 13-15, 2026
Ashur got the original idea for the company while he was a junior at Davidson College studying economics and Spanish. He happened to walk by a building that was being cleaned by window washers. It was a windy day, and the workers’ swing stage started to knock around and slam into the building.
Watching the harrowing scene made Ashur think about how technology could make this safer.
“Built infrastructure is literally the largest asset class in the world, but right now, we’ve got these three compounding issues,” Ashur said. “We’ve got aging infrastructure, the new infrastructure we’re building is getting bigger and harder to maintain, and, last but not least, we have less and less people willing and able to do the work. We needed to start building drones and robots to bridge that gap.”
Lucid Bots was launched in 2018 and started out as a cleaning company that took contract jobs to learn more about the industry. After two years, and a few cleaning chemical burns, Ashur said they knew what their drone needed to be successful.
Lucid Bots’ sales has gained momentum recently. It took the startup five years to sell 100 units and now it is approaching 1,000.
The company continues to improve its bots and drones in an effort to keep sales ticking along. Data collected by the robots is fed back to the underlying software, which is used to improve both of Lucid Bots’ products. The company is also building a tool that will allow its bots to be used for adjacent categories like painting waterproofing and sealing, among others.
“We recently waterproofed a massive university stadium that was starting to age, still using the same brain and frame as a Sherpa,” Ashur said. “Part of why we went there is because our existing customers were pulling us there and we were getting, gosh, probably about 50 or so inbound leads a month related to painting and coating and that was before we even began marketing that option.”
Tech
After pivoting, Y Combinator grad Glimpse raises $35M led by a16z
Dispute-tracking fintech Glimpse announced Wednesday that it raised a $35 million Series A led by Andreessen Horowitz, with participation from 8VC and Y Combinator.
Founders Akash Raju, Anuj Mehta, and Kushal Negi, attended Purdue together and were initially building a startup that did Airbnb product placements. That company launched in 2020, but by 2024, the founders pivoted to a wholly new idea: Glimpse, a platform that helps retailers automate financial deduction processes.
It raised a $10 million round last year, led by 8VC, after the business pivot, which it called a Series A round at the time. It is now calling this fresh $35 million a Series A round, and rebranding its previous Series A as a seed round. The company has raised $52 million to date, including funding they raised before they pivoted.
“We ultimately felt we lacked product-market fit and decided to hard pivot,” Raju said of the first, unsuccessful idea. “In this process, we had exposure to brands’ back offices and the chaos that was selling in retail, ultimately leading us to start Glimpse as it is today.”
They met their lead a16z investor through a mutual founder friend. “We built a strong relationship as we scaled the business. Really excited we can partner with them for this next stage of growth,” he continued.
Deductions are the amounts a retailer subtracts from what they owe a brand when settling an invoice. It’s commonplace and typically works like this: A brand bills the retailer, the retailer pays the brand. If it pays less than what was billed, it provides a reason, such as if the goods were damaged.
Some of the deductions are for valid reasons, but some aren’t — those are called invalid deductions, and they are tedious to track and manage on the back end. “These errors are surprisingly common,” Raju, the company’s CEO, said, adding that “a brand might ship inventory correctly but still be charged for a short shipment.”
Techcrunch event
San Francisco, CA
|
October 13-15, 2026
“Teams log into multiple retailer systems, pull scattered documents, review line items, reconcile against internal records, and manage disputes end-to-end. The challenge is driven by fragmented, unstructured data and siloed workflows across systems and teams,” he said of how the process usually goes.
If the brand doesn’t reconcile every invalid deduction, that could lead “to consistent revenue leakage,” he said.
Glimpse says it helps with this process by reviewing deductions, flagging invalid ones, and filing disputes, helping companies recover money they may have missed or lost. The platform’s AI agents log into a retailer’s portal, find and centralize all necessary documents, then classify each deduction, Raju explained. From there, the AI agents validate each change against internal data (such as supply chain records and promotion calendars) to determine which deductions are legitimate and which are not.
The company said it works with more than 200 retail brands, including Suave and its lip balm brand Chapstick.
“When issues are identified, Glimpse automatically files disputes, follows through on the process, applies recovered cash, and syncs everything back to the brand’s ERP,” Raju said, adding that the product integrates across multiple systems. In addition to the main enterprise resource planning financial software, it integrates with promotion calendars, and retail portals. It can truncate a long process down to days, he said.
Despite Glimpse’s automation, Raju said his company does have humans in the loop, “primarily around ensuring outcomes,” he said, like “following up on disputes to drive resolution and cash recovery, as well as quality assurance on critical steps like classification and data extraction.”
The system gets smarter each time a deduction is processed and continuously refines its classification, validation, and resolution. “Over time, this creates a compounding data advantage, where each new integration and customer makes the system smarter and more effective across the entire network,” he said.
Others are tackling invalid deduction with software, too, such as Revya and Confido.
“Our vision is to be the AI infrastructure for CPG and retail brands, and this capital helps continue executing toward that vision,” he said.
Tech
Harbinger’s next product will be hybrid emergency vehicles
Trucking startup Harbinger is still a relatively new company, but the flexibility of its electric vehicle platform has helped it capture another customer in a different line of business. This time, Harbinger’s chassis will be used in emergency vehicles for 70-year-old company Frazer.
The two companies announced Wednesday that Frazer will build ambulances on the hybrid version of Harbinger’s platform, as well as larger mobile healthcare vehicles. Frazer will also become a customer of Harbinger’s new energy storage business, which the startup debuted earlier this year in a partnership with Airstream.
The deal shows that companies like Harbinger are finding success with electric and hybrid vehicles despite headwinds in the passenger vehicle space in the United States. Grounded, another startup based in Detroit, revealed this week that it worked with Colgate to develop a small fleet of mobile dental care vehicles.
The key to Harbinger’s success is its flexible platform, co-founder and CEO John Harris said in an exclusive interview with TechCrunch. The simple truck chassis can be shortened or lengthened depending on a customer’s needs, and Harbinger can drop in a range-extending combustion engine if desired. Harbinger is just a few years old, but this one platform already powers RVs (built with THOR Industries), FedEx delivery vans, a smaller box truck design, and now ambulances. This has helped the company raise more than $300 million to date.
“If you look across the step van and the RV use case, we’ve got three wheel bases, four different GVWR [gross vehicle weight ratings], and sort of four different powertrain options, with four, five, [or] six battery packs, plus the hybrid across all of that stuff. We have 99.5% part commonality,” Harris said. “That’s the game changer.”
Frazer CEO Laura Griffin told TechCrunch that switching to Harbinger’s hybrid powertrain — which is predominantly electric but leverages the gas engine to top up the battery — was a no-brainer because it helps lower her customers’ total cost of ownership and increases their uptime.
“We’re constantly looking for, what innovations can elevate the experience for our end users, which are going to typically be municipalities, 911 entities, hospitals,” she said. “They’re doing it where it’s comparative to other medium-duty chassis, so it checks all of the boxes for us.”
Techcrunch event
San Francisco, CA
|
October 13-15, 2026
Griffin said Frazer will buy the battery-based auxiliary power units from Harbinger and use them on both the newer hybrid emergency vehicles, as well as on older combustion versions. These will replace standard generators and can help first responders (or users of the mobile healthcare vehicles) power medical devices in the field without pulling energy from a vehicle’s battery pack or combustion engine.
“In the back of an emergency vehicle, for instance, an ambulance, you can imagine there’s a lot of equipment, and all of the latest, greatest equipment that’s added on tends to be power based,” Griffin said. “So we are looking for abundant clean power sources that don’t necessarily tie to the chassis.”
Harris sees this becoming a great business no matter how many hybrid vehicles Frazer buys, since the auxiliary power units are useful regardless of powertrain.
“It will be a faster growth curve, because there are thousands of ambulances,” he said. And he’s looking to other industries as well — especially in Harbinger’s home state of California where there are increased restrictions on the use of gas generators.
“We’re seeing a lot of interest from people saying, like, I don’t really want a generator six feet from an operator for 12 hours a day, I’d be happy to save money with batteries. I would be happy to have less emissions,” he said.
