MarketAlert – Real-Time Market & Crypto News, Analysis & AlertsMarketAlert – Real-Time Market & Crypto News, Analysis & Alerts
Font ResizerAa
  • Crypto News
    • Altcoins
    • Bitcoin
    • Blockchain
    • DeFi
    • Ethereum
    • NFTs
    • Press Releases
    • Latest News
  • Blockchain Technology
    • Blockchain Developments
    • Blockchain Security
    • Layer 2 Solutions
    • Smart Contracts
  • Interviews
    • Crypto Investor Interviews
    • Developer Interviews
    • Founder Interviews
    • Industry Leader Insights
  • Regulations & Policies
    • Country-Specific Regulations
    • Crypto Taxation
    • Global Regulations
    • Government Policies
  • Learn
    • Crypto for Beginners
    • DeFi Guides
    • NFT Guides
    • Staking Guides
    • Trading Strategies
  • Research & Analysis
    • Blockchain Research
    • Coin Research
    • DeFi Research
    • Market Analysis
    • Regulation Reports
Reading: Elon Musk’s AI Is Generating Sexual Images Of Women And Girls. Here’s What To Do If It Happens To You.
Share
Font ResizerAa
MarketAlert – Real-Time Market & Crypto News, Analysis & AlertsMarketAlert – Real-Time Market & Crypto News, Analysis & Alerts
Search
  • Crypto News
    • Altcoins
    • Bitcoin
    • Blockchain
    • DeFi
    • Ethereum
    • NFTs
    • Press Releases
    • Latest News
  • Blockchain Technology
    • Blockchain Developments
    • Blockchain Security
    • Layer 2 Solutions
    • Smart Contracts
  • Interviews
    • Crypto Investor Interviews
    • Developer Interviews
    • Founder Interviews
    • Industry Leader Insights
  • Regulations & Policies
    • Country-Specific Regulations
    • Crypto Taxation
    • Global Regulations
    • Government Policies
  • Learn
    • Crypto for Beginners
    • DeFi Guides
    • NFT Guides
    • Staking Guides
    • Trading Strategies
  • Research & Analysis
    • Blockchain Research
    • Coin Research
    • DeFi Research
    • Market Analysis
    • Regulation Reports
Have an existing account? Sign In
Follow US
© Market Alert News. All Rights Reserved.
  • bitcoinBitcoin(BTC)$80,626.000.84%
  • ethereumEthereum(ETH)$2,321.991.37%
  • tetherTether(USDT)$1.000.01%
  • rippleXRP(XRP)$1.421.61%
  • binancecoinBNB(BNB)$648.810.91%
  • usd-coinUSDC(USDC)$1.000.00%
  • solanaSolana(SOL)$93.114.08%
  • tronTRON(TRX)$0.3508160.24%
  • Figure HelocFigure Heloc(FIGR_HELOC)$1.032.43%
  • dogecoinDogecoin(DOGE)$0.1088561.10%
Learn

Elon Musk’s AI Is Generating Sexual Images Of Women And Girls. Here’s What To Do If It Happens To You.

Last updated: January 10, 2026 12:55 am
Published: 4 months ago
Share

Over the past few weeks, people on X — the Elon Musk-owned social media platform — have used the app’s chatbot, Grok, to generate sexual images of women and girls without their consent.

With a few simple instructions — “put her into a very transparent mini-bikini,” for instance — Grok will digitally strip anyone down to their bikini.

A report by the nonprofit AI Forensics found that 2% of 20,000 images generated by Grok over the holidays depicted a person who appeared to be 18 or younger, including 30 young or very young women or girls in bikinis or transparent clothing. Other images depict women and girls with black eyes, covered in liquid, and looking afraid.

Despite receiving global backlash and regulatory probes in Europe, India and Malaysia, Musk first mocked the situation by sharing an array of Grok-generated images, including one depicting himself in a bikini, alongside laughing-crying emojis.

By Jan. 3, Musk commented on a separate post: “Anyone using Grok to make illegal content will suffer the same consequences as if they upload illegal content.” (We’ll explain what constitutes illegal content later on.)

Deepfake nudes are nothing new. For years, apps like “DeepNude” have given people access to deepfake technology that allows them to digitally insert women into porn or be stripped naked without their knowledge. (Of course, men have been victims of sexualized deepfakes as well, but the research indicates that men are more likely than women to perpetrate image-based abuse.)

Still, Grok’s usage this week is different and arguably more alarming, said Carrie Goldberg, a victims’ rights attorney in New York City.

“The Grok story is unique because it’s the first time there’s a combining of the deepfake technology, Grok, with an immediate publishing platform, X,” she said. “The immediate publishing capability enables the deepfakes to spread at scale.”

“It needs to be underscored how bizarre it is that the world’s richest man not only owns the companies that create and publish deepfakes, but he is also actively promoting and goading users on X to de-clothe innocent people,” Goldberg added. “Elon Musk feels entitled to strip people of their power, dignity, and clothes.”

What’s been happening the last few weeks is unfortunate, but none of it is a surprise to Riana Pfefferkorn, a policy fellow at the Stanford Institute for Human-Centered AI. Her take: This problem will get worse before it gets better.

“Every tech service that allows user-generated content will inevitably be misused to upload, store and share CSAM (child sex abuse material), as CSAM bad actors are very persistent,” she said.

The upshot is that AI companies will have to learn how to best implement robust safeguards against illegal imagery. Some companies may have a stronger culture of “CSAM/nonconsensual deepfake porn is not OK.”

Others will try to have it both ways, establishing loose guardrails for safety while also trying to make money from permissible NSFW imagery, Pfefferkorn said.

“Unfortunately, while I don’t have any direct insight, x.AI does not seem to have that strong of a corporate culture in that respect, going off Elon Musk’s dismissive reaction to the current scandal as well as previous reporting from a few months ago,” she said.

Victims of this kind of exploitation often feel powerless and unsure of what they can do to stop the images from proliferating. Women who are vocal online worry about the same thing happening to them.

Omny Miranda Martone, the founder of the Washington-based Sexual Violence Prevention Association, had deepfake nude videos and pics posted of themselves online a few years back. As an advocate on legislation preventing digital sexual violence, Martone wasn’t exactly surprised to be a target.

“They also sent the deepfakes to my organization, in an attempt to silence me. I have seen this same tactic used on Twitter with Grok over the last week,” they said.

Martone said they’ve seen several instances of a woman sharing her opinion and men who disagree with her using Grok to create explicit images of her.

“In some cases, they are using these images to threaten the women with in-person sexual violence,” they added.

One of the most persistent beliefs about deepfakes depicting nudity is that because an image is “fake,” the harm is somehow less real. That assumption is wrong, said Rebecca A. Delfino, an associate professor of law who teaches generative AI and legal practice at Loyola Marymount University.

“These images can cause serious and lasting damage to a person’s reputation, safety, and psychological well-being,” she said. “What matters legally and morally is that a real person’s body and identity were used without consent to create a sexualized lie.”

While protections remain uneven, untested and often come too late for victims, Delfino said the law is slowly beginning to recognize that reality.

“Stories like what’s happening with Grok matter because public attention often drives the legal and regulatory responses that victims currently lack,” she said. “The law is finally starting to treat AI-generated nude images the same way it treats other forms of nonconsensual sexual exploitation.”

If you identify deepfake content of yourself, screen grab it and report it immediately.

“The most practical advice is to act quickly and methodically,” Delfino said. “Preserve evidence — screenshots, URLs, timestamps) — before content is altered or removed. Report the image to platforms clearly as nonconsensual sexual content and continue to follow up.”

If you’re under 18 in a nude or nudified image, platforms should take that very seriously, Pfefferkorn said. Sexually explicit imagery of kids under 18 is illegal to create or share, and platforms are required to promptly remove such imagery when they learn of it and report it to the National Center for Missing & Exploited Children (NCMEC).

“Don’t be afraid to report a nude image to NCMEC that you took of yourself while you were underage: there is also a federal law saying you can’t be legally punished if you report it,” Pfefferkorn added.

And if a minor is involved, law enforcement should be contacted immediately.

“When possible, consulting with a lawyer early can help victims navigate both takedown efforts and potential civil remedies, even where the law is still evolving,” Delfino said.

The Take It Down Act, signed into law last May, is the first federal law that limits the use of AI in ways that can harm individuals. (Ironically enough, Grok gave someone insight about the Take It Down Act when asked about the legal consequences of digitally undressing someone.)

This legislation did two things, Martone said. First, it made it a criminal offense to knowingly publish AI-generated explicit videos and images without the consent of the person depicted. Second, it required social media sites, search engines, and other digital platforms to create “report and remove procedures” by May of 2026 — still a few months away.

“In other words, all digital platforms must have a way for users to report that someone has posted an explicit video or image of them, whether it was AI-generated or not,” they said. “The platform must remove reported images within 48 hours. If they fail to do so, they face penalties from the Federal Trade Commission (FTC).”

Pfefferkorn noted that the law allows the Department of Justice to prosecute only those who publish or threaten to publish NCII (non-consensual intimate images) of victims; it does not allow victims to sue.

As it’s written, the Take It Down Act only covers explicit images and videos, which must include “the uncovered genitals, pubic area, anus, or post-pubescent female nipple of an identifiable individual; or the display or transfer of bodily sexual fluids.”

“A lot of the images Grok is creating right now are suggestive, and certainly harmful, but not explicit,” Martone said. “Thus, the case could not be pursued in criminal court, nor would it be covered by the new report-and-remove procedure that will be created in May.”

There are also many state laws that the nonprofit consumer advocacy organization Public Citizen tracks here.

If this has happened to you, know it is not your fault and you are not alone, Martone said.

“I recommend immediately contacting a loved one. Ask them to come over or talk with you on the phone as you go through the process of finding the images and choosing how to take action, they said.

Once you have a loved one helping you, reach out to your local rape crisis center, a victims’ rights attorney in your state, or an advocacy organization to help you identify your options and navigate these processes safely, Martone said.

“Because there are so many variations in state laws, a local professional will ensure you are receiving guidance that is accurate and applicable to your situation,” they said.

Read more on HuffPost

This news is powered by HuffPost HuffPost

Share this:

  • Share on X (Opens in new window) X
  • Share on Facebook (Opens in new window) Facebook

Like this:

Like Loading…

Related

Tuchel’s 11: England break national team winning record in Albania victory
DFW Strategy Launches Geographic Exclusivity Marketing Model for Dallas-Fort Worth Service Businesses
RoboMarkets expands trading opportunities with over 1,300 additional US stocks and ETFs
FaFaFa dos Slot Comment A 1-Reel Slot casino book of sun You must Discover!
Gwen Stefani: Is a 2025-26 Pop Comeback Loading?

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Email Copy Link Print
Previous Article Fletcher focussed on Manchester United’s FA Cup tie, not his future at Old Trafford
Next Article Doctors say ‘The Pitt’ reflects the gritty realities of medicine today
© Market Alert News. All Rights Reserved.
Welcome Back!

Sign in to your account

Username or Email Address
Password

Prove your humanity


Lost your password?

%d