MarketAlert – Real-Time Market & Crypto News, Analysis & AlertsMarketAlert – Real-Time Market & Crypto News, Analysis & Alerts
Font ResizerAa
  • Crypto News
    • Altcoins
    • Bitcoin
    • Blockchain
    • DeFi
    • Ethereum
    • NFTs
    • Press Releases
    • Latest News
  • Blockchain Technology
    • Blockchain Developments
    • Blockchain Security
    • Layer 2 Solutions
    • Smart Contracts
  • Interviews
    • Crypto Investor Interviews
    • Developer Interviews
    • Founder Interviews
    • Industry Leader Insights
  • Regulations & Policies
    • Country-Specific Regulations
    • Crypto Taxation
    • Global Regulations
    • Government Policies
  • Learn
    • Crypto for Beginners
    • DeFi Guides
    • NFT Guides
    • Staking Guides
    • Trading Strategies
  • Research & Analysis
    • Blockchain Research
    • Coin Research
    • DeFi Research
    • Market Analysis
    • Regulation Reports
Reading: AI can detect criminal plotting but alerting police remains privacy dilemma
Share
Font ResizerAa
MarketAlert – Real-Time Market & Crypto News, Analysis & AlertsMarketAlert – Real-Time Market & Crypto News, Analysis & Alerts
Search
  • Crypto News
    • Altcoins
    • Bitcoin
    • Blockchain
    • DeFi
    • Ethereum
    • NFTs
    • Press Releases
    • Latest News
  • Blockchain Technology
    • Blockchain Developments
    • Blockchain Security
    • Layer 2 Solutions
    • Smart Contracts
  • Interviews
    • Crypto Investor Interviews
    • Developer Interviews
    • Founder Interviews
    • Industry Leader Insights
  • Regulations & Policies
    • Country-Specific Regulations
    • Crypto Taxation
    • Global Regulations
    • Government Policies
  • Learn
    • Crypto for Beginners
    • DeFi Guides
    • NFT Guides
    • Staking Guides
    • Trading Strategies
  • Research & Analysis
    • Blockchain Research
    • Coin Research
    • DeFi Research
    • Market Analysis
    • Regulation Reports
Have an existing account? Sign In
Follow US
© Market Alert News. All Rights Reserved.
  • bitcoinBitcoin(BTC)$68,074.004.08%
  • ethereumEthereum(ETH)$2,058.627.05%
  • tetherTether(USDT)$1.000.02%
  • rippleXRP(XRP)$1.444.28%
  • binancecoinBNB(BNB)$623.793.27%
  • usd-coinUSDC(USDC)$1.000.01%
  • solanaSolana(SOL)$86.854.62%
  • tronTRON(TRX)$0.2863300.28%
  • dogecoinDogecoin(DOGE)$0.0985404.75%
  • Figure HelocFigure Heloc(FIGR_HELOC)$1.03-0.24%
Global Regulations

AI can detect criminal plotting but alerting police remains privacy dilemma

Last updated: February 26, 2026 5:55 pm
Published: 18 minutes ago
Share

Canada’s Tumble Ridge shooter reportedly used ChatGPT to plot violence but OpenAI did not alert authorities despite flagging danger

Artificial intelligence (AI) platforms are making efforts to detect users who attempt to use them for criminal purposes by shutting down accounts and alerting local law enforcement when deemed necessary.

People use large language models (LLMs), such as ChatGPT, Google Gemini, DeepSeek, Perplexity, Grok, Microsoft Copilot, and Claude, for everything these days, ranging from general research to health inquiries.

These powerful models can be exploited to plot physical attacks or even manufacture weapons and ammunition.

Tech companies are training their models to automatically reject malicious prompts, but the decision as to when to notify authorities before a crime is committed remains a complex issue of data privacy and internal policy.

The privacy dilemma of AI chats being used to alert law enforcement came to the forefront following a recent mass shooting in Canada.

The Wall Street Journal recently reported that an armed attack at a home and a school in the Canadian district of Tumbler Ridge in northeastern British Columbia on Feb. 10 left 10 people dead, including the attacker, and 27 injured.

Months before the attack, Jesse Van Rootselaar, the 18-year-old shooter, reportedly used ChatGPT for criminal plotting, spending several days last June describing scenarios involving armed violence to the chatbot.

While these conversations were flagged and forwarded to OpenAI employees via an automated review system, which led to the closure of the account, OpenAI said the activity did not meet the criteria for imminent threat and ultimately did not alert law enforcement.

This incident triggered a global debate over data privacy and whether or not AI platforms should have to serve as early warning systems.

Companies developing these chatbots implement multi-layered security protocols to prevent criminal use.

For instance, Google’s Gemini automatically rejects chat prompts requesting instructions to manufacture weapons, synthesize illegal substances, or plot acts of physical violence via its automatic system and refuses to respond.

Many experts and independent reviewers regularly monitor risky conversations flagged by the system.

Google may share data with authorities if it deems an imminent and serious risk of physical harm, such as in the form of bomb threats, school shootings, suicide, kidnappings, and more, according to the company’s policy.

Global regulations generally cover data sharing by tech firms only after a crime has already been committed, which leaves pre-crime notifications largely at their discretion.

Read more on Anadolu Ajansı

This news is powered by Anadolu Ajansı Anadolu Ajansı

Share this:

  • Share on X (Opens in new window) X
  • Share on Facebook (Opens in new window) Facebook

Like this:

Like Loading...

Related

The Rise of AI Is Making Life Even Harder for Real People in Gaza
Tsuneishi debuts first methanol bulk carrier
Chinese researchers suggest lasers to counter Musk’s Starlink satellites
Untangling complex cybersecurity stacks in a supercharged risk environment
Nations meet to consider regulations to drive green transition in shipping

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Email Copy Link Print
Previous Article Was Jane Street behind the bitcoin crash? A deep dive into why that theory may not not hold
Next Article Commercially Connected – 25 February 2026
© Market Alert News. All Rights Reserved.
Welcome Back!

Sign in to your account

Username or Email Address
Password

Prove your humanity


Lost your password?

%d