MarketAlert – Real-Time Market & Crypto News, Analysis & AlertsMarketAlert – Real-Time Market & Crypto News, Analysis & Alerts
Font ResizerAa
  • Crypto News
    • Altcoins
    • Bitcoin
    • Blockchain
    • DeFi
    • Ethereum
    • NFTs
    • Press Releases
    • Latest News
  • Blockchain Technology
    • Blockchain Developments
    • Blockchain Security
    • Layer 2 Solutions
    • Smart Contracts
  • Interviews
    • Crypto Investor Interviews
    • Developer Interviews
    • Founder Interviews
    • Industry Leader Insights
  • Regulations & Policies
    • Country-Specific Regulations
    • Crypto Taxation
    • Global Regulations
    • Government Policies
  • Learn
    • Crypto for Beginners
    • DeFi Guides
    • NFT Guides
    • Staking Guides
    • Trading Strategies
  • Research & Analysis
    • Blockchain Research
    • Coin Research
    • DeFi Research
    • Market Analysis
    • Regulation Reports
Reading: Content Tokenization Could be the Next Biggest AI Trend – Here’s Why
Share
Font ResizerAa
MarketAlert – Real-Time Market & Crypto News, Analysis & AlertsMarketAlert – Real-Time Market & Crypto News, Analysis & Alerts
Search
  • Crypto News
    • Altcoins
    • Bitcoin
    • Blockchain
    • DeFi
    • Ethereum
    • NFTs
    • Press Releases
    • Latest News
  • Blockchain Technology
    • Blockchain Developments
    • Blockchain Security
    • Layer 2 Solutions
    • Smart Contracts
  • Interviews
    • Crypto Investor Interviews
    • Developer Interviews
    • Founder Interviews
    • Industry Leader Insights
  • Regulations & Policies
    • Country-Specific Regulations
    • Crypto Taxation
    • Global Regulations
    • Government Policies
  • Learn
    • Crypto for Beginners
    • DeFi Guides
    • NFT Guides
    • Staking Guides
    • Trading Strategies
  • Research & Analysis
    • Blockchain Research
    • Coin Research
    • DeFi Research
    • Market Analysis
    • Regulation Reports
Have an existing account? Sign In
Follow US
© Market Alert News. All Rights Reserved.
  • bitcoinBitcoin(BTC)$68,382.002.31%
  • ethereumEthereum(ETH)$1,977.362.19%
  • tetherTether(USDT)$1.000.15%
  • rippleXRP(XRP)$1.4916.06%
  • binancecoinBNB(BNB)$649.650.09%
  • usd-coinUSDC(USDC)$1.000.01%
  • solanaSolana(SOL)$84.11-0.51%
  • tronTRON(TRX)$0.271172-1.88%
  • staked-etherLido Staked Ether(STETH)$2,267.81-3.37%
  • dogecoinDogecoin(DOGE)$0.0958712.55%
Smart Contracts

Content Tokenization Could be the Next Biggest AI Trend – Here’s Why

Last updated: June 26, 2025 4:40 am
Published: 8 months ago
Share

Blockchain and decentralized autonomous organizations (DAOs) could transform content licensing, ensuring fairness, traceability, and collective decision-making.

Leading media organizations are increasingly signing licensing agreements with AI giants. For newspapers like The New York Times, such a deal safeguards their intellectual property and provides an additional revenue stream.

Meanwhile, companies like OpenAI and Amazon can train their models on accurate information and avoid lawsuits over copyright infringement. However, experts from IoTeX Network, O.XYZ, and AR.IO told BeInCrypto that existing decentralized alternatives could more transparently and equitably achieve the same results for content creators.

In a move that drew considerable attention, The New York Times signed a deal with Amazon earlier this month, allowing Amazon to use its editorial content to train the tech company’s artificial intelligence (AI) models.

The licensing agreement between The New York Times and Amazon allows the tech company to use articles from the newspaper and its other publications. However, the newspaper’s public announcement about the deal did not reveal the financial terms.

This decision marks a public change in strategy for The New York Times, which had previously opposed large language models (LLMs) using its content without permission.

In January 2024, the newspaper sued OpenAI and Microsoft over copyright infringement. The New York Times claimed these companies used copyrighted articles to train their LLMs without permission or compensation. That lawsuit is still ongoing and has not yet reached an outcome.

The New York Times is not the first media organization to sue a technological company over unfair use of its intellectual property.

“‬In‬‭ recent‬‭ years,‬‭ many‬‭ big‬‭ tech‬‭ projects‬‭ have‬‭ encountered‬‭ numerous‬‭ legal‬‭ challenges‬‭ and‬‭ fines. For‬‭ example,‬‭ Google‬‭ has‬‭ faced‬‭ over‬‭ €8‬‭ billion‬‭ in‬‭ fines‬‭ from‬‭ the‬‭ EU‬‭ in‬‭ the‬‭ past‬‭ decade‬‭ due‬‭ to‬‭ poor‬‭ data‬‭ practices,” Ahmad‬‭ Shadid,‬‭ CEO‬‭ of‬‭ O.XYZ., told BeInCrypto.

As the creators of leading LLMs need more widespread access to accurate information, such deals are becoming increasingly common.

Licensing deals are growing in popularity. Last year, OpenAI, led by Sam Altman, signed an agreement with the European multinational media company Axel Springer SE. The deal closely mirrored the one recently made between The New York Times and Amazon.

The agreement allows OpenAI to use articles from media organizations owned by Axel Springer, including Politico, Business Insider, and Morning Brew, among other top international publications.

Altman later signed similar agreements with the Financial Times, Vogue, and the parent companies of outlets like The New Yorker, Cosmopolitan, and Le Monde, to name a few. OpenAI agreed to backlink all relevant information to the original articles as part of these deals.

As major technological companies face increasing pressure over intellectual property violations and copyright infringement, these situations are a win-win for all parties involved.

“After lawsuits like the one The New York Times filed, AI companies are being more cautious about what they train on. Licensing deals offer peace of mind, and for publishers, it’s a chance to turn decades of archived content into steady income. At the same time, AI companies benefit from exclusive access to trusted sources, which helps improve the quality of their models,” Aaron Basi, Head of Product at IoTeX Network, explained.

But, is there a better way to achieve the same results with greater transparency?

It is becoming increasingly urgent to find a solution that broadens access to trustworthy information when interacting with AI and fairly compensates its creators. Licensing agreements offer one path to this goal.

“There is huge strategic value. These deals can include better visibility, like being featured in AI-generated answers or summaries. There’s also access to analytics showing how content is being used or interacted with,” Basi said.

It also goes a long way in preventing misinformation when using LLMs.

“Training AI without verified, transparent data is like flying blind. If we can’t trace what went in, we can’t trust what comes out. This is how we end up with silent failures crafted by brittle AI models that lack long-term consideration,” Phil Mataras, founder of AR.IO, told BeInCrypto.

However, these licensing agreements are often private, making it difficult for smaller content creators to secure similar deals or protect themselves from cases of unfair use. Decentralization has the potential to level the playing field here.

“Closed models win short-term sprints. Decentralized models win the marathon. Trust reigns supreme alongside transparency and auditability,” Mataras added.

There are several different tools that Web3 has to offer that can achieve such a thing.

Decentralized technologies can create a more democratic and transparent system for all creators to license their content. This is especially beneficial for those often overlooked in traditional private agreements.

“Instead of cutting individual licensing deals behind closed doors, creators can upload content to a decentralized network. Smart contracts can enforce terms and automatically handle payments. This makes it easier for independent creators or smaller organizations to participate. It also creates more transparency around who’s using the data and how,” Basi explained.

Tokenization also offers creators a method to track the active use of their content by AI models.

“Tokenizing content could give publishers more control and better tracking. For example, they could set rules around access or usage and get paid automatically through smart contracts. It’s still early, but for digital-first media companies, this kind of setup might offer new ways to earn revenue without giving up control,” Basi added.

Other blockchain-based solutions can ensure unbreakable record-keeping to strengthen these decentralized options further.

Another vital aspect of a truly equitable digital ecosystem involves ensuring authenticity, tracking usage, and protecting intellectual property. This is where blockchain-based provenance systems emerge as powerful solutions.

Blockchain-based provenance systems are designed to record the history and lineage of digital content meticulously. They leverage blockchain’s core features — its traceability, transparency, and immutability — to create trustworthy and tamper-proof records.

Every significant event in a content’s lifecycle, from its creation to any changes or transfers, can be logged on a distributed ledger, creating an unbreakable record of its history.

“Provenance‬‭ systems‬‭ have‬‭ been‬‭ very‬‭ helpful‬‭ in‬‭ the‬‭ tech‬‭ industry.‬‭ Having‬‭ to‬‭ depict,‬‭ precisely,‬‭ the‬ history‬‭ of‬‭ a‬‭ dataset‬‭ being‬‭ utilized‬‭ or‬‭ transferred.‬‭ It‬‭ helps‬‭ to‬‭ dictate‬‭ the‬‭ initial‬‭ owner,‬‭ who‬‭ it‬‭ was‬ sold‬‭ to,‬‭ how‬‭ it‬‭ was‬‭ sold,‬‭ when,‬‭ and‬‭ the‬‭ current‬‭ holder‬‭ of‬‭ that‬‭ dataset.‬‭ Blockchain‬‭ systems‭ already‬‭ have‬‭ permanent‬‭ storage‬‭ mechanisms‬‭ — they‬‭ provide‬‭ rigidity‬‭ when‬‭ it‬‭ comes‬‭ to‬‭ data‬ ownership,” Shadid told BeInCrypto.

Building on this foundation of verifiable history, watermarking tools complement provenance systems by embedding hidden, identifiable information directly into digital content.

“Watermarking‬‭ tools‬‭ play‬‭ a‬‭ key‬‭ role‬‭ in‬‭ preventing‬‭ copyright‬‭ infringement,‬‭ data‬ theft,‬‭ and‬‭ wrongful‬‭ claim‬‭ of‬‭ ownership.‬‭.. These‬‭ techniques‬‭ bring‬‭ a‬‭ tougher‬‭ game‬‭ for‬‭ the‬‭ data‬‭ thieves‬‭ and‬‭ hackers‬‭ in‬‭ order‬‭ to‬‭ provide‬‭ data integrity, fairness, and ethics,” Shadid added.

The principles of decentralization could also be extended to the collective governance and management of content.

Instead of individual creators or the leadership of large media organizations solely making content licensing decisions, decentralized autonomous organizations (DAOs) could empower collectives of creators, such as journalists, to take control of decision-making collaboratively.

“A group of creators could pool their work and use a DAO to manage licensing, payments, and governance. This approach gives independent voices a seat at the table when dealing with large AI firms. It also makes it easier to negotiate fair terms and ensures that decisions are made collectively. It’s like a union, but designed for the digital age,” Basi explained.

Despite the focus on transparency, licensing agreements between AI models and information sources are still in their early stages. This raises a critical question: Will open-source models lag as AI companies secure exclusive data deals?

LLMs’ unauthorized and opaque use of content initially sparked significant discontent among original creators. Licensing agreements have now improved the situation.

However, full transparency is yet to be seen. Deals like the one struck between The New York Times and Amazon will not be enough for people who want to know where they get their data from and for creators who wish to understand how their content is being used.

“Closed models win short-term sprints. Decentralized models win the marathon. Trust reigns supreme alongside transparency and auditability,” Mataras said.

Basi agreed, adding:

“Transparency is a powerful advantage. People want to understand what goes into the tools they use, especially in sensitive fields like health or education. Open-source projects can adapt quickly, get help from the community, and build trust through openness. In the long run, that trust might matter more than access to a few exclusive datasets.”

Though licensing deals are a good starting point, the real transformation for content creators and AI transparency will likely stem from decentralized and open-source approaches.

Read more on BeInCrypto

This news is powered by BeInCrypto BeInCrypto

Share this:

  • Share on X (Opens in new window) X
  • Share on Facebook (Opens in new window) Facebook

Like this:

Like Loading...

Related

Artur Gronus Leads Logistics Innovation via Finance & Tech
Join FiveCrypto and complete the AI Training Model to Get 0.1 ETH Reward
Embracing New Technologies And Players In Payments, Federal Reserve Governor Christopher J. Waller, At The Payments Innovation Conference, Federal Reserve Board, Washington, D.C.
What Does The US Crypto Poker Landscape Look Like in 2025?
The Best Alternative to SOL for Investors Targeting $10 by August 2026 – Blockonomi

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Email Copy Link Print
Previous Article Zama Raises $57M In Series B To Bring End-to-End Encryption To Public Blockchains – The Industry Spread
Next Article Decentralized AI: Empowering the Future of Autonomous Systems
© Market Alert News. All Rights Reserved.
Welcome Back!

Sign in to your account

Username or Email Address
Password

Prove your humanity


Lost your password?

%d