The future of influence: risk, data and the human-AI creator duo
Risk-adjusted pricing and governance are coming to the fore, and brands must clarify why they are engaging a creator. Also, as AI influencers are here to stay, collaboration with them must be disclosed
The influencer economy is no longer the Wild West it once was. As creator partnerships mature, brands are under pressure to balance innovation with governance, measuring risk while still unlocking creative value.
Three critical shifts are defining the next era of influence. These are: having risk-adjusted models for partnerships, treating creators as ethical data partners and pairing human creators and AI responsibly. Together, the three point towards a more intentional, transparent and future-ready industry.
Influence has never been a flat-fee game. A creator’s ability to inspire or manipulate audiences carries real-world impact. Paying the same rate across categories, regardless of brand safety, audience volatility or reputational risk, is no longer sustainable. This is where risk-adjusted pricing comes into play. It’s a system in which fees and engagement terms reflect the true exposure a brand assumes.
Locally, the South African Content Creator Charter of the Interactive Advertising Bureau (IAB) is an important tool for setting expectations and managing risk. The charter commits creators and marketers to transparency, respect for privacy and compliance with local and global regulations. Brands can use the charter as a baseline, rewarding creators who are in alignment with lower risk premiums or longer-term partnerships. Those who do not adopt these standards should expect stricter controls and possibly higher rates to cover potential exposure.
Intentionality is crucial. Brands must clarify why they are engaging a creator, how that collaboration will be disclosed and what principles will guide it. If the intent is to genuinely inform or inspire audiences, risk decreases. If the intent is purely to push sales with little regard for ethics or transparency, risk increases and pricing should reflect this.
As third-party cookies disappear and privacy regulations tighten, creators are becoming critical allies for building first-party data strategies. Rather than treating creators solely as media channels, forward-thinking brands are treating them as consent gateways, helping audiences opt in to value exchanges.
This means co-creating campaigns in which the benefit to the audience is clear: exclusive drops, gated experiences, giveaways or loyalty rewards that require an informed opt-in. To do this responsibly, marketers should implement consent logs aligned with South Africa’s Protection of Personal Information Act (Popia) and the EUs General Data Protection Regulation (GDPR) as well as other global privacy standards, ensuring that every data point collected is privacy-safe and auditable.
Creators who can communicate these exchanges clearly and ethically will become indispensable partners. They not only help brands collect compliant data but also build communities rooted in trust, where audiences willingly share information because they see value.
Recent years have seen the introduction of AI influencers, and while they are here to stay, they don’t replace humans. Instead, the most exciting opportunity lies in human-AI collaborations that blend efficiency, creativity and authenticity. For example, a human creator might use generative AI to storyboard campaign ideas, create virtual backdrops or co-write scripts while the human is still the face and voice of the content. Or a human could appear side-by-side with an AI influencer in a campaign.
Brands can use the [AIB] charter as a baseline, rewarding creators who are in alignment with it with lower risk premiums or longer-term partnerships.
This future requires robust governance. Contracts must define ownership of AI-assisted assets, likeness rights and usage permissions. Disclosure must go beyond #ad or #spon hashtags to include AI involvement, ensuring audiences know when synthetic elements are part of what they are seeing.
The IAB charter already sets a high bar for transparency and disclosure. The next step is creating a guideline for the correct and incorrect ways to disclose partnerships, paid content and AI involvement. The goal is to establish a long-term standard that protects audiences while providing confidence to brands and creators.
With so much at risk, clear governance is a legal imperative. Popia, GDPR and the EU Artificial Intelligence Act, which came into force in August 2024, are shaping how data, automation and transparency are regulated. Most importantly, AI cannot be sued or held liable under current law, which means the ultimate responsibility falls on the human creator or brand commissioning the work.
This is why disclosure, consent logs and clear contracts are essential. They create a paper trail that proves compliance and intention, two factors that regulators increasingly weigh when investigating advertising ethics or data breaches.
The future of influence will be defined by responsibility and intention. Brands, agencies and creators must collaborate to build an ecosystem in which risk is acknowledged and priced fairly, data is collected ethically and AI is used transparently. The IAB charter and forthcoming disclosure guidelines offer a strong foundation. Those who adopt these frameworks early will not only protect themselves from risk but also earn the trust of their audiences, building influence that lasts.
Casey Mantle is the chair of the IAB South Africa influencer marketing committee. To learn more about the IAB South African Content Creator Charter, visit iabsa.net
The big take-out: The IAB South African Content Creator Charter offers a strong foundation for brands, agencies and creators to build an ecosystem in which risk is acknowledged and priced fairly, data is collected ethically and AI is used transparently.

