
Child fatality task force makes recommendations to combat social media addiction
North Carolina experts tasked with lowering the number of child deaths in the state are eyeing two high-tech targets: Social media and artificial intelligence.
The North Carolina Child Fatality Task Force’s Intentional Death Prevention Committee met Wednesday to talk about their concerns over the algorithms used by social media and AI companies, which they said are fueling mental health problems. In the past decade nearly 500 North Carolina children have committed suicide.
The meeting comes on the heels of a congressional hearing where parents alleged that chatbots coached their child into taking his own life. It also comes as the state seeks to regulate teens’ exposure to social media.
North Carolina Gov. Josh Stein signed a bill this summer requiring schools to ban cell phones during class, and to teach lessons on social media literacy. For the 2026-27 school year, school boards across the state must create social media literacy lessons that teach students about social media addiction, misinformation and manipulation.
One of the presentations Wednesday focused on AI chatbots and the increasing number of teenagers turning to them for emotional support — only to have their secrets sold to advertisers who can then use the information about their hopes, fears and insecurities to target ads at them.
Committee co-chair Whitney Bellich called the report “terrifying information.” But Celeste Campos-Castillo, a Michigan State professor who studies teen use of chatbots, said the AI tools can be helpful. It’s only when teens become too reliant on or obsessed with them — something that critics fear their algorithm intentionally tries to make happen — that it becomes more troubling.
“Nearly three quarters of teens are using chatbots,” Campos-Castillo said Wednesday. “Chatbots are here, and teens are using them.”
Her interviews with teens, she said, show that even those who use chatbots for emotional support and validation often tell her they’d prefer to have a real person to confide in. So she suggested that schools do more to hire counselors, psychologists and nurses so that teens have more options.
Task force members said Wednesday that data shows children who commit suicide are mostly boys, most of whom haven’t previously been treating for mental health issues, and that typically it has to do with something at school.
North Carolina lags far behind nationally recommended levels of those types of professionals working in local schools.
“Teens like the idea of using a chat bot for emotional support,” Campos-Castillo said. “But really, they would much rather seek support from humans who are close by, like the school counselor or school nurse — meaning someone they could see in person if they wanted to. And so to reduce the risk of harms, including suicide, I’m going to suggest in addition to regulating chat bot design, that we do what teens want, which is improve their access to local resources like humans who can help them.”
Proposed regulations on social media or AI have so far faced a difficult time gaining support in the state legislature, where Republican leaders tend to be more in favor of de-regulation.
Members of the Child Fatality Task Force, as well as outside advocates, noted that some state legislators have already filed multiple bills this session aimed at cracking down on social media addiction, AI chatbots and other issues they believe are harming mental health. None of those bills became law, and some were never even allowed a committee hearing.
But Rep. Donna White, R-Johnston, said Wednesday she will keep pushing for these ideas to get more attention. White chairs the House Health Committee, which handles bills related to health issues including mental health. She specifically praised Ava Smithing, a 24-year-old activist pushing for stronger controls on social media, who gave a presentation to the task force.
“I think she’s the smartest one in the room, and anything that I can do to help move this, Ava, I would love to talk with you,” White said.
Smithing told the task force she wants North Carolina to put in place stronger regulations on social media companies to crack down on the practice of collecting and selling people’s personal data, particularly of children. She said she developed mental health issues after starting to use social media as a 12-year-old — something she blamed on social media algorithms that kept pushing images of models onto her feeds that exacerbated her anxiety around body image issues.
“Their ultimate goal is to keep you on the app for as long as possible,” Smithing said. “And they had discovered that the best way to do this was by triggering people’s fight-or-flight. Or showing them images and photos that, in a way, triggered their ‘human negativity bias’ — which is to look at negative information and observe that information for longer.”
Smith said the legislature should pass Senate Bill 514, a proposal she worked with members of the Republican-led state legislature to file earlier this spring. It would enact stricter privacy protections for social media users, including a ban on social media companies selling childrens’ data to advertisers. The bill had bipartisan sponsors, including some influential Republicans, but has gone nowhere.
Social media companies lobby heavily in state legislature as well as in Congress, and they have opposed some efforts to regulate them. Some proposals are criticized as unconstitutional First Amendment violations; others are big-government overbearance.
A trade group for big tech companies backed by Meta, Google, Amazon, Pinterest, Reddit and other companies says lawmakers should invest more money into law enforcement efforts to find child predators online — but that any efforts to stop children from signing up for social media accounts, using age verification tools, are unconstitutional.
“While parents and guardians try to adapt their families to emerging digital tools, they’ve consistently expressed to both technology services and lawmakers that they’d like help,” the group, Net Choice, says on its website. “However, some policymakers across the U.S. have responded with laws that violate the core, protected right of Americans to free expression.”
The state attorney general’s office has also targeted social media companies.
In 2023, when Stein was attorney general, North Carolina was one of 33 states that sued Meta Platforms Inc., alleging that the social media company had contributed to a youth mental health crisis by knowingly designing features on its Instagram and Facebook platforms that addict children to its platforms.
The company said in a statement that year that it shares “the attorneys general’s commitment to providing teens with safe, positive experiences online,” and noted that it has introduced more than 30 tools to support teens and their families. It also expressed an openness to working with officials to create standards for apps used by teens.
This year, North Carolina Attorney General Jeff Jackson signed onto a letter to Instagram urging the company to make changes to the real-time location map feature, which allows users to share their location with others. Jackson and other attorneys general said the feature could raise the risk for stalking, data collection and child exploitation.
An Instagram spokesperson previously told WRAL that Instagram Map’s design addresses concerns raised by the attorneys general saying “it is off by default,” adding: “Everyone receives a notification explaining what the feature entails and can turn it off whenever they want, and with parental supervision, parents get a notification if their teen starts using it and can block their access to location sharing at any time.”
Jackson also got a key legal win earlier this year, WRAL reported, when a judge blocked TikTok’s effort to keep evidence confidential in a similar lawsuit North Carolina is pursuing against that company. After that ruling, one of the pieces of evidence that became public was a recording of internal video chats between TikTok employees appearing to acknowledge that the company’s app and the algorithms behind it could be harmful to young people’s mental health.
TikTok characterized the video as misleading, saying its employees were trying to discuss ways to make the app safer.

