Jill Murphy has worked for Common Sense Media for 20 years, almost since the organization first started rating movies, television shows, and other media. She began advocating for child-safe technologies to help parents and teachers protect children from what they might see on screens.
But she could not protect her 15-year-old daughter on Wednesday when a classmate sitting next to her in chemistry class pulled out his phone and watched a video of the shooting death of conservative activist Charlie Kirk.
“She just can’t get over it,” Ms. Murphy says. “She’s saying, ‘What about his family? Where were his kids? I saw it from two different angles.’ She’s like, ‘I didn’t want to see it, but then I also couldn’t look away from it because I just couldn’t believe what I was seeing.’”
Why We Wrote This
Scenes and images of real-life violence spread rapidly on social media, where many Americans, including children, encounter them. This week’s gruesome footage of the deaths of Charlie Kirk and a Ukrainian refugee has renewed debate about safeguards for online content.
Ms. Murphy sighs. “I mean, look at all those layers. How do you unpack that?”
It’s a question that Americans across the country – and across political differences – are asking today after a week of gruesomely violent acts filled social media feeds. Images of murder appeared uninvited in schools and bedrooms, breakfast tables and playgrounds – Mr. Kirk’s shooting, caught on camera as he spoke to students at Utah Valley University, and the newly released video of Ukrainian refugee Iryna Zarutska being stabbed to death in August as she sat scrolling on her phone on a train in Charlotte, North Carolina.
For years, those worried about violence on social media have argued that technology companies should create more safeguards around what content is pushed by algorithms to those people on the other side of smartphones, especially when those looking at the screens are children. This week is a reminder that it’s an issue faced by adults as well.
In June 2024, then-U.S. Surgeon General Vivek H. Murthy called for warning labels on social media platforms, citing research showing that those sites can dramatically increase anxiety and depression symptoms among young people. In the state of Utah, where Mr. Kirk’s assassination took place, lawmakers in 2023 created legislation to regulate children’s access to social media – policies that were challenged, then waylaid, in court.
The episodes of violence have put a new focus on that question. Social media companies have long argued that they are not responsible for the content uploaded onto their sites. But as Johannes Thrul, an associate professor in the Department of Mental Health at the Johns Hopkins Bloomberg School of Public Health, says, they are responsible for the way their technology distributes and amplifies content to others.
“That’s really where we have to start holding companies accountable,” Dr. Thrul says. “For what their recommender systems are providing not just to kids, to teenagers, but to everyone. Because you could argue that this is not good content to recommend to anyone.”
Social media firms say they do have mechanisms to block content that violates their company standards, such as extreme gore and graphic violence. But those sorts of images are still regularly distributed and absorbed. And that matters.
Research in 2019, for instance, found that those people who watched even part of a beheading video created by the Islamic State were more likely to fear future negative events – two years after the videos went viral.
Algorithms feeding violent content
Just past midnight on Wednesday night, Keri Rodrigues practiced taking deep breaths with her 12-year-old son.
He couldn’t sleep, she says – not after watching a social media video of Mr. Kirk’s death.
“I wouldn’t allow my child to go to a movie theater in which they were showing this kind of scene,” says the Boston mother of five sons. “Yet he comes out of basketball practice, very innocently, and the algorithm is putting it in front of him immediately.”
It wasn’t the first time she’s helped her son process horrifying images online, says Ms. Rodrigues, the founding president of the National Parents Union. This advocacy organization has been pushing for the Kids Online Safety Act, federal legislation that would implement new safeguards and a “duty of care” standard for social media platforms.
Last year, the Youth Endowment Fund in Britain released a study showing that 70% of teenagers had come across real-life violence on social media within the past year. Only 6% had searched for that content, while 25% had been fed the content through algorithmic recommendations such as newsfeeds, stories, and “for you” sections.
The study found that TikTok was the platform where teenagers were most likely to encounter real-life violence. On Thursday, the social media platform said in a statement that it was working to keep users from seeing the videos of Mr. Kirk’s murder.
“The horrific, violent acts have no place in our society,” the statement read. “We remain committed to proactively enforcing our Community Guidelines and have implemented additional safeguards to prevent people from unexpectedly viewing footage that violates our rules.”
Meta, the parent company of Facebook, also said that it would be applying warning screens to footage of the shooting and would restrict it to users 18 and older, according to spokesperson Francis Brennan.
Conversations about navigating technology
For Ms. Murphy, the rise of social media has been a game changer in her work helping parents and teachers navigate technology.
“We used to do this work around protecting and managing the world for your kids – it was like ‘turn off the TV, don’t have it on for everybody,’” she says. “That’s the kind of thing that isn’t a viable option anymore. Even for our younger kids, there isn’t an opportunity to really shut it down.”
One child might not be allowed to have a phone or social media accounts – but someone else in the playground might. And while there’s nothing new about young people sneaking looks at the taboo, the way that disturbing content appears without invitation adds a new layer.
This requires new types of conversations.
E. Alison Holman, professor of nursing and psychology at the University of California at Irvine, says the more people engage with graphic media, the more it can negatively impact them. Everyone, children and adults alike, should be “thoughtful and mindful about how they use media.”
Maria Gregori has had this conversation with her children.
Though the Los Angeles mother doesn’t spend much time on social media, her two teenage boys do. It was her son who initiated a discussion about Mr. Kirk’s shooting. It was a wake-up call, she says, that she needs to ask them uncomfortable questions about what they’re seeing online.
“We want to be aware,” she says. “What they’re looking at is what they’re thinking about, and what they’re thinking about is going to affect their wellbeing.”
She was taken aback at how normal it seemed for her son to be taking in this sort of violence.
“Yes, he was shocked and concerned,” she says. “But it’s another day in the life of an American teenager getting that information.”
What gives her hope, she says, is that people’s natural pull toward happiness will turn them away from images wreaking sadness and anger.
“This is the kind of stuff that inspires people to take action,” says Ms. Murphy, from Common Sense Media. “There are places to put your effort and your energy and turn it into action, instead of just feeling like everything is messed up. This is important for all families to think about, no matter your political view.”