Pinkston

Age-Appropriate Design Code and Kids’ Social Media Laws: An Overview

  • Teenager Email Smartphone Bed

John Lovett, Senior Manager of Media Research and Data Analytics, Pinkston

Overview

  • Increasingly, federal and state legislators are grappling with how to address teen and child use of social media.

  • Lawmakers are increasingly proposing and debating measures such as regulation of social media design for children, age verification and algorithm feeds.

  • While many agree that actions are necessary to protect adolescents online, questions remain over the constitutionality of certain measures and any implications for adults.

  • Agencies and companies will need to understand how their content and tactics might affect kids – even inadvertently – and make changes based on legislative activity.



On June 17, 2024, U.S. Surgeon General Vivek Murthy, writing in The New York Times, called for congressional action toward the implementation of warning labels for social media. Murthy argued that a warning label was necessary due to mounting data about mental health impacts on youths, noting,

“Adolescents who spend more than three hours a day on social media face double the risk of anxiety and depression symptoms, and the average daily use in this age group, as of the summer of 2023, was 4.8 hours. Additionally, nearly half of adolescents say social media makes them feel worse about their bodies.”

Murthy’s call came just days before Governor Kathy Hochul of New York signed the state’s own kids’ social media bill, which is designed to end unmonitored teen scrolling on social media apps. Meanwhile, in California, a variety of measures have been considered for online content, from a recent law on age-appropriate design code for social media to Governor Gavin Newsom’s statement calling for restrictions on cell phone access in schools. Throughout the United States, both states and the federal government continue to focus on the regulation of social media in society, and for kids in particular.

Central to all this is the effect social media has on children. The potential physical and mental health risks to adolescents range from posts that can adversely affect their mental health, design features that potentially reinforce addictive behavior, cyberbullying from peers and anonymous individuals, access to unsafe illicit substances, and sexual exploitation both from bad actors and potentially algorithms. While there is a broader consensus to protect adolescents and teens, the debate centers around exactly how to address these risks from a policy perspective.

In this overview, we examine some of the major initiatives in regulating social media for kids, including California’s age-appropriate design code law and other state-level laws in addition to legislation considered at the federal level, like the Kids Online Safety Act (KOSA).

The State of Kids’ Online Media

Social media has technically been available since the dawn of the internet. Users of services like Prodigy, America Online (AOL), and CompuServe could easily chat with people worldwide. While many of these would recede over time, others have become widely adopted, especially by children, including Facebook, X (formerly known as Twitter), Snapchat, TikTok, YouTube, and Instagram.

Social media was originally designed for those over 18, with sites like Facebook, for example, requiring a college email at its inception. In turn, Congress and the Clinton administration passed the Children’s Online Privacy Protection Act (COPPA) in 1998, which added consent features for online access for those under 13, leading social media sites to generally ban access to those under that age. However, over time, social media has become a major part of younger lives. Whether sharing pictures on Instagram, messages on Snapchat, or videos on TikTok, teenagers have increasingly used the sites to interact and receive information. While those interactions can be positive, they can also include cyberbullying, information glamorizing eating disorders, self-harm or suicide, and sexual exploitation through images, video, messaging, and other adult content.

Even sites that are explicitly designed for children have not been immune to these issues: YouTube Kids, a version of YouTube designed to feature content exclusively for kids, reportedly showed children videos about losing weight and suicide as well as videos about pregnancy and death related to popular kids cartoon characters like the Paw Patrol and Elsa from "Frozen."

With the issues and landscape at hand, parents want to see changes in how kids access and engage with social media. However, the “how” of doing this has led to battles between groups who want the government to regulate access more tightly and those who have concerns that mandated regulation could have broader ramifications restricting privacy or free speech.

State-Level Regulation: California’s Age-Appropriate Design Code and Similar State Bills/Laws

The states have been one of the earliest actors on kids’ access to social media, with the state of California passing the California Age-Appropriate Design Code Act in 2022. The act was designed to put into place provisions similar to those in the United Kingdom’s design code law that required social media websites to be mindful of potential pitfalls for children under 18, with fines for violations of those laws. California’s law is currently stalled pending litigation from opposing groups.

However, California is not the only state that has considered legislation on kids’ access to social media. According to the National Conference of State Legislatures, 35 states and Puerto Rico saw introductions of some form of social media legislation, with 12 enacting legislation, including Utah (which enacted an age-verification law requiring social media websites to verify ages of individuals setting up accounts) and Montana’s banning of TikTok, among other bills. In 2024, legislators continued to pursue bills, with Maryland passing its own version of California’s Design Code Act. All of these are currently facing legal challenges from opposing groups.

Federal Legislation: The Kids Online Safety Act

The federal government has also considered legislation on kids and online safety. The Kids Online Safety Act (KOSA) was introduced by Senators Richard Blumenthal and Marsha Blackburn in the 117th and 118th Congresses, and updates to the Children’s Online Privacy Protection Act of 1998 (referred to as COPPA 2.0) have also been considered by the most recent Congress. KOSA would require social media sites to change their designs and allow opt-outs of algorithm-based features like curated scrolling, among other measures, as well as requiring social media outlets to offer parental monitoring tools for parents to see what their children are seeing.

As of this writing, the legislation has 70 cosponsors in the Senate, but may not move forward due to other priorities and opposition from individual senators obstructing the bill.

In addition, an updated version of COPPA, the Children and Teens Online Privacy Protection Act (aka COPPA 2.0), has also been introduced to ban targeted advertising to anyone 16 and under.

Arguments For and Against Currently Enacted and Considered Legislation

The actions in California and throughout state legislatures, as well as at the federal level, have been met with support and opposition from a number of groups, as questions focus on constitutionality and rights.

Those in support of legislation are a mix of groups, including parents worried about what their children are seeing online such as ParentsSOS and Fairplay, and groups that are more critical of tech companies’ motivations broadly, such as the Tech Oversight Project and the Tech Transparency Project. These groups argue that tech companies are playing an outsized role in the lives of children, and kids need protection against algorithms and other measures that could harm them or lead them to suicide.

Many parents and parents’ groups have also come out in favor of legislation, especially with regard to limiting access to algorithmic feeds. For example, a recent editorial in The Hill outlines how a father lost his 12-year-old son due to scrolling on TikTok and finding the “Blackout Challenge,” which the child then attempted, leading to the child’s death. The parent notes the addictive nature of social media, which has been lucrative for companies to the tune of $11 billion in revenue in 2022.

Those against legislation (especially with regard to KOSA) are a combination of civil rights and liberties groups like the ACLU and Electronic Frontier Foundation, as well as tech-aligned groups like NetChoice and Chamber of Progress. Their arguments vary by group, but one common theme among many is that age-appropriate design code laws violate the First Amendment to the Constitution and hinder free speech and expression. As NetChoice argued on KOSA, “The bill sets dangerous precedent that would grant the government greater power over free speech online and would grant partisan bureaucrats authority over what speech is “appropriate” for American families to see and hear.” From a constitutional standpoint, the Electronic Frontier Foundation voiced concerns over the First Amendment protections and access, as well as whether the bill would be effective in helping children.

Meanwhile, others have argued that the legislation would give too much power to attorneys general who would use that power to restrict information for LGBTQ people. As noted in one coalition letter led by Chamber of Progress,

“KOSA specifically targets platforms for hosting online content that allows for the “exploitation” and “abuse” of minors. Unfortunately, extremist lawmakers and media personalities have employed a smear campaign adopting this same terminology to attack supporters and members of the LGBTQ community.”

What Should You Know

Social media laws related to children will likely affect the entire ecosystem of social media, from content warnings, as in the case of the Surgeon General’s recommendation, to access for design code and age verification laws. In light of this, companies and individuals involved in social media campaigns (organic or paid) should understand how their content could influence children and the ramifications of it reaching this audience. Corporations and agencies need to consider changes to social media as well as the implications of what a social media warning label would have on the content they wish to produce.

Pinkston provides research in support of a wide range of thought leadership, including media analysis, policy analysis, audience surveys, focus groups, and competitive analyses. Pinkston also offers a full suite of digital marketing services, including social media, digital advertising, website design, production, and authoritative digital content.

For more information or questions about this article, please contact John Lovett at john.lovett@pinkston.co