Don’t Let Empty Objections Stop the Kids Online Safety Act

COMMENTARY Big Tech

Don’t Let Empty Objections Stop the Kids Online Safety Act

Aug 6, 2024 3 min read

Commentary By

Annie Chestnut Tutor

Policy Analyst, Tech Policy Center

Mark Meador

Visiting Fellow, Tech Policy Center

Signs in support of the Kids Online Safety Act are seen ahead of a news conference with Sens. Blumenthal (D-CT) and Blackburn (R-TN) on July 25, 2024 in Washington, D.C. Kent Nishimura / Getty Images

Key Takeaways

America’s best chance to protect children online and rein in harmful and predatory conduct by social-media companies is close to becoming law.

The argument that children have a constitutional right to access all content ignores long-standing legal precedent and societal norms.

KOSA would give parents more choice and help free kids from the shackles of social media.

America’s best chance to protect children online and rein in harmful and predatory conduct by social-media companies is close to becoming law—but it faces a final, crucial test in the coming months.

The Kids Online Safety Act (KOSA) has garnered broad, bipartisan support since its introduction in the House of Representatives in May, despite opposition from well-funded critics. No fewer than 69 senators have enlisted as co-sponsors of the legislation, giving KOSA a filibuster-proof majority and positioning it to easily pass the Senate when put to a vote.

Therein lies the challenge, however. Big Tech lobbyists and trade groups have been successful at convincing a small number of lawmakers to oppose KOSA’s passage, often relying on misleading and unsupported objections. As we head into a high-stakes election season, Congress will have very few opportunities to address important legislation before the end of the year. It is therefore imperative to dispense with the myths and empty objections surrounding KOSA and act for its swift passage by Congress.

KOSA would require social-media platforms to provide parental controls and disable features, such as algorithmic recommendations, that can be addictive and inappropriate for accounts assigned to children 16 and younger. The bill would also hold platforms accountable when they design and operate feeds that promote harmful content that includes, for example, sexual exploitation, physical violence, suicidal behavior, and illegal substances, to minors.

>>> Big Tech’s Power to Shape Public Discourse

Despite growing support for the bill and a widespread consensus that social media is damaging kids’ mental health and well-being, criticism of KOSA persists.

Some concerns, largely advanced by Big Tech–funded trade groups, simply perpetuate a mischaracterization of the duty-of-care section in the bill. Section 3(a) would direct covered platforms to take reasonable measures in the design and operation of their products used by minors to prevent and mitigate issues such as anxiety and depression, eating disorders, violence, online bullying, and sexual exploitation. Critics claim that this duty-of-care provision is overbroad. But algorithms and other design features that promote content featuring, encouraging, or celebrating harmful and addictive behaviors—especially to users under 17—are indefensible. Data show that kids today are more anxious, lonely, depressed, and suicidal because of the amount of time they spend on their phones and social media. Opponents who think the bill’s definition of duty of care is too broad should consider that perhaps the problems caused by social media are similarly extensive.

First Amendment arguments against KOSA also fall short. The argument that children have a constitutional right to access all content, for example, ignores long-standing legal precedent and societal norms that protect children from certain offline harms. Consider the age restrictions on drinking, smoking, gambling, viewing R-rated movies, and purchasing pornography. Our laws and social norms treat children differently because of their innocence, vulnerability, and underdeveloped reasoning faculties. Abandoning that practice to pander to the interests of Big Tech would be incredibly dangerous.

While the bill aims to place limits on the types of content promoted through recommendation-based algorithms, it would not place a blanket censor on content allowable or accessible on the platforms. The text of the bill states that nothing in the duty-of-care section “shall be construed to require a covered platform to prevent or preclude any minor from deliberately and independently searching for, or specifically requesting, content” or prohibit the platform from providing resources for the prevention or mitigation of the harms described. In other words, while the bill would prevent social-media platforms from promoting inappropriate content, it would not prevent kids from freely searching for content on the platform.

>>> TikTok’s Free Speech Facade

KOSA would provide parents and kids more autonomy over their social-media experience and allow users to opt out of a personalized-recommendation system or to limit categories of recommendations. It would require platforms to provide easy-to-use tools to allow kids to impose limits on notifications, use of their location information, who can contact them, and the amount of time they spend on an app.

KOSA would not replace but rather empower parents, who have the final say in their child’s privacy and account settings. Platforms would be required to provide an accessible reporting process for claims of abuse and to respond to such claims within an established time frame. Many users say current reporting processes are difficult to navigate, and the platforms often do not respond or resolve the issues reported. KOSA aims to fix that.

The current structure of social media is a zero-sum game. Parents have to battle to either keep their kids off entirely or throw in the towel. Platforms that are designed around user-generated content should provide those users with more say in their online experience. This bill would do exactly that.

By diminishing the prominence of harmful content that is damaging to children’s psyches, and by restricting the addictive design features that platforms use to entice child users, KOSA would give parents more choice and help free kids from the shackles of social media.

This piece originally appeared in the National Review

More on This Issue

COMMENTARY 4 min read

TikTok’s Free Speech Facade

COMMENTARY 5 min read

California’s Gift to Big Tech