Despite lagging behind counterparts in regulating for online safety, the US could soon align with other global powers in enhancing protections for children
The FTC proposes reigning in targeted advertising for children under 13
On 20 December 2023, the US Federal Trade Commission (FTC) proposed changes to its rules under the Children’s Online Privacy Protection Act (COPPA) relating to monetising children’s data. Under its authority granted through COPPA, the FTC already enforces some rules regarding the collection and use of personal data for children under the age of 13. Through a notice of proposed rulemaking, the regulator is now asking for feedback on a requirement for separate parental consent to opt children into personalised advertising. The updates to the COPPA rules would also bar platforms from limiting their services to young users who consent to providing personal information. The FTC is also considering prohibiting platforms from using any personal data to nudge children to stay online, such as through push notifications. While the FTC asserts that these changes as well as the rest of the COPPA rules do apply to foreign companies that have a substantial US user base, the regulator has rarely enforced these rules against companies based outside of the US, with a notable exception being fines levied in recent years against TikTok parent company ByteDance.
US lags in regulating for online safety, despite numerous pending proposals
The FTC is currently limited in its ability to change child privacy rules under COPPA as congressional approval would be required to make most substantive changes. Broadly, the US continues to lag significantly behind global counterparts in writing new legislation around child safety online broadly. After President Biden called for a ban on targeted advertising for children in 2023, though, US policymakers both at the state and federal level are currently considering a number of options to strengthen laws regarding online safety for children, including through measures on data protection. A proposed legislative update to COPPA, which has already passed the committee phase in Congress with bipartisan support, would ban all targeted advertising for young users and raise the age for protections to kids under 16. The proposed Kids Online Safety Act would also address data privacy by empowering children and their parents to control the collection of personal data by platforms and requiring platforms to allow users to opt out of personalised recommendation systems that use personal data. Both pieces of federal legislation are mirrored in a number of state-level laws, including California’s California Age Appropriate Design Code, under consideration in legislatures or facing legal scrutiny.
Regulation elsewhere has yielded stronger protection
Elsewhere in the world, targeted advertising for minors has been banned entirely. The EU’s Digital Services Act (DSA) bans the practice for children, and a number of regulated platforms have already made changes to comply. Under the GDPR’s expanded privacy protections for adult users, platforms are also limited in their ability to process personal data for marketing purposes. Meta is now facing legal scrutiny for its proposal to require users to pay a subscription fee if they wish to opt out of personalised advertising. While the UK’s Online Safety Act does not ban targeted advertising for children outright, platforms are required to allow all users to object to marketing based on their personal data. Despite its persistent lag in regulating tech broadly, the US has perhaps its best opportunity yet to align with the EU and UK on online safety regulation, given the rare bipartisan agreement currently formed on better protecting children’s privacy online.