Minnesota lawmakers want to clamp down on data collection on children and limit how it’s being used by the world’s most powerful tech companies, which are facing growing scrutiny over social media platforms’ potential harm to kids.
They’re also pushing platforms to have default privacy settings for users while prohibiting them from prioritizing engagement over content that users explicitly didn’t ask for. The proposals are attracting fierce opposition from trade groups representing tech giants such as TikTok, Meta, Snapchat and X, which argue the bills will force them to censor content and run afoul of the First Amendment.
“This isn’t trying to regulate what’s on the internet, we’re trying to protect kids while they’re in the internet,” said Sen. Erin Maye Quade, DFL-Apple Valley, who is carrying several bills related to social media practices this year. “It’s not what’s on the internet that’s harmful, it’s what the tech companies are doing with the data that’s harmful.”
The strategy to target data collection and user settings is a shift from a broader attempt last year to ban use of algorithms on anyone under the age of 18. If successful this year, legislators are already anticipating lawsuits to strike down new state laws. An age-appropriate design law passed in California that’s similar to one of the bills moving in Minnesota is in the midst of ongoing litigation, after a district court judge blocked its implementation.
“An unconstitutional law protects no one, including kids,” Amy Bos, director of state and federal affairs at NetChoice, a trade company representing tech companies, told lawmakers at a hearing earlier this year. “Under the threat of fines from misjudging what may be considered potentially harmful to children, many platforms will certainly default to taking down all content on entire subjects, which is likely to remove beneficial, constitutionally protected material along with anything genuinely harmful.”
So far, Minnesota lawmakers haven’t been deterred by threats of litigation. They’re building on work started last year to criminalize certain deep fake images, video and audio content created by artificial intelligence. They’re proposing to increase those penalties this year, as well as extend the state’s child labor laws to the digital space, blocking social media users from making money off videos featuring children under the age of 14.
But tech trade groups have focused their lobbying power on the broader regulations being considered by legislators. Rep. Zack Stephenson’s proposal, called the “Prohibiting Social Media Manipulation Act,” is the first of its kind in the nation. He said it would require certain privacy features by default and force tech companies to prioritize the user’s preferences about what they want to see on their feeds above the need to keep them engaged.
User engagement on social media, which is run on algorithms, has become an important driver of revenue for tech companies.