Parents, Not Government, Should Oversee Protecting Children’s Online Privacy
The federal and state proposals have a noble goal but they would not prevent the exploitation of data, and there are no provisions to empower parents to help preserve their children’s online privacy.
When children use social media platforms, websites, and other online services, immense amounts of data about them can be collected. Parents may understand some of what is being captured, but they likely don’t know just how much.
There are options for parents interested in protecting their children’s online privacy — the platforms themselves and some innovators offer such tools — but some legislative proposals could substitute government dictates for parental discernment.
Children’s online privacy is an incredibly nuanced topic. While they may use social media to post videos, pictures, and status updates, children sometimes also use online options to reach out for help, such as seeking shelter from an abusive parent. Unfortunately, the policy proposals sacrifice nuance in the name of “doing something.”
Perhaps just as bad, the plans would short-circuit solutions the private sector is developing. Innovators understand the problems parents and children face and are creating apps and other services designed to shield children’s privacy from prying eyes and ensuring compliance with existing laws.
To be sure, the federal and state proposals have a noble goal — to prevent companies from exploiting children, as they cannot understand how their online choices create permanent profiles. Neither the federal nor state proposals, though, would prevent the exploitation of data and there are no provisions to empower parents to help preserve their children’s online privacy.
Parents should maintain active roles supervising their children’s online habits. Most platforms and services that children use have privacy tools, which are not too difficult for parents to learn to use. If parents are intimidated by a platform and are concerned about their children’s online privacy, they should take the time to learn about the privacy tools.
Innovators are creating many beneficial services geared at children, such as educational apps and software tools designed for use in school, apps designed to teach children good money habits, and software to teach foreign languages. If the federal and state proposals are adopted, though, they would throw out beneficial uses with the abuses of children’s privacy.
The federal proposal recently introduced focuses heavily on children’s use of “platforms.” These platforms tend to gather tons of information about children — not just what they are posting, but the devices they are using and their locations. The proposal, though, lacks nuance as its definition of “platform” would include any software, app, or service that children could use. This means that the educational tools designed for schools would be captured, along with programs like Zoom, Skype, or Facetime that children may use to connect with their grandparents.
The way California wants to address children’s privacy is far worse. Its proposal would, among other things, establish government standards for age-appropriate content, how platforms must present content to children, and completely substitute parents’ judgment about what is, or is not, appropriate for children.
As it stands now, companies collect data when people use their services, including from children. The government recognizes this and, years ago, passed the Children’s Online Privacy Protection Act. This law recognizes that some services are targeted at children, and — as many parents know — that children are really good at getting around rules; so it requires platforms to safeguard children’s data.
Companies at least provide lip service to protecting children’s privacy. Most social media platforms, such as Facebook (and Instagram), Snapchat, and TikTok, officially prohibit anyone under the age of 13 from creating an account. The platforms differ, though, on how they enforce the age restrictions, with some using tools, like artificial intelligence, to prevent minors from creating accounts. Other platforms simply trust that the prospective user enters his or her age honestly.
Innovators are trying to solve the problem of children’s privacy, as well. Some innovators are focusing on specific industries, like teaching children good monetary habits. Others have broader visions. Even the platforms children try to use are working on efforts to create age-appropriate content policies, filters to keep offensive content out of young users’ feeds, and provide incentives for parents to work with their children in creating positive online habits.
Children’s online privacy is a huge issue that society needs to tackle. There needs to be nuance in the approach and needs to empower innovators to work with parents. Any policy must think of parents as partners, not supplant their discretion with government dictates.