Data privacy and online child protection experts at a Senate Judiciary Committee hearing today called for multi-layered approaches to online data monitoring, policies, and enforcement to safeguard children from sexual exploitation and exposure to inappropriate content online.
During the hearing, senators asked how social media and tech companies can regulate their platforms or create algorithms to protect children, and give parents greater control over content their children can access.
Several witnesses argued for solutions that rope in tech companies, regulator accountability, and parental responsibility.
Angela Campbell, a professor at the Georgetown Law Institute for Public Representation’s Communications and Technology Clinic, emphasized that tech companies and regulators should ensure that online platforms design their systems to protect children.
“[Tech companies] design their systems not to protect children or nurture children, but to attract a large number of users, including children, and to keep them online as long as possible so they can maximize revenue by collecting valuable data about those users and deliver targeted marketing to them,” she said.
Campbell pointed to the Children’s Online Privacy Protection Act (COPPA) enacted in 2000, but said governmental bodies like the Federal Trade Commission (FTC) have failed to enforce COPPA’s safeguards. She argued that Congress should strengthen COPPA with effective updates, press the FTC to tighten its enforcement of COPPA rules, and introduce further legislation to protect children online.
She said lawmakers also should consider action to compel data minimization, which would prevent companies from conducting unfettered data collection – particularly in the case of children’s data, which can be accessed by online predators.
“I think the ability to focus on data minimization and also data security [is important], because the data that’s available to companies can easily be available through data brokerages and other sources to predators and lots of other people,” Campbell said.
Christopher McKenna, founder and CEO of Protect Young Eyes, took online service providers to task, but also said parents bear much responsibility for what their children experience online. Parents – who often provide and monitor devices and applications that children use – should have better control over the content their children access, and tech companies should create solutions that help parents gain better control.
“Two simple solutions could change everything,” McKenna said. “First, creating uniform, independent, and accountable rating system for apps, and second, enact better defaults based on the age provided during device and app setup. Let’s fix this for the kids.”
Stephen Balkam, founder and CEO of the Family Online Safety Institute (FOSI), argued in favor of empowering children to combat exploitation online. He said that while government, companies, and parents have a role in making a kid-friendly internet environment, giving children the tools to empower themselves to be “good, digital citizens” is equally important.
“At FOSI, we do not seek to diminish the existence of these risks on the internet, but rather to focus on the constructive ways that they can be managed against the benefits and opportunities of being online,” he said. “That’s why we believe that creating a culture of responsibility is the most effective framework to ensure that children are protected in the digital world.”