The recent lawsuit filed by the U.S. Department of Justice (DOJ) against TikTok, and its parent company ByteDance, highlights the critical responsibilities app developers have in safeguarding children's privacy online.
Accused of violating the Children's Online Privacy Protection Act (COPPA) by collecting data on children under 13 without parental consent, TikTok's case serves as a warning for developers to prioritize ethical data practices and compliance with legal standards.
FTC Chair Lina M. Khan said TikTok “knowingly and repeatedly” violated kids’ privacy, threatening the safety of millions of children across the country. “The FTC will continue to use the full scope of its authorities to protect children online — especially as firms deploy increasingly sophisticated digital tools to surveil kids and profit from their data,” said Khan.
Legal responsibilities and ethical standards
App developers have a legal obligation to comply with COPPA and similar regulations, which require parental consent before collecting, using, or disclosing personal information from children under 13. The lawsuit against TikTok alleges that the company continued to collect and retain data from minors, even after being fined $5.7 million in 2019 for previous violations. This underscores the importance of adhering to regulatory requirements and implementing comprehensive data protection measures.
“If there's a loophole, people—especially kids—will exploit it,” said Joshua Copeland, director of managed security services at Quadrant Security and adjunct professor at Tulane University. “App and service providers need to be exceptionally vigilant because any social platform can and has been used to harm minors."
Beyond legal compliance, developers must also consider the ethical implications of their app designs. This includes creating robust age-verification systems, offering parental controls, and ensuring that data collection practices are transparent and limited to necessary information. Ethical app development is not just about following the law, but also about building trust with users and prioritizing the safety of minors.
Implementing effective safeguards
The TikTok case highlights the need for app developers to implement effective safeguards to prevent misuse of data, especially when minors are involved. Developers should design apps that automatically detect and restrict access to content unsuitable for children. For instance, TikTok's "Kids Mode" was intended to offer a safer platform for children but allegedly failed to protect their privacy, as the company continued collecting and sharing minors' data.
"Social media platforms are the most visible, but privacy is relevant for all software that gathers telemetry or processes data and should be considered as part of the architectural design," says Matthew Rosenquist, cybersecurity strategist and CISO at Mercury Risk and Compliance.
The DOJ's lawsuit against TikTok sets a significant precedent for the tech industry, signaling that regulators are willing to take stringent action against companies that fail to protect minors' privacy. The case emphasizes the importance of developing apps that prioritize user safety and privacy, particularly for young users.
App developers must recognize that the obligations change when it comes to minors. The TikTok lawsuit illustrates that ignoring these responsibilities can lead to severe legal consequences and damage to a company's reputation.
Dustin Sachs, chief cybersecurity technologist, CyberRisk Alliance