WASHINGTON – The Federal Trade Commission announced Wednesday a $ 5.7 million deal with Musical.ly, a popular video social network known as TikTok, following accusations that its application illegally collected personal information. on children.
F.T.C. said it was a record fine for a violation of children's privacy. The agency found that a large percentage of app users were under the age of 13 and had disclosed sensitive personal information, including their email addresses, names, and schools. F.T.C. said that the application did not ask permission from a parent to collect data on users. When parents asked him to delete videos and other data, the site declined.
Under the Children's Online Privacy Act, online services require parents' permission before collecting personal data from users under 13 years of age.
"This record penalty should remind all online services and websites that target children: we take the application of Coppa very seriously and we will not tolerate companies that blatantly ignore the law," he said. Joseph J. Simons, president of the agency.
The agency is under increasing pressure to strengthen respect for children's online privacy. Last week, more than a dozen child rights groups called on the F.T.C. to investigate Facebook after information was revealed that the company had knowingly misled children by charging them fees for games on its social network.
ByteDance, a Chinese Internet conglomerate, bought Musical.ly in 2017. The new owner then merged Musical.ly with TikTok, an already-operated app. The application allows users to create short video clips and share them with other users. The Musical.ly version had 65 million users in the United States in 2014, the F.T.C. says, and the TikTok app remains a big hit today.
The investigation was triggered by various reports and a complaint from the Bureau of Business Ethics. F.T.C. stated that even during an informal review of the application, it was found that a large portion of the users were under the age of 13 and that there were many cases in which the parents had not been warned nor requested authorization.
User accounts were public by default and adults could contact users, regardless of their age. Until 2016, the application allowed users to see other users within a radius of 50 km, the F.T.C. I said.
When some parents requested the removal of their children's data, TikTok removed the child's account, but kept the videos and personal account information of those users on its servers, the agency said.
In response to F.T.C. TikTok announced Wednesday the creation of a separate application for those under 13 years old. The new application will not allow the sharing of personal information and limits the content that can be displayed and shared, announced the company.
"Although we have always considered TikTok a place for everyone, we understand the concerns that arise about young users," the company said. "By working with the F.T.C. and, along with today's agreement, we have implemented changes designed to integrate young US users into a separate and limited enforcement experience, which introduces additional security and privacy protections designed specifically for this audience. "