FTC strikes TikTok with a record $ 5.7 million fine on children's privacy



[ad_1]

When the lip sync Musical.ly app exploded for the first time in popularity almost four years ago. He was best known for being a teenager sensation. But according to the Federal Trade Commission, the app also illegally collected information about children under 13 years old. The agency announced Wednesday that Musical.ly, now known as TikTok, had agreed to pay a $ 5.7 million fine to settle the charges. TikTok must also comply with the Children's Online Privacy Act, or COPPA, and continue to remove all videos uploaded by users under 13 years of age.

"This record penalty should remind all online services and websites that target children: we take the application of COPPA very seriously and we will not tolerate companies that blatantly ignore the law," said FTC President Joe Simons in a statement. The FTC complaint alleges that Musical.ly violated COPPA by failing to require parental consent from users under 13, by failing to inform parents of how the application collected personal information about underage users and preventing parents from requesting the deletion of their children's data.

TikTok then announced Wednesday the launch of a separate part of its application for children under 13, which "introduces additional security and privacy protections designed specifically for this audience."

"Companies like TikTok have been very keen to take advantage of kids' apps users at every turn."

Senator Ed Markey

By essentially associating Vine with Spotify, Musical.ly has attracted the attention of around 100 million difficult Generation Z consumers. In November 2017, it was bought by the Chinese company ByteDance and renamed TikTok. The application, which allows users to share 15-second video clips on music, has been installed more than a billion times, including 96 million in the United States, according to research firm Sensor. Tower. After receiving $ 3 billion from SoftBank and other investors in October, Bytedance is now considered one of the most valuable private start-ups in the world.

The FTC alleges that TikTok knew that a "significant percentage" of its users were under the age of 13 and that it had received thousands of complaints from parents whose minor children had created accounts. Until April 2017, the app's website had even warned parents that "If you have a young child on Musical.ly, be sure to monitor their activity on the app", according to the complaint from the FTC. But the app only began asking users to provide their age later in the year, the agency notes. Since then, the app has prevented anyone who has reported to be under age 13 from creating an account, but its operators have not confirmed the age of existing users.

"Children's lives are increasingly being experienced online, and companies such as TikTok have been eager to take advantage of children's application users," said Ed Markey, Democratic Senator for Massachusetts and key figure in COPPA 20 years ago, in a report. He and other legislators introduced a bill last year to update the law.

TikTok accounts are public by default. other people can see the content published by users, unless they change their privacy settings. However, the FTC's complaint alleges that even if users set up their profiles privately, others can still send them a message. For years, a number of media have reported that underage users have been asked to send nude images to Music.ly and later on TikTok.

TikTok says that it has now created a separate application experience for users under the age of 13, which does not allow them to share personal information. It does not allow you to download videos, comment on other people's videos, communicate with users, maintain a profile, or follow subscribers. In short, TikTok will only allow younger children to consume content, not share it. Starting Wednesday, new and existing users of TikTok will need to check their birthday. They will then be directed to the part of the application reserved for children if they say they are under 13 years old. The company has also launched a new series of tutorials focusing on privacy and security on its platform.

Two-tiered "mixed audience" systems, such as those implemented by TikTok, were first approved by an amendment to COPPA in 2012, according to Dona Fraser, director of the Ontario Human Rights Unit. Children's advertising from the Council of Business Ethics Boards. The FTC acknowledged that the unit had drawn attention to the TikTok case on Wednesday. "It's a great way to comply," she says. "You create two products in one. [Children] get an environment that does not include all the elements that will trigger COPPA. "

This change is likely to have a significant impact on the TikTok community, where very young users have played an important role from the beginning. Some of the biggest stars of the platform, like Lauren Godwin, which has 12.3 million fans, sang "duets" with children under 13 years old. We do not yet know what the platform will do about these videos, which have minor users but are not shared. directly on their own profiles. A spokesman for TikTok said some details of the implementation of the new system were being finalized.

Although the FTC voted 5 to 0 to approve the consent decree, some FTC officials say TikTok should be required to do more than pay a fine. "The FTC's investigations generally focus on individual liability only in certain circumstances, which has meant that employees of large companies have often avoided thorough examinations," wrote Commissioners Rohit Chopra and Rebecca Kelly Slaughter. "We should move away from this approach. Leaders of big corporations who condemn companies that break the law should be held accountable. "


More great cable stories

[ad_2]

Source link