skip to main content
 
 
 
Mar 11, 2019

United States Federal Trade Commission strikes one for children’s privacy

By Lisa R. Lifshitz
Share:
Canadian Lawyer Online — IT Girl Column View original

On Feb. 27, the U.S. Federal Trade Commission announced that operators of the video networking app Musical.ly (now known as TikTok), had agreed to pay US$5.7 million to settle allegations that they had illegally collected personal information from children in violation of the Children’s Online Privacy Act. The order marks the highest civil penalty ever obtained by the FTC in a children’s privacy case.

Originally enacted in 1998, COPPA is intended to protect the safety and privacy of children online by prohibiting the unauthorized or unnecessary collection of children’s personal information online by operators of internet websites and online services. Under the legislation, operators of commercial websites and online services directed to children that collect, use and/or disclose personal information from children must meet specific requirements prior to collecting online, using or disclosing personal information from children.

Musical.ly (a Cayman Islands corporation based in China) and Muscial.ly, Inc. (a California company and a subsidiary of Musical.ly) had been charged with multiple violations of COPPA, including failing to post a privacy policy that provided a “clear, understandable and complete” notice of its information practices (including what information the website operator collected online, how it used such information and its disclosure practices for such information; failing to provide a clear, understandable and complete notice of its information practices, including specific disclosures, directly to parents; failure to obtain verifiable parental consent prior to collecting, using and/or disclosing personal information from children;  failing to delete personal information collected from children online, following parental requests and retaining personal information longer than was reasonably necessary to fulfil the purpose for which the information was collected (sounds almost Canadian, no?).

In order to register for the app, users were required to provide their email address, phone number, user name, first and last name and a short bio with a profile picture. Between December 2015 and October 2016, Musical.ly also collected geolocation information, but this practice was later scrapped.  While some users had previously voluntarily included their ages in their bios, beginning July 2017, the app specifically asked customers to provide age information to prevent users under the age of 13 from creating accounts (although existing users were not asked to confirm their age). Until October 2016, users could tap on the “my city” tab and access a list of other users within an 80-kilometre radius with whom the user could connect and send messages. Significantly, user accounts were set to “public” by default, meaning that a user’s profile bio, username, profile picture and videos were entirely public and searchable by other customers. While the option existed to set accounts to private, all user profiles, including user names, profile pictures and bios remained public and searchable by other users at all times.

Interestingly, the app did not allow users to close their own accounts. Instead, a user (or their parent) had to send an email to Musical.ly asking to terminate the account, despite the complaints of thousands of parents who asserted that their kids had created accounts without their knowledge and presumably consent. However, even when the accounts were officially closed, the FTC found that Muscial.ly did not actually delete the users’ videos or profiles from their servers. 

In its original complaint, the FTC alleged that Musical.ly knew that many children were using their platform/app, with a significant percentage of those under the age of 13. Articles published in the press between 2016 and 2018 highlighted the popularity of the app among children and tweens. More damning, however, is the fact that, in February 2017, Musical.ly sent messages to 46 of its most popular users (who appeared to be children under 13 years old) advising such users to edit their profiles to make it look as though their accounts were being run by adults, either their parents or talent managers (with no effort made to actually verify whether responders were adults versus the child account holder).

Moreover, the FTC also claimed that the app was deliberately marketed in a way to appeal to young children; making available Disney songs from movies such as Toy Story and The Lion King, and allowing users to send emoji-filled messages filled with “cute animals and smiley faces.” The platform also featured music by celebrities appealing to children (Katy Perry, Selena Gomez), many of whom had their own Musical.ly accounts that encouraged fans to post and share videos of themselves lip-synching or dancing to their latest songs. As many users blatantly self-identified as under 13 in their profile bios or provided school information showing their age as under 13, the FTC concluded that Musical.ly had “actual knowledge” that they were collecting personal information from children under the age of 13.

The United States District Court, Central District of California approved the consent decree filed on behalf of the FTC against Musical.ly enjoining it from further violating COPPA. The order requires Musical.ly to destroy the personal information of account holders under the age of 13 (except upon consent, a user’s videos could be transferred to the user’s device and the user could retain their user name, so long as the user name does not function as online contact information). The personal information in the form of registration information of older users can be retained, but if the age of a particular user could not be identified within 45 days from the entry of the order, Muscial.ly had to remove such user’s personal information from its websites and online services and destroy such information within 12 months after entry of the order unless Musical.ly obtains verifiable parental consent for its collection, use and disclosure as set out in COPPA. 

In addition to paying the civil penalty of US$5.7 million, Muscial.ly also has extensive compliance reporting and record-keeping obligations for up to 10 years after entry of the order. These include submitting a report within 90 days regarding its deletion obligations (followed by one within 15 months of the entry of the order) and myriad other additional reports confirming that it had undertaken the corrective actions promised. As part of its obligations, Musical.ly must detail any methods used to obtain verifiable parental consent prior to collecting, using and disclosing personal information, the means provided to parents to review any personal information collected from their children (and to refuse to permit its further use or maintenance) and why each type of information collected from a child is “reasonably necessary” for the provision of the related activity. 

Musical.ly must also create “all records necessary” (for 10 years after the order and retain them for another five years) to demonstrate its full compliance with the order. Such records include keeping all copies of consumer complaint letters regarding its information practices and a copy of materially different form, page or screen created, maintained or otherwise provided by Musical.ly through which personal information is collected from a child (in addition to the URL of the webpage where the material was posted online).

Responding to the adverse publicity surrounding the FTC settlement, TikTok announced on Feb. 27 on a blog post on its website  that it was launching a limited, separate app experience that introduces additional safety and privacy protections designed specifically for children under 13. The additional app allows TikTok to “split users into age-appropriate TikTok environments, in line with FTC guidance for mixed audience apps.” The new environment for younger users does not permit the sharing of personal information and puts extensive limitations on content and user interaction. Both “current and new TikTok users will be directed to the age-appropriate app experience, beginning today.”

Following the FTC settlement, users of TikTok under 13 years old will not be permitted to share personal information. The new section of the app does not allow uploading videos, commenting on others’ videos, messaging with users or maintaining a profile or followers — young children will only be allowed to consume content. Moreover, all new and existing TikTok users will be required to verify their birthdays and will be redirected to the COPPA-compliant portion of the app if they identify as under 13. TikTok also launched a new video tutorial series (cheerily titled “You’re in Control”) emphasizing privacy and security on its platform.

While Canada does not have a statute identical to COPPA, the lessons learned from the Musical.ly case and the legal sensitivity of collecting personal information from children (especially those under the age of 13) should still nonetheless resonate with Canadians.

For example, PIPEDA’s new “Guidelines for obtaining meaningful consent,” which came into force on Jan. 1, reiterated that it is unrealistic to expect children to fully appreciate the complexities and potential risks of sharing personal information. Accordingly, PIPEDA allows for consent through an authorized person, such as a parent or legal guardian. 

As noted by the guidelines, the federal privacy commissioner has taken the position that there is a threshold age (which happens to be 13, the same chosen by COPPA) under which young children are not likely to fully understand the consequences of their privacy choices. Accordingly, the OPC has determined (barring exceptional circumstances), that anyone under the age of 13 cannot give meaningful consent and instead that consent must be obtained from their parents or guardians.  (It is worth mentioning that this position was not adopted by the privacy commissioner’s Alberta, B.C. or Quebec counterparts, who prefer to consider whether the individual understands the nature and consequences of the exercise of the right or power in question). 

Even if a minor is able to provide meaningful consent, the onus, however, is and will always be on the organization doing the data collecting to show that they have taken into consideration the child’s “level of maturity in developing their consent processes” and adapting them accordingly in order to demonstrate that their processes lead to meaningful and valid consent.

While not the law in Canada, COPPA requirements dovetail with many Canadian best practices relating to children, and Canadian operators of commercial websites and online services aimed at children (and their advisors) should take notice of this decision.

This article originally appeared as Lisa's IT Girl column in Canadian Lawyer Online