TikTok Is Randomly Deleting Accounts After Its $5.7MM Compliance Violation
After getting slapped with a $5.7 million dollar fine by the FTC for violating child privacy laws, TikTok is making sweeping changes to the platform.
People under the age of 13 now have a drastically different app experience to better conform with the Children’s Online Privacy Protection Act (COPPA).
Some of those changes include being unable to share videos or have a public profile. The TikTok app now prompts people to verify their birthdays, but the algorithms appear to have gone wonky. Some people who have registered accounts as 13 and over have found their accounts deleted.
In an interview with BuzzFeed News, a 15-year-old confirmed that she used her current birthdate, but the account was instantly deleted. Her account had over 17,000 fans, and they were wiped clean with just a simple age verification check.
Some users caught in the crossfire include individuals who entered fake birthdays that make them appear to be under 13. These accounts were instantly deleted with no chance to correct the birthdays to the actual age.
TikTok said those people who have experienced an account deletion in this manner will have to verify their true age with a copy of their government ID. The instant deletion was probably designed as an attempt to prevent young users from lying about their age to skirt the new age-gated app.
Users who have been locked out of their accounts report having trouble accessing the report tool linked by TikTok.
TikTok has support staff working to help individuals affected, but this just goes to show what a clusterfuck this whole situation has been.
Some reports indicate that child predators have targeted the popular video-sharing app as a place to meet young people. The app’s user base trends overwhelmingly to those 18 and below. Unlike Snapchat, videos shared with fans are not deleted after a set period of time.