According to the short video social media platform, TikTok which is famous among teens, it has deleted more than 7 million Tik Tok accounts that belong to children under 13 at the start of this year. With this move, the social media platform is taking a considerable step in applying the required age of users on the application.
On Wednesday, according to a blog post of the company, more than 11.1 million accounts were deleted for violating the guidelines of the application, 7.26 million of them were from users suspected of being under the age policies. However, it is the first time that the social media platform, TikTok published the number of underage accounts it removed.
Moreover, as per the Children’s Online Privacy Protection Act, internet websites are needed to acquire parental approval ahead of collecting data on children under 13 in the United States. However, several kids mispresents their age and make accounts in any case across social media websites from Instagram to YouTube.
TikTok which is possessed by ByteDance Ltd of China has drawn specific scrutiny due to the large quantity of data its refined algorithm gathers and its fame, particularly among teens. The New York Times stated TikTok categorized more than a third of its daily users last year in the US as being 14 or younger.
The social media app was compelled to pay the US Federal Trade Commission a record $5.7 million fine for illegally gathering children’s data such as names, email addresses, and locations of children who utilized the application in 2019.
From that moment on, the social media platform has modified a lot of features to make the platform secure for its users. TikTok rolled out a devoted section on the application about two years ago for children 12 and under named TikTok for Younger Users.
Space provides “a curated viewing experience with additional safeguards and privacy protections” as per the tech giant. Furthermore, the walled-off portion of the application doesn’t allow sharing private data, puts restrictions on showed content, as well as doesn’t let its users post videos or comments.
Other age-adaptive modifications comprise a new built-in privacy setting for accounts age 13-15, rolled out in January, that needs young people to authorize followers to view their videos. The organization further launched a new Family Pairing setting to permit parents to monitor and control their children’s screen time on the application last August.
Apart from eliminating accounts, TikTok further continuously monitors published content. According to a report issued on Wednesday, 8.54 million videos were removed in the US only in the first three months of the year. However, 36.8% of those taken down, were for “minor safety” causes, 21.1% for comprising unlawful activities, and 15.6% for breaching policies on adult nudity and sexual activities.
According to TikTok, it is going to start posting its perceptions related to the implementation of community policies quarterly.