The Ministry of Electronics and Information Technology in the country of India has asked Apple and Android to remove Tik Tok from their app stores due to pornographic and other inappropriate content. The statement says that the app encourages illicit content. This comes just months after being fined by America’s Federal Trade Commission for improperly managing data collected from users in their young target audience.
TikTok, managed by Chinese company Bytedance, has hundreds of millions of users and features user generated lip-sync, livestream, and other short form videos. The app has come under fire from many child protection agencies because of their lack of age verification and limited parental controls. The app is most popular with kids under 20 and mostly with girls. The app is wildly popular with kids under the age of 13 but it isn’t likely that stats will show that demographic since most who sign up say they are over 13 years old to line up with the company’s terms and agreements.
Tik-Tok Under Fire as Study Finds 25% of Kids Talk to Strangers Online
What Parents Should Know
TikTok is another example of user generated content getting away from the company who is supposed to manage that content. They say they are doing all they can and boast how many videos and posts they’ve removed due to violations, all the while more illicit content is being uploaded by new accounts. While these companies play catch up our kids are the ones seeing the inappropriate videos that haven’t been flagged enough yet and being contacted by the creepy predators who’s accounts have yet to be removed. As I often say, the responsibility falls to parents to protect our kids from this content.
You have to ask what it is, besides the sheer scale of numbers, that makes it so difficult for companies to get a handle on this. These issues continue to flare up yet the companies continue to grow and growth is ultimately the goal. Growth means money. Companies need more users to grow. More content will earn you more users. So perhaps they try to remove as little content as possible because it is easier to ignore the illicit and pay the bills than take a stand to protect your users and possibly slow the growth of your company.
Parents have to be aware of what apps their kids use. We should know when they are live-streaming or posting videos of themselves. We should follow their accounts and see what everyone who follows them see. If they won’t allow you to follow them you shouldn’t allow them to use the app. It’s really simple. Tic Tok, Snapchat, Instagram; none of them exist to keep your kids safe, they exist to make money and only take steps toward security and privacy when not doing so may hurt their bottom line. Parents have to stand up, not to these companies, but to our kids, and tell them that our goal is safety and health. Then set the right example and work with them to have the right attitude about our time on social media and other tech. You can do it! We’re here to help you.
Listen to this article as a podcast below:
Podcast: Play in new window
Subscribe: Apple Podcasts | Spotify | RSS

Comments
One response to “Indian Agencies Ask for Removal of TikTok from App Stores Citing “Pornographic and Illicit Content.””
Excellent post! Thank you for protecting our children and educating our parents.