Tag: coppa

  • Instagram is Adding Useless Age Verification to Comply with Child Privacy Laws

    Instagram is Adding Useless Age Verification to Comply with Child Privacy Laws

    Users on most social media platforms are supposed to be 13 years old or older. Some apps have had a form of age verification available for a long time. This allowed them to collect data on all of their users without failing to comply with child privacy laws since you can’t have an account if you’re 12 or younger. They then, as much discussed on this blog and elsewhere, sell that data to advertisers or use it to sell targeted advertising on their own platform. Instagram hasn’t had age verification since it started. That is changing as of this week.

    You may have already seen your birthday show up on your profile in the Instagram app. Don’t worry, that information isn’t public, only you can see it. All users will have their birthday information on their profile as of this week. If the birth year used to create your profile shows that you are under the age of 13, your account will be suspended. When setting up a new Instagram account, the app will now ask for you to put in your birth date.

    “Asking for this information will help prevent underage people from joining Instagram, help us keep young people safer and enable more age-appropriate experiences overall,” the company wrote. “In the coming months, we will use the birthday information you share with us to create more tailored experiences, such as education around account controls and recommended privacy settings for young people.”

    Asking for users ages has already been a part of major social media apps like Snapchat but Instagram hadn’t added it to their sign up process yet. TikTok added age verification after being fined nearly 6 million dollars by the Federal Trade Commission. The problem with these age verification practices is that they are fully reliant on users being honest about their age. All you have to do is a little bit of math to determine when you have to have been born to be over 12 years old. When you enter your new determined birth-date you would be allowed into the app just like anyone else. Parents have been known to lie about their kids ages to allow them to have social media accounts, this is allowed by COPPA as it counts as parent permission. The problem is that developers of these apps can’t tell the difference between a parent making an an account for their child or the kid making their own and lying about their age.

    What Parents Should Know

    Age verifications on social media apps are a hand wave towards regulations that depend solely on users to take the rules into account when using the apps. This means that they aren’t concerned with the safety of users as much as their own ability to skirt around fines and other regulations from the Federal Trade Commission. It is very obvious that these apps are meant to be open and as public as possible. The want as many users as they can get because they aren’t social media companies, these are advertising companies. They sell ads, plain and simple. When you sign up to use social media you are signing up to be advertised to specifically and aggressively. When we sign our kids up and lie about their age we are telling these companies to treat them just like any other consumer.

    If you are honest with yourself, the reason you’re allowing your young kids to use social media is pretty weak. Because their friends have it? Because a teacher says that’s how they contact students? There are ways around any of the reasons you think lead to your hands being tied. All it takes is your own knowledge of what being on these social media apps means for your kids and then a little bit of confidence to just say no. Stand up to your kid, you are the parent after all, or stand up to that teacher or coach. Ask them why they want to contact your 12 year old on social media anyway, does that sound appropriate to you? I submit that in nearly any other context it would not be acceptable.

    You are the first line of defense. Advertising and data collection is the main issue that the government leans on when saying they are trying to protect children online. There are, however, so many other issues to be concerned with. Pornography is rampant on apps like Snapchat, Instagram, and TikTok. You se report after report of young people discussing suicide, mental health problems, and eating disorders on these apps. This information is just sitting there for our children to see. When you give in and allow them to use social media at an early age simply because you think it’s no big deal, or you trust your child, you are allowing things into their minds that cannot be unseen. You’re giving them access to a world that cannot be left behind. Once you know about or begin to contemplate these things, they are permanently a part of your psyche. We must do better. We have to be smarter about our children’s access apps with user generated content. Whether it be games, social media, or any other software. We cannot trust software companies to do the right thing. They are looking after their bottom line first. It is up to us to protect our children. Not the government, not app developers, not the schools, or even police departments and social workers. It is up to you, mom, dad, aunt uncle, grandma, and grandpa. Only you.

  • Youtube’s New Kids Content Policies Explained

    Youtube’s New Kids Content Policies Explained


    Starting today, all creators are required to mark their content as made for kids or not made for kids in YouTube Studio. -YouTube Creators Email

    YouTube will be limiting the data they collect form videos that are targeting children. This is in effort to comply with the FTC’s demands that they be responsible for the information they gather on their site which lists children among their most frequent audience members. Wording in the email suggests that YouTube is “helping” creators comply with COPPA as well as meeting the demands the Federal Trade Commission put on YouTube as a media company.

    YouTube will use an algorithm to monitor content for child centric content and flag it as such if it is not flagged by the creator of the video. The email reminds creators to be vigilant to properly tag their videos if they are made for children as failure to comply could cause them to be in violation of the FTC’s demands.

    The FTC has outlined what constitutes children’s content and YouTube has that information available on their support page. YouTube’s announcement briefly defines children’s content as:

    • It is directed to children as the primary audience (e.g. videos for preschoolers).
    • It is directed to children but children are a secondary audience (e.g. cartoon video that primarily targets teenagers but is also intended for younger kids).

    YouTube’s guidelines state that they may override content creator’s settings if their content seems to be geared toward kids but isn’t marked as such. This could result in content creators being demonetized or held accountable in some other way for not properly categorizing their content.

    What Parents Should Know

    The FTC fined YouTube for their inability to comply with COPPA and told them they had to have a plan by next year to keep children’s data private on their site. Many thought YouTube Kids was the solution but so few parents actually used the kid version of YouTube so children remain a major audience for YouTube’s main site and app. The information creators give YouTube about their videos and channels will help YouTube know what videos to collect data from that will be used for advertising in the future. Also, the advertising on videos marked as “for children” will be different, focusing on the content of the video as an indicator of the audience rather than viewing data from the viewers themselves.

    These changes, in my opinion, are a step in the right direction for YouTube. Their collection of data from young audiences have been a point of contention for tech safety experts, security and privacy agencies, and family advocacy groups for several years now. The policies handed down by the FTC are in direct response of some of these experts and agencies asking for an investigation into YouTube for their lack of compliance with COPPA.

    As parents we rarely think about our kids digital footprint being collected and used against them but it is happing every time they log on to an app or game. It is important, however, to remember that the trail they leave behind online will follow them for the rest of their lives. The things they buy, the sites they visit, the videos they watch, and the games they play are all being compiled to create a profile on them that will be used to market to them online for years to come. If parents remember that our children’s web traffic is being collected we can take steps to protect them from excessive data collection. Encourage them to use messenger apps that are made just for kids. [Facebook Messenger Kids, not WhatsApp or FB Messenger.] Remind them that what they share online becomes public the moment they share it. Tell them they should only use video and game apps that are intended for children and made by major developers who are more likely to comply with COPPA. Parents are responsible for the safety of their children, as well as their privacy and security so take the steps you can to keep their data private.

  • UPDATED: YouTube May Eliminate Targeted Ads on Kids’ Videos

    UPDATED: YouTube May Eliminate Targeted Ads on Kids’ Videos

    UPDATE 9-4-2019: This morning the FTC announced a 170 million dollar settlement with Google to end the investivations  of YouTube’s children’s data collection practices. At the same time YouTube announced they are rolling out funding for original children programming. YouTube CEO, Susan Wojcicki said that the changes proposed by the FTC could be detrimental to much of the ad revenue made by content creators who make videos targeting children. She also said that the changes are rolling out slowly over four months to give creators time to adjust their content.

    Child data security advocates are not satisfied with this fine or these changes. They were hoping for more:

    “A plethora of parental concerns about YouTube – from inappropriate content and recommendations to excessive screen time – can all be traced to Google’s business model of using data to maximize watch time and ad revenue,” said Josh Golin, the Executive Director for the Campaign for a Commercial-Free Childhood (CCFC).

    Parents should be aware that the changes to YouTube’s data collection and advertising properties are rolling out slowly but will affect both YouTube and YouTube kids. My advice as mentioned in the video below, is that parents pay close attention to the videos their children watch on YouTube. Understand that much of the content they consume is created to advertise products whether it be websites, video games, or physical products such as toys or food and candy. Advertisments will still be geared toward kids based on the videos they are choosing to watch, much like seeing commercials for toys during Saturday morning cartoons.

    8-23-2019

    YouTube’s data collection policies have garnered attention from media and government agencies alike over the past several months. After some shocking reports about child pornography on the site and restrictions handed down from the FTC, Google is finally taking some real steps to comply with the Child Online Privacy Protection Act (COPPA.) Bloomberg reported this week that YouTube will be ending targeted ads on videos intended for children. 

    Obviously, ads that target viewers use data that has been collected in order to assign advertisement to that user. If YouTube is targeting ads to children, it stands to reason that they are collecting information about them as viewers in order to create their advertising profile in the first place. This data collection is blatantly against COPPA and one of the reasons the site was investigated by the FTC earlier this year.

    YouTube has already cut advertising income from videos that feature disturbing content aimed at children and eliminated comments in videos that feature children. It is estimated that YouTube makes nearly $750m annually from advertising on children’s videos. Obviously eliminating those targeted ads could seriously hurt Google’s bottom line but they say it is the least damaging option. There are other ways for YouTube to serve somewhat targeted ads to children. The company can use ads that are chosen based on the videos they appear on, thus tying the kids’ interest in the video itself with the ad that will be served. Those who have brought complaints against YouTube about their COPPA violations aren’t expected to be satisfied with that solution either.

    The Best Way to Keep Your Kids Safe On Youtube

    What Parents Should Know

    Of course YouTube wants your children to use YouTube Kids. This is how they protect themselves from the very mess they are in now. They say that YouTube Kids doesn’t collect data from viewers and only shows ads as they relate to the video users are watching. Even so, my recommendation is that your kids only watch YouTube in a place that everyone can see what they are watching. If inappropriate content comes up you will want to see what it is. This way you can talk to your child about what they saw and how to avoid seeing that in the first place.

    Another option is to use YouTube Premium to eliminate ads all together. We use this so that when we build a playlist of videos for our kids, we can be sure that they’ll only see what we selected and not some other video ad for something we may not approve of. YouTube is trying all they can to keep their ad based ecosystem alive while staying out of dangerous apps list and tech safety expert blog posts. Only time will tell if they are able to do so. This change could be a very tiny step in the right direction. 

  • FB Messenger Kids “Error” Allowed Thousands of Kids to Talk to Unapproved Strangers

    FB Messenger Kids “Error” Allowed Thousands of Kids to Talk to Unapproved Strangers


    Facebook Messenger kids was created to give children a safe place to communicate through text, stickers, video, and gifs with friends that are pre-approved by their parents or guardians. This week, however, the kids’ messenger app has had to send notifications to thousands of parents about their children having access to strangers in the app. 

    What happened is that a technical error allowed kids to create a group message with friends who would then invite their own friends who, while approved for them, may not have been approved by the parents of the first child. Confusing? Ya, this is possibly why the flaw was even possible in the first place. Facebook says they have alerted parents whose children may have had this type of interaction and that they’ve disabled any chats that were created, using this flaw. The story isn’t over, though, as some are calling for the FTC to look in to the error since it may have resulted in a COPPA violation.

    Released Today: Facebook Messenger For Kids!

    What Parents Should Know

    The moral of this story centers around trust. It is important that, while we may trust our children, we can’t always trust who our kids are in contact with. We definitely shouldn’t blindly trust the companies who make the hardware and software that our children are using. When our kids use an app like Messenger Kids, the whole point of the app is that it gives parents control. When the control is hindered, even by a “technical error,” that is a severe violation. We can, however, take actions to protect our kids from dangerous effects that could come from these errors.

    I recommend having a copy of the messenger kids app on your phone logged in to your child’s account. My wife and I are each logged in to one of our kids’ messenger kids apps and can see when they get messages and what the messages are about. We are notified when they receive a message and can look to see who it is from and even read it. I have, a time or two, jumped into the app to tell a friend to stop messaging since my son was past his allowed time for social media that day. I received a “yes sir,” and there were no more messages until the next day. We also use BARK to monitor their messages and alert us of any dangerous or inappropriate content.

    Parents are gate keepers. Our job is to be sure our kids are growing up with guidance through every area of life. If they aren’t being taught how to manage social media and internet use safely then they will struggle to make healthy decisions when they are older. Messenger Kids is a good tool to help your kid learn the right way to use a messenger but it won’t work if you are uninvolved, pretending that the creators of the app only have your kid’s best interest in mind. The truth is that they want to provide you a service to make a profit. We cannot overlook that. It is our responsibility, and ours alone, to teach our kids how to be safe online. We should take it seriously. We should hold companies accountably when they have errors that put our kids at risk but ultimately we should be the ones making sure our children are protected on every app, site, and software they use.

  • YouTube May Have to Stop Making Money Off Our Kids

    YouTube May Have to Stop Making Money Off Our Kids

    The US Federal Trade Commission is finishing an investigation into YouTube’s Children’s Data and Ad policies and at least one member of Congress is now asking for YouTube to make some major changes. Massachusettes Senator, Ed Markey has officially requested that the FTC enforce some major policy shifts on Google for how YouTube handles advertisements to children and the collection of kids’ data.

    The request states that:

    Personal information about a child can be leveraged to hook consumers for years to come, so it is incumbent upon the FTC to enforce federal law and act as a check for the ever increasing appetite for childrens’ data. – FTC YouTube COPPA 2019

    This three-page document outlines a plan for rules that the FTC should enforce upon YouTube in order to keep them compliant with  COPPA and to better regulate their child advertising practices. The rules include requiring Google to stop collecting data from users under 13, requiring YouTube to develop a way to identify users under 13 and implement COPPA compliant policies, disallow influencers from marketing products geared towards children under 13, and forcing Google to create a fund for developing content meant for children that is ad-free and COPPA compliant. 

    COPPA imposes certain requirements on operators of websites or online services directed to children under 13 years of age, and on operators of other websites or online services that have actual knowledge that they are collecting personal information online from a child under 13 years of age.

    What Parents Should Know

    Parents have to be intentional about teaching their children about online privacy. Regulations from the FTC will, likely, be coming in the near future. Even if these changes aren’t as strict as the ones listed in the letter from Senator Markey, they will still cause major ripples in the YouTube creator and viewer community. The way that YouTube seems to try and handle these kinds of problems is by “demonetizing” videos that contain the type of content they are taking heat about. The heat they are getting from the FTC right now, though, is concerning some of the most profitable channels on any video sharing platform ever.

    Advertising is the way these companies make their money and collecting data is their sole model for targeting their advertising. If they aren’t allowed to target children anymore then there won’t be much content on YouTube for children at all. Our approach has always to only allow our kids to watch YouTube videos that we have selected and they must watch them on the television in the living room. That protects them from any surprises and we curate the types of videos they are allowed to watch. We also have YouTube Premium which removes ads. This is helpful since the algorithm that selects which ads show up on what videos often doesn’t take the age of the target audience into account. (i.e. an ad for the latest Childs Play film on a video about kids making slime.)

    As I always say, we should hold these companies accountable as much as possible but it falls to parents to be the responsible ones when it comes to our children’s digital health and online safety. What is your approach to YouTube, do your kids watch as much as they want? Do you limit their viewership on YouTube? Do you think this news will affect how much time you allow them to use the app? Let me know your thoughts in the comments below.