WELCOME to Raising Connecting Kids the podcast that answers your questions about the connected world your kids are growing up in.
Thank you to everyone for partnering with BecauseFamily and making these resources possible. Visit BecauseFamily.org/partnership to help us continue to protect your family by making free content like this podcast.
QUESTION OF THE WEEK
I get multiple questions a week. Sometimes through email or FB messages and sometimes face to face at an event or meeting. In this podcast, I’ll be answering the most common questions I’ve had and even, your questions. Email me at BecauseFamily@gmail.com to get your question read and answered on the Podcast.
Question: How do Social Media Sites censor content?
User-Generated Content
Generally accepted standards.
Nudity/Sexual Content
Extreme Violence
Hate Speech
Harmful content disguised as kid content.
Keep in Mind:
Location of the company.
Lawsuits and bad PR
More than one reviewer.
Usually flagged by users.
Protecting viewers from dangerous false facts, visual content, and messages is not censorship. It is the company’s right to protect their image and intellectual property.
These companies can make decisions based on their own guidelines. They are not government entities. The CAN censor content if they want as long as they have put reasons in their terms and agreements.
CONCLUSION
Thank you again for listening to Raising Connected Kids, the podcast that answers your questions about the connected world your kids are growing up in. Subscribe on iTunes, Stitcher, YouTube, Spotify, Google Podcasts, and anywhere else you listen to podcasts. Like/Follow us on Facebook, Twitter and InstagramShare the show with your friends and leave a review on your favorite podcast app to help spread the word. Remember to visit BecauseFamily.org/partnership to partner with us as we protect children and teenagers by bridging the technology gap between them and their parents.
I am an avid YouTube viewer. I get most of my entertainment from the video streaming service, watching gaming videos, D&D streams, and educational tutorials. I have noticed a trend since YouTube changed its policies for creators to be more responsible for their channel’s content as it pertains to advertising to children.
Since YouTube cannot collect viewer data from videos that are intended for children, the company has asked creators to label whether their videos are for kids or not. They are also using an algorithm to view popular videos and identify the content as meant for kids or not meant for kids. This algorithm has content creators concerned for the viability of their channel. This has caused them to be more blatant with crude content and swearing in order to make it very obvious to this algorithm that their video is not meant for children.
One YouTuber that I enjoy watching, partially because he isn’t overly crude, has been starting his videos with strings of swear words and jokingly saying “This video isn’t for kids YouTube, just be aware, not meant for children.” One of the reasons he feels the need to say this so blatantly is because he plays video games on his channel that may appeal to children. The images of the game alone could lead a person or artificial intelligent software to believe the video was made for children even though that isn’t this creator’s main target audience. Another YouTube content creator that I know has lamented on social media that his channel, which is family-friendly, has lost hundreds of dollars monthly in revenue since YouTube changed their policies.
SirWillow is a Family-Friendly YouTube Channel with nearly 30,000 subscribers and over 4 and a half million views.
Would you be willing to tell me a percentage your ad revenue went down when YouTube changed their policies?
I’m still waiting to see how it all sorts out, but right now in my case I’m looking at about a 30% drop, but it’s in a state of flux. What will be telling will be the end of January when the full force of the new policies kicks in.
How have the changes to the ad policy changed your process for making videos?
In my case, it hasn’t changed any of my process. But I may not be the norm in that regard. I know several that do YouTube “full time” and for them, it has meant some drastic changes. I know at least one that is likely going to shut down, another is cutting back on YouTube to increase time in other projects. For me, it’s been a hobby that has brought in a part-time job income, and while the income has dropped it’s still going to fit the same role. It has meant a change in how many videos though. I am cutting back my production some from 10-12 videos a month to closer to 7.
Your videos are “family-friendly.” Do you think that YouTube is becoming a less friendly place for families in general or is it mostly up to creators?
I absolutely think YouTube is becoming less family-friendly, and these changes are going to directly impact that and make it worse. The changes are going to pretty much destroy financial benefits for anyone producing kid-focused videos, and there are a lot of family-friendly channels that are going to get caught in that backwash and cut back or stop producing. It’s also going to be harder to find kid and family-friendly videos because of all of the blocks that will remove them from the normal algorithms that recommend videos.
And there are a number of producers who have, as you mentioned, increased cursing and crude language, along with images and subjects to make it clear that they aren’t “kid-focused” It’s going to make it hard to find, and hard to produce and make money, kid and family-friendly content.
My thanks to SirWillow for answering these questions for me. He does videos about theme parks and what it has been like working at theme parks. Go check out his channel!
What Parents Should Know
It should be very clear by now that YouTube isn’t intended for children. It is becoming harder and harder for people who make videos for kids to sustain a profitable channel on the site. This is causing some different reactions. Some kids’ channels are switching to a subscription method where you can sign up to pay monthly for more content from them. Others are changing to Facebook or Twitch because of their less strict ad policies.
The only real way to be sure your kids aren’t watching videos that aren’t intended for their age is for you to control what they are viewing. Legally, our young kids (under 13) are supposed to be using only apps intended for their age group. The legal responsibility, however, doesn’t fall to our kids or even us as their parents, it falls to the company. Hundreds of millions of dollars worth of fines have been handed out by the FTC for companies illegally collecting data from children. They are being investigated and forced to make changes. The changes seem like they should be good for the safety of our children but so far they are only truly helping protect the company from the repercussions of disobeying child safety laws.
When the safety measures protect only from advertising info being collected, they may be intended to protect children but in practice, they seem to be increasing the volatility of the content on the service while only protecting the service itself. Parents are the only true guardians of our kids’ hearts and minds. The only way to protect them from adult content and crude language on the videos they watch are to take responsibility for their screen time ourselves. Here are some tips:
Only allow screens in a public area.
Limit headphone use so you can hear what they are watching.
Build playlists on YouTube to ensure they are only watching videos meant for kids.
Use apps like PBS Kids or DisneyPlus to keep them watching family-friendly videos.
Use YouTube kids instead of YouTube; while not foolproof its a far better option than basic YouTube.
Limit the amount of time watching videos; the more time spent on YouTube the more chance of coming across inappropriate content.
Parents should take the steps necessary to protect their children online. Companies should be held responsible for their advertising practices and the content on their sites and apps but the responsibility for protecting our children falls strictly to parents. When the measure taken by companies to protect kids backfire by causing creators to lose money unless they swear, use violent and sexist language, or show adult images on their videos, the measure don’t protect our kids, they make the app more dangerous. Parents are the gatekeeper. Protect your children.
Starting today, all creators are required to mark their content as made for kids or not made for kids in YouTube Studio. -YouTube Creators Email
YouTube will be limiting the data they collect form videos that are targeting children. This is in effort to comply with the FTC’s demands that they be responsible for the information they gather on their site which lists children among their most frequent audience members. Wording in the email suggests that YouTube is “helping” creators comply with COPPA as well as meeting the demands the Federal Trade Commission put on YouTube as a media company.
YouTube will use an algorithm to monitor content for child centric content and flag it as such if it is not flagged by the creator of the video. The email reminds creators to be vigilant to properly tag their videos if they are made for children as failure to comply could cause them to be in violation of the FTC’s demands.
The FTC has outlined what constitutes children’s content and YouTube has that information available on their support page. YouTube’s announcement briefly defines children’s content as:
• It is directed to children as the primary audience (e.g. videos for preschoolers).
• It is directed to children but children are a secondary audience (e.g. cartoon video that primarily targets teenagers but is also intended for younger kids).
YouTube’s guidelines state that they may override content creator’s settings if their content seems to be geared toward kids but isn’t marked as such. This could result in content creators being demonetized or held accountable in some other way for not properly categorizing their content.
What Parents Should Know
The FTC fined YouTube for their inability to comply with COPPA and told them they had to have a plan by next year to keep children’s data private on their site. Many thought YouTube Kids was the solution but so few parents actually used the kid version of YouTube so children remain a major audience for YouTube’s main site and app. The information creators give YouTube about their videos and channels will help YouTube know what videos to collect data from that will be used for advertising in the future. Also, the advertising on videos marked as “for children” will be different, focusing on the content of the video as an indicator of the audience rather than viewing data from the viewers themselves.
These changes, in my opinion, are a step in the right direction for YouTube. Their collection of data from young audiences have been a point of contention for tech safety experts, security and privacy agencies, and family advocacy groups for several years now. The policies handed down by the FTC are in direct response of some of these experts and agencies asking for an investigation into YouTube for their lack of compliance with COPPA.
As parents we rarely think about our kids digital footprint being collected and used against them but it is happing every time they log on to an app or game. It is important, however, to remember that the trail they leave behind online will follow them for the rest of their lives. The things they buy, the sites they visit, the videos they watch, and the games they play are all being compiled to create a profile on them that will be used to market to them online for years to come. If parents remember that our children’s web traffic is being collected we can take steps to protect them from excessive data collection. Encourage them to use messenger apps that are made just for kids. [Facebook Messenger Kids, not WhatsApp or FB Messenger.] Remind them that what they share online becomes public the moment they share it. Tell them they should only use video and game apps that are intended for children and made by major developers who are more likely to comply with COPPA. Parents are responsible for the safety of their children, as well as their privacy and security so take the steps you can to keep their data private.
UPDATE 9-4-2019: This morning the FTC announced a 170 million dollar settlement with Google to end the investivations of YouTube’s children’s data collection practices. At the same time YouTube announced they are rolling out funding for original children programming. YouTube CEO, Susan Wojcicki said that the changes proposed by the FTC could be detrimental to much of the ad revenue made by content creators who make videos targeting children. She also said that the changes are rolling out slowly over four months to give creators time to adjust their content.
Child data security advocates are not satisfied with this fine or these changes. They were hoping for more:
“A plethora of parental concerns about YouTube – from inappropriate content and recommendations to excessive screen time – can all be traced to Google’s business model of using data to maximize watch time and ad revenue,” said Josh Golin, the Executive Director for the Campaign for a Commercial-Free Childhood (CCFC).
Parents should be aware that the changes to YouTube’s data collection and advertising properties are rolling out slowly but will affect both YouTube and YouTube kids. My advice as mentioned in the video below, is that parents pay close attention to the videos their children watch on YouTube. Understand that much of the content they consume is created to advertise products whether it be websites, video games, or physical products such as toys or food and candy. Advertisments will still be geared toward kids based on the videos they are choosing to watch, much like seeing commercials for toys during Saturday morning cartoons.
8-23-2019
YouTube’s data collection policies have garnered attention from media and government agencies alike over the past several months. After some shocking reports about child pornography on the site and restrictions handed down from the FTC, Google is finally taking some real steps to comply with the Child Online Privacy Protection Act (COPPA.) Bloomberg reported this week that YouTube will be ending targeted ads on videos intended for children.
Obviously, ads that target viewers use data that has been collected in order to assign advertisement to that user. If YouTube is targeting ads to children, it stands to reason that they are collecting information about them as viewers in order to create their advertising profile in the first place. This data collection is blatantly against COPPA and one of the reasons the site was investigated by the FTC earlier this year.
YouTube has already cut advertising income from videos that feature disturbing content aimed at children and eliminated comments in videos that feature children. It is estimated that YouTube makes nearly $750m annually from advertising on children’s videos. Obviously eliminating those targeted ads could seriously hurt Google’s bottom line but they say it is the least damaging option. There are other ways for YouTube to serve somewhat targeted ads to children. The company can use ads that are chosen based on the videos they appear on, thus tying the kids’ interest in the video itself with the ad that will be served. Those who have brought complaints against YouTube about their COPPA violations aren’t expected to be satisfied with that solution either.
Of course YouTube wants your children to use YouTube Kids. This is how they protect themselves from the very mess they are in now. They say that YouTube Kids doesn’t collect data from viewers and only shows ads as they relate to the video users are watching. Even so, my recommendation is that your kids only watch YouTube in a place that everyone can see what they are watching. If inappropriate content comes up you will want to see what it is. This way you can talk to your child about what they saw and how to avoid seeing that in the first place.
Another option is to use YouTube Premium to eliminate ads all together. We use this so that when we build a playlist of videos for our kids, we can be sure that they’ll only see what we selected and not some other video ad for something we may not approve of. YouTube is trying all they can to keep their ad based ecosystem alive while staying out of dangerous apps list and tech safety expert blog posts. Only time will tell if they are able to do so. This change could be a very tiny step in the right direction.
The US Federal Trade Commission is finishing an investigation into YouTube’s Children’s Data and Ad policies and at least one member of Congress is now asking for YouTube to make some major changes. Massachusettes Senator, Ed Markey has officially requested that the FTC enforce some major policy shifts on Google for how YouTube handles advertisements to children and the collection of kids’ data.
The request states that:
Personal information about a child can be leveraged to hook consumers for years to come, so it is incumbent upon the FTC to enforce federal law and act as a check for the ever increasing appetite for childrens’ data. – FTC YouTube COPPA 2019
This three-page document outlines a plan for rules that the FTC should enforce upon YouTube in order to keep them compliant with COPPA and to better regulate their child advertising practices. The rules include requiring Google to stop collecting data from users under 13, requiring YouTube to develop a way to identify users under 13 and implement COPPA compliant policies, disallow influencers from marketing products geared towards children under 13, and forcing Google to create a fund for developing content meant for children that is ad-free and COPPA compliant.
COPPA imposes certain requirements on operators of websites or online services directed to children under 13 years of age, and on operators of other websites or online services that have actual knowledge that they are collecting personal information online from a child under 13 years of age.
What Parents Should Know
Parents have to be intentional about teaching their children about online privacy. Regulations from the FTC will, likely, be coming in the near future. Even if these changes aren’t as strict as the ones listed in the letter from Senator Markey, they will still cause major ripples in the YouTube creator and viewer community. The way that YouTube seems to try and handle these kinds of problems is by “demonetizing” videos that contain the type of content they are taking heat about. The heat they are getting from the FTC right now, though, is concerning some of the most profitable channels on any video sharing platform ever.
Advertising is the way these companies make their money and collecting data is their sole model for targeting their advertising. If they aren’t allowed to target children anymore then there won’t be much content on YouTube for children at all. Our approach has always to only allow our kids to watch YouTube videos that we have selected and they must watch them on the television in the living room. That protects them from any surprises and we curate the types of videos they are allowed to watch. We also have YouTube Premium which removes ads. This is helpful since the algorithm that selects which ads show up on what videos often doesn’t take the age of the target audience into account. (i.e. an ad for the latest Childs Play film on a video about kids making slime.)
As I always say, we should hold these companies accountable as much as possible but it falls to parents to be the responsible ones when it comes to our children’s digital health and online safety. What is your approach to YouTube, do your kids watch as much as they want? Do you limit their viewership on YouTube? Do you think this news will affect how much time you allow them to use the app? Let me know your thoughts in the comments below.
WWDC was held last week at Apple’s Headquarters in Cupertino, California. Every year, the tech giant hosts a conference for developers and media from all over the world. The company’s Project Managers and Chief Officers all take their turns on stage to discuss what they’ve been working on over the past year in order to increase the hype around Apple’s products and software. Much of what is announced at WWDC targets developers and “tech-heads” who can’t wait to find out how to make apps for Apple products or what the next big thing is going to be. Some of Apple’s new feature’s however could bring some peace of mind to parents. Here is a break down:
Apple TV+
Apple’s streaming video device has been great for viewing other services but Apple’s streaming service itself has been lackluster. One thing that has been missing for a while is the ability to make separate accounts or profiles for viewers, including children. Apple announced at WWDC that this is changing. They are making it possible to create profiles for every member of your family. Your viewing history and suggestions will be sorted according to your accounts and best of all, your recommendations won’t be overloaded with shows that your children love to watch.
Apple Music/iTunes
iTunes is officially no more as Apple will be separating iTunes offerings into multiple apps. Books, Podcasts, and Music will all be separate now on the MacOS. When you plug in your iPhone to sync with your Mac, nothing will happen. Your phone will sync in the background. It has become pretty apparent that most folks don’t need software to manage their music collection. Streaming music has taken over and iTunes wasn’t very good at that job. Apple Music is taking over the music service and Podcasts is mainly accessed through the mobile app, not on desktop.
iTunes has been around since 2001 and while there are those who have become used to the software, most have been aggravated by frequent updates and overuse of computer resources. Apple is likely accurate in thinking the software won’t be missed by very many people.
Apple Arcade
Apple is also working their way into the video game streaming world with Apple Arcade, due to release this fall. Apple Arcade will consist of a series of exclusive games made just for their system and will be playable through your phone, tablet, Mac Computer, or Apple T+. They have a controller that you can use with AppleTV but are adding support for Playstation 4 and Xbox One controllers as well. The 100 or so available games are a bit weak looking but they are sure to find some developers who are willing to put out some quality content for Apple before too long. They’re going to have to in order to compete with Google’s Sadia and the new service coming soon from Sony and Microsoft.
iOS 13
Probably the most relevant of updates from WWDC has to do the Apple’s latest smartphone operating system, iOS13. The software boasts a new dark mode, faster app launches and downloads, faster Face ID unlock, and a new (to Apple at least) “swipe” style typing system.
Dark Mode is cool and faster downloads and unlocking features are great but the iOS update doesn’t really have anything going on that is relevant to parents besides their focus on data security. More on that below.
Photos and Video
Photos in iOS13 is getting an overhaul as well. With the ability to pinch to zoom in your galleries and a new sorting method that groups photos together based on the date they were taken. Photos will also include a new smart gallery that will remove images like screen shots from your view, only showing the photos you’ve taken with your camera.
Privacy is a Key Theme
Every update at this year’s WWDC had privacy as a key theme. Directors and Developers mentioned over and over again what Apple does and doesn’t do with your data. Apple Maps uses encrypted data to help you find your way, the photos app does its date and location tracking locally, and they even mentioned a new “Sign in with Apple” that allows you to sign in with your Face ID and create accounts with individual dummy email addresses.
Data security and privacy has been in the news a lot lately and Apple has been very vocal about their desire to keep their user’s information secure. Whether it is a direct attack against other tech companies who have made most of their money by collecting and selling data or just an honest desire to maintain their user’s trust, the result should be a bit more confidence that your information is safe if you are using their products. I always advise, however, that you continue to make efforts to protect your own privacy. Be careful what you share online. Turn off location access to apps that don’t require that information to work properly and most importantly, teach this approach to privacy to your children.
You can listen to this article as a podcast on Family Tech Update.
You can subscribe on Stitcher, Spotify, or Apple Podcasts using the links below the player.
The term Meme was coined by Richard Dawkins in his book “The Selfish Gene.” It was simply defined as any form of media that was passed from person to person until it reached a massive level of popularity. Nowadays we would call that going viral. It is difficult to put your finger on a single meme as the first one or even to identify how some of today’s most popular memes got their start. In this article we’ll look at the history of memes, how we got to where we are, and what it parents need to know about Memes. Keep in mind that you can see some Meme examples in the video above.
History of the Meme
It didn’t take long once the internet was available to most people for Memes to become a major part of how people spent their time online. In 1991 we saw such memes as the dancing baby, motivational posters, and the hamster dance being passed along in emails and forums. These images, videos and gifs were passed from person to person and inbox to inbox, shooting this silly content to Meme stardom.
Then came the 2000s, some would say this was the golden age for memes due to the rise of YouTube, Social Media, and Viral Videos. This took us from sharing content within a limited access forum or the contact list in our email to sharing them on our public social media page to be re-shared over and over again to thousands or millions of people. This period is where we were blessed with the rick roll, Chuck Norris jokes, turn down for what, cat videos, and Vine videos.
We are currently living in the age of the modern Meme. Most originating on Reddit before they become popular on other social media sites, Memes are going mainstream in television, radio, politics, and marketing. Memes are used to promote idealogical ideals. Memes like the Harambee meme are an outlet for those who are bothered by certain things in society to express their belief or concern. Politicians even capitalize on the popularity of their own Memes, sharing them on their social media accounts to gain recognition and strengthen support.
The Dank Meme
Dank usually means dark, damp, and gross. When it comes to Memes, dank is a positive term. A dank Meme is usually one that can be used and reused with different other Memes added. Sometimes popular sound clips or songs from a Meme will make its way through a whole bunch of different videos. Something like the “oof” of a dying Roblox character being dubbed over videos of people falling or otherwise hurting themselves. This is what a dank meme usually is.
The Memes you see gain popularity on your social media account. That Condescending Willy Wonka image with someone’s sarcastic comment typed onto it is a dank meme, having been reimagined several times, thus gaining more popularity.
What Parents Should Know
Memes are an easy way to express yourself. It can be a fun way for kids to have a laugh or share what they think about certain issues. My problem with some Memes is that they tend to simplify complex concepts. Something as complicated as political beliefs are packaged as Memes and expressed in a shallow, unhelpful way. The Meme is a limited genre, only allowing so much space for sharing what you think. This can cause confusion and can ultimately be polarizing.
Memes also have a tendency to take us in a circle of reasoning. We share more Memes that we think are funny because of the statement they are making and this tells the algorithm of the social media account we use that we want to see more Memes like this. We then are simply only fed a steady diet of the same thoughts, repackaged as dank Memes and our view is never questioned or challenged in a way that can be healthy and help shape who we are.
Finally, we have to be careful because Memes can often be very adult oriented. Memes are an expression that has been limited to those who understand them. When we start into the Meme rabbit hole, whether it’s on Reddit or Youtube, we can tend to find ourselves getting to some strange and even dark places. I am not squeamish and there are a lot of Memes that I’m a fan of and I share regularly when I see them repackaged in a way I find humorous. I did, however, get into some content while researching this article that just made me feel stupider for seeing it. See what I go through to help you out?
Thanks for reading. Share this article with a friend who needs to know what a Dank Meme is.
On March 19th, Google announced their latest product: Stadia. The promise of Stadia is to allow people to play AAA games (Assassin’s Creed, Fortnite, etc.) without having to buy a dedicated gaming console or PC. How does Google plan to deliver on this promise? With Chrome and YouTube.
Google has stated that Stadia is “the future of gaming.” I agree. Young adults are used to subscribing to services and streaming their entertainment and Stadia is the next step. Kids already watch hours of gaming content on YouTube every day, why not add the ability to play those games too?
What We Know Right Now
We don’t know a lot about Stadia right now but what we do know is pretty impressive.
A high-speed Internet connection will be required.
Up to 4K HDR at 60fps.
Plasy using multiple devices: PCs, laptops, tablets, and smartphones will be supported.
No need to download games or wait for updates.
You’ll be able to use any USB controller connected to you computer.
There will be a dedicated wireless controller.
Stadia will be available this year.
What We Don’t Know Right Now
Despite all the excitement around this announcement, there are many things we don’t know.
The price of the service.
The price of the controller.
Games available at launch.
Supported mobile devices at launch.
Release date.
Minimum Internet connection speed.
Podcast Episode:
What Parents Need to Know
Your kids are going to want this, especially if they watch gameplay videos on YouTube. Being able to instantly play a game that one of their favorite streamers is playing and try that special move is very appealing.
If the price is right, this could be an affordable alternative to purchasing a gaming console. Being able to play hundreds of games for $50-$60 a month is more affordable than buying a $600 console and a game or two every month.
The Stadia controller has a streaming button which means your kids could be online and streaming their game and voice instantly. In fact, they could even join in a game with another person. Parents should be aware of this feature and take measures to block it if they don’t want their kids to live-stream.
Google has been improving their products with better parental controls every year. Parents should familiarize themselves with those parental controls and enable any restrictions they deem necessary. You may want to consider adding time limits, enabling ratings limits, and disabling some of the streaming and cooperative features. Of course, this
A YouTuber (sensitive content warning) has found evidence that there is a vast community of child predators and those who watch content that contain child exploitation on YouTube. They are using comment sections and timestamps to lead each other to actual child pornography. They start with a search for a simple popular YouTube trend and then the algorithm that YouTube uses to connect viewers with like content will eventually propose a video of kids that can be considered appealing to these viewers. They then click through in the comments section to parts of the video that seem innocent but are, unfortunately, what these predators have been looking for.
YouTube’s response didn’t come until after advertisers begin pulling down there ads. They began by removing some of the videos and some of the comments, as well as demonitizing (pausing ad revenue) videos on which these comments are posted. YouTubers are concerned because some of their videos have been or could be demonetized because of a commenter’s words, not something they control. Whether you want to blame the site, the viewers, or the makers of the videos, the fact that the conversation goes strait to money is a serious problem.
I think the money isn’t the issue. I understand hitting them where it hurts and that YouTube should do something but we have to take some responsibility. I believe we need to have a serious conversation about what types of videos we allow our children to post publicly and we should be very concerned about the types of people who watch these videos. Watch the video above to hear more of my thought son this issue.
The video to initially expose this issue is below, be warned that it’s disturbing to watch and there is adult language.
I field messages and requests all week long from parents who want the latest tools for keeping their kids safe online. They ask about everything from YouTube to Instagram to Snapchat and want apps that will monitor social media use, block adult content, and limit screen time. While there are resources to help parents in each of those departments, some apps just aren’t intended for your younger child. Unfortunately many parents have a real problem giving in to that fact.
Streaming Videos
Let’s look at YouTube as our first example. The video app was created in 2005 as a place for anyone to upload short videos to share with their friends. Google purchased YouTube in 2006 and Social Media became popular soon after, rocketing YouTube to the successful streaming platform it has become. The site is loaded with videos from filmmakers, vloggers, video gamers, makeup artists, geeks, professionals, educators, ministers, animators, artists, basically any category you can think of. It has evolved into an immovable force on which there are 300 minutes of footage uploaded every single minute. YouTube has come under fire for some of their content being too mature or sensitive and so they’ve employed algorithms to keep tabs on inappropriate videos. They also released n app for children called YouTube Kids. This app has also seen its share of controversy after YouTube has been unable to keep sensitive material from showing up in videos on the app.
YouTube obviously wasn’t intended for young viewers. It is a site that is populated primarily by videos uploaded by its users. Some companies that make content for kids use YouTube but this is a choice by these companies in response to the popularity of the platform. It’s an attitude that says: “Kids are there, so we should be there too.” The goal is to reach the audience already there, not necessarily to build an audience on YouTube. There are no real parental controls (safe search is mostly useless) and videos that are labeled as kid friendly are done so without any human eyes ever seeing the entire video. The only time a content reviewer sees the video is when enough users of the site have flagged it as inappropriate. Allowing your kids to watch YouTube on their own is a risk that many parents don’t even realize they are making.
What about Social Media?
Snapchat, Instagram, or Facebook are all the same. They, like YouTube, feature content created and posted by the users of the service. This “User Generated Content” can vary from political or religious views, to silly cat videos or memes, and random personal updates that mean nothing to anyone. People also post updates on their serious mental health issues, they share about their plans to harm themselves or others, the post images of themselves in compromising situations, and that’s just what people post publicly. Private messaging contains content that people post when they think nobody except those they trust is watching. Private messaging is how predators groom their victims. It’s how the out of control teenage boy convinces the girl to send him inappropriate pictures of herself. Social Media is intended to be a place to connect with people, some you may know, some you don’t. It is meant to be a public forum and that which is meant to be private, is meant to be completely private. This is where the problems come in when parents ask for ways to monitor their kids social media.
Age Rating vs Terms and Agreements
I see a lot of parents giving their kids access to social media and other online activities when they reach the age of 13. This is based on the fact that the terms and agreements that these sites have you approve before making an account list 13 as the minimum age to use their service. A common mistake parents make is thinking that this age is meant to protect their kids from content on the site when, in fact, it’s intended to protect the company from having data and information on kids under the age of 13.COPPA laws say that companies can’t collect and use information of kids under 13 without parental consent. If a company says you can’t use the site if you’re under 13 then they can do whatever they want with all of that data and if your kid is underage, it isn’t their fault. You ignored the Terms and Agreements when you allowed them to use the site.
Age rating is the age recommendation you’ll see in the app store when you are downloading and app. This age restriction is based on the actual content in the app, not any legal requirements for the company. The usual standard is that apps populated by user generated content are rated 17+. This is because the company can’t guarantee that what is seen on their product won’t be considered adult content. When we allow our kids to use apps that contain user generated content we are allowing them to be subject to the opinions, behavior, and whims of everyone else who uses that app.
Parental Involvement Before Parental Control
When I am asked to help parents protect their kids in apps that are obviously not made for children I feel like I’m being asked to give parents a suit their kids can wear to protect them while they play in a burning building. I get it. It isn’t easy to tell your kids they can’t do something they want to do. “My friends are all on Snapchat.” or the one that irritates me to no end, “The teacher/coach says I have to use Facebook to get the homework/practice schedule.” Sometimes we just have to say no. It is difficult to set the boundaries and limits that keep our kids safe but if we have the right attitude about what we’re protecting them from it becomes easier. Social Media, YouTube, video games that are rated M for mature, non of these things are intended for people under the age of 17 and when we allow our kids to use these products, we open them up to a world that is meant for adults.
It is difficult for algorithms to catch nudity or violence in uploaded videos. Social Media sites and private messaging apps go to great lengths to keep prying eyes from seeing what is being sent. This makes parental monitoring software hard to develop. Unfortunately some burning buildings are just too dangerous and there isn’t much that can be done to protect you if you’re inside. If you aren’t ok with your child seeing content that is meant for grown ups then I recommend thinking about uninstalling that app instead of trying to find software that doesn’t it allow it to do what it was intended to do.