Tag: teens

  • The Worst Thing About CES2020!

    The Worst Thing About CES2020!

    I spent five days walking around the show room floor and attending conference sessions at CES2020. It is the largest trade show in the world. I saw all kinds of technology from smart cars, smart homes, and toys and ed products for kids. These people are why I am at CES. I’m there to learn how their products can benefit our kids in the future. Tech is super helpful and useful as a tool for education, entertainment, and development. Many kids are learning in ways they couldn’t before, children are getting opportunities they didn’t have before because of vr and ar classrooms. Technology is and always will be a part of our lives. The world is getting more and more tech-centric. The worst thing about CES2020 seems to be that parent’s concerns about the amount of tech in their kids’ lives are being ignored.

    The Worst Thing about CES2020

    I heard a lot of mixed messages at CES this year. Especially at the Living at Digital Times “Family Tech Summit.” It has become increasingly frustrating to listen to software developers and hardware engineers talk about how their new technology is going to change the world. While much of this technology is very neat, and as mentioned, can be helpful. There are also a small percentage of people on the stages at CES warning us that our kids are becoming too dependent on this technology. Parents and teachers are getting concerned because they feel like technology is moving far faster than they can keep up. The experts at CES don’t seem to understand the anxiety caused by new, “world changing,” technology being announced every single year.

    Most technology being announced at CES is a new take on the same thing we’ve had for the past ten years. I am walking in to the Family Tech Summit expecting to hear about what new products will be best for our kids. Instead I am hearing what will be best for these developers and companies. How to market and close sales with their new products. I did hear from a few people about ways to protect our kids on the technology we allow them to use.

    Unfortunately they were given a small amount of time. They were followed by someone who just got on stage to celebrate the latest voice control tech. This “expert” explained how great it is for our kids. He marginalized parents’ concerns by calling them misguided. then he touted the fact that parents seem to be concerned but don’t take action to protect their kids. He ignored the fact that companies make their products and advertise them as safe. They build in parental controls that are weak and hard to set up. Then they wonder why they show up in the news when a kid comes across adult content on a smart speaker or is visited by a stranger on their in-room nanny cam.

    It wasn’t all bad.

    There were highlights at CES2020, though. Dr Amanda Gummer with the Good Toy Guide, spoke of using tech to encourage kids to play and learn. Sean Herman, author of “Screen Captured,” shared about his own kids and how their attention to screens caused him to start Kinzoo. Kinzoo is a messenger app that “turns screen time into family time.” I met Carrol Titus, founder of GoldenPoppy Inc. who is making augmented reality games to teach physics, programming, and positive self awareness. I enjoyed speaking with Ahren Hoffman and Sue Warfield from the American Specialty Toy Retailing Association, “ASTRA.” We talked about the lack of attention to giving parents tools to learn and use tech wisely and the benefits of kids playing off of screens. Especially young children.

    Everyone can say what they want about screen time and the benefits or risks. The truth I see is ‘that technology should enhance our play and education. It shouldn’t replace it. Parents aren’t freaking out because their kids are spending too much time watching educational videos. They’re not concerned about them playing apps that teach them to read or do math. The concern is the unstoppable flow of entertainment that comes flying at our children at toy stores and app stores. Entertainment that has no intention of teaching anything, just using up your child’s time and attention to show them ads or sell them access to more entertainment. I understand that many want to see tech become the new norm for education, recreation, entertainment, and everything else.

    The issue is that we currently aren’t promoting balance. Surely not at CES2020, definitely not in our app stores or on the shelves of our retailers. Once again, it falls to us as parents to take the step towards a healthy attitude toward s tech. Digital wellness is our responsibility and the more I hear from app developers and toy makers, the more I am sure they won’t be taking it seriously, not really, so we have to.


    If you’re concerned about what your kids are doing online, be sure to check out Accountable2You.com. This software is my favorite accountability software and will help you keep a close eye on the websites your kids view.

  • BecauseFamily 2019 Year in Review

    BecauseFamily 2019 Year in Review

    2019 was an incredible year! Thank you, our readers, for supporting our organization by reading, watching, listening to, and sharing our posts. We have been able to help thousands of families be internet safe in 2019. The infographic below is a celebration of the advancements our organization has made this year. Be sure to watch the video above to hear about all of the exciting new things in the works for 2020.


  • YouTube’s Children Privacy Policies Causing Content Creators to be More Crude in their Videos

    YouTube’s Children Privacy Policies Causing Content Creators to be More Crude in their Videos

     

     

    I am an avid YouTube viewer. I get most of my entertainment from the video streaming service, watching gaming videos, D&D streams, and educational tutorials. I have noticed a trend since YouTube changed its policies for creators to be more responsible for their channel’s content as it pertains to advertising to children. 

    Since YouTube cannot collect viewer data from videos that are intended for children, the company has asked creators to label whether their videos are for kids or not. They are also using an algorithm to view popular videos and identify the content as meant for kids or not meant for kids. This algorithm has content creators concerned for the viability of their channel. This has caused them to be more blatant with crude content and swearing in order to make it very obvious to this algorithm that their video is not meant for children.

    One YouTuber that I enjoy watching, partially because he isn’t overly crude, has been starting his videos with strings of swear words and jokingly saying “This video isn’t for kids YouTube, just be aware, not meant for children.” One of the reasons he feels the need to say this so blatantly is because he plays video games on his channel that may appeal to children. The images of the game alone could lead a person or artificial intelligent software to believe the video was made for children even though that isn’t this creator’s main target audience. Another YouTube content creator that I know has lamented on social media that his channel, which is family-friendly, has lost hundreds of dollars monthly in revenue since YouTube changed their policies. 

    SirWillow is a Family-Friendly YouTube Channel with nearly 30,000 subscribers and over 4 and a half million views.

    1. Would you be willing to tell me a percentage your ad revenue went down when YouTube changed their policies?

    I’m still waiting to see how it all sorts out, but right now in my case I’m looking at about a 30% drop, but it’s in a state of flux. What will be telling will be the end of January when the full force of the new policies kicks in.

    1. How have the changes to the ad policy changed your process for making videos?

    In my case, it hasn’t changed any of my process.  But I may not be the norm in that regard. I know several that do YouTube “full time” and for them, it has meant some drastic changes.  I know at least one that is likely going to shut down, another is cutting back on YouTube to increase time in other projects. For me, it’s been a hobby that has brought in a part-time job income, and while the income has dropped it’s still going to fit the same role.  It has meant a change in how many videos though. I am cutting back my production some from 10-12 videos a month to closer to 7.

    1. Your videos are “family-friendly.” Do you think that YouTube is becoming a less friendly place for families in general or is it mostly up to creators?

    I absolutely think YouTube is becoming less family-friendly, and these changes are going to directly impact that and make it worse.  The changes are going to pretty much destroy financial benefits for anyone producing kid-focused videos, and there are a lot of family-friendly channels that are going to get caught in that backwash and cut back or stop producing. It’s also going to be harder to find kid and family-friendly videos because of all of the blocks that will remove them from the normal algorithms that recommend videos.

    And there are a number of producers who have, as you mentioned, increased cursing and crude language, along with images and subjects to make it clear that they aren’t “kid-focused”  It’s going to make it hard to find, and hard to produce and make money, kid and family-friendly content.

    My thanks to SirWillow for answering these questions for me. He does videos about theme parks and what it has been like working at theme parks. Go check out his channel!

    What Parents Should Know

    It should be very clear by now that YouTube isn’t intended for children. It is becoming harder and harder for people who make videos for kids to sustain a profitable channel on the site. This is causing some different reactions. Some kids’ channels are switching to a subscription method where you can sign up to pay monthly for more content from them. Others are changing to Facebook or Twitch because of their less strict ad policies. 

    The only real way to be sure your kids aren’t watching videos that aren’t intended for their age is for you to control what they are viewing. Legally, our young kids (under 13) are supposed to be using only apps intended for their age group. The legal responsibility, however, doesn’t fall to our kids or even us as their parents, it falls to the company. Hundreds of millions of dollars worth of fines have been handed out by the FTC for companies illegally collecting data from children. They are being investigated and forced to make changes. The changes seem like they should be good for the safety of our children but so far they are only truly helping protect the company from the repercussions of disobeying child safety laws.

    When the safety measures protect only from advertising info being collected, they may be intended to protect children but in practice, they seem to be increasing the volatility of the content on the service while only protecting the service itself. Parents are the only true guardians of our kids’ hearts and minds. The only way to protect them from adult content and crude language on the videos they watch are to take responsibility for their screen time ourselves. Here are some tips:

    • Only allow screens in a public area. 
    • Limit headphone use so you can hear what they are watching.
    • Build playlists on YouTube to ensure they are only watching videos meant for kids.
    • Use apps like PBS Kids or DisneyPlus to keep them watching family-friendly videos.
    • Use YouTube kids instead of YouTube; while not foolproof its a far better option than basic YouTube. 
    • Limit the amount of time watching videos; the more time spent on YouTube the more chance of coming across inappropriate content.

    Parents should take the steps necessary to protect their children online. Companies should be held responsible for their advertising practices and the content on their sites and apps but the responsibility for protecting our children falls strictly to parents. When the measure taken by companies to protect kids backfire by causing creators to lose money unless they swear, use violent and sexist language, or show adult images on their videos, the measure don’t protect our kids, they make the app more dangerous. Parents are the gatekeeper. Protect your children. 

  • Instagram is Adding Useless Age Verification to Comply with Child Privacy Laws

    Instagram is Adding Useless Age Verification to Comply with Child Privacy Laws

    Users on most social media platforms are supposed to be 13 years old or older. Some apps have had a form of age verification available for a long time. This allowed them to collect data on all of their users without failing to comply with child privacy laws since you can’t have an account if you’re 12 or younger. They then, as much discussed on this blog and elsewhere, sell that data to advertisers or use it to sell targeted advertising on their own platform. Instagram hasn’t had age verification since it started. That is changing as of this week.

    You may have already seen your birthday show up on your profile in the Instagram app. Don’t worry, that information isn’t public, only you can see it. All users will have their birthday information on their profile as of this week. If the birth year used to create your profile shows that you are under the age of 13, your account will be suspended. When setting up a new Instagram account, the app will now ask for you to put in your birth date.

    “Asking for this information will help prevent underage people from joining Instagram, help us keep young people safer and enable more age-appropriate experiences overall,” the company wrote. “In the coming months, we will use the birthday information you share with us to create more tailored experiences, such as education around account controls and recommended privacy settings for young people.”

    Asking for users ages has already been a part of major social media apps like Snapchat but Instagram hadn’t added it to their sign up process yet. TikTok added age verification after being fined nearly 6 million dollars by the Federal Trade Commission. The problem with these age verification practices is that they are fully reliant on users being honest about their age. All you have to do is a little bit of math to determine when you have to have been born to be over 12 years old. When you enter your new determined birth-date you would be allowed into the app just like anyone else. Parents have been known to lie about their kids ages to allow them to have social media accounts, this is allowed by COPPA as it counts as parent permission. The problem is that developers of these apps can’t tell the difference between a parent making an an account for their child or the kid making their own and lying about their age.

    What Parents Should Know

    Age verifications on social media apps are a hand wave towards regulations that depend solely on users to take the rules into account when using the apps. This means that they aren’t concerned with the safety of users as much as their own ability to skirt around fines and other regulations from the Federal Trade Commission. It is very obvious that these apps are meant to be open and as public as possible. The want as many users as they can get because they aren’t social media companies, these are advertising companies. They sell ads, plain and simple. When you sign up to use social media you are signing up to be advertised to specifically and aggressively. When we sign our kids up and lie about their age we are telling these companies to treat them just like any other consumer.

    If you are honest with yourself, the reason you’re allowing your young kids to use social media is pretty weak. Because their friends have it? Because a teacher says that’s how they contact students? There are ways around any of the reasons you think lead to your hands being tied. All it takes is your own knowledge of what being on these social media apps means for your kids and then a little bit of confidence to just say no. Stand up to your kid, you are the parent after all, or stand up to that teacher or coach. Ask them why they want to contact your 12 year old on social media anyway, does that sound appropriate to you? I submit that in nearly any other context it would not be acceptable.

    You are the first line of defense. Advertising and data collection is the main issue that the government leans on when saying they are trying to protect children online. There are, however, so many other issues to be concerned with. Pornography is rampant on apps like Snapchat, Instagram, and TikTok. You se report after report of young people discussing suicide, mental health problems, and eating disorders on these apps. This information is just sitting there for our children to see. When you give in and allow them to use social media at an early age simply because you think it’s no big deal, or you trust your child, you are allowing things into their minds that cannot be unseen. You’re giving them access to a world that cannot be left behind. Once you know about or begin to contemplate these things, they are permanently a part of your psyche. We must do better. We have to be smarter about our children’s access apps with user generated content. Whether it be games, social media, or any other software. We cannot trust software companies to do the right thing. They are looking after their bottom line first. It is up to us to protect our children. Not the government, not app developers, not the schools, or even police departments and social workers. It is up to you, mom, dad, aunt uncle, grandma, and grandpa. Only you.

  • Dangerous Random Live Video Chatting Apps are Dominating Social Media

    Dangerous Random Live Video Chatting Apps are Dominating Social Media

    A reader sent me this article written in the Washington Post today and I wanted to post my response. The article outlines the problems that Apple is having keeping “unwanted sexual content” out of apps on their iOS App Store. Apps like Monkey, Yubo, and ChatLive are all apps that allow you to chat live with random people, often only connecting you based on the gender you say you’d like to chat with. The problem with these apps is that most of them have no way to verify your identity, gender, age, or anything. This means that kids who use these apps are chatting with random strangers, many of whom are much older than them and have nefarious intentions.

    The complains in the article are centered specifically around “unwanted sexual material.” As you can imagine, the consequences of this content is often our young kids seeing images of people in mature circumstances whether they were seeking that kind of content or not. When you can just chat with someone randomly, you never know who is going to show up on your screen. When the person that shows up is in a compromising position, you’ve already seen it and it is impossible to unsee it at that point. Our kids are being shown this nonsense and the developers of apps are monetizing some of the only ways you can filter the content. (i.e. Monkey making you spend their in-app currency called “bananas” to select what gender you want to chat with). Those who run the app stores (Google or Apple) often say they do their best to keep apps with inappropriate content off of their stores, especially when it comes to apps that children use, but once they’ve labeled the app 17+ they pretty much shift the responsibility to the adult who is caring for the child.

    The Monkey App will be a Hotbed for Predators

    What Parents Should Know

    Two years ago I wrote an article about the dangers of the app Monkey and how it would become a hotbed for predators and “unwanted sexual content.” Today, Monkey is mentioned in the Washington Post article as one of the main companies with this content in their app, citing: “About 2 percent of all iOS reviews of Monkey, ranked 10th most popular in Apple’s social networking category earlier this month, contained reports of unwanted sexual experiences, according to The Post’s investigation.” Does 2% constitute a “hotbed?” I don’t know. But I will say it is a cause for great concern, especially since this is only a percentage of the reviews that mentioned the problem, mostly parents who saw that their children had been assaulted with adult content in the app. It doesn’t measure those who saw it and didn’t report it for one reason or another. I had been contacted by Allen Loh, head of global expansions for the Holla Group, operators of the Monkey App, six months ago or so and he assured me that they were working to address some of the safety and content concerns within the app. I have reached out again to get updated information about these issues but have not, as of yet, received a response.

    iOS 12’s Screen Time App Changes Everything!| Video

    The only real way to ensure your child is protected from the unwanted content in apps like these is to use the restriction settings built in to your operating system. Apple’s Screen Time has a restrictions setting in which you can set a maximum age rating for apps that your child can download. If your 15 year old has an iPhone, you can set the restriction to 12+ to ensure that apps rated 17+ won’t be available. Android users can use FamilyLink to set App Store restrictions for their younger children. These restrictions, however, will be automatically set to “adult” when your child turns 13.

    As I always say, the most important thing is communication with your child. You have to make them aware of the dangers of chatting with random strangers on the internet. As obviously dangerous as that sounds, these apps are branded and marketed as a fun way to meet new people. They build an environment that is like going to the mall or the movies back in our day, but instead it is all within the anonymity an app. Unfortunately, within an app, the weirdo who wouldn’t go out into public and make advances at your kids is there waiting to find someone he can groom, send adult pictures to, or violate in some other way. Parents need to create a safe space for our kids to come when they feel threatened or violated by someone online or in an app they use. Many of the stories of parents finding unwanted sexual content within an app were made known only because their child knew to come to them when they saw something that made them uncomfortable or feel violated. Do everything you can to protect your children’s hearts, eyes, and minds and then be sure they know they can come to you if something inappropriate comes across their screen.

     

  • Call of Duty Modern Warfare | A Parent’s Guide

    Call of Duty Modern Warfare | A Parent’s Guide

    Call of Duty Modern Warfare Parent’s Guide

    The rating below is based on the game content. Online interactions will always increase the risk of unwanted content.

    Violence – 1
    Language – 1
    Sexual Content – 4
    Positive Message – 1

    Total Score – 7  out of 20
    (The higher the rating, the safer the game is for kids.)

    ESRB Rating – M for Mature [for Blood and Gore, Intense Violence, Strong Language, Suggestive Themes, and Use of Drugs]


    The Game

    Call of Duty has set the standard for realistic first person shooter gaming for more than 15 years. 2019’s Modern Warfare seems to be a  tribute to the original games in that the story of the latest release is as good as any in every other CoD game to date. The campaign mode takes you through the story through the eyes of British, American, and Middle Eastern soldiers and insurgents who are fighting to free a country from a Russian general and his armies. The story is rich and the characters include soldiers you’ve fought with in different games, giving an instant buy in and causing you to care about these characters from early in the story. While the game does ask you to make some pretty difficult decisions, the realism is unlike any other FPS game I’ve ever played, mostly because of the gruesome situations you are put in during the campaign. Overall, Call of Duty Modern Warfare, as much as the campaign is concerned, is one of the best games of 2019. I recommend it for those mature enough to play as long as you have a strong constitution.

    Violence [1]

    Violence is intense in this game. Explosions blow people apart, every bullet hit causes a spray of blood that can be seen from far away. The rag doll effect is used to increase realism causing enemies to fall limply to the ground and fly through the sky when an explosion takes place nearby. Like many of the most recent Call of Duty games there is an option to disable gore effects but this option is in the settings and not password protected. If you set the gore settings to off they can be easily turned back on without any trouble.

    Language [1]

    CoD Modern Warfare is full of profanity. Every mature word in the book is used in the game and in every mode of the game. Commentary from non-player characters contains extreme language and obviously online multiplayer modes is likely to contain adult language from other users as well. The gore/content filter will turn off language from characters in the game but, again, it isn’t password protection and online play is not affected by these settings. 

    Sexual Content [4]

    There isn’t any obvious sexual content in CoD Modern Warfare. Early in the campaign you interrupt a man who is abusing a woman, it is hinted that he was possibly going to abuse her sexually. You kill him before anything happens. There are some character models/outfits that could be considered revealing, especially with cleavage in the multiplayer modes.

    Positive Message [1]

    Modern Warfare is honest about the cruelty and awful things that happen in modern war. It sets up the Russians as enemies and the US and the UK as the heroes. The campaign story is very dark in places and, while intriguing and well performed, is intended for adult audiences. This game puts players through situations that those who experience PTSD from actual combat often describe as what gave them their condition. Kids who experience anxiety and anxiousness could be seriously harmed by the extreme situations in Call of Duty Modern Warfare.

    You could argue that the cruelty shown in this game can be a commentary on how awful war can be but the fact that you spend 99% of your time in the game participating in combat would likely overshadow any lesson the game is trying to teach.

    What Parents Should Know

    The most important information about this specific game is already mentioned above. I would like to address something I see often when discussing violent video games and first person shooters. There are different schools of thought on the dangers of violent first and third person shooter video games. Obviously there are some who think they are bad for everyone, decreasing sensitivity to violence, and causing people to act out. There is little actual evidence to back up this opinion but there are those who will always feel this way. Another group feels that these games are no big deal. They believe that playing games with violence and blood and gore can help kids understand the true danger of gun violence and lower the risk they they themselves become violent. Many will compare games like Call of Duty to other shooters like Fortnite by saying that Fortnite is too tongue in cheek and puts our kids at risk because it doesn’t take combat seriously enough. As the first opinion there is little to no evidence supporting these ideas either.

    The only statements about violent video games that can be backed up by viable research is that they can cause increased anxiety and adrenaline in children, can exacerbate attention problems in children who already have those issues,  and that there is far too little research to outline the true effects these games have on our children. It may be difficult for parents to be alright with the fact that there is no obvious bad or good answer for video games like Modern Warfare. The truth is that you have to know your child and their maturity level. You should watch their behavior and pay attention to signs like grades, relationships, diet, and exercise to be sure your child has a healthy balance between life and time on their screens.

  • Family Link’s New Features are Great but Still Not Good Enough

    Family Link’s New Features are Great but Still Not Good Enough

    Android has updated their Family Link parental controls feature. The above video will take you though what they’ve done and give you some questions to ask yourself about using the service.

    Make sure your device is compatible.

    The site is very clear that Family Link is only compatible with newer android devices. Go into the settings on your kid’s device and tap the ABOUT button in the menu to see if your software version is 7.0 or newer. If it isn’t your child may not be able to install Family Link which will mean you can’t use the software to set limits and restrictions.

    Double check their privacy policies.

    COPPA regulates the collection of children’s data without parent permission. You have to create an account for your child to use Family Link and to do that you must give permission for Google to collect some of their data. The video explores a bit more of what information they can collect and what they do with that data.

    Be aware that your kids get full control at 13.

    If you are one that wants to be able to see what your older child is doing on their device you’ll have to use the child’s phone to adjust parental control settings with Family Link as control is shifted to the child at age 13.

    Do your homework!

    As I mention in the video above and the podcast episode below, you need to familiarize yourself with the benefits and limitations of Google’s Family Link software. Visit families.google.com to see their information about it and check out our other articles and videos about Family Link as well. You can never be too informed.

     

  • This Digital Citizenship Curriculum May Not Be as Helpful as You Think

    This Digital Citizenship Curriculum May Not Be as Helpful as You Think

    I end every workshop and nearly every video and podcast telling parents to talk to their kids about digital citizenship, screen time balance, and internet safety. I often point them to videos or articles I have made or written that will help them with these topics. Cornell University, in partnership with Common Sense Media, have put together a resource for schools that claims to be perfect to help you, the parent, talk to your kids about these critical topics.

    “Social Media Test Drive” is a curriculum created to help teachers and parents guide their kids through healthy internet use and digital citizenship. The lesson plans for younger children were good. They featured fun videos with cartoon characters singing about what to do if you see a bully and why you shouldn’t talk to strangers online. Some of the curriculum for older kids, however, gave me some red flags.

    Minimizing Research

    The videos that I watched that were created for older kids and teenagers did a good job of presenting research that shows how dangerous too much social media or screen time can be. Unfortunately, most of the videos then downplayed the research by comparing it to anecdotal evidence that is gained simply from the way they know kids feel about using their devices and social media. It felt as if the video was pandering to young people, encouraging them that there aren’t many dangers online as long as they know how to use the internet properly.

    “Find Your Tribe”

    One of the things that is increasingly dangerous about social media and internet usage is exactly the same thing that many will say is a great benefit. The ability for kids (or anyone for that matter) to go online and find a group of people who think exactly like they do and believe exactly the same things they do. This seems like it would be a good thing. In fact, one of the Common Sense Media training videos called it “Finding your tribe.” The problem comes when you surround yourself with so many like minded people that you are no longer encouraged or even able to think critically about the things you see, hear, and experience. We should be hearing voices that contradict each other sometimes so that we can grow in our understanding of the world. The internet can be good for that. We can learn about new ideas, new places, and new types of people. The problem is that when we dive into social media by clicking like or double tapping every post of every person who agrees with us on everything we think, we are telling the algorithms to feed us more and more of the same thing. This is tricking us (not just our children or teens) into thinking that everyone who is right thinks exactly like we do. This is a dangerous attitude and if “your tribe” means people who won’t challenge you when you’re wrong. I hope my kids never find theirs.

    Relevance to the Point of Irrelevance

    Unfortunately these videos remind me of the after school specials we all made fun of when we were kids. The young guy standing in front of motion graphics and reading a script about how to use the internet wisely. It’s been done before. It was done with cigarettes, it was done with drinking and driving, now it’s being done with the internet. It all reminds me of the end credits scene for Spiderman Homecoming where Captain America shows up give a speech on patience. As I mentioned above, it seems to be pandering and I can imagine it being laughed off by most kids in the age group it is intended for. Even with the obsolescence of the “after school special concept” these videos attempt to be cool, especially by downplaying the dangers of phone usage and encouraging kids to just “be careful.” They try so hard not to say anything that will make students shut down that they barely say anything helpful at all.

    It Falls to the Parents

    Ultimately these issues are the responsibility of us as the parents of our children. Only we know what it will take to get them to understand the truth about their time online. We are the only ones that can set the standards of internet use in our families. We are the ones who can set the limits we feel are best and do it in a way that helps our children feel that they are partners with us as we work towards developing healthy tech habits together as a family.

    We should use the resources at our disposal, accountability software, filters, message monitoring, and built in parental control settings can all go a long way to help us keep our kids safe and teach them how to protect themselves. There are truly dangerous things on the internet. These things shouldn’t be glorified or blown out of proportion but they shouldn’t be ignored or downplayed either. We, as the gatekeepers of our homes, must decide what level of discretion we will use in protecting our children. We can rely on our schools or other companies or organizations to do it for us.

  • UPDATED: YouTube May Eliminate Targeted Ads on Kids’ Videos

    UPDATED: YouTube May Eliminate Targeted Ads on Kids’ Videos

    UPDATE 9-4-2019: This morning the FTC announced a 170 million dollar settlement with Google to end the investivations  of YouTube’s children’s data collection practices. At the same time YouTube announced they are rolling out funding for original children programming. YouTube CEO, Susan Wojcicki said that the changes proposed by the FTC could be detrimental to much of the ad revenue made by content creators who make videos targeting children. She also said that the changes are rolling out slowly over four months to give creators time to adjust their content.

    Child data security advocates are not satisfied with this fine or these changes. They were hoping for more:

    “A plethora of parental concerns about YouTube – from inappropriate content and recommendations to excessive screen time – can all be traced to Google’s business model of using data to maximize watch time and ad revenue,” said Josh Golin, the Executive Director for the Campaign for a Commercial-Free Childhood (CCFC).

    Parents should be aware that the changes to YouTube’s data collection and advertising properties are rolling out slowly but will affect both YouTube and YouTube kids. My advice as mentioned in the video below, is that parents pay close attention to the videos their children watch on YouTube. Understand that much of the content they consume is created to advertise products whether it be websites, video games, or physical products such as toys or food and candy. Advertisments will still be geared toward kids based on the videos they are choosing to watch, much like seeing commercials for toys during Saturday morning cartoons.

    8-23-2019

    YouTube’s data collection policies have garnered attention from media and government agencies alike over the past several months. After some shocking reports about child pornography on the site and restrictions handed down from the FTC, Google is finally taking some real steps to comply with the Child Online Privacy Protection Act (COPPA.) Bloomberg reported this week that YouTube will be ending targeted ads on videos intended for children. 

    Obviously, ads that target viewers use data that has been collected in order to assign advertisement to that user. If YouTube is targeting ads to children, it stands to reason that they are collecting information about them as viewers in order to create their advertising profile in the first place. This data collection is blatantly against COPPA and one of the reasons the site was investigated by the FTC earlier this year.

    YouTube has already cut advertising income from videos that feature disturbing content aimed at children and eliminated comments in videos that feature children. It is estimated that YouTube makes nearly $750m annually from advertising on children’s videos. Obviously eliminating those targeted ads could seriously hurt Google’s bottom line but they say it is the least damaging option. There are other ways for YouTube to serve somewhat targeted ads to children. The company can use ads that are chosen based on the videos they appear on, thus tying the kids’ interest in the video itself with the ad that will be served. Those who have brought complaints against YouTube about their COPPA violations aren’t expected to be satisfied with that solution either.

    The Best Way to Keep Your Kids Safe On Youtube

    What Parents Should Know

    Of course YouTube wants your children to use YouTube Kids. This is how they protect themselves from the very mess they are in now. They say that YouTube Kids doesn’t collect data from viewers and only shows ads as they relate to the video users are watching. Even so, my recommendation is that your kids only watch YouTube in a place that everyone can see what they are watching. If inappropriate content comes up you will want to see what it is. This way you can talk to your child about what they saw and how to avoid seeing that in the first place.

    Another option is to use YouTube Premium to eliminate ads all together. We use this so that when we build a playlist of videos for our kids, we can be sure that they’ll only see what we selected and not some other video ad for something we may not approve of. YouTube is trying all they can to keep their ad based ecosystem alive while staying out of dangerous apps list and tech safety expert blog posts. Only time will tell if they are able to do so. This change could be a very tiny step in the right direction. 

  • How “Kids Games” Give Predators Unmonitored Access to Children

    How “Kids Games” Give Predators Unmonitored Access to Children

    I was contacted this week by a parent who was shocked to find that adults had been chatting with her young son in Disney Heroes, Battle Mode, an app rated 9+ in the Apple App store. She sent me screen shots in which players were asking her son if he was a boy or girl. They asked how old he was and where he was from. One of them even confessed, “I am not a kid. LOL.”  Obviously, when his mother found these messages she was extremely concerned, she removed access to that game and set some limits for their whole family for a while. Then, just a few hours later I received a link from a concerned parent about an app in which people are posing as employees of the game company and asking children to send pictures “without a shirt on” to prove their age. She asked if this was true and my response was that yes, these things are happening every single day. Here’s why these predators can gain such easy access to our kids.

    Disney Heroes Battle Mode

    After hearing about the trouble with Disney Heroes Battle Mode I downloaded the app to see what it was all about. After a short cenimatic and then playing through the tutorial you get a notification that the app has purchases built-in and that you shouldn’t be under 13 (app is rated 9+ in the app store. if you want to play. I simply tapped continue and moved right past the warning. No age verification, no password, no face id, nothing. Once in the app I started looking through the settings. I did find controls for the chat feature, including a password protected on/off toggle for chat access. This was good to see, especially since the issue I was researching had to do with chatting.

    The problem is that apps like Disney Heroes give parents a false sense of security. The app is made by Disney, the company’s name on anything makes many parents think that the product is made with their kids’ health in mind. This could not be further from the truth. Disney is out for exactly what every other major corporation is out for, their financial bottom line. We have to remember that data is big money and apps that are made for kids collect just as much data as any other app. Data that is personalized to a user is worth more money which means app developers need users to make an account to sort and identify their data more easily. The easiest way to convince app users to create an account is by making it the only way they can chat with friends in the game.

    What Parents Should Know

    I recommend taking a look at the game your kids play on their pones or tablets. Just because the game features cartoon characters doesn’t mean there aren’t adults playing the game. If the game has a social feature like chat or friend-mode you can be sure that your kids will be contacted by strangers. Look in the settings, preferences, or options of the games to see if there is a way to turn off chat mode. If they don’t allow you to disable social features, I would uninstall the game and encourage your child to find a different game to play.

    We must remember that the companies that make these games offer them for free because their money comes from in-app purchases and advertising. In order to make money they have to keep people playing the games as long as possible. Research shows that there is no better way to keep someone in your app than social engagement. People will be sure to keep coming back if they have friends in the game to play with or against. This means that they will continue to put these social features in their games and while app stores may rate these games as safe for younger children, my rule is that if it has a social element it should be for kids older than 13. Even then you should ensure that you child understands what they should do if they are approached online by a stranger and encourage them to tell you if someone makes them uncomfortable in any social engagement online. We can do our best to protect them from this software but nothing is more affective in preventing these dangerous encounters than teaching them how to recognize them and end the conversations immediately.