Tag: kids

  • Roblox Has Hit 90 Million Users

    Roblox Has Hit 90 Million Users

    That little game your kids like to play, Roblox, has hit 90 million users as of this week. The company is worth more than 2 billion dollars and has expanded globally to add more than twenty million users over the past six months. Roblox is a gaming “platform” more than a game, giving users the ability to create their own levels and share them with others who play the game. When you log in to Roblox you see a list of user-generated levels that you can play. You join with other users and try out the different games which include combat, stealth, mazes, puzzles, and sometimes just overall silliness. Some of the user built levels are just places to hang out and meet new people.

    Roblox has been considered social gaming from the beginning. The entire platform is designed around allowing users to share their content within the game. This requires you to meet new people and possibly even chat with them in order to fully enjoy the game. When you first log in you see multiple games listed that you can choose from. Most of the starting games are curated by the developers but once you start meeting new people you can be invited to their creations and who knows where you’ll end up at that point. This is the concern for many parents when it comes to Roblox.

    Musical.ly is now Tik Tok

    What Parents Should Know

    I played a bit of Roblox recently and could definitely see the appeal. The user made levels are pretty neat and very elaborate. I played a silly banana game, tried a “Wipeout” style obstacle course, and played around in a world of puzzles. Really there is no limit to the amount of fun you can have in the app, the only guideline is imagination. This is a great thing for kids as long as the community rules are followed in the way they are supposed to be.

    There are parental controls available and their password protected but they could hinder some of the options in the game. It is intended to be played in a public and sharable social setting. There have been instances when people have abused some of the settings in the app to perform inappropriate behavior in the game. This will always be a possibility when a game is meant to be so extensively social. There is also no age verification which means you can claim to be whatever age you want when you create your account. I created a 13-year-old girl, just to prove the point.

    Roblox is on my uninstall list because of the social nature of the game. There aren’t really any alternatives that work exactly like Roblox but there are games that will let you craft and build and get creative while staying offline. Games like Toca Builders (Android/iOS)can give you the building features while Toca Life World (Android/iOS) is a game made for older kids that gives a safer environment to explore and play in a world they create on their own. With 90 million users, Roblox has a lot of people creating games and communicating with the kids that are playing. Being careful that your kids can’t just talk to random strangers online is one of the most important things we can do to protect them.

    Listen to this article as a podcast below:

  • How Video Game Developers can Help Parents

    How Video Game Developers can Help Parents


    I think video games can be fun and good for my kids if kept in the right context. We have very strict rules about gaming in our home and do our best to limit our kid’s access, screen time, and exposure to some of the gaming content available. Unfortunately many developers build their games (even kids’ games) that make screen time and other restrictions hard for parents. If I could speak to a room full of game devs, here a few of the things I would say.

    1 Let me save the game whenever I want.

    My children have a strict 30 minutes per day rule on our xbox. They understand when they sit down to play they they have a limited amount of time. My kids know that they’ll be “kicked off” the xbox after a half hour so they save often. They save their Minecraft worlds because they can’t build the crazy epic structures they’ve planned in just 30 minutes. 

    The problem rises when we play games, like the Lego games, that don’t allow you to save your game whenever you want. You have to reach certain milestones or the end of levels to save. When the xbox kicks you out of the game, it resets the game causing you to lose your progress. This means mom or dad have to either continually add time to the limits for the day until they can save the game or we just have to deal with the kids’ frustration for wanting to see the next levels of this game but not being able to because of our time limits. We, as parents, don’t mind being the bad guy but a simple save mechanic built into the pause menu sure would make life easier.

    Parent Guide: Call of Duty Black Ops 4

    2 Password protect your content controls.

    The most popular comment on my review for last year’s Call of Duty game is “hey man, you can turn off the graphic violence.” I’ve replied to most of those comments with, “Cool, but it isn’t password protected so it may as well not be there.” Can we please put content restriction settings behind some sort of pin code? It isn’t that difficult to do. I don’t want my kids playing games that are meant for adults, but some families are ok with their fourteen year old playing a Rated M game if the gore is turned off. Unfortunately, most warm blooded 14 year old boys are into or at least interested in that sort of violent content in film and video games. That means they’ll often turn the restrictions off when mom and dad aren’t looking.

    Maybe that’s a bit too restrictive as your kids get older but isn’t that the parent’s decision to make? Game developers make their games with over the top graphic violence and pretend that their target audience is adults. The reality is that at least half of those who play your games are below the recommended age. This is why they add a content restriction in the game, however, that restriction isn’t helpful if it is only buried a couple of levels deep into your settings menu and doesn’t require any sort of passcode to change.

    3 Don’t force me to make an account to play your game.

    It is already frustrating to have to have an account for everything I do online. Then I have to create separate accounts for each of my kids to let them play games or use apps with parental control settings turned on. If I want each of my kids to have their own settings or their own way through the game I have to have an account for them on our gaming system. When I turn on a game and see that the developer of that title wants me to create yet another profile, on their site this time, it is infuriating. I don’t want to give you my email address. I paid to play your game, isn’t that enough? I get having an online account so I can play multiplayer but games that require me to have a profile with your company even to play the local offline campaign is simply data mining. I don’t need it. Especially with my kids information.

    Parents Guide: Apex Legends (Titanfall Battle Royale)

    What Can Parents Do?

    This post may be a bit ranty but I’m not the only parent I know who has complaints about these issues. It’s hard enough protecting our kids from cyberbullies, adult content, and predators. We have enough drama from our kids alone when we want to simply limit their screen time. The last thing we need is some setting or lack thereof in a video game to make it even harder. The truth, however, is that it’s unlikely a game developer will see this article or video. We have to take responsibility as parents. Either we have to take the role of gatekeeper and keep our kids from games that pose these problems or we have to just have the conflict when it arrives because it’s worth it. It’s worth it to have kids who know how to function when screens are turned off. It’s worth it to have kids that are safe from violent thoughts, nightmares, and attention problems. It’s worth it to protect our kids private information and data from collection by gaming companies and who knows who else.

    Talk to your kids about the limits you’ve set. Take a stand when they try to bypass your settings. Don’t let them play games that cause their behavior to change or keep them interested to the point of obsession. Protect their information by only creating accounts for them on sites that absolutely require it and when you do, use an alias. We live in a new world. A world where data is a form of currency and your kids gaming behavior can be used in so many ways so it is invaluable to the companies that create these games. We have to be responsible for our own family’s Internet safety and healthy tech habits. We can ask developers to make it easier and hope for the best but when it all comes down to it, it is up to you and me.


     


    Podcast:

  • Does Your Kid Need a Fitness Tracking Smartwatch?

    Does Your Kid Need a Fitness Tracking Smartwatch?

    We all want our kids to be healthy. Parents are always telling me they’re concerned that their kids play video games too much and just need to play outside for a bit. I agree. Couldn’t agree more! The fitness wearable (think Fitbit and Apple Watch) industry has made some huge promises about giving us motivation and inspiration to get out and get moving. The wearable trend is making its way to children now too. Garmin and Fitbit have both put out new products that are made for kids. These wearables serve as a watch, a step tracker, a sleep habit monitor, and even reward your kids for meeting goals with achievements and celebrations. My eleven year old son likes wearing a watch. He doesn’t necessarily care about tracking his steps or heart rate, but I’m sure he would love a Fitbit. Should I get him one. I have to ask a few questions first.

    Do Fitness Wearables Work?

    There have been multiple studies since the invention of the Fitbit that have tested the effectiveness of these health tracking watches. Of course the earliest studies featured products that could only track your steps. These “one trick” smart watches weren’t very smart but they promised to get you out and moving so you’d be healthier. The studies showed that those who were originally committed to fitness stayed pretty committed and were a little bit more effective at working out since they could monitor what they had done. People who were given an incentive to work out using their Fitbit tracker did exercise more but no more than those without a Fitbit who received the same incentives, also they stopped excercising as much when the incentives ended. Finally, the extra activity that was logged didn’t result in increased health outcomes. Basically, you are going to be as committed to fitness with a fitness wearable as you would be without one, the same thing is true about your kids.

    Does Your Kid Need a Fitbit or Garmin?

    These products can help those who use them keep track of the amount of activity they are getting. They can use this information to make better decisions about what they do through their day. As mentioned above, however, awareness doesn’t always equal action. Especially when it comes to fitness. Nobody will tell you you shouldn’t do something to keep your kids from being healthy. You know your child. You know if they will be inspired or intimidated by activity tracking and goal setting. You know if they will use their watch for ten days and then set it down, never to pick it up again. Finally, you are the only one who knows for sure if your child will just loose the Smartwatch within ten minutes of putting on their wrist.

    You have to take all of these factors into account when deciding if a fitness tracker is right for you child. As for which ones work best, I don’t have any data to provide you with a conclusion on that. I do, however, have a few family tech safety tips to encourage you to think about while you decide on a wearable for your kids.

    1. Data Security
      It is pretty obvious that the companies that sell fitness wearables use your data quiet liberally. They have to use it to affectively communicate your health information to you and to keep records for you to access later. Fitbit requires parents to make accounts for their children in order for their kids to use their products. By creating this account parents are giving Fitbit permission to access their children data and us it according to their Privacy Policy for Children.
    2. Smartphone Sync
      Most (basically all) of these devices require you to sync with a smartphone of some kind. While it is possible for you to sync the device up with your own phone, your child will see another opportunity to try and convince you that they need a smartphone of their own. Let’s be honest, none of us need our kids to have more points to support the argument that they need a smartphone. Maybe they already have one, great, maybe they have a device they are only allowed to use at home, that’s good too. Be sure you’re allowing them time to sync and use those apps in junction with the smartwatch or you kind of defeat the purpose.
    3. Location Sharing
      The security policies for Fitbit and Garmin both state that they do not automatically collect location data from Fitbit accounts created for children. However, they do collect IP addresses which often contain location data, and you are able to share your location manually which kids could do without realizing it. It is especially important, if you are concerned about leaked or sold location data, that you don’t allow your kids to use a fitness wearable that is connected to an adult’s account. These accounts do share location information by default.

    Be Fit, With or Without a Fitbit

    I’m not going to tell you what to do. As I said above, you know your child and their habits. You know if they are active or not. Some of these wearables can save lives, for kids with diabetes for example, but those are specific situations and, in my opinion, the absolute best and intended use of these products. Most of us have discipline and motivation problems and a fitness tracker can only bring our lack of a healthy lifestyle to our attention, we still have to do something about it. I speak as one who loves pizza and begrudgingly runs about six miles every two weeks. I am “preaching to the choir” as they say, and while I think an Apple Watch or one of the latest Fitbit Smartwatches would be cool to have, the truth is, there are data security issues to discuss, and the trade off for increased health outcomes aren’t guaranteed. Lets just get our kids to a playground more often, and maybe even get out there and play tag with them.

  • It’s Being Called the Ultimate Unsend Button, Does it Encourage False Anonymity?

    It’s Being Called the Ultimate Unsend Button, Does it Encourage False Anonymity?

    Telegram is an end to end encrypted messenger that touts speed, privacy, and security. They have featured private messaging and self destructing messages for a while but their new feature takes privacy to a new level. You can now delete a message you’ve sent from your account and the account you sent it to no matter how long ago it was sent. Telegram is, again, standing up for privacy and users are buying in. Millions have flocked to Telegram after Facebook’s data leak news from the past several months. It looks like Telegram is doubling down on Privacy as their claim to fame. They’ve also added the ability remove your information from a message when the message is forwarded to other users. Some accessibility and ease of use features have also been aded.

     

    What Parents Should Know

    Security and privacy are often overlooked when we allow our kids to use internet connected devices. Privacy is becoming a major concern for experts and activists of family tech safety. Messengers that allow data to be collected and used for advertising shouldn’t be used by children and even teenagers due to the risks of such data being released or revealed without the messenger app developer’s consent. When an app features privacy as it’s distinquishing feature, you have to ask who the data is being kept private from. Obviously, we want data to be kept from third party companies who would use that data to advertise. Sometimes data is even kept private from the company that developed the messenger app that you are using. Telegram has a “secret messages” setting that must be set to keep your information encrypted from end to end. (End to end encryption means not only the company can see or collect what is being sent.)

    Anytime the ability to delete messages you’ve sent is added, I see red flags. While I think privacy is critical, there is also a risk of kids thinking they are safe from inappropriate or incriminating photos or messages being saved and used for nefarious purposes. It only takes a half a second to screen shot a message or image on your screen. Most phones allow you to record your screen to a video very easily. This means that you are non always anonymous online. If you are sending messages to someone, thinking you have complete privacy, you are trusting that the person you’re sending the messages to has your privacy in mind as well. Telegram is an easy way for predators, cyberbullies, and those interested in sexting, to send and receive messages that do their damage and then are removed as evidence.

    I have spoken to parents who have taken their kids to the police with complaints about people trying to groom them online but the police had no evidence because the messages had all been deleted. This is why a messenger makes the FamilyTechBlog uninstall list as soon as they add disappearing messages. It isn’t safe for your kids to chat with a feeling of anonymity or for them to chat with people who can send what they want and make the message go away after it’s been viewed. Telegram is rated 17+ and I fully agree with this rating. Private messengers that allow you to chat with anyone, anywhere shouldn’t be used by children and young teenagers. Especially when the messages can be removed at will.

  • What Parents Need to Know About Stadia by Google

    What Parents Need to Know About Stadia by Google

    On March 19th, Google announced their latest product: Stadia. The promise of Stadia is to allow people to play AAA games (Assassin’s Creed, Fortnite, etc.) without having to buy a dedicated gaming console or PC. How does Google plan to deliver on this promise? With Chrome and YouTube.

    Google has stated that Stadia is “the future of gaming.” I agree. Young adults are used to subscribing to services and streaming their entertainment and Stadia is the next step. Kids already watch hours of gaming content on YouTube every day, why not add the ability to play those games too?

    What We Know Right Now

    We don’t know a lot about Stadia right now but what we do know is pretty impressive.

    • A high-speed Internet connection will be required.
    • Up to 4K HDR at 60fps.
    • Plasy using multiple devices: PCs, laptops, tablets, and smartphones will be supported.
    • No need to download games or wait for updates.
    • You’ll be able to use any USB controller connected to you computer.
    • There will be a dedicated wireless controller.
    • Stadia will be available this year.

    What We Don’t Know Right Now

    Despite all the excitement around this announcement, there are many things we don’t know.

    • The price of the service.
    • The price of the controller.
    • Games available at launch.
    • Supported mobile devices at launch.
    • Release date.
    • Minimum Internet connection speed.

    Podcast Episode:

    What Parents Need to Know

    Your kids are going to want this, especially if they watch gameplay videos on YouTube. Being able to instantly play a game that one of their favorite streamers is playing and try that special move is very appealing.

    If the price is right, this could be an affordable alternative to purchasing a gaming console. Being able to play hundreds of games for $50-$60 a month is more affordable than buying a $600 console and a game or two every month.

    The Stadia controller has a streaming button which means your kids could be online and streaming their game and voice instantly. In fact, they could even join in a game with another person. Parents should be aware of this feature and take measures to block it if they don’t want their kids to live-stream.

    Google has been improving their products with better parental controls every year. Parents should familiarize themselves with those parental controls and enable any restrictions they deem necessary. You may want to consider adding time limits, enabling ratings limits, and disabling some of the streaming and cooperative features. Of course, this

  • Creators of Fortnite in Court for “Predatory” Advertising

    Creators of Fortnite in Court for “Predatory” Advertising


    Imagine you go shopping and instead of clothes, toys, or other products you just see boxes. You can’t purchase items on their own, that’s not how this works. Instead, you have to buy a box and hope that what you want is in it. I don’t think that store would be popular for very long, maybe for a while but once the novelty wore off the place would likely go out of business. People want to know that when they pay for something, they are getting what they want or need. In-game “loot boxes,” work basically like the fictional store I described above. You pay a dollar amount small enough to feel meaningless and unlock access to the box. When it opens on your screen you see what you were able to purchase and you can only hope it’s something you wanted or needed for your character.

    Epic games no longer has these types of loot boxes in Fortnite but they did and that’s what this law suit is all about. The boxes advertised the best items that you could get but the family of the young player who this lawsuit is centered around say the chances of actually obtaining those items were very low. This is being interpreted as “predatory,” especially since many of the loot boxes are cute little llama pinatas. Freemium games have been around for a long time but Fortnite is the first game of its kind to have such a large and young player count. Children as young as six or seven are playing Fortnite and purchasing these items to make their characters and weapons look more interesting.

    What Parents Should Know

    If you are inclined to allow your child to play games like Fortnite you need to be aware of a few things. First of all, free is never truly free. There is a reason they don’t charge for the game, it is easier to get a ton of players and have a bunch of them pay for arbitrary avatar and weapon skins than to convince people your game is worth sixty dollars. Many of the top earners in every app store are Free to Play games. These games are popular because they are free to play and the cost of in app purchases seem very low. The trick is how easily you can rack up the amount you spend on the game just to keep yourself playing. Whether it is a game where you’re building a farm and want your crops to grow faster or one in which you are fighting and want better weapons, many of these games let you pay to progress further into the game.

    VIDEO TUTORIAL: iCloud FamilyShare Set-Up

     

    The question, I guess, isn’t whether or not this practice is legal. (Spoiler alert: it is completely legal.) The question is that should it be legal to create in-app purchases that appeal to especially young gamers? These games made for kids that ask you to pay to continue or educational apps that make you pay to unlock more characters have found a way to get past the parent gatekeeper by making the app free. Then the child just has to click “purchase” when the ad pops up in the app and the purchase is made. There are ways for parents to set up controls to keep that from happening but many aren’t aware of how or just don’t think to set it until their credit card has already been used for hundreds or even thousands of dollars.

    The creators of Fortnite may never be held accountable for the way they market products in their games. Whether or not they should be held accountable is up to the courts to decide. As far as parents go, you do have a responsibility to protect your kids in the digital world they live in. Talk to your children about in-app purchases. Help them understand that the money has to come from somewhere. If you are ok with them spending some money in-game then use gift cards instead of credit cards so that when they run out of money, they’re out. Set up controls so they have to ask you to approve in-app purchases. Whatever method you choose, you can keep your kids from being preyed upon by the advertising in these games. You just have to do the research and take the steps.

     

  • Tik-Tok Under Fire as Study Finds 25% of Kids Talk to Strangers Online

    Tik-Tok Under Fire as Study Finds 25% of Kids Talk to Strangers Online

    Tik-Tok is at the center of a controversy surrounding the exposure to predators and child pornographers through live streaming on their app. One in twenty children who use live-streaming apps have been asked to take off their clothes according to a study by the UK’s Children’s Charity NSPCC.  Originally called Musical.ly, Tok Tok claims to “empower everyone to be a creator directly from their smartphones, and is committed to building a community by encouraging users to share their passion and creative expression through their videos.” Their mission statement sounds like they are building a place for our kids to stretch their creative muscles and build a supportive audience but in reality it is exposing them to potential danger.

    Sexual exploitation is only a part of the issue, there are popular hashtags on the app that highlight self harm and eating disorders. Tags like #thinspo (thinsporation) feature videos of children as young as eight showing their rib cages through their skin and proclaiming that they are inspiring to others who desire to be thin. Suicide and self harm are also featured on the app with complete with encouragement to hurt yourself and instructions on how to do so. Tik-Tok says you have to be 13 to use the app but as we have shared multiple times on this site, that age exists to protect the company from legal action concerning the collection of children’s data, not to protect your children from content on the app.

    While the app is rated 12+ in apps stores in the U.S. the reasons listed for the rating prove to be, in fact, very mature. The issue, again, as I’ve mentioned, is user generated content. Anyone with a smartphone and a wifi connection can make videos and now livestream in Tic-Tok, they can also watch you perform on the app. This makes for an open, dangerous atmosphere filled with predators, adult content, scams, and violence.

    What Parents Should Know

    Tik-Tok says they have filters and parental controls in the app that allow you to set the app to private but all of these measures have proven to be less than effective. Kids who use the app on their own can easily come across content that isn’t age appropriate. The content restriction and  time management settings in the app are password protected; they can be useful and should be set up if you allow your child to use Tik-Tok. Also be sure to turn off the ability for non-friends to comment on, share, and download (this is on by default, creepy right?) your child’s videos.

    We don’t want our kids talking to strangers online. All parents understand the dangers associated with live-streaming and posting public videos to the internet. Unfortunately many parents feel that their hands are tied when it comes to keeping their kids safe on these apps and websites. That isn’t the truth, however, there are tools (some in the app and some third party) which you can use to keep them from accessing things that are dangerous. An algorithmic filter is never going to be enough, though, so it is important that we have open communication with out kids about what they are posting and seeing on apps like Tik-Tok. Also, if your child doesn’t meet that age restriction then they shouldn’t use the app.

    Twenty five percent of kids talking to strangers online is a horrifyingly high statistic. It shows that while there are privacy settings and parental controls out there for parents to use, either parents aren’t using them or their kids are getting around them. I know that the privacy settings in Tik-Tok aren’t password protected so if your children want to talk to strangers on the app and they have time using the app by themselves there are ways for them to make that happen. It is important that parents take the responsibility to protect our kids online. Many media outlets are blasting these companies for putting our kids in danger but I have to be honest, you don’t blame the slide for your kid falling off and busting their face, you think of precautions that YOU can take to keep that from happening in the future.

  • There is a Child Pornography Ring on YouTube and Everyone’s Making it About the Money

    There is a Child Pornography Ring on YouTube and Everyone’s Making it About the Money

    A YouTuber (sensitive content warning) has found evidence that there is a vast community of child predators and those who watch content that contain child exploitation on YouTube. They are using comment sections and timestamps to lead each other to actual child pornography. They start with a search for a simple popular YouTube trend and then the algorithm that YouTube uses to connect viewers with like content will eventually propose a video of kids that can be considered appealing to these viewers. They then click through in the comments section to parts of the video that seem innocent but are, unfortunately, what these predators have been looking for.

    YouTube’s response didn’t come until after advertisers begin pulling down there ads. They began by removing some of the videos and some of the comments, as well as demonitizing (pausing ad revenue) videos on which these comments are posted. YouTubers are concerned because some of their videos have been or could be demonetized because of a commenter’s words, not something they control. Whether you want to blame the site, the viewers, or the makers of the videos, the fact that the conversation goes strait to money is a serious problem.

    I think the money isn’t the issue. I understand hitting them where it hurts and that YouTube should do something but we have to take some responsibility. I believe we need to have a serious conversation about what types of videos we allow our children to post publicly and we should be very concerned about the types of people who watch these videos. Watch the video above to hear more of my thought son this issue.

    The video to initially expose this issue is below, be warned that it’s disturbing to watch and there is adult language. 


    SENSITIVE CONTENT WARNING!

    .

    .

    .

    .

    .

    .

    .

     

     

  • I Can’t Help You Protect Your Kids on Apps Meant for Adults

    I Can’t Help You Protect Your Kids on Apps Meant for Adults

    I field messages and requests all week long from parents who want the latest tools for keeping their kids safe online. They ask about everything from YouTube to Instagram to Snapchat and want apps that will monitor social media use, block adult content, and limit screen time. While there are resources to help parents in each of those departments, some apps just aren’t intended for your younger child. Unfortunately many parents have a real problem giving in to that fact.

    Streaming Videos

    Let’s look at YouTube as our first example. The video app was created in 2005 as a place for anyone to upload short videos to share with their friends. Google purchased YouTube in 2006 and Social Media became popular soon after, rocketing YouTube to the successful streaming platform it has become. The site is loaded with videos from filmmakers, vloggers, video gamers, makeup artists, geeks, professionals, educators, ministers, animators, artists, basically any category you can think of. It has evolved into an immovable force on which there are 300 minutes of footage uploaded every single minute. YouTube has come under fire for some of their content being too mature or sensitive and so they’ve employed algorithms to keep tabs on inappropriate videos. They also released n app for children called YouTube Kids. This app has also seen its share of controversy after YouTube has been unable to keep sensitive material from showing up in videos on the app.

    YouTube obviously wasn’t intended for young viewers. It is a site that is populated primarily by videos uploaded by its users. Some companies that make content for kids use YouTube but this is a choice by these companies in response to the popularity of the platform. It’s an attitude that says: “Kids are there, so we should be there too.” The goal is to reach the audience already there, not necessarily to build an audience on YouTube. There are no real parental controls (safe search is mostly useless) and videos that are labeled as kid friendly are done so without any human eyes ever seeing the entire video. The only time a content reviewer sees the video is when enough users of the site have flagged it as inappropriate. Allowing your kids to watch YouTube on their own is a risk that many parents don’t even realize they are making.

    What about Social Media?

    Snapchat, Instagram, or Facebook are all the same. They, like YouTube, feature content created and posted by the users of the service. This “User Generated Content” can vary from political or religious views, to silly cat videos or memes, and random personal updates that mean nothing to anyone. People also post updates on their serious mental health issues, they share about their plans to harm themselves or others, the post images of themselves in compromising situations, and that’s just what people post publicly. Private messaging contains content that people post when they think nobody except those they trust is watching. Private messaging is how predators groom their victims. It’s how the out of control teenage boy convinces the girl to send him inappropriate pictures of herself. Social Media is intended to be a place to connect with people, some you may know, some you don’t. It is meant to be a public forum and that which is meant to be private, is meant to be completely private. This is where the problems come in when parents ask for ways to monitor their kids social media.

    Age Rating vs Terms and Agreements

    I see a lot of parents giving their kids access to social media and other online activities when they reach the age of 13. This is based on the fact that the terms and agreements that these sites have you approve before making an account list 13 as the minimum age to use their service. A common mistake parents make is thinking that this age is meant to protect their kids from content on the site when, in fact, it’s intended to protect the company from having data and information on kids under the age of 13. COPPA laws say that companies can’t collect and use information of kids under 13 without parental consent. If a company says you can’t use the site if you’re under 13 then they can do whatever they want with all of that data and if your kid is underage, it isn’t their fault. You ignored the Terms and Agreements when you allowed them to use the site.

    Age rating is the age recommendation you’ll see in the app store when you are downloading and app. This age restriction is based on the actual content in the app, not any legal requirements for the company. The usual standard is that apps populated by user generated content are rated 17+. This is because the company can’t guarantee that what is seen on their product won’t be considered adult content. When we allow our kids to use apps that contain user generated content we are allowing them to be subject to the opinions, behavior, and whims of everyone else who uses that app.

    Parental Involvement Before Parental Control

    When I am asked to help parents protect their kids in apps that are obviously not made for children I feel like I’m being asked to give parents a suit their kids can wear to protect them while they play in a burning building. I get it. It isn’t easy to tell your kids they can’t do something they want to do. “My friends are all on Snapchat.” or the one that irritates me to no end, “The teacher/coach says I have to use Facebook to get the homework/practice schedule.” Sometimes we just have to say no. It is difficult to set the boundaries and limits that keep our kids safe but if we have the right attitude about what we’re protecting them from it becomes easier. Social Media, YouTube, video games that are rated M for mature, non of these things are intended for people under the age of 17 and when we allow our kids to use these products, we open them up to a world that is meant for adults.

    It is difficult for algorithms to catch nudity or violence in uploaded videos. Social Media sites and private messaging apps go to great lengths to keep prying eyes from seeing what is being sent. This makes parental monitoring software hard to develop. Unfortunately some burning buildings are just too dangerous and there isn’t much that can be done to protect you if you’re inside. If you aren’t ok with your child seeing content that is meant for grown ups then I recommend thinking about uninstalling that app instead of trying to find software that doesn’t it allow it to do what it was intended to do.

  • Tinder, Grindr, and Predators. Social Media and Suicide. Who do we blame?

    Tinder, Grindr, and Predators. Social Media and Suicide. Who do we blame?


    There were more than 30 instances of abuse of children from the Tinder and Grindr apps since 2015. That number may seem small but when you consider that fact that kids have easily skirted around the age requirements of these dating/hookup apps and made contact with people who wish to harm them, any number is too high. While these companies say they’re doing all they can to keep kids from using their software, all they really say in response to these horrible occurrences is that the predators and kids violated their terms and services. Since the terms say you shouldn’t contact minors and that minors shouldn’t be using their software, they claim the responsibility isn’t theirs because the child was put in danger by using the app in a way that it wasn’t intended to be used.

    Officials are saying that isn’t good enough with law makers in the UK trying to create legislation that will require age verification on apps like Tinder and even some social media apps like Instagram. Recent suicides have been proven to be inspired by images of self harm that were viewed on Instagram. Again, officials at the social media company say that the most violent of the images violate their terms and services. They have recently, however, banned images of self harm and suicide and removed the categories from search results.

    Here is the question: When these horrible things happen, do we blame the companies who make these online products? Is it enough to write a terms and agreements and say that those who break the rules do so at the fault of their own and no fault of the company? So far, legally, that’s all it takes. It seems that the responsibility of the company ends with the terms and conditions page. If the user doesn’t follow the terms, then how is the company supposed to protect users? Some officials are asking for age verification which means keeping more records. This is something many companies don’t want to do because of recent privacy and data breach concerns. There is only one thing I know for sure, if families will get serious about monitoring their kids’ screen time and online activity, the number of these occurrences will dramatically decrease.

    Let me describe a scenario for you. Your 12 year old child wants to meet new people online, maybe they heard some friends talking about a dating or hook up app, maybe they just don’t have a lot of friends in real life. Whatever the reason, they’re looking for a way to meet people. While they’re looking through the app store they see this in the search results:

     

    They tap download, create a profile and start swiping. Eventually meeting new people on the app. Conversations move to WhatsApp, Facebook Messenger, or Signal and they schedule a meetup. Your imagination can take over from there and if you’ve read some of the news stories it can get pretty awful.

    Imagine, now, that you have parental controls set so that your child has to request permission to download apps. Maybe you even have their controls set to keep them from downloading apps rated for users over 12 years of age. Either of these approaches would keep you from hearing about your child’s new friendship or worse, romantic relationship with a stranger online. Instead, you’ll see that they’re trying to download an app that is designed to connect people for romantic relationships and be able to discuss this with them. You can share the dangers of building relationships with strangers and help them understand the importance of privacy, security, and parental supervision.

    There are built in ways to protect your child on both iOS and Android devices. The key is to set them up. Use the built in protections and features and don’t rely on these companies to protect your children. They don’t exist to keep your family safe or even to help people build healthy relationships. These companies develop their products to make money. It is foolish to expect Instagram to protect your kids from suicide, should they have a responsibility for what is on their app, yes, should you blame them if your kid harms themselves because they see something on the app, not entirely. You have to take some of the blame onto yourself. There are ways to keep your kids safe from that kind of content. If you don’t know about it or don’t use it, it isn’t the fault of the company. It’s yours. Be involved, pay attention, and do the work to keep them safe.