Tag: predators

  • A Parents’ Guide to Among Us

    A Parents’ Guide to Among Us

    This is a Parents’ Guide to Among Us
    This guide is intended to inform parents to help them make quality decisions for their families. The rating is based on my opinion of playing Among Us and viewing others playing the game as well.

    The rating below is based on the game content. Online interactions will always increase the risk of unwanted content.

    Violence – 3
    Language – 4
    Sexual Content – 5
    Positive Message – 2
    Monetization -2

    Total Score – 16  out of 25
    (The higher the rating, the safer the game is for kids.)

    ESRB Rating – Among Us has an ESRB rating of 10+. It is rated 9+ in the app stores and Common Sense Media gives it a rating of 10+.

    About the Game

    Among Us is an online multiplayer game of social deduction, teamwork, and betrayal. You play as crewmates on a space ship or space station who are trying to prepare the ship for take off. You have tasks that you all must complete to win the game. The catch is that there is an imposter Among Us. This (or these) imposter(s) can sabotage your efforts to prepare your ship, they can also kill you or your crewmates. When a dead body is found, a meeting is called. The entire crew discusses what has happened and what they’ve seen that could give hints as to who the imposter is. They then all vote and if someone gets a majority of votes, they are ejected from the ship. If that person was an imposter, the crew wins, otherwise, it’s back to the ship to complete your tasks and hope the imposter doesn’t get to you first.

    This game has a little bit of everything. There are simple puzzles, social interactions, mystery, and even some opportunity to be a little dark by killing your friends in-game. The graphics are simple and a bit silly, but the gameplay is so fun that it doesn’t matter. This is truly a social game and cannot be played on your own. There is a “freeplay” mode in which you can explore the map and get familiar with puzzles but it is really just for preparing to play online multiplayer.

    Violence

    One of the key themes in Among Us is murder. The imposter is trying to sabotage the ship by whatever means necessary. This usually includes killing crew members. You kill by simply tapping or clicking an icon when you’re close enough to a crewmate. There is then a short animation of your murder. Sometimes you slice them in half, sometimes your small companion (in-game purchase) will shoot them, and sometimes a spear-like tongue will come from you and pierce them in the face. While the animations are a bit graphic, they aren’t really bloody or gory, and they very cartoon/silly. The characters don’t look like humans, they are better described as colorful walking spacesuits so when they are killed, there isn’t much realism.

    Language

    There is no dialog or narration in Among Us. This means that there is no adult language in the game itself. This is a game, however, that is meant to be played with other people over the internet. When you play a multiplayer game online you are always opening yourself up to unsavory language. In Among Us, this happens in the chat which is used to discuss murders and vote out crewmates. There is a censor mode that is on by default. This censor will use symbols to block out adult language and other inappropriate comments. This doesn’t mean that players don’t use these words. You’ll often see sentences with words asterisked out and most of us can tell by the number of symbols and the context of the sentence what words were meant. It is nice that a censor is included and on by default, it is simple to deactivate with one click/tap and is not password protected.

    Sexual Content

    Again, there is no sexual content in Among Us. The style of the game doesn’t lend itself to that kind of material. This is another issue, however, that is greatly impacted by online play. While the censor mentioned above will block some sexual comments, most make it through. While playing the game I saw many players with suggestive usernames. Nothing obvious but definitely innuendo. When these names were commented on in chat, however, they were mostly met with annoyance by other players who just wanted to play the game and were therefore not amused.

    In other words, there will always be people who think their immature sexual jokes and comments are funny but in such a social game you’ll also find a majority of players who aren’t interested in that kind of humor. These players usually kick out or shut down the inappropriate players pretty quickly.

    Positive Message

    I guess we can talk about teamwork and trust here but in reality, this game is just all about having fun. There is no real moral to Among Us, it is intended to be a clone of the classic party game Mafia but set in space. Playing with friends is easy through their local or private game settings and this allows for kids to have fun with friends even though we can’t be around each other all of the time these days. I think this is what made Among Us the breakout game of 2020 even though it has already been released for two years.

    Monetization

    Among Us does have in-game purchases but they aren’t game-changing. You can buy packs of costumes, skins, and even pets. The prices are between $1 and $3 per pack and the game is definitely playable without spending more than the $4.99 it cost on the PC. The mobile version (free for Apple and Android) has ads that can be removed for $1.99. I recommend removing these ads because some of the games advertised should, in my opinion, be rated for adults only.

    What Parents Should Know

    Among Us is a game that I have been playing quite often lately. It is easy to pop in and do a ten or fifteen minute round and then log off. I have played in public rooms with friends as well, that was quite fun as we were able to work together (trying not to cheat) to complete tasks and win. It can be a time drainer as you always want to play another round. I find myself saying “one more round” a few times before I actually quit the game. Like Fortnite or other online multiplayer games, kids aren’t going to want to drop out in the middle of a game so giving them a warning about getting off their screen will be better than saying, “Put it away, now!” Trust me, you’ll have less conflict if you say “Be finished after this round, alright?” and then hold them to that.

    The only real danger in this game is from strangers online. While that is always a concern with online multiplayer games, rounds are so short and fast-paced in Among Us that there isn’t much time for “grooming” or bullying especially since there is no private or direct messaging. You can stay in the same “Lobby” to play with the same people but it is so easy to back out and go into another game if you need to that I wouldn’t expect too much trouble from people in chat in Among Us.

    As with most games, my recommendation is that parents understand Among Us, how it works, and what their kids like about it. Know who they are playing with online and if they are playing with strangers, be sure they feel comfortable coming to you if they see something that makes them feel strange. This game is simple enough and quick enough that many parents should be able to play along with their kids some as well. Do this. It would be really fun for you to get into their world a little bit, plus you may just enjoy the game yourself.

  • Dangerous Random Live Video Chatting Apps are Dominating Social Media

    Dangerous Random Live Video Chatting Apps are Dominating Social Media

    A reader sent me this article written in the Washington Post today and I wanted to post my response. The article outlines the problems that Apple is having keeping “unwanted sexual content” out of apps on their iOS App Store. Apps like Monkey, Yubo, and ChatLive are all apps that allow you to chat live with random people, often only connecting you based on the gender you say you’d like to chat with. The problem with these apps is that most of them have no way to verify your identity, gender, age, or anything. This means that kids who use these apps are chatting with random strangers, many of whom are much older than them and have nefarious intentions.

    The complains in the article are centered specifically around “unwanted sexual material.” As you can imagine, the consequences of this content is often our young kids seeing images of people in mature circumstances whether they were seeking that kind of content or not. When you can just chat with someone randomly, you never know who is going to show up on your screen. When the person that shows up is in a compromising position, you’ve already seen it and it is impossible to unsee it at that point. Our kids are being shown this nonsense and the developers of apps are monetizing some of the only ways you can filter the content. (i.e. Monkey making you spend their in-app currency called “bananas” to select what gender you want to chat with). Those who run the app stores (Google or Apple) often say they do their best to keep apps with inappropriate content off of their stores, especially when it comes to apps that children use, but once they’ve labeled the app 17+ they pretty much shift the responsibility to the adult who is caring for the child.

    The Monkey App will be a Hotbed for Predators

    What Parents Should Know

    Two years ago I wrote an article about the dangers of the app Monkey and how it would become a hotbed for predators and “unwanted sexual content.” Today, Monkey is mentioned in the Washington Post article as one of the main companies with this content in their app, citing: “About 2 percent of all iOS reviews of Monkey, ranked 10th most popular in Apple’s social networking category earlier this month, contained reports of unwanted sexual experiences, according to The Post’s investigation.” Does 2% constitute a “hotbed?” I don’t know. But I will say it is a cause for great concern, especially since this is only a percentage of the reviews that mentioned the problem, mostly parents who saw that their children had been assaulted with adult content in the app. It doesn’t measure those who saw it and didn’t report it for one reason or another. I had been contacted by Allen Loh, head of global expansions for the Holla Group, operators of the Monkey App, six months ago or so and he assured me that they were working to address some of the safety and content concerns within the app. I have reached out again to get updated information about these issues but have not, as of yet, received a response.

    iOS 12’s Screen Time App Changes Everything!| Video

    The only real way to ensure your child is protected from the unwanted content in apps like these is to use the restriction settings built in to your operating system. Apple’s Screen Time has a restrictions setting in which you can set a maximum age rating for apps that your child can download. If your 15 year old has an iPhone, you can set the restriction to 12+ to ensure that apps rated 17+ won’t be available. Android users can use FamilyLink to set App Store restrictions for their younger children. These restrictions, however, will be automatically set to “adult” when your child turns 13.

    As I always say, the most important thing is communication with your child. You have to make them aware of the dangers of chatting with random strangers on the internet. As obviously dangerous as that sounds, these apps are branded and marketed as a fun way to meet new people. They build an environment that is like going to the mall or the movies back in our day, but instead it is all within the anonymity an app. Unfortunately, within an app, the weirdo who wouldn’t go out into public and make advances at your kids is there waiting to find someone he can groom, send adult pictures to, or violate in some other way. Parents need to create a safe space for our kids to come when they feel threatened or violated by someone online or in an app they use. Many of the stories of parents finding unwanted sexual content within an app were made known only because their child knew to come to them when they saw something that made them uncomfortable or feel violated. Do everything you can to protect your children’s hearts, eyes, and minds and then be sure they know they can come to you if something inappropriate comes across their screen.

     

  • How “Kids Games” Give Predators Unmonitored Access to Children

    How “Kids Games” Give Predators Unmonitored Access to Children

    I was contacted this week by a parent who was shocked to find that adults had been chatting with her young son in Disney Heroes, Battle Mode, an app rated 9+ in the Apple App store. She sent me screen shots in which players were asking her son if he was a boy or girl. They asked how old he was and where he was from. One of them even confessed, “I am not a kid. LOL.”  Obviously, when his mother found these messages she was extremely concerned, she removed access to that game and set some limits for their whole family for a while. Then, just a few hours later I received a link from a concerned parent about an app in which people are posing as employees of the game company and asking children to send pictures “without a shirt on” to prove their age. She asked if this was true and my response was that yes, these things are happening every single day. Here’s why these predators can gain such easy access to our kids.

    Disney Heroes Battle Mode

    After hearing about the trouble with Disney Heroes Battle Mode I downloaded the app to see what it was all about. After a short cenimatic and then playing through the tutorial you get a notification that the app has purchases built-in and that you shouldn’t be under 13 (app is rated 9+ in the app store. if you want to play. I simply tapped continue and moved right past the warning. No age verification, no password, no face id, nothing. Once in the app I started looking through the settings. I did find controls for the chat feature, including a password protected on/off toggle for chat access. This was good to see, especially since the issue I was researching had to do with chatting.

    The problem is that apps like Disney Heroes give parents a false sense of security. The app is made by Disney, the company’s name on anything makes many parents think that the product is made with their kids’ health in mind. This could not be further from the truth. Disney is out for exactly what every other major corporation is out for, their financial bottom line. We have to remember that data is big money and apps that are made for kids collect just as much data as any other app. Data that is personalized to a user is worth more money which means app developers need users to make an account to sort and identify their data more easily. The easiest way to convince app users to create an account is by making it the only way they can chat with friends in the game.

    What Parents Should Know

    I recommend taking a look at the game your kids play on their pones or tablets. Just because the game features cartoon characters doesn’t mean there aren’t adults playing the game. If the game has a social feature like chat or friend-mode you can be sure that your kids will be contacted by strangers. Look in the settings, preferences, or options of the games to see if there is a way to turn off chat mode. If they don’t allow you to disable social features, I would uninstall the game and encourage your child to find a different game to play.

    We must remember that the companies that make these games offer them for free because their money comes from in-app purchases and advertising. In order to make money they have to keep people playing the games as long as possible. Research shows that there is no better way to keep someone in your app than social engagement. People will be sure to keep coming back if they have friends in the game to play with or against. This means that they will continue to put these social features in their games and while app stores may rate these games as safe for younger children, my rule is that if it has a social element it should be for kids older than 13. Even then you should ensure that you child understands what they should do if they are approached online by a stranger and encourage them to tell you if someone makes them uncomfortable in any social engagement online. We can do our best to protect them from this software but nothing is more affective in preventing these dangerous encounters than teaching them how to recognize them and end the conversations immediately.

     

  • Facebook is Making a Dating Feature while Instagram Works to Curb Bullying

    Facebook is Making a Dating Feature while Instagram Works to Curb Bullying

    Social Media News from Facebook’s F8 Conference

    The F8 Developers Conference is Facebook’s annual event to showcase what they are working on in their numerous social media and messaging platforms. Tuesday’s announcements featured Facebook’s new features to connect people for romance and new friendships. Instagram is looking to stop bullying before it happens.

    Facebook Dating

    The dating feature for Facebook has been tested in several countries including Mexico, Thailand, and Canada. It will be rolled out soon to more countries and finally released in the United States “by the end of this year.The latest update to Facebook Dating allows you to build a secret crush list. This list of eight people will be saved and compared to the lists of your friends who also use Facebook Dating. If any of your crushes add you to their crush list you will both be notified so you can make a connection.

    I guess, if you’re going to try to make romantic connections on social media it is better to start with people you’re already friends with. Facebook says it will help you with connections based on your groups, likes, and comments on their app. Their goal is to help connect you to people with whom you share interested, thus, increasing the chance of you having a match. They actually said they are not trying to make connections for a one time hook up but to actually help you find someone you’d be interested in having a real relationship with.

    Facebook is also testing features that will recommend new friends based on your interests, location, work, and even what college you went to. Again, being tested in just a few countries, the Meet New Friends feature will allow users to opt-in and then customize their profile to tell the system what interests to prioritize while connecting them with new friends. You can even list what activities you’d like to do with new friends and then be prompted to send a private message to someone and make plans to do that activity.

    What Parents Should Know

    Fewer of our kids use Facebook now but there are those that still spend time on there. Dating and Friend Finding features can be problematic for parents who are concerned about their kids making unwanted connections on social media. My advice is to not allow your child on social media until around the age of 16 (based on their maturity) and even then keep open conversation with them about the kind of people they make friends with online. My rule will be to only allow my friends to communicate online with people they already know really well in real life.

    Instagram Fights Bullying

    While Facebook is trying to connect you with more people, Instagram is working to protect you from the people you’re already connected to. Developers have announced a tool that will nudge users to think twice before posting a negative comment on an Instagram photo. They can choose to ignore the advice and post it anyway, but Instagram is hoping that causing them to give pause will curb some of the negativity that Instagram is becoming known for. There are also tools in development that will allow users to block comments from certain users without blocking their accounts altogether.

    Just in case blocking comments isn’t enough of a break from the negativity, another Instagram feature will let you take a break by going into “away mode.” This is a way to sign off of Instagram for a while, no longer get messages, comments, and notifications or be prompted to post, but still not have to delete your account. Also, in an attempt to make Instagram “less pressurized” they are testing the ability to hide like counts.

    What Parents Should Know

    We have all heard stories of young people deleting or archiving photos because they didn’t get enough likes. We’ve read the horrible news stories about kids who harmed themselves, or worse, as a result of being bullied on Instagram or Snapchat. These efforts by Instagram to curb some of the negativity are a great idea. In my opinion, however, there is no better line of defense that parents. Our job is to create that safe space for our kids to come to when they have a question or concern about social media. We should be the ones determining how old they should be before they sign up for that Instagram profile. We should be who they come to when some stranger reaches out to them on Snapchat. That can’t happen if we aren’t aware or if we are too timid about the time they spend on social media. If we will take our role seriously we can raise kids who are healthy and whole.

    Listen to this post as a podcast below:

  • Roblox Has Hit 90 Million Users

    Roblox Has Hit 90 Million Users

    That little game your kids like to play, Roblox, has hit 90 million users as of this week. The company is worth more than 2 billion dollars and has expanded globally to add more than twenty million users over the past six months. Roblox is a gaming “platform” more than a game, giving users the ability to create their own levels and share them with others who play the game. When you log in to Roblox you see a list of user-generated levels that you can play. You join with other users and try out the different games which include combat, stealth, mazes, puzzles, and sometimes just overall silliness. Some of the user built levels are just places to hang out and meet new people.

    Roblox has been considered social gaming from the beginning. The entire platform is designed around allowing users to share their content within the game. This requires you to meet new people and possibly even chat with them in order to fully enjoy the game. When you first log in you see multiple games listed that you can choose from. Most of the starting games are curated by the developers but once you start meeting new people you can be invited to their creations and who knows where you’ll end up at that point. This is the concern for many parents when it comes to Roblox.

    Musical.ly is now Tik Tok

    What Parents Should Know

    I played a bit of Roblox recently and could definitely see the appeal. The user made levels are pretty neat and very elaborate. I played a silly banana game, tried a “Wipeout” style obstacle course, and played around in a world of puzzles. Really there is no limit to the amount of fun you can have in the app, the only guideline is imagination. This is a great thing for kids as long as the community rules are followed in the way they are supposed to be.

    There are parental controls available and their password protected but they could hinder some of the options in the game. It is intended to be played in a public and sharable social setting. There have been instances when people have abused some of the settings in the app to perform inappropriate behavior in the game. This will always be a possibility when a game is meant to be so extensively social. There is also no age verification which means you can claim to be whatever age you want when you create your account. I created a 13-year-old girl, just to prove the point.

    Roblox is on my uninstall list because of the social nature of the game. There aren’t really any alternatives that work exactly like Roblox but there are games that will let you craft and build and get creative while staying offline. Games like Toca Builders (Android/iOS)can give you the building features while Toca Life World (Android/iOS) is a game made for older kids that gives a safer environment to explore and play in a world they create on their own. With 90 million users, Roblox has a lot of people creating games and communicating with the kids that are playing. Being careful that your kids can’t just talk to random strangers online is one of the most important things we can do to protect them.

    Listen to this article as a podcast below:

  • Tik-Tok Under Fire as Study Finds 25% of Kids Talk to Strangers Online

    Tik-Tok Under Fire as Study Finds 25% of Kids Talk to Strangers Online

    Tik-Tok is at the center of a controversy surrounding the exposure to predators and child pornographers through live streaming on their app. One in twenty children who use live-streaming apps have been asked to take off their clothes according to a study by the UK’s Children’s Charity NSPCC.  Originally called Musical.ly, Tok Tok claims to “empower everyone to be a creator directly from their smartphones, and is committed to building a community by encouraging users to share their passion and creative expression through their videos.” Their mission statement sounds like they are building a place for our kids to stretch their creative muscles and build a supportive audience but in reality it is exposing them to potential danger.

    Sexual exploitation is only a part of the issue, there are popular hashtags on the app that highlight self harm and eating disorders. Tags like #thinspo (thinsporation) feature videos of children as young as eight showing their rib cages through their skin and proclaiming that they are inspiring to others who desire to be thin. Suicide and self harm are also featured on the app with complete with encouragement to hurt yourself and instructions on how to do so. Tik-Tok says you have to be 13 to use the app but as we have shared multiple times on this site, that age exists to protect the company from legal action concerning the collection of children’s data, not to protect your children from content on the app.

    While the app is rated 12+ in apps stores in the U.S. the reasons listed for the rating prove to be, in fact, very mature. The issue, again, as I’ve mentioned, is user generated content. Anyone with a smartphone and a wifi connection can make videos and now livestream in Tic-Tok, they can also watch you perform on the app. This makes for an open, dangerous atmosphere filled with predators, adult content, scams, and violence.

    What Parents Should Know

    Tik-Tok says they have filters and parental controls in the app that allow you to set the app to private but all of these measures have proven to be less than effective. Kids who use the app on their own can easily come across content that isn’t age appropriate. The content restriction and  time management settings in the app are password protected; they can be useful and should be set up if you allow your child to use Tik-Tok. Also be sure to turn off the ability for non-friends to comment on, share, and download (this is on by default, creepy right?) your child’s videos.

    We don’t want our kids talking to strangers online. All parents understand the dangers associated with live-streaming and posting public videos to the internet. Unfortunately many parents feel that their hands are tied when it comes to keeping their kids safe on these apps and websites. That isn’t the truth, however, there are tools (some in the app and some third party) which you can use to keep them from accessing things that are dangerous. An algorithmic filter is never going to be enough, though, so it is important that we have open communication with out kids about what they are posting and seeing on apps like Tik-Tok. Also, if your child doesn’t meet that age restriction then they shouldn’t use the app.

    Twenty five percent of kids talking to strangers online is a horrifyingly high statistic. It shows that while there are privacy settings and parental controls out there for parents to use, either parents aren’t using them or their kids are getting around them. I know that the privacy settings in Tik-Tok aren’t password protected so if your children want to talk to strangers on the app and they have time using the app by themselves there are ways for them to make that happen. It is important that parents take the responsibility to protect our kids online. Many media outlets are blasting these companies for putting our kids in danger but I have to be honest, you don’t blame the slide for your kid falling off and busting their face, you think of precautions that YOU can take to keep that from happening in the future.

  • There is a Child Pornography Ring on YouTube and Everyone’s Making it About the Money

    There is a Child Pornography Ring on YouTube and Everyone’s Making it About the Money

    A YouTuber (sensitive content warning) has found evidence that there is a vast community of child predators and those who watch content that contain child exploitation on YouTube. They are using comment sections and timestamps to lead each other to actual child pornography. They start with a search for a simple popular YouTube trend and then the algorithm that YouTube uses to connect viewers with like content will eventually propose a video of kids that can be considered appealing to these viewers. They then click through in the comments section to parts of the video that seem innocent but are, unfortunately, what these predators have been looking for.

    YouTube’s response didn’t come until after advertisers begin pulling down there ads. They began by removing some of the videos and some of the comments, as well as demonitizing (pausing ad revenue) videos on which these comments are posted. YouTubers are concerned because some of their videos have been or could be demonetized because of a commenter’s words, not something they control. Whether you want to blame the site, the viewers, or the makers of the videos, the fact that the conversation goes strait to money is a serious problem.

    I think the money isn’t the issue. I understand hitting them where it hurts and that YouTube should do something but we have to take some responsibility. I believe we need to have a serious conversation about what types of videos we allow our children to post publicly and we should be very concerned about the types of people who watch these videos. Watch the video above to hear more of my thought son this issue.

    The video to initially expose this issue is below, be warned that it’s disturbing to watch and there is adult language. 


    SENSITIVE CONTENT WARNING!

    .

    .

    .

    .

    .

    .

    .

     

     

  • I Can’t Help You Protect Your Kids on Apps Meant for Adults

    I Can’t Help You Protect Your Kids on Apps Meant for Adults

    I field messages and requests all week long from parents who want the latest tools for keeping their kids safe online. They ask about everything from YouTube to Instagram to Snapchat and want apps that will monitor social media use, block adult content, and limit screen time. While there are resources to help parents in each of those departments, some apps just aren’t intended for your younger child. Unfortunately many parents have a real problem giving in to that fact.

    Streaming Videos

    Let’s look at YouTube as our first example. The video app was created in 2005 as a place for anyone to upload short videos to share with their friends. Google purchased YouTube in 2006 and Social Media became popular soon after, rocketing YouTube to the successful streaming platform it has become. The site is loaded with videos from filmmakers, vloggers, video gamers, makeup artists, geeks, professionals, educators, ministers, animators, artists, basically any category you can think of. It has evolved into an immovable force on which there are 300 minutes of footage uploaded every single minute. YouTube has come under fire for some of their content being too mature or sensitive and so they’ve employed algorithms to keep tabs on inappropriate videos. They also released n app for children called YouTube Kids. This app has also seen its share of controversy after YouTube has been unable to keep sensitive material from showing up in videos on the app.

    YouTube obviously wasn’t intended for young viewers. It is a site that is populated primarily by videos uploaded by its users. Some companies that make content for kids use YouTube but this is a choice by these companies in response to the popularity of the platform. It’s an attitude that says: “Kids are there, so we should be there too.” The goal is to reach the audience already there, not necessarily to build an audience on YouTube. There are no real parental controls (safe search is mostly useless) and videos that are labeled as kid friendly are done so without any human eyes ever seeing the entire video. The only time a content reviewer sees the video is when enough users of the site have flagged it as inappropriate. Allowing your kids to watch YouTube on their own is a risk that many parents don’t even realize they are making.

    What about Social Media?

    Snapchat, Instagram, or Facebook are all the same. They, like YouTube, feature content created and posted by the users of the service. This “User Generated Content” can vary from political or religious views, to silly cat videos or memes, and random personal updates that mean nothing to anyone. People also post updates on their serious mental health issues, they share about their plans to harm themselves or others, the post images of themselves in compromising situations, and that’s just what people post publicly. Private messaging contains content that people post when they think nobody except those they trust is watching. Private messaging is how predators groom their victims. It’s how the out of control teenage boy convinces the girl to send him inappropriate pictures of herself. Social Media is intended to be a place to connect with people, some you may know, some you don’t. It is meant to be a public forum and that which is meant to be private, is meant to be completely private. This is where the problems come in when parents ask for ways to monitor their kids social media.

    Age Rating vs Terms and Agreements

    I see a lot of parents giving their kids access to social media and other online activities when they reach the age of 13. This is based on the fact that the terms and agreements that these sites have you approve before making an account list 13 as the minimum age to use their service. A common mistake parents make is thinking that this age is meant to protect their kids from content on the site when, in fact, it’s intended to protect the company from having data and information on kids under the age of 13. COPPA laws say that companies can’t collect and use information of kids under 13 without parental consent. If a company says you can’t use the site if you’re under 13 then they can do whatever they want with all of that data and if your kid is underage, it isn’t their fault. You ignored the Terms and Agreements when you allowed them to use the site.

    Age rating is the age recommendation you’ll see in the app store when you are downloading and app. This age restriction is based on the actual content in the app, not any legal requirements for the company. The usual standard is that apps populated by user generated content are rated 17+. This is because the company can’t guarantee that what is seen on their product won’t be considered adult content. When we allow our kids to use apps that contain user generated content we are allowing them to be subject to the opinions, behavior, and whims of everyone else who uses that app.

    Parental Involvement Before Parental Control

    When I am asked to help parents protect their kids in apps that are obviously not made for children I feel like I’m being asked to give parents a suit their kids can wear to protect them while they play in a burning building. I get it. It isn’t easy to tell your kids they can’t do something they want to do. “My friends are all on Snapchat.” or the one that irritates me to no end, “The teacher/coach says I have to use Facebook to get the homework/practice schedule.” Sometimes we just have to say no. It is difficult to set the boundaries and limits that keep our kids safe but if we have the right attitude about what we’re protecting them from it becomes easier. Social Media, YouTube, video games that are rated M for mature, non of these things are intended for people under the age of 17 and when we allow our kids to use these products, we open them up to a world that is meant for adults.

    It is difficult for algorithms to catch nudity or violence in uploaded videos. Social Media sites and private messaging apps go to great lengths to keep prying eyes from seeing what is being sent. This makes parental monitoring software hard to develop. Unfortunately some burning buildings are just too dangerous and there isn’t much that can be done to protect you if you’re inside. If you aren’t ok with your child seeing content that is meant for grown ups then I recommend thinking about uninstalling that app instead of trying to find software that doesn’t it allow it to do what it was intended to do.

  • Tinder, Grindr, and Predators. Social Media and Suicide. Who do we blame?

    Tinder, Grindr, and Predators. Social Media and Suicide. Who do we blame?


    There were more than 30 instances of abuse of children from the Tinder and Grindr apps since 2015. That number may seem small but when you consider that fact that kids have easily skirted around the age requirements of these dating/hookup apps and made contact with people who wish to harm them, any number is too high. While these companies say they’re doing all they can to keep kids from using their software, all they really say in response to these horrible occurrences is that the predators and kids violated their terms and services. Since the terms say you shouldn’t contact minors and that minors shouldn’t be using their software, they claim the responsibility isn’t theirs because the child was put in danger by using the app in a way that it wasn’t intended to be used.

    Officials are saying that isn’t good enough with law makers in the UK trying to create legislation that will require age verification on apps like Tinder and even some social media apps like Instagram. Recent suicides have been proven to be inspired by images of self harm that were viewed on Instagram. Again, officials at the social media company say that the most violent of the images violate their terms and services. They have recently, however, banned images of self harm and suicide and removed the categories from search results.

    Here is the question: When these horrible things happen, do we blame the companies who make these online products? Is it enough to write a terms and agreements and say that those who break the rules do so at the fault of their own and no fault of the company? So far, legally, that’s all it takes. It seems that the responsibility of the company ends with the terms and conditions page. If the user doesn’t follow the terms, then how is the company supposed to protect users? Some officials are asking for age verification which means keeping more records. This is something many companies don’t want to do because of recent privacy and data breach concerns. There is only one thing I know for sure, if families will get serious about monitoring their kids’ screen time and online activity, the number of these occurrences will dramatically decrease.

    Let me describe a scenario for you. Your 12 year old child wants to meet new people online, maybe they heard some friends talking about a dating or hook up app, maybe they just don’t have a lot of friends in real life. Whatever the reason, they’re looking for a way to meet people. While they’re looking through the app store they see this in the search results:

     

    They tap download, create a profile and start swiping. Eventually meeting new people on the app. Conversations move to WhatsApp, Facebook Messenger, or Signal and they schedule a meetup. Your imagination can take over from there and if you’ve read some of the news stories it can get pretty awful.

    Imagine, now, that you have parental controls set so that your child has to request permission to download apps. Maybe you even have their controls set to keep them from downloading apps rated for users over 12 years of age. Either of these approaches would keep you from hearing about your child’s new friendship or worse, romantic relationship with a stranger online. Instead, you’ll see that they’re trying to download an app that is designed to connect people for romantic relationships and be able to discuss this with them. You can share the dangers of building relationships with strangers and help them understand the importance of privacy, security, and parental supervision.

    There are built in ways to protect your child on both iOS and Android devices. The key is to set them up. Use the built in protections and features and don’t rely on these companies to protect your children. They don’t exist to keep your family safe or even to help people build healthy relationships. These companies develop their products to make money. It is foolish to expect Instagram to protect your kids from suicide, should they have a responsibility for what is on their app, yes, should you blame them if your kid harms themselves because they see something on the app, not entirely. You have to take some of the blame onto yourself. There are ways to keep your kids safe from that kind of content. If you don’t know about it or don’t use it, it isn’t the fault of the company. It’s yours. Be involved, pay attention, and do the work to keep them safe.

  • How to Turn Off Hidden Location Access in iOS

    How to Turn Off Hidden Location Access in iOS

    I recently noticed that some of my photos and videos were still being tagged with a location. One of the most common pieces of advice I give to parents is to turn location access off to their cameras. I was a bit annoyed because I never saw location information in my Photos app before, but now I was. Well, I did a bit of digging and found the culprit. It’s about five taps deep into your privacy settings and, therefore, easy to miss. Below is a short video tutorial to help you be sure location info on your phone stays as private as possible.

    Why Turn This Off

    Your location information is easy to track and very easy to gather from the data in the videos and pictures that you upload to social media. There have been instances of kids being harassed by predators who learned where they were through images their parents had shared online. Common sense tells us never to post pictures or videos that show an address number, school name, or sign of a place you frequently go with your family. The problem is that some of the apps on our phones tag our locations by default. I recommend you look at every app’s location request and ask yourself, “Does this app HAVE TO know where I am to function properly?” If it does not, then turn off access to your location.

    Hopefully, this short video helps you make the changes you need to feel like your privacy is even more secure. I know I feel better.