Tag: texting

  • I Can’t Help You Protect Your Kids on Apps Meant for Adults

    I Can’t Help You Protect Your Kids on Apps Meant for Adults

    I field messages and requests all week long from parents who want the latest tools for keeping their kids safe online. They ask about everything from YouTube to Instagram to Snapchat and want apps that will monitor social media use, block adult content, and limit screen time. While there are resources to help parents in each of those departments, some apps just aren’t intended for your younger child. Unfortunately many parents have a real problem giving in to that fact.

    Streaming Videos

    Let’s look at YouTube as our first example. The video app was created in 2005 as a place for anyone to upload short videos to share with their friends. Google purchased YouTube in 2006 and Social Media became popular soon after, rocketing YouTube to the successful streaming platform it has become. The site is loaded with videos from filmmakers, vloggers, video gamers, makeup artists, geeks, professionals, educators, ministers, animators, artists, basically any category you can think of. It has evolved into an immovable force on which there are 300 minutes of footage uploaded every single minute. YouTube has come under fire for some of their content being too mature or sensitive and so they’ve employed algorithms to keep tabs on inappropriate videos. They also released n app for children called YouTube Kids. This app has also seen its share of controversy after YouTube has been unable to keep sensitive material from showing up in videos on the app.

    YouTube obviously wasn’t intended for young viewers. It is a site that is populated primarily by videos uploaded by its users. Some companies that make content for kids use YouTube but this is a choice by these companies in response to the popularity of the platform. It’s an attitude that says: “Kids are there, so we should be there too.” The goal is to reach the audience already there, not necessarily to build an audience on YouTube. There are no real parental controls (safe search is mostly useless) and videos that are labeled as kid friendly are done so without any human eyes ever seeing the entire video. The only time a content reviewer sees the video is when enough users of the site have flagged it as inappropriate. Allowing your kids to watch YouTube on their own is a risk that many parents don’t even realize they are making.

    What about Social Media?

    Snapchat, Instagram, or Facebook are all the same. They, like YouTube, feature content created and posted by the users of the service. This “User Generated Content” can vary from political or religious views, to silly cat videos or memes, and random personal updates that mean nothing to anyone. People also post updates on their serious mental health issues, they share about their plans to harm themselves or others, the post images of themselves in compromising situations, and that’s just what people post publicly. Private messaging contains content that people post when they think nobody except those they trust is watching. Private messaging is how predators groom their victims. It’s how the out of control teenage boy convinces the girl to send him inappropriate pictures of herself. Social Media is intended to be a place to connect with people, some you may know, some you don’t. It is meant to be a public forum and that which is meant to be private, is meant to be completely private. This is where the problems come in when parents ask for ways to monitor their kids social media.

    Age Rating vs Terms and Agreements

    I see a lot of parents giving their kids access to social media and other online activities when they reach the age of 13. This is based on the fact that the terms and agreements that these sites have you approve before making an account list 13 as the minimum age to use their service. A common mistake parents make is thinking that this age is meant to protect their kids from content on the site when, in fact, it’s intended to protect the company from having data and information on kids under the age of 13. COPPA laws say that companies can’t collect and use information of kids under 13 without parental consent. If a company says you can’t use the site if you’re under 13 then they can do whatever they want with all of that data and if your kid is underage, it isn’t their fault. You ignored the Terms and Agreements when you allowed them to use the site.

    Age rating is the age recommendation you’ll see in the app store when you are downloading and app. This age restriction is based on the actual content in the app, not any legal requirements for the company. The usual standard is that apps populated by user generated content are rated 17+. This is because the company can’t guarantee that what is seen on their product won’t be considered adult content. When we allow our kids to use apps that contain user generated content we are allowing them to be subject to the opinions, behavior, and whims of everyone else who uses that app.

    Parental Involvement Before Parental Control

    When I am asked to help parents protect their kids in apps that are obviously not made for children I feel like I’m being asked to give parents a suit their kids can wear to protect them while they play in a burning building. I get it. It isn’t easy to tell your kids they can’t do something they want to do. “My friends are all on Snapchat.” or the one that irritates me to no end, “The teacher/coach says I have to use Facebook to get the homework/practice schedule.” Sometimes we just have to say no. It is difficult to set the boundaries and limits that keep our kids safe but if we have the right attitude about what we’re protecting them from it becomes easier. Social Media, YouTube, video games that are rated M for mature, non of these things are intended for people under the age of 17 and when we allow our kids to use these products, we open them up to a world that is meant for adults.

    It is difficult for algorithms to catch nudity or violence in uploaded videos. Social Media sites and private messaging apps go to great lengths to keep prying eyes from seeing what is being sent. This makes parental monitoring software hard to develop. Unfortunately some burning buildings are just too dangerous and there isn’t much that can be done to protect you if you’re inside. If you aren’t ok with your child seeing content that is meant for grown ups then I recommend thinking about uninstalling that app instead of trying to find software that doesn’t it allow it to do what it was intended to do.

  • How Your Kids Can Hide Texts

    How Your Kids Can Hide Texts

    One of the major issues facing our teens these days is sexting. Statistics say that one of every ten teenagers admits to having sent naked pictures of themselves to someone. Sometimes our kids use social media to do this. They’ll send photos and inappropriate messages in the private messaging features of these social media platforms. Sometimes, though, your kids just use text messaging to do it. There are several tools out there to allow you to monitor what texts your kids are sending but there are a few ways they can hide their texts, even from the security you’ve set up. Here’s what they’re doing.

    Deleting text history.

    This seems pretty obvious but you’d be surprised how oblivious some parents can be. If there aren’t any text messages in your kids’ messaging app, or if it looks like they’ve only chatted with you, they’ve probably deleted their messages. This doesn’t always mean they’re up to something naughty but it does mean you should be having a conversation with them. Deleting their messages is a bad habit to allow them to get into for a number of reasons. First, it looks like they’re hiding something. They don’t want you to be suspicious of them any more than you want to creep around and spy on them. Secondly, they could be deleting conversations that can come back in the future. It’s not a bad thing to have written (or typed) evidence of these conversations, especially if they’re ever contacted by someone they don’t know. I advise you to encourage your kids not to delete texts. If you’re using a monitoring software that uses the iCloud backup (TeenSafe, mSPY) to monitor their texts then you could be missing what they’ve texted if they deleted them while outside of wifi and before a backup to the cloud. If you have a suspicion that text messages are being deleted then you should disallow the use of texting on their phone. If you can’t disable texting then don’t be afraid to take the phone away for a while. Most of all talk to your kids about the risks associated with keeping their conversations hidden. You should be here to help them and they need to understand that.

    Using Dummy Phone Numbers and Private Texting Apps

    Apps like TextBurner, Anonymous Texting, Buffalo Private Texting, and Smiley Private Texting can easily be used by kids and teens to hide the conversations they’ve been having. Not only do these apps require a pin number to access the text messages but many of them also allow you to set up a new phone number so that you can send and receive texts of calls anonymously. The apps descriptions mention job searches, Craigs List, and dating as some of the main uses for these private texting and dummy phone numbers. They do, however, advise against certain usese of the app and even warn of some of their policies for dealing with those who don’t follow guidelines:

     

    Screenshot from the Anonymous Texting App

     

    Check out this list of private texting apps for iOS.     —      Here’s a list of private texting apps for Android.

    Notice the warning about the age requirement? It says you must be 13 years old. If that’s so then why is the app rated for ages 4+? If you do a scan of all of the apps like this in iOS you’ll find that they’re all rated 4+. If you have your app store settings allowing your kid to download apps rated 9+ or lower or even 4+, they’d be able to download one of these apps, create a private and secret phone number that you don’t even know about and begin texting whoever they’d like. This is why I recommend using Family Link (for android phones OS 7 or higher) and iTunes Family Share to require your child to ask permission to download new apps. If you see any kind of app that says secret or private, or anonymous in the description I’d think twice about allowing them to download it. We have a major issue on our hands of kids sending images and texts that are very adult oriented. You, honestly, can’t keep an eye on every message they send on every app. This is why it’s important to limit which apps they’re allowed to use. It may cause that knockdown, drag out fight you’ve been trying to avoid but it’s better than filing a police report about some stranger who has been sending nasty pictures to your child.

     

  • The Hooked App is as Addictive as it Sounds

    The Hooked App is as Addictive as it Sounds

    We all want our kids to read more so an app that claims to have had users reading for over 500,000 hours sounds like a godsend. It is, in fact, a great idea and a pretty original way to get it’s users to read. The Hooked app isn’t new, having launched in 2014, but it’s rating number 6 in the iOS app store today. It’s popularity is growing fast.

    Hooked is an app that tells stories in the form of text messaging conversations. The story topics include comedy, horror, fantasy, and sci fi. Each story has a title page photo, many of which feature a pretty girl or couple in a romantic or suggestive pose. When you’ve made your choice the story unfolds one text message at a time, usually in the form of a conversation between a couple of people. You can tap the screen to reveal the next message and you may find yourself in a tapping frenzy to reach the next plot twist. Then, however, you’ll suddenly be halted by the Hooked Owl asking you to pay for more “Hoots.” A “Hoot” is a click/tap and you only have a certain amount of them every hour. Once you’ve used them up you’ll be prompted to buy more or sign up for a weekly or monthly subscription to get unlimited “Hoots.” If you’re like me and not ready to invest fiscally in finding out the next line of the story you can just wait until the next hour begins.

    The Hooked app cashes in on the obvious popularity of texting by using it as a storytelling venue. This makes it very appealing to the younger aged adults and teenagers. The target audience for the Hooked app is ages 13-24 but I wouldn’t just look over the app if I saw it on my teen’s phone.

    What Parents Should Know

    The Hooked app consists of some very mature themed stories but is targeting younger teens.  The stories are delivered in a way that your teens and tweens will definitely find appealing. I found myself anxious to find read the next message as the story progressed. Obviously the story really started to climax right before I ran out of “Hoots” so I would have to wait a while to continue reading. While this did get me to close the app for a while, the $7.99 per month for unlimited “Hoots” is a low price tag for being able to sit all day long and click through these stories.

    My issue isn’t really with the way the stories are delivered, it’s actually a creative way to tell these tales. The “one post at a time” method lends itself to a lot of suspense and a pretty entertaining read. The problem I have is the addictive nature of this app, if you can tap an unlimited number of times and go from story to story you’d very easily find yourself reading through a hundred of these stories in just a few days. Also, these stories can be pretty mature, dramatic, and suspenseful. The categories feature love and thriller options and the stories get quite intense at times. The texting storytelling method also makes them a bit more eerie. Users can also write stories which makes for a completely new potential problem there. Any time you’re dealing with User Generated Content it’s hard to be sure what your kids may read.

    My advice is to know your child and their maturity level well before you let them use Hooked. I wouldn’t go by the 9+ rating it has in the app store, I would assume your child should be a bit older. Keep in mind that many of these stories are written by users of the app. Since the content can’t be guaranteed to be safe for any age group, you should help your teen or tween by involving yourself in their decision to use Hooked. If they do read on the app, I recommend asking them what the stories are about and what they like about them. Keep yourself in the loop and informed as much as possible.

  • Who is To Blame for the Dangers of Technology?

    Who is To Blame for the Dangers of Technology?

    Someone is Suing Apple…Again.

    Who’s to blame when the dangers that technology can present become a reality? A series of lawsuits filed in California against Apple claim that we can blame the developers of that tech. California resident Julio Ceja is suing Apple to force them to apply a feature that will lock out an iPhone when you reach a certain speed. He says that they have already filed a patent for the technology to use gps speed to lock a phone. Ceja isn’t suing for any money, except for legal costs and court fees. Apple is claiming that the responsibility for safety lies with the user of the phones to turn off notifications or use “Airplane Mode” while driving.

    Texting and driving are frowned upon everywhere and even illegal in many states but the responsibility has traditionally lied with the driver, not the company who made the phone. Developers of smartphones will say that they can’t ensure that their product will be used as they recommend and therefore they can’t be blamed for any dangers that come from the use of their phone or tablet. These cases are pretty important because they will set a precedent for what safety concerns companies will have to think about as they design, produce, and update their products. We will also learn how much responsibility is considered by law to be personal.

    Parents Should be the First Layer

    …Ceja alleges that Apple willfully did not implement a lock-out mechanism out of a choice to emphasize its business over customer safety, a choice that he believes is an example of “unfair business acts and practices” under California’s Unfair Competition Law. – TechCrunch

    The courts will decide whether or not Apple is guilty of unfair business acts but as parents, we have to look closely at the question of responsibility with tech.  Yes, there is a level of concern that is acceptable for a company to consider when they are producing a product, however, the first layer of responsibility should lie with parents. No, your kids shouldn’t text and drive and they are hearing that from all over. The question is “are they hearing it from you?” Are they seeing something different from you? If you are texting and driving while your kids are hearing the message that it’s wrong and dangerous, then you are removing a layer of education that can be critical to your child or teen’s safety. Our example is very important.

    Texting and driving isn’t the only issue. Frustration with video game addiction or contact with adult content online is understandable but if mom and/or dad aren’t setting any boundaries to help their kids learn healthy behavior, the company who made the video game can’t be blamed. Neither can the pornography industry. The message we send our kids about healthy media and technology practices will set them up for their own behavior in the future. Pay close attention to not only what you say, but even more importantly, what you do.