Tag: suicide

  • Tik-Tok Under Fire as Study Finds 25% of Kids Talk to Strangers Online

    Tik-Tok Under Fire as Study Finds 25% of Kids Talk to Strangers Online

    Tik-Tok is at the center of a controversy surrounding the exposure to predators and child pornographers through live streaming on their app. One in twenty children who use live-streaming apps have been asked to take off their clothes according to a study by the UK’s Children’s Charity NSPCC.  Originally called Musical.ly, Tok Tok claims to “empower everyone to be a creator directly from their smartphones, and is committed to building a community by encouraging users to share their passion and creative expression through their videos.” Their mission statement sounds like they are building a place for our kids to stretch their creative muscles and build a supportive audience but in reality it is exposing them to potential danger.

    Sexual exploitation is only a part of the issue, there are popular hashtags on the app that highlight self harm and eating disorders. Tags like #thinspo (thinsporation) feature videos of children as young as eight showing their rib cages through their skin and proclaiming that they are inspiring to others who desire to be thin. Suicide and self harm are also featured on the app with complete with encouragement to hurt yourself and instructions on how to do so. Tik-Tok says you have to be 13 to use the app but as we have shared multiple times on this site, that age exists to protect the company from legal action concerning the collection of children’s data, not to protect your children from content on the app.

    While the app is rated 12+ in apps stores in the U.S. the reasons listed for the rating prove to be, in fact, very mature. The issue, again, as I’ve mentioned, is user generated content. Anyone with a smartphone and a wifi connection can make videos and now livestream in Tic-Tok, they can also watch you perform on the app. This makes for an open, dangerous atmosphere filled with predators, adult content, scams, and violence.

    What Parents Should Know

    Tik-Tok says they have filters and parental controls in the app that allow you to set the app to private but all of these measures have proven to be less than effective. Kids who use the app on their own can easily come across content that isn’t age appropriate. The content restriction and  time management settings in the app are password protected; they can be useful and should be set up if you allow your child to use Tik-Tok. Also be sure to turn off the ability for non-friends to comment on, share, and download (this is on by default, creepy right?) your child’s videos.

    We don’t want our kids talking to strangers online. All parents understand the dangers associated with live-streaming and posting public videos to the internet. Unfortunately many parents feel that their hands are tied when it comes to keeping their kids safe on these apps and websites. That isn’t the truth, however, there are tools (some in the app and some third party) which you can use to keep them from accessing things that are dangerous. An algorithmic filter is never going to be enough, though, so it is important that we have open communication with out kids about what they are posting and seeing on apps like Tik-Tok. Also, if your child doesn’t meet that age restriction then they shouldn’t use the app.

    Twenty five percent of kids talking to strangers online is a horrifyingly high statistic. It shows that while there are privacy settings and parental controls out there for parents to use, either parents aren’t using them or their kids are getting around them. I know that the privacy settings in Tik-Tok aren’t password protected so if your children want to talk to strangers on the app and they have time using the app by themselves there are ways for them to make that happen. It is important that parents take the responsibility to protect our kids online. Many media outlets are blasting these companies for putting our kids in danger but I have to be honest, you don’t blame the slide for your kid falling off and busting their face, you think of precautions that YOU can take to keep that from happening in the future.

  • I Can’t Help You Protect Your Kids on Apps Meant for Adults

    I Can’t Help You Protect Your Kids on Apps Meant for Adults

    I field messages and requests all week long from parents who want the latest tools for keeping their kids safe online. They ask about everything from YouTube to Instagram to Snapchat and want apps that will monitor social media use, block adult content, and limit screen time. While there are resources to help parents in each of those departments, some apps just aren’t intended for your younger child. Unfortunately many parents have a real problem giving in to that fact.

    Streaming Videos

    Let’s look at YouTube as our first example. The video app was created in 2005 as a place for anyone to upload short videos to share with their friends. Google purchased YouTube in 2006 and Social Media became popular soon after, rocketing YouTube to the successful streaming platform it has become. The site is loaded with videos from filmmakers, vloggers, video gamers, makeup artists, geeks, professionals, educators, ministers, animators, artists, basically any category you can think of. It has evolved into an immovable force on which there are 300 minutes of footage uploaded every single minute. YouTube has come under fire for some of their content being too mature or sensitive and so they’ve employed algorithms to keep tabs on inappropriate videos. They also released n app for children called YouTube Kids. This app has also seen its share of controversy after YouTube has been unable to keep sensitive material from showing up in videos on the app.

    YouTube obviously wasn’t intended for young viewers. It is a site that is populated primarily by videos uploaded by its users. Some companies that make content for kids use YouTube but this is a choice by these companies in response to the popularity of the platform. It’s an attitude that says: “Kids are there, so we should be there too.” The goal is to reach the audience already there, not necessarily to build an audience on YouTube. There are no real parental controls (safe search is mostly useless) and videos that are labeled as kid friendly are done so without any human eyes ever seeing the entire video. The only time a content reviewer sees the video is when enough users of the site have flagged it as inappropriate. Allowing your kids to watch YouTube on their own is a risk that many parents don’t even realize they are making.

    What about Social Media?

    Snapchat, Instagram, or Facebook are all the same. They, like YouTube, feature content created and posted by the users of the service. This “User Generated Content” can vary from political or religious views, to silly cat videos or memes, and random personal updates that mean nothing to anyone. People also post updates on their serious mental health issues, they share about their plans to harm themselves or others, the post images of themselves in compromising situations, and that’s just what people post publicly. Private messaging contains content that people post when they think nobody except those they trust is watching. Private messaging is how predators groom their victims. It’s how the out of control teenage boy convinces the girl to send him inappropriate pictures of herself. Social Media is intended to be a place to connect with people, some you may know, some you don’t. It is meant to be a public forum and that which is meant to be private, is meant to be completely private. This is where the problems come in when parents ask for ways to monitor their kids social media.

    Age Rating vs Terms and Agreements

    I see a lot of parents giving their kids access to social media and other online activities when they reach the age of 13. This is based on the fact that the terms and agreements that these sites have you approve before making an account list 13 as the minimum age to use their service. A common mistake parents make is thinking that this age is meant to protect their kids from content on the site when, in fact, it’s intended to protect the company from having data and information on kids under the age of 13. COPPA laws say that companies can’t collect and use information of kids under 13 without parental consent. If a company says you can’t use the site if you’re under 13 then they can do whatever they want with all of that data and if your kid is underage, it isn’t their fault. You ignored the Terms and Agreements when you allowed them to use the site.

    Age rating is the age recommendation you’ll see in the app store when you are downloading and app. This age restriction is based on the actual content in the app, not any legal requirements for the company. The usual standard is that apps populated by user generated content are rated 17+. This is because the company can’t guarantee that what is seen on their product won’t be considered adult content. When we allow our kids to use apps that contain user generated content we are allowing them to be subject to the opinions, behavior, and whims of everyone else who uses that app.

    Parental Involvement Before Parental Control

    When I am asked to help parents protect their kids in apps that are obviously not made for children I feel like I’m being asked to give parents a suit their kids can wear to protect them while they play in a burning building. I get it. It isn’t easy to tell your kids they can’t do something they want to do. “My friends are all on Snapchat.” or the one that irritates me to no end, “The teacher/coach says I have to use Facebook to get the homework/practice schedule.” Sometimes we just have to say no. It is difficult to set the boundaries and limits that keep our kids safe but if we have the right attitude about what we’re protecting them from it becomes easier. Social Media, YouTube, video games that are rated M for mature, non of these things are intended for people under the age of 17 and when we allow our kids to use these products, we open them up to a world that is meant for adults.

    It is difficult for algorithms to catch nudity or violence in uploaded videos. Social Media sites and private messaging apps go to great lengths to keep prying eyes from seeing what is being sent. This makes parental monitoring software hard to develop. Unfortunately some burning buildings are just too dangerous and there isn’t much that can be done to protect you if you’re inside. If you aren’t ok with your child seeing content that is meant for grown ups then I recommend thinking about uninstalling that app instead of trying to find software that doesn’t it allow it to do what it was intended to do.

  • Tinder, Grindr, and Predators. Social Media and Suicide. Who do we blame?

    Tinder, Grindr, and Predators. Social Media and Suicide. Who do we blame?


    There were more than 30 instances of abuse of children from the Tinder and Grindr apps since 2015. That number may seem small but when you consider that fact that kids have easily skirted around the age requirements of these dating/hookup apps and made contact with people who wish to harm them, any number is too high. While these companies say they’re doing all they can to keep kids from using their software, all they really say in response to these horrible occurrences is that the predators and kids violated their terms and services. Since the terms say you shouldn’t contact minors and that minors shouldn’t be using their software, they claim the responsibility isn’t theirs because the child was put in danger by using the app in a way that it wasn’t intended to be used.

    Officials are saying that isn’t good enough with law makers in the UK trying to create legislation that will require age verification on apps like Tinder and even some social media apps like Instagram. Recent suicides have been proven to be inspired by images of self harm that were viewed on Instagram. Again, officials at the social media company say that the most violent of the images violate their terms and services. They have recently, however, banned images of self harm and suicide and removed the categories from search results.

    Here is the question: When these horrible things happen, do we blame the companies who make these online products? Is it enough to write a terms and agreements and say that those who break the rules do so at the fault of their own and no fault of the company? So far, legally, that’s all it takes. It seems that the responsibility of the company ends with the terms and conditions page. If the user doesn’t follow the terms, then how is the company supposed to protect users? Some officials are asking for age verification which means keeping more records. This is something many companies don’t want to do because of recent privacy and data breach concerns. There is only one thing I know for sure, if families will get serious about monitoring their kids’ screen time and online activity, the number of these occurrences will dramatically decrease.

    Let me describe a scenario for you. Your 12 year old child wants to meet new people online, maybe they heard some friends talking about a dating or hook up app, maybe they just don’t have a lot of friends in real life. Whatever the reason, they’re looking for a way to meet people. While they’re looking through the app store they see this in the search results:

     

    They tap download, create a profile and start swiping. Eventually meeting new people on the app. Conversations move to WhatsApp, Facebook Messenger, or Signal and they schedule a meetup. Your imagination can take over from there and if you’ve read some of the news stories it can get pretty awful.

    Imagine, now, that you have parental controls set so that your child has to request permission to download apps. Maybe you even have their controls set to keep them from downloading apps rated for users over 12 years of age. Either of these approaches would keep you from hearing about your child’s new friendship or worse, romantic relationship with a stranger online. Instead, you’ll see that they’re trying to download an app that is designed to connect people for romantic relationships and be able to discuss this with them. You can share the dangers of building relationships with strangers and help them understand the importance of privacy, security, and parental supervision.

    There are built in ways to protect your child on both iOS and Android devices. The key is to set them up. Use the built in protections and features and don’t rely on these companies to protect your children. They don’t exist to keep your family safe or even to help people build healthy relationships. These companies develop their products to make money. It is foolish to expect Instagram to protect your kids from suicide, should they have a responsibility for what is on their app, yes, should you blame them if your kid harms themselves because they see something on the app, not entirely. You have to take some of the blame onto yourself. There are ways to keep your kids safe from that kind of content. If you don’t know about it or don’t use it, it isn’t the fault of the company. It’s yours. Be involved, pay attention, and do the work to keep them safe.

  • Instagram Updates Give You More Control Over Your Feed

    Instagram Updates Give You More Control Over Your Feed

    So many people found themselves migrating their photo sharing to Instagram several years ago because of their chronological timeline. As Facebook became more algorithm based, people felt like they had no control over what they were seeing on their timelines so they opened Instagram accounts. A few years ago Instagram went the way of the algorithm as well trying to give posts you are most interested in a higher priority in your feed. This was met with mixed reactions and now Instagram is working to give you more control.

    external link…

    The first addition, effective now, is the “Mute” feature. This allows you to hide posts from your feed without completely unfollowing them. This could be useful for hiding bullies without letting them see that you’ve unfollowed (which could encourage even more bullying) and just helping the algorithm know your preferences even better. To mute someone simply press the three dots to open the menu on a post. Select Mute in the menu and that’s it, you shouldn’t see posts from that user anymore.

    The second announcement is for updates that haven’t made their way to users yet. Soon, Instagram will include an insights feature that will allow you to see how often and how you use the social media service as well as a notification when you’ve seen all the posts from the previous 48 hours. Instagram and parent company Facebook are hoping that these features will improve the user experience by helping them develop better usage habits. There has been more of a focus recently on improving the “overall well-being” of their users. Some of the recent updates on YouTube, Facebook, Snapchat, and Instagram have been attempts to encourage more engagement within their communities without having to spend as much time using their services.

    What Parents Should Know

    Any changes made to social media sites that encourage breaks from screens are good changes. But nothing will replace a parent setting limits for their children. Remember, also, that there is no better lesson than your own example. Monitor your own screen time and make healthy choices so that you can advise your kids from a place of leadership. Instagram’s new mute feature will help moms and dads keep unwanted posts away from their kids as well as help older kids silence those they don’t longer want to hear from.

    My hope is that as parents we can guide our children into proper use of social media. Statistics are showing that the chance of depression increases the more our children use social media. More depression increases the rate of suicide among young people. In fact, suicide is now the second leading cause of death among teenagers. With these facts in place, even the social media services themselves are taking notice and making changes. Parents, do not allow yourself to be caught in the dark when it comes to your kids and safe use of their technology. While Facebook, Instagram, and Youtube have been making changes, the responsibility does not fall to these companies to protect our children. It falls to us, their parents.

  • Instagram’s Comment Controls Can Help Parents Breathe a Little Easier

    Instagram’s Comment Controls Can Help Parents Breathe a Little Easier

    Cyberbullying and suicide are two of the most dangerous symptoms of our digital culture. Suicide is now the third leading cause of death among teens and half of all teens admit to having been bullied online. One of the major ways these bullies find their foothold is through comments on social media. My advice to parents is always to keep their kids away from online comments as much as possible. Instagram’s new comment control feature will help moms and dads be more effective. Their new live video reporting feature could even save lives.

    Comment Controls

    Instagram wants you to have control over who sees what you post. They’re also giving you control over what people say about your selfies and food pics. Well, not as much what they say but who is saying it. If you are concerned about random strangers contacting your kids on Instagram through their comments section you can have them set their post comments to private. There are actually four settings, Everyone, Your Followers, People You Follow, or People You Follow and Your Followers. These settings allow you to set a smaller group of folks who can comment on your kids’ posts. This can be very helpful but only if your kid agrees with your motives. These settings aren’t password protected so they can be reset at any time. The best way to ensure they’re keeping their settings as you’d want them is to check in on who’s commenting on them. If you don’t recognize them from your child’s account then you should ask about it.

    Live Video Reporting

    Instagram has also added a reporting feature to highlight when someone may be considering self-harm or harming others while filming a live video. If the video is reported they’ll be shown a message that encourages them to reach out for help. Instagram has trained staff available 24/7 to accept messages from people who reach out via the Instagram Live reporting feature. Their hope is to give friends a way to help friends stay safe and maybe even choose to stay alive. What a great way to be encouraged that people care about you. This feature is also available on Facebook Live.

    What Parents Should Know

    Some social media sites are leaning towards the most public and open atmosphere possible (here’s looking at you Snapchat) which can be dangerous for our kids. Instagram and Facebook seem to be taking notice of our desire to keep some things private or in our chosen circle of friends. Understanding and using these features is very important for parents as we work to keep our kids safe online. Be sure to keep yourself informed.

    If you’re looking for even more info about how to protect your children online, you can contact me (Michael) about hosting a workshop to train you and your friends on family internet safety. Home workshops are free and are available all over the country through Skype. Email me at BecauseFamily@gmail.com to learn more.

  • Four Blue Whale Challenge FACTS

    Four Blue Whale Challenge FACTS

    There are a lot of misconceptions about the Blue Whale Challenge. Some are calling it the latest threat to our kids while others are saying it’s being blown out of proportion. The short video below unpacks some of the facts that I’ve discovered as I look deeper into this trending topic.

     


    The Facts:

    1. Kids ARE hurting themselves because of this the Blue Whale Challenge
    2. There have been zero (0) confirmed suicides resulting from the Blue Whale Challenge.
    3. Authorities are concerned we are going to turn this into something because of the popularity we’re giving it.
    4. There are steps you can take and you should take them now.

    What Parents Should Know

    There are always reasons to take a trend like this seriously. We, as parents should be vigilant to keep an eye on what our kids are doing online while not allowing the trendiness of a topic like the Blue Whale Challenge cause us to inflate it into even more of an issue. Be wise and take the steps you know to take to keep your kids safe online.

    My Advice:

    1. Know who they talk to online.
    2. Follow them on your social media accounts.
    3. Have their login information for EVERY social media account they have.
    4. Keep an eye out for the signs of depression or other struggles.
    5. Talk to your children openly about the dangers of buying into these trends and communicating with people you don’t know.
  • Social Media Live Video Causes Public Mental Health Concerns

    Social Media Live Video Causes Public Mental Health Concerns

    Mark Zuckerberg is on a mission to make Social Media safer for our minds. No, he isn’t trying to protect us from adult images, he’s more concerned with fake news and potential damaging live video. Over the past several weeks there have been some live Facebook videos that have garnered much attention because of the graphic and horrific nature of the content. Videos of murder or suicide have been passed around social media and showed up on many of our Facebook feeds. Once these videos are filmed live, they are uploaded to the account of the person who filmed it and spread across the timelines of their followers. This often leads to more shares and potentially a viral spread of the video. The sudden popularity of these gruesome videos will then lead to thousands or even millions of people seeing them before Facebook can have them taken down. This is where the concern for public mental health comes in.

    Facebook’s response to this issue is the hiring of 3,000 new employees who’s job it is to screen these live videos for any content they may deem a danger to the mental health of Facebook’s users. This team of editors is an addition to a team of nearly 4,500 people who have been screening content. The issue is that live video adds to the challenge of keeping content on Facebook free from graphic images and videos. Just responding to reports that a post may be harmful isn’t enough anymore. Facebook is trying to screen some videos and images before they’re posted. Hopefully, this will make for fewer viral videos that give us nightmares. This will also set a precedent for other social media. Including the platforms that your kids use.

    What Parents Should Know

    If you haven’t had a reason to talk to your kids about what they see online yet, this one should do it for you. With consistant opinions and world-views being tossed around social media, we have to have an active, ongoing conversation with our kids about what they’re seeing on their timelines. Videos are posted and shared long before any of us can see them and remove them and long before we can step in and keep our kids from seeing them. News articles are taken as fact even if they are in the “opinion” category on the news site. This is why my advice is to be a safe place for your kids to come when they see something troubling or have questions about what they’ve seen. 

    Whether it’s violence, bullying, or sexual content, what we see can’t be unseen. In a world where technology is changing faster than we can keep up it’s critical to be the one your kids come to when they’ve seen something that will stick with them. If the companies who develop these social media platforms are concerned enough to hire more employees to help solve the problem then those of us whose family use the service should be on top of setting up safeguards, learning more about these tech topics, and keeping the lines of communication open.


    Keep in the loop with our weekly posts sent straight to your email.

     

    Get Our Latest Posts in Your Inbox

    * indicates required




  • Why Your Teen Posts Her Feelings On Social Media

    Why Your Teen Posts Her Feelings On Social Media

    I’ve seen them called “vaguebook posts.” It’s the status on social media that hints at some sort of trouble or drama but doesn’t go into detail. It can be annoying to see these things on our Instagram or Facebook feeds but sometimes they’re not just cries for attention, they’re cries for help. Teens are being taught (rightfully so) that it’s better to express their feelings than to hold them inside. Studies are showing that more and more young people are admitting to having bouts with depression, anxiety, and even suicidal thoughts. Social media has become a safe place for them to express how they feel. The problem is that it can often open them up for criticism and unwanted attention. 

    The 12-month prevalence of MDEs (major depressive episodes) increased from 8.7% in 2005 to 11.3% in 2014 in adolescents and from 8.8% to 9.6% in young adults (both P < .001). – Pediatrics Journal Study

    The reasons for these increasing numbers could be related to increased awareness of the symptoms of depression but regardless of the reason, they are statistics that parents should pay attention to. Mental Health professionals are championing more awareness and openness about depression and anxiety. They agree that being outspoken about how you feel can lead to prevention, early detection, and even increase the likelihood of sufferers getting professional help. The problem, however, may be that this transparency is happening in a public forum like Social Media. This is where parents come in.

    What Parents Should Know

    I am not a psychologist  or licensed counselor. I don’t have a professional opinion on the mental health of your teenager. I do, however, have a professional opinion of their activity on Social Media. As a family internet safety expert I see parents struggle to open the lines of communications with their teens. An emotional post on social media should be seen as an open door. There is no more important thing in the life of your young adult or teenager than the ability to be transparent with you about how they feel. If they are posting on social media about depression, anxiety, and especially suicide, it’s time to bring that conversation into a face to face meeting. Posting such transparent posts (even if they’re vague) on social media opens your child up to more bullying and harassment which could be what’s contributing to the problem in the first place. They should be advised to express those thoughts vocally to someone they trust.

    My advice is to start the conversation with your child even before you see any sign of depression. Ask them how they are doing and feeling. Ask them if they feel overwhelmed and if there is anything happening in their lives that you should know about. If you are seeing true signs of these depressive episodes then don’t hesitate to seek out a licensed counselor that can mediate your discussions with your teen. The world they are growing up in is very different than it was even twenty years ago. The standards that kids are asked to live up to are higher. Criticism is more public and the media is teaching messages that conflict with what many of feel is healthy for our kids to believe. This problem is real and it isn’t going to solve itself. We, as parents, have to step up and help our teens make quality decisions when it comes to their feelings and mental health. Take the first step today. Have a conversation.


    Sign up for our email list to get our family internet safety checklist and the Safe Blog Uninstall List.

    Get Our Latest Posts in Your Inbox

    * indicates required