Category: Social Media

  • I Can’t Help You Protect Your Kids on Apps Meant for Adults

    I Can’t Help You Protect Your Kids on Apps Meant for Adults

    I field messages and requests all week long from parents who want the latest tools for keeping their kids safe online. They ask about everything from YouTube to Instagram to Snapchat and want apps that will monitor social media use, block adult content, and limit screen time. While there are resources to help parents in each of those departments, some apps just aren’t intended for your younger child. Unfortunately many parents have a real problem giving in to that fact.

    Streaming Videos

    Let’s look at YouTube as our first example. The video app was created in 2005 as a place for anyone to upload short videos to share with their friends. Google purchased YouTube in 2006 and Social Media became popular soon after, rocketing YouTube to the successful streaming platform it has become. The site is loaded with videos from filmmakers, vloggers, video gamers, makeup artists, geeks, professionals, educators, ministers, animators, artists, basically any category you can think of. It has evolved into an immovable force on which there are 300 minutes of footage uploaded every single minute. YouTube has come under fire for some of their content being too mature or sensitive and so they’ve employed algorithms to keep tabs on inappropriate videos. They also released n app for children called YouTube Kids. This app has also seen its share of controversy after YouTube has been unable to keep sensitive material from showing up in videos on the app.

    YouTube obviously wasn’t intended for young viewers. It is a site that is populated primarily by videos uploaded by its users. Some companies that make content for kids use YouTube but this is a choice by these companies in response to the popularity of the platform. It’s an attitude that says: “Kids are there, so we should be there too.” The goal is to reach the audience already there, not necessarily to build an audience on YouTube. There are no real parental controls (safe search is mostly useless) and videos that are labeled as kid friendly are done so without any human eyes ever seeing the entire video. The only time a content reviewer sees the video is when enough users of the site have flagged it as inappropriate. Allowing your kids to watch YouTube on their own is a risk that many parents don’t even realize they are making.

    What about Social Media?

    Snapchat, Instagram, or Facebook are all the same. They, like YouTube, feature content created and posted by the users of the service. This “User Generated Content” can vary from political or religious views, to silly cat videos or memes, and random personal updates that mean nothing to anyone. People also post updates on their serious mental health issues, they share about their plans to harm themselves or others, the post images of themselves in compromising situations, and that’s just what people post publicly. Private messaging contains content that people post when they think nobody except those they trust is watching. Private messaging is how predators groom their victims. It’s how the out of control teenage boy convinces the girl to send him inappropriate pictures of herself. Social Media is intended to be a place to connect with people, some you may know, some you don’t. It is meant to be a public forum and that which is meant to be private, is meant to be completely private. This is where the problems come in when parents ask for ways to monitor their kids social media.

    Age Rating vs Terms and Agreements

    I see a lot of parents giving their kids access to social media and other online activities when they reach the age of 13. This is based on the fact that the terms and agreements that these sites have you approve before making an account list 13 as the minimum age to use their service. A common mistake parents make is thinking that this age is meant to protect their kids from content on the site when, in fact, it’s intended to protect the company from having data and information on kids under the age of 13. COPPA laws say that companies can’t collect and use information of kids under 13 without parental consent. If a company says you can’t use the site if you’re under 13 then they can do whatever they want with all of that data and if your kid is underage, it isn’t their fault. You ignored the Terms and Agreements when you allowed them to use the site.

    Age rating is the age recommendation you’ll see in the app store when you are downloading and app. This age restriction is based on the actual content in the app, not any legal requirements for the company. The usual standard is that apps populated by user generated content are rated 17+. This is because the company can’t guarantee that what is seen on their product won’t be considered adult content. When we allow our kids to use apps that contain user generated content we are allowing them to be subject to the opinions, behavior, and whims of everyone else who uses that app.

    Parental Involvement Before Parental Control

    When I am asked to help parents protect their kids in apps that are obviously not made for children I feel like I’m being asked to give parents a suit their kids can wear to protect them while they play in a burning building. I get it. It isn’t easy to tell your kids they can’t do something they want to do. “My friends are all on Snapchat.” or the one that irritates me to no end, “The teacher/coach says I have to use Facebook to get the homework/practice schedule.” Sometimes we just have to say no. It is difficult to set the boundaries and limits that keep our kids safe but if we have the right attitude about what we’re protecting them from it becomes easier. Social Media, YouTube, video games that are rated M for mature, non of these things are intended for people under the age of 17 and when we allow our kids to use these products, we open them up to a world that is meant for adults.

    It is difficult for algorithms to catch nudity or violence in uploaded videos. Social Media sites and private messaging apps go to great lengths to keep prying eyes from seeing what is being sent. This makes parental monitoring software hard to develop. Unfortunately some burning buildings are just too dangerous and there isn’t much that can be done to protect you if you’re inside. If you aren’t ok with your child seeing content that is meant for grown ups then I recommend thinking about uninstalling that app instead of trying to find software that doesn’t it allow it to do what it was intended to do.

  • Tinder, Grindr, and Predators. Social Media and Suicide. Who do we blame?

    Tinder, Grindr, and Predators. Social Media and Suicide. Who do we blame?


    There were more than 30 instances of abuse of children from the Tinder and Grindr apps since 2015. That number may seem small but when you consider that fact that kids have easily skirted around the age requirements of these dating/hookup apps and made contact with people who wish to harm them, any number is too high. While these companies say they’re doing all they can to keep kids from using their software, all they really say in response to these horrible occurrences is that the predators and kids violated their terms and services. Since the terms say you shouldn’t contact minors and that minors shouldn’t be using their software, they claim the responsibility isn’t theirs because the child was put in danger by using the app in a way that it wasn’t intended to be used.

    Officials are saying that isn’t good enough with law makers in the UK trying to create legislation that will require age verification on apps like Tinder and even some social media apps like Instagram. Recent suicides have been proven to be inspired by images of self harm that were viewed on Instagram. Again, officials at the social media company say that the most violent of the images violate their terms and services. They have recently, however, banned images of self harm and suicide and removed the categories from search results.

    Here is the question: When these horrible things happen, do we blame the companies who make these online products? Is it enough to write a terms and agreements and say that those who break the rules do so at the fault of their own and no fault of the company? So far, legally, that’s all it takes. It seems that the responsibility of the company ends with the terms and conditions page. If the user doesn’t follow the terms, then how is the company supposed to protect users? Some officials are asking for age verification which means keeping more records. This is something many companies don’t want to do because of recent privacy and data breach concerns. There is only one thing I know for sure, if families will get serious about monitoring their kids’ screen time and online activity, the number of these occurrences will dramatically decrease.

    Let me describe a scenario for you. Your 12 year old child wants to meet new people online, maybe they heard some friends talking about a dating or hook up app, maybe they just don’t have a lot of friends in real life. Whatever the reason, they’re looking for a way to meet people. While they’re looking through the app store they see this in the search results:

     

    They tap download, create a profile and start swiping. Eventually meeting new people on the app. Conversations move to WhatsApp, Facebook Messenger, or Signal and they schedule a meetup. Your imagination can take over from there and if you’ve read some of the news stories it can get pretty awful.

    Imagine, now, that you have parental controls set so that your child has to request permission to download apps. Maybe you even have their controls set to keep them from downloading apps rated for users over 12 years of age. Either of these approaches would keep you from hearing about your child’s new friendship or worse, romantic relationship with a stranger online. Instead, you’ll see that they’re trying to download an app that is designed to connect people for romantic relationships and be able to discuss this with them. You can share the dangers of building relationships with strangers and help them understand the importance of privacy, security, and parental supervision.

    There are built in ways to protect your child on both iOS and Android devices. The key is to set them up. Use the built in protections and features and don’t rely on these companies to protect your children. They don’t exist to keep your family safe or even to help people build healthy relationships. These companies develop their products to make money. It is foolish to expect Instagram to protect your kids from suicide, should they have a responsibility for what is on their app, yes, should you blame them if your kid harms themselves because they see something on the app, not entirely. You have to take some of the blame onto yourself. There are ways to keep your kids safe from that kind of content. If you don’t know about it or don’t use it, it isn’t the fault of the company. It’s yours. Be involved, pay attention, and do the work to keep them safe.

  • WhatsApp Update Brings Thumbprint and FaceID Lock To Private Messenger

    WhatsApp Update Brings Thumbprint and FaceID Lock To Private Messenger

    The private messenger, WhatsApp, has updated recently to allow users to lock the app from prying eyes by using their Touch or Face ID. Private messaging is becoming more important to users these days since the spotlight has been on Facebook and Google for their data mining and sales. WhatsApp has been a mainstay of private messaging for some time now and this new update takes privacy from an algorithmic/software level to a more obvious tangible place. You can now use your FaceID or TouchID, depending on the generation of your iPhone, to lock people out of the WhatsApp software entirely. This will keep people from opening the app and looking through your messages. Currently this feature is available for iOS only but it is rumored to roll out to Android soon.

    What Parents Should Know

    It’s important to know that there are options that allow you to keep an eye on your kids’ messaging without having to physically take their phone from them. However, if the physical approach is your style then this update from WhatsApp could become a problem for you. Messages being locked in this way needn’t deter you from checking up on your child’s messaging activity, though.  You can store your thumbprint in your child’s device so you can unlock it or just make them unlock the app for you when it comes time to inspect their messages.

    I recommend allowing your children to have a feeling of privacy by using some sort of software to monitor their messaging apps instead of taking the device from them every now and then. Not only does that plan give them a feeling of privacy, it is also a far better monitor then your weekly check up. If a message monitoring algorithm like Bark is active it will look at every single message your child sends or receives in real time, notifying you if any of those messages cross the line to dangerous or inappropriate content. Taking the phone from them to monitor it yourself allows messages to be removed before you get around to looking at it.

    I never advise spying on your children without their knowledge. They should know that you are keeping an eye on their messages and how the software works. They should also know what the consequences are if they send messages they shouldn’t be sending. Finally, you should have an open conversation to allow them to feel like they can come to you if they receive a message they are not comfortable with. No matter what you do to monitor your kids messaging, having a culture of transparency and openness in your home is critical.

  • The Bird Box Challenge and the Decline of Self Responsibility

    The Bird Box Challenge and the Decline of Self Responsibility

    What’s in a Meme?

    It seems that every time something gets popular, someone finds a way to turn it into a dangerous internet meme. The movie Birdbox on Netflix stars Sandra Bullock and features her adventure with two children through a dystopian wasteland five years after a mysterious force caused mass suicide all over the world. Because this force causes you to kill yourself when you look at it most of the main characters spend the entire movie wearing blindfolds. Enter said internet meme.

    Bird Box

    The Birdbox Challenge is a video meme that asks its participants to do mundane, regular tasks while blindfolded. People have done things like cooking, walking through their home, spend 24 hours blindfolded, and even driving while blindfolded. Some of the earliest challenge videos received millions of views in a matter of days. Obviously content creators felt the need to outdo themselves and others. This led to some dumb and even dangerous stunts that eventually led to a warning from Netflix and the banning of all BirdBox challenge videos from YouTube.

    People have filmed themselves walking through traffic and driving while blindfolded leading to a couple of car crashes including one by a teen in Layton, Utah. These challenges can be dangerous and unfortunately the popularity travels so fast that our kids are the first ones to learn of them and try them themselves. Always looking for something of theirs to go viral, our kids will try to copy and even outdo the other videos they’ve seen online. The TidePod challenge is another example of escalation causing a silly meme to get out of hand and even hurt people. The Tide Pod challenge was also banned by YouTube and Tide even did a series of commercials to discourage people from participating.

    Do Violent Video Games Create Killers?

    The Decline of Personal Responsibility

    Whenever anyone, especially our kids, gets harmed by something as ridiculous as a challenge on the internet there is an outcry for someone to take responsibility. We may speak out against law enforcement for not cracking down or the production company for making the show the memes are based on. Maybe we’ll want the streaming platform or social media service that these memes are being shared on to take responsibility. Wherever we place the blame, we are understandably longing for someone to answer for these stupid and dangerous occurrences.

    Gaming, social media, entertainment, and education have all come under fire from time to time for the influence they have over our kids. Learning about dangerous challenges can happen naturally from friends but the ability for information to spread over the internet is unprecedented until now. The inspiration for some of these challenges come from the media our kids consume. There are so many factors, though, that cause the spread of all this craziness but the one factor that is constant is the lack of moderation and responsibility.

    As parents, we should see these occurrences as warnings that it is time for us to be more involved in what our kids are doing, whether online or offline. We have no excuse since there is so much hardware and software available to help us monitor what is happening on our kids’ screens. We are hearing all of the time that we need to be involved and that we should take responsibility for the things our children are seeing. It also falls to us to teach our kids to take personal responsibility for their actions.

    Teaching Responsibility

    When our kids see Netflix putting out warnings or YouTube banning content, they see a major corporation taking responsibility for their user-base’s stupid choices. In reality, though, these companies aren’t taking responsibility, they are covering their own back sides before something truly horrible and reputation ruining happens on their service or platform. We live in a world that wants to skirt around responsibility and find someone else to blame. Our kids aren’t going to learn how to take the fall for their own actions unless we teach them to do so. Here are some ways my wife and I teach that to our children:

        1. They must ask forgiveness and they must give forgiveness.

    We don’t just let our kids say they’re sorry. When they hurt each other’s feelings they must ask to be forgiven and then we expect the other child to say more than “It’s ok.” we want them to say “I forgive you.” This causes the offender to understand that their actions caused someone harm and the offended to realize that they have a responsibility to honor the request for forgiveness.

    2. They have chores.

    Our children have responsibility for things they do around the house and they don’t get paid for it. We consider keeping your room clean and your laundry in the hamper a basic requirement for living in our home. They have other chores that they cycle through and they don’t get to bargain or trade, no matter which ones they dislike or like they have to do what the chart says for that day. This way they’re learning to do junk they don’t enjoy just because they are required to do it. They don’t get paid for this either unless you consider that fact that they have to have it done to even ask to get screen time.

     3. They buy their own stuff.

    Our oldest two children have made some money through performing and sometimes they all get a chance to do odd jobs for friends and family to earn some cash. When they have money, they like to spend it and when it’s gone, it’s gone. We don’t usually agree to fit the bill on stuff they want. They can wait for birthdays or Christmas or they can buy it themselves. That’s how life works.

     4. They are told no a lot.

    We have learned the power of saying no to your children. As our oldest two have grown up they’ve heard no so much that they know when to not even ask. Sometimes I’ll have what I call a “Yes day!” this is a day when I say yes to pretty much anything they ask me (within reason.) I don’t tell them it’s a yes day but they tend to figure it out pretty quick and we all really enjoy doing things together that I would normally say no to without thinking. (Think playing four hours of Risk with your 11 and 9 year old.)

    These things aren’t world changing but they can be life changing and can go a long way to help you instill a sense of responsibility in your kids. Making them take responsibility for how they feel and how they make others feel is something that I believe is truly lacking in our society and our kids have a head start at becoming excellent human beings from that step alone. Learning how to work for and spend money wisely teaches them that they are responsible for how they spend their time as well as what they earn and what they have. Finally, hearing no is critical for kids from a very young age. Life isn’t fair. There are things in life that just happen when you don’t want them to or things that you wish would happen to you that happen for others instead. This. Is. Life. Get used to it.

     

  • Family Tech Blog’s Top Five Posts of 2018

    Family Tech Blog’s Top Five Posts of 2018

    Thank You for Everything!

    I can’t believe the year is over. During 2018 the Family Tech Blog has more than doubled in monthly reach and many articles have been read thousands of times each. I am so grateful for all of you who read and share our content and especially to those who have chosen to support BecauseFamily financially so that this blog can exist. Looking back on this past year it is crazy to think of what all has happened in the tech and family tech safety world. I wanted to write one last post for 2018 that highlights some of the most read articles from this past year. Here are the most read posts from a busy and fascinating 2018.

    Number Five

    Three Ways to Identify a Dangerous YouTube Video Before Your Kids See It

    YouTube is a popular topic for parents and educators. The video streaming site provides some of the most helpful and easy to access free resources on the internet. Unfortunately, however, when anything is as easy to use and popular as YouTube, you will have content on there that isn’t appropriate. I think this article was so popular because in it, I lay out some steps parents can take to identify dangerous or misleading videos on YouTube just by looking for a few signs. I’ve had parents, youth workers, and teachers tell me this article helped them make better choices in what their child was able to watch. Remember that YouTube is the wild wild west. Nearly anything goes. Parental supervision is HIGHLY recommended.

    Number Four

    unGlue is a Great Way to Teach Your Older Kids Screen Time Management

    There comes a time as parents that we should transition from control to guidance. unGlue (a BecauseFamily affiliate) is a great software option for parents who want to add guidance to their internet safety plan without giving up all control at once. This article came out before Apple rolled out Screen Time so it was one of the first software options to provide the kind of limits parents were looking for. It you are trying to protect Android devices or even some older hand me down iPhones, unGlue is still a great option.

    Number Three

    Do Violent Video Games Create Killers?

    Tragedy at a gaming competition in Florida spurred this article that explored some of the opinions that float around about gaming and violence every time a young man commits a violent crime. This article unpacks actual research that has been done to try and answer the question: Do violent video games create killers?

    Number Two

    Tools to Monitor Your Own Screen Time in 2018

    As parents, it is critical that we live out the lessons we try to teach our kids. They retain more of what they see you do than what hear you teach. Monitoring your own screen time, even if just to increase your awareness, can be a very helpful practice in trying to set a healthy example for our kids and teens. This article was released right at the beginning of 2018 and continued to grow in popularity all year long. It is obvious that people realize they spend a lot of time on their phones, here’s hoping they used some of these resources to keep track and make some healthy choices.

    Number One

    Parent Guide: Call of Duty Black Ops 4

    Finally, we are back to gaming. Call of Duty Black Ops 4 released on the back of a ton of hype based around their Battle Royale mode titled “Blackout.” The game released to positive reviews but had a lot of kids asking their parents if they could play it. This Parent Guide is a great way for moms and dads to see if this game would be appropriate for their child.

    Final Thoughts and Trends

    There are the top five posts from 2018. Obviously video games and screen time was a major trend with YouTube maintaining  a presence as one of the most common apps used by parents. I was surprised that there were no articles about Fortnite on the list as that game has taken the world by storm. You can’t look anywhere without seeing the dances, costumes, and merchandise. Voice control is another major trend in 2018 that I am surprised didn’t get as many readers as some other topics. I imagine 2019 will be all about gaming, internet privacy, voice control, and of course…YouTube.

    Thank you for reading the Family Tech Blog this year. We appreciate your support and sharing. Keep checking in through the next year as we have even more awesome plans including adding more tutorial content, including Xbox and Android tutorials, and a lot of news from CES2019, starting next week. Thank you again, Happy New Year, and we will see you in 2019!

  • Looking Forward to CES 2019

    Looking Forward to CES 2019

    As CES 2019 approaches (my flight leaves in 17 days) I find myself more and more interested in the different topics that will be discussed at the Kids@Play Family Tech Summit. The summit features leaders in the industries of tech, toys, education, psychology, software, and entertainment. Sessions last all day long and the topics discussed are exactly the kind of information we parents need to know as we raise our kids in this digital age. The problem is, those in attendance are all industry people who are making apps, toys, and technology for our kids and families. There is very little to no representation of those who work to educate parents themselves on the connected age we live in. That’s where I come in.

    To my knowledge, BecauseFamily’s FamilyTechBlog, is the only publication in attendance at CES that offers our news and stories exclusively from the viewpoint of helping parents protect their children. While I sit and take notes and record footage of the summit my mind is processing how this information can help parents make quality decisions to keep their kids safe on their tech devices. I am glad that this event exists and happy that leaders in this industry are having serious discussions about how to be responsible while developing their products for children. I am also glad that our donors and readers have made it possible for me to be there, as the only exclusive family tech safety website in attendance, and report back to you.

    Here is some of what I’m looking forward to seeing, learning, and reporting on at CES 2019:

    • Jobs of the Future
    • Coding Without Screens
    • Gaming and Creativity
    • Tech Addiction
    • Data and Privacy for Connected Kid’s Products
    • Augmented and Virtual Reality to Help Kids Get More Active

    There is a ton more that I’m excited to see and learn but these are going to be the highlights for sure. Parents are always asking about things like gaming and tech addiction and the jobs that are available to our children now will be completely different in ten years. Having some insight on these questions will be pivotal to making decisions as parents. Many of us have issues with keeping our kids active as they’d rather play with tech than each other at times. Can the tech increase their activity without impacting them in other negative ways? Finally, coding will soon be a skill that is not optional if you want to have your pick of the jobs of the future. How can we introduce coding logic and principles to our children without exacerbating the screen addiction problems we already see in out kids? I am looking forward to finding answers or at least more insight on these topics and questions at CES 2019.

    You Can Help!

    Very briefly, allow me to ask for your help for this trip to Las Vegas for CES 2019. The costs associated with this event are covered solely by donations from our non-profit partners and donors. If you would like to sponsor a meal, an Uber or Lyft ride, or something like that, please visit BecauseFamily.org/partnership to see how you can donate to BecauseFamily and send your family tech safety representative to CES on your behalf. Thank you.

     

     

     

  • Tumblr to FINALLY Ban Adult Content

    Tumblr to FINALLY Ban Adult Content


    *WARNING: this post uses quotes with direct language about pornography and graphic content.

    While most social media sites that allow user generated content have been working to protect their users from unwanted adult images and videos, Tumblr has been happy to be known as “porn GIF central.” Last month, however, their app was pulled from the iOS app store for child pornography and that seems to have caused the developers to reconsider their policies. Earlier this week, Tumblr announced that they are changing their sensitive content guidelines and will be blocking such posts in the future.

    Tumblr defines sensitive content as:

    photos, videos, or GIFs that show real-life human genitals or female-presenting nipples, and any content—including photos, videos, GIFs and illustrations—that depicts sex acts. – Tumblr help center.

    Their guidelines also mentions what type of posts will not cross their line to be considered “sensitive:”

    Examples of exceptions that are still permitted are exposed female-presenting nipples in connection with breastfeeding, birth or after-birth moments, and health-related situations, such as post-mastectomy or gender confirmation surgery. Written content such as erotica, nudity related to political or newsworthy speech, and nudity found in art, such as sculptures and illustrations, are also stuff that can be freely posted on Tumblr. – Tumblr help center.

    Their terms now state that content that is considered sensitive will not be allowed and that any sensitive posts that have been posted previously and not marked as explicit will be flagged and removed. Accounts that have been treated as explicit in the past (you can tag your own account as explicit) will maintain their explicit status and be allowed to continue posting, however, posts, both past and future, that are considered explicit under the new guidelines will be treated as such and removed.

    What Parents Should Know

    Very simply put, Tumblr is still going to allow some forms of sexual content and nudity in their app, as long as it can be labeled as political, newsworthy, or health and social justice related. Many other social media outlets already have these guidelines so Tumblr, while not allowing “hardcore” sexual content, there are still going to be images, videos, and GIFS, that you don’t want your children to see. My advice is, as always to keep an eye on what your children are using social media for, if they are sending messages to friends, you want to be sure they are wholesome and healthy communication and that they are only talking to people they know. If they are using it for artistic inspiration then you should know they could come across content you may consider sensitive, even if Tumblr does not.

    Bark is a good way to keep an eye on what your children are sending in social media messages. It uses an artificial intelligence to watch out for dangerous conversation for you and send you an alert if something about suicide, self harm, sexting, or bullying is sent or received. As I always say, the most important thing you can do is speak to your child about what they do online and what they use their social media for. You may hear from them that Tumblr is all safe now and that they should be allowed to download it, but let this article be your warning that what Tumblr considers safe may not be the same as what you consider safe.

  • Are You on Your Kids’ Instagram “Close Friends List?”

    Are You on Your Kids’ Instagram “Close Friends List?”

    Instagram is rolling out another update today and this one gives users the ability to build a “Close Friends List.” This category of friend creates a list of names that will see stories that you post and designate for that group only. This allows you to post more private or personal posts and trust that it will only be seen by a pre-approved group of friends. This feature should roll out today and will be available through the settings menu on your profile page in the Instagram app.

    You set up your list and then choose, in your stories posts, to designate that post only for your “Close Friends.” The update shows a green badge to notify those seeing it that it is from your Close Friends list. Those on your list will also see a green circle around your “Stories” icon.

    What Parents Should Know

    This update can be a really good thing. It is important to know who is seeing your posts and keeping your audience in mind. A way to separate those who you are ok with seeing certain things could be a way to eliminate the “finsta” or “Spam” instagram account. My advice, though, is to make sure your child has you on their Close Friends list. If they’ve been posting Close Friends posts and you aren’t seeing a green circle around their stories post, you aren’t on their list and you should have a conversation with them about why you don’t want them hiding posts from you.

    Remember that you should be a safe place for your kids to come if they have serious issues to discuss. They shouldn’t be afraid that you won’t understand their depression or that you won’t believe them if they are having problems with people at school or work. You should be THE place that they know they’ll be heard, believed, and understood. I truly believe that if you create that culture in your family your children will automatically think to add you to their Close Friends list because you actually belong there.

  • Tumblr Removed from Apple App Store for Child Pornography

    Tumblr Removed from Apple App Store for Child Pornography

    Photo blogging app Tumblr has been removed from the iOS App Store because of child pornography. Earlier this month the iTunes App Store removed Tumblr from their market unexpectedly. The reason wasn’t announced at the time but it has recently become clear that scans showed child pornography was making it through Tumblr’s content filters. A statement from Yahoo (owners of Tumblr) confirmed that child pornography was the reason for the app’s removal and said that they are working hard to fix the flaws in their scanning algorithm and get the app back on the app store.

    Tumblr has been criticized for their lack of concern for adult and inappropriate content on their app. Some even call it “porn gif central.” They added an on/off switch for adult content when Apple made it a requirement but didn’t password protect it. Tumblr has a reputation for doubling down on the fact that pornography is what makes their app so popular. The app is still available on Android’s Google Play Store.

    What Parents Should Know

    It didn’t take much research for me to add Tumblr to my uninstall list a couple of years ago. It is still there and this latest news only solidifies that fact that it belongs there. There is content on Tumblr that many feel they want to see. Geek stuff, memes, humor, art, and photography are all featured on the app prominently but a simple search or click on the wrong related image can lead you to hardcore adult images and animated images. Your children shouldn’t be allowed to use Tumblr and your teens should be advised against it.

  • You Can Soon Delete Those Facebook Messages You Sent on Accident

    You Can Soon Delete Those Facebook Messages You Sent on Accident

    Facebook is testing a new feature that will allow you to unsend messages after you’ve sent them. As long as you decide to take the message back within 10 minutes you can undo your typos, unintentional rants, or inappropriate messages. This feature currently isn’t available in all markets but will be very soon.

    What Parents Should Know

    Anytime a Messenger has the ability to delete messages you’ve already sent, I see red flags. One of the problems with our young people using messenger apps is the false idea that they are anonymous or that they can hide what they said. Being able to send disappearing messages on Instagram and Snapchat has put them on my uninstall list. This new feature for Facebook Messenger might be a dealbreaker as well.

    Always discuss with your kids the idea that anything posted online should be considered there forever. Even though a message says it could be deleted doesn’t mean it wasn’t saved by the recipient before you removed it. Also, the increase in cyber bullying and sexting can sometimes be attributed to the ability to take back messages you’ve already sent. Young people could be more inclined to send a sensitive message if they think they can just delete it later. Predators also use disappearing messages to allow them to groom there pray without any evidence being compiled.

    Keep communication about social media open with your kids. Use something like Bark to manage what they are sending and receiving. And model good examples on how to use a Messenger app in a healthy way.