Category: Security

  • We Bought Four Amazon Echo Dots!

    We Bought Four Amazon Echo Dots!

    Well, it is Prime Day and as usual, there are some deeply discounted items available on Amazon. My family usually looks but doesn’t buy on Prime Day, hoping to be able to predict the discounts we may see on Cyber Monday or Black Friday in a few weeks. We especially avoid any smart speaker or digital assistant hardware since we have always had (well informed) privacy issues and concerns. This year it has been different. We caved and bought Amazon Echo Dots for the whole family. Here’s why.

    They’ll Be Perfect for Our New Home

    Our forever family home is being built and we are planning a move-in just a few months from now. We are going to have more space for the six of us than we have ever had, especially in the kids’ rooms, the master suite, and the kitchen/dining great room. We’ll be a bit more spread out than we’ve ever been and the Echo has some great options for communicating throughout your home without having to scream up the stairs or down the hallway. The intercom feature was a deal sealer for both my wife and myself. The kids are pretty excited too.

    Digital Homeschool Help

    More of us are homeschooling than ever now and with four kids, all doing school work nearly every day, we need help sometimes. YouTube can be great to present some complicated concepts in helpful ways (7th-grade math, anyone) but my kids looking at screens and using a Google Search for spelling or calculator solutions isn’t the safest proposition. Alexa (the Virtual Assistant on Amazon Echo) will answer your spelling, language arts, science, and math questions with no risky search results or screen use at all. It is more important for my kids to know how to get information than it is that they know the info when they pass a grade. Alexa and other Virtual Assistants are the new waves of information access and they aren’t going away. They’re only getting smarter and faster.

    Less Screen Time

    My kids, like all kids, love to sit around and look at a phone or tablet. We are constantly having to get on to them about their obsessive behavior. We try to set better examples, we don’t always succeed, but giving them alternatives is very helpful. The Echo Dot is a smart speaker without a screen. At night, when the kids want to listen to a podcast or music for bedtime they can ask Alexa to play it for them instead of having their screens in their faces right up to when they fall asleep. Studies have shown this isn’t good for their sleep and can actually very detrimental to their development. With parental controls on the subscription services we use and on Alexa itself, we can ensure that our kids aren’t looking at their screens and are only listening to music and podcasts we’ve approved of.

    Safety and Security Upgrades

    All of this is great but digital safety and data security are always an issue. Especially with artificial intelligence that is designed to learn about you in order to be more useful to you. There is an obvious trade-off. You’re giving it information in exchange for convenience. I believe most of us consider that an acceptable exchange, considering Alexa and Google Home have been some of the fastest tech product to be integrated into people’s homes. The truth is that we have been making this exchange for a long time without really thinking about it. Every post on Facebook, Twitter, or Instagram, every search on Google, and every purchase or browsing session on Amazon has been used to build a database of advertising information about you. This can be scary to many but in all honesty, that ship has sailed and you raised the sails for it to do so.

    When you use these sites, you allow them access to your information. Alexa is no different and my family has considered the risks and decided it’s worth it. First of all, we already get targeted ads because we do so much of our shopping on Amazon and searching on Google. Secondly, the latest models of Amazon Echo Dot have added features like a hardware button to turn off the microphone that makes us feel like we can avoid being listened to when we don’t want to be listened to.

    Risk/Reward

    When you narrow it down it is a consideration of opportunity cost. You have an opportunity for convenience but it will cost some of your info. At a $19.99 price point, the Echo Dot is a great deal right now on Prime Day so we bought four of them. They’ll be here in a couple of days and I’ll set one up and let you know how it all goes. Stay tuned for my (late but in-depth) review of the Amazon Echo Dot as a tool for controlling kids’ screen time.

    If you shop the Amazon Prime Day today, consider using http://smile.amazon.com and signing up to support our non-profit, Four Point Families. You’ll have to search for Four Point Families and select it as the organization you’d like to partner with. Then Amazon will send .5% of your purchase our way to help us continue to protect families. Thanks.

     

  • These Apps Aren’t as Harmless as they Seem

    These Apps Aren’t as Harmless as they Seem

    Our kids use all kinds of different apps for many different reasons. Some for socializing, some for fun, and some for school and productivity. We don’t think twice about letting our kids use Google Documents or even the Bible app. Yet many of these apps aren’t as harmless as they seem. I receive messages from parents a lot asking if I have seen the latest awful thing people have done online. The answer is usually yes, and I am not surprised. For the last five years I’ve been learning about the digital/connected world our kids are growing up in and how it impacts our children and our families.

    Something I’ve learned is that if there is a system or an app that can be exploited to do harm, those who wish to cause harm will use it to do so. You see it yourself in your Facebook comments as some friends think it is the perfect forum for their disruptive thoughts. Worse still is the story from Bark’s project that put a 37 year old mom on instagram posing as a 13 year old girl. The response was shocking with inappropriate pictures and requests filling her direct messages just minutes after posting her first picture. The social function in the YouVersion Bible App being used to groom potential predatory victims. Google docs being used by young people for bullying, secret messaging, and sexting.

    It is shocking but I’m not surprised.

    What is our response to this tendency for people to use something meant for good and using it for the worst intentions. We can’t hide our head in the sand and keep our kids from using technology at all. This just isn’t realistic. We won’t be writing paper letters and saying no to laptops for school projects. The only reasonable response is to take responsibility for our children’s safety ourselves. We can no longer trust the apps that they use blindly, imagining that no harm can come to them simply because the app wasn’t meant for harm.

    We have to help our kids remember that the same stranger danger that is true when you’re six and at the playground is just as real when you’re fifteen and connected in direct messages by people you don’t know. I am not surprised by the nonsense that is happening on these apps. I just know that we, parents, are the only answer. People always find a way to ruin things that were meant for productivity or good. My advice is to talk to your kids. Help them know that. Tell them that if they are contacted by a stranger, even in an app like the Bible App they take caution. Remind them that they should say something if they see bullying online, even in a class Google Document.

    Our children are surrounded by voices telling them all kinds of truths. If you aren’t creating a safe place for them to come and be open with you about their concerns then you’re making it hard for them to live in this connected world. Do your best to be who they need you to be. I’m here to help.

  • Youtube’s New Kids Content Policies Explained

    Youtube’s New Kids Content Policies Explained


    Starting today, all creators are required to mark their content as made for kids or not made for kids in YouTube Studio. -YouTube Creators Email

    YouTube will be limiting the data they collect form videos that are targeting children. This is in effort to comply with the FTC’s demands that they be responsible for the information they gather on their site which lists children among their most frequent audience members. Wording in the email suggests that YouTube is “helping” creators comply with COPPA as well as meeting the demands the Federal Trade Commission put on YouTube as a media company.

    YouTube will use an algorithm to monitor content for child centric content and flag it as such if it is not flagged by the creator of the video. The email reminds creators to be vigilant to properly tag their videos if they are made for children as failure to comply could cause them to be in violation of the FTC’s demands.

    The FTC has outlined what constitutes children’s content and YouTube has that information available on their support page. YouTube’s announcement briefly defines children’s content as:

    • It is directed to children as the primary audience (e.g. videos for preschoolers).
    • It is directed to children but children are a secondary audience (e.g. cartoon video that primarily targets teenagers but is also intended for younger kids).

    YouTube’s guidelines state that they may override content creator’s settings if their content seems to be geared toward kids but isn’t marked as such. This could result in content creators being demonetized or held accountable in some other way for not properly categorizing their content.

    What Parents Should Know

    The FTC fined YouTube for their inability to comply with COPPA and told them they had to have a plan by next year to keep children’s data private on their site. Many thought YouTube Kids was the solution but so few parents actually used the kid version of YouTube so children remain a major audience for YouTube’s main site and app. The information creators give YouTube about their videos and channels will help YouTube know what videos to collect data from that will be used for advertising in the future. Also, the advertising on videos marked as “for children” will be different, focusing on the content of the video as an indicator of the audience rather than viewing data from the viewers themselves.

    These changes, in my opinion, are a step in the right direction for YouTube. Their collection of data from young audiences have been a point of contention for tech safety experts, security and privacy agencies, and family advocacy groups for several years now. The policies handed down by the FTC are in direct response of some of these experts and agencies asking for an investigation into YouTube for their lack of compliance with COPPA.

    As parents we rarely think about our kids digital footprint being collected and used against them but it is happing every time they log on to an app or game. It is important, however, to remember that the trail they leave behind online will follow them for the rest of their lives. The things they buy, the sites they visit, the videos they watch, and the games they play are all being compiled to create a profile on them that will be used to market to them online for years to come. If parents remember that our children’s web traffic is being collected we can take steps to protect them from excessive data collection. Encourage them to use messenger apps that are made just for kids. [Facebook Messenger Kids, not WhatsApp or FB Messenger.] Remind them that what they share online becomes public the moment they share it. Tell them they should only use video and game apps that are intended for children and made by major developers who are more likely to comply with COPPA. Parents are responsible for the safety of their children, as well as their privacy and security so take the steps you can to keep their data private.

  • Family Link’s New Features are Great but Still Not Good Enough

    Family Link’s New Features are Great but Still Not Good Enough

    Android has updated their Family Link parental controls feature. The above video will take you though what they’ve done and give you some questions to ask yourself about using the service.

    Make sure your device is compatible.

    The site is very clear that Family Link is only compatible with newer android devices. Go into the settings on your kid’s device and tap the ABOUT button in the menu to see if your software version is 7.0 or newer. If it isn’t your child may not be able to install Family Link which will mean you can’t use the software to set limits and restrictions.

    Double check their privacy policies.

    COPPA regulates the collection of children’s data without parent permission. You have to create an account for your child to use Family Link and to do that you must give permission for Google to collect some of their data. The video explores a bit more of what information they can collect and what they do with that data.

    Be aware that your kids get full control at 13.

    If you are one that wants to be able to see what your older child is doing on their device you’ll have to use the child’s phone to adjust parental control settings with Family Link as control is shifted to the child at age 13.

    Do your homework!

    As I mention in the video above and the podcast episode below, you need to familiarize yourself with the benefits and limitations of Google’s Family Link software. Visit families.google.com to see their information about it and check out our other articles and videos about Family Link as well. You can never be too informed.

     

  • FB Messenger Kids “Error” Allowed Thousands of Kids to Talk to Unapproved Strangers

    FB Messenger Kids “Error” Allowed Thousands of Kids to Talk to Unapproved Strangers


    Facebook Messenger kids was created to give children a safe place to communicate through text, stickers, video, and gifs with friends that are pre-approved by their parents or guardians. This week, however, the kids’ messenger app has had to send notifications to thousands of parents about their children having access to strangers in the app. 

    What happened is that a technical error allowed kids to create a group message with friends who would then invite their own friends who, while approved for them, may not have been approved by the parents of the first child. Confusing? Ya, this is possibly why the flaw was even possible in the first place. Facebook says they have alerted parents whose children may have had this type of interaction and that they’ve disabled any chats that were created, using this flaw. The story isn’t over, though, as some are calling for the FTC to look in to the error since it may have resulted in a COPPA violation.

    Released Today: Facebook Messenger For Kids!

    What Parents Should Know

    The moral of this story centers around trust. It is important that, while we may trust our children, we can’t always trust who our kids are in contact with. We definitely shouldn’t blindly trust the companies who make the hardware and software that our children are using. When our kids use an app like Messenger Kids, the whole point of the app is that it gives parents control. When the control is hindered, even by a “technical error,” that is a severe violation. We can, however, take actions to protect our kids from dangerous effects that could come from these errors.

    I recommend having a copy of the messenger kids app on your phone logged in to your child’s account. My wife and I are each logged in to one of our kids’ messenger kids apps and can see when they get messages and what the messages are about. We are notified when they receive a message and can look to see who it is from and even read it. I have, a time or two, jumped into the app to tell a friend to stop messaging since my son was past his allowed time for social media that day. I received a “yes sir,” and there were no more messages until the next day. We also use BARK to monitor their messages and alert us of any dangerous or inappropriate content.

    Parents are gate keepers. Our job is to be sure our kids are growing up with guidance through every area of life. If they aren’t being taught how to manage social media and internet use safely then they will struggle to make healthy decisions when they are older. Messenger Kids is a good tool to help your kid learn the right way to use a messenger but it won’t work if you are uninvolved, pretending that the creators of the app only have your kid’s best interest in mind. The truth is that they want to provide you a service to make a profit. We cannot overlook that. It is our responsibility, and ours alone, to teach our kids how to be safe online. We should take it seriously. We should hold companies accountably when they have errors that put our kids at risk but ultimately we should be the ones making sure our children are protected on every app, site, and software they use.

  • Is FaceApp Sending all of Your Private Data to Russia?

    Is FaceApp Sending all of Your Private Data to Russia?


    Last week everyone was posting pictures of themselves looking older or younger. They were all using FaceApp, an Android and iPhone app that uses AI to change your face to make you look older or younger, change your gender, and all kinds of different things. Then, suddenly everyone who had been posting pictures of themselves began sharing articles about the privacy dangers of FaceApp. What is true? What does FaceApp do with your pictures? Should we use apps like this? Here are the answers I found.

    Your Pictures Aren’t in Russia

    One of the major concerns due to political news lately is that all of these pictures have been stored by the Russians since the company that makes FaceApp is in Russia. The truth is that these pictures are stored on servers owned by Google and Amazon. Many of the photo apps you use including some of the social media apps you frequent use the same server companies to store your pictures and posts. There is no evidence to suggest that your images are being collected by the Russian government or even companies in Russia.

    Your Photos are Deleted after 48 Hours

    The face app privacy policies state that photos uploaded to their servers are usually deleted after 48 hours. They do state that some photos may be kept for analytical purposes but that they are not sent to the FaceApp companies. These photos are used by the artificial intelligence to make it smarter and help it do a better job of editing photos for people.

    FaceApp Terms Mention Affiliate Companies and Governments

    The policies of FaceApp do allow for them to give your photos to other companies “in their network.” Again, they say that this is for analysis purposes and not data tracking. They also say that they’ll give your photos to law enforcement if requested through legal means. 

    You Can Use FaceApp Without Giving Personal Information

    The company that makes FaceApp says that 99% of their users don’t login to the app. That means there are no ways for them to have your personal or identifying information. The only thing that they collect in those cases are your photos. If you have location settings turned off for your camera then there isn’t much personal data that can be gained from the images. All they actually have is a picture of a non-identified person’s face. Also, FaceApp only uses the photos you tell it to upload. Not your whole camera roll. 

    “…please note that we may transfer information, including personal information, to a country and jurisdiction that does not have the same data protection laws as your jurisdiction.” FaceApp Privacy Terms

    FaceApp Doesn’t Handle Data Differently than any Other Social Media Service

    The only major difference between FaceApp’s privacy policies and those of Facebook and Instagram are how much terminology they use to describe them. Personal data and photos are basically handled the same way by all these companies. You may consider it more of a fair trade off for Facebook and Instagram to collect your data in exchange for the services they provide. You also may be less inclined to be worried because of Facebook and Instagram being from the United States. Either way, your data is being used in the same way by all of these companies.

    Musical.ly is now Tik Tok

     

    Just Share Smart

    These instances of public outcry about the privacy policies of an app or a company are a great time to be reminded of the importance of thinking before you share. The truth is that everything, once shared on the internet, is public domain. It belongs to every citizen of the web and not to you any more. This should govern every choice you make on every site you visit and every app you use. If you wouldn’t want the whole world seeing that photo of you, your child, or your spouse, then you shouldn’t share it. If what you are about to post as a status would put your security in jeopardy then you shouldn’t post it. If you aren’t sure about a company or an app that is asking for your personal information then you shouldn’t give them your personal info. It is very simple. Just think before you fill out an online form. Think before you share a photo. Think before you past your thoughts about anything and everything.

    The issue isn’t where your information is stored. It is the fact that you share photos, phone numbers, credit card numbers, and even your social security number like it is no big deal. You don’t have to be an internet security expert, you just have to pause and think.

     

  • Family Tech News From Apple’s Developer Conference

    Family Tech News From Apple’s Developer Conference

    WWDC was held last week at Apple’s Headquarters in Cupertino, California. Every year, the tech giant hosts a conference for developers and media from all over the world. The company’s Project Managers and Chief Officers all take their turns on stage to discuss what they’ve been working on over the past year in order to increase the hype around Apple’s products and software. Much of what is announced at WWDC targets developers and “tech-heads” who can’t wait to find out how to make apps for Apple products or what the next big thing is going to be. Some of Apple’s new feature’s however could bring some peace of mind to parents. Here is a break down:

    Apple TV+

    Apple’s streaming video device has been great for viewing other services but Apple’s streaming service itself has been lackluster. One thing that has been missing for a while is the ability to make separate accounts or profiles for viewers, including children. Apple announced at WWDC that this is changing. They are making it possible to create profiles for every member of your family. Your viewing history and suggestions will be sorted according to your accounts and best of all, your recommendations won’t be overloaded with shows that your children love to watch.

    Apple Music/iTunes

    iTunes is officially no more as Apple will be separating iTunes offerings into multiple apps. Books, Podcasts, and Music will all be separate now on the MacOS. When you plug in your iPhone to sync with your Mac, nothing will happen. Your phone will sync in the background. It has become pretty apparent that most folks don’t need software to manage their music collection. Streaming music has taken over and iTunes wasn’t very good at that job. Apple Music is taking over the music service and Podcasts is mainly accessed through the mobile app, not on desktop.

    iTunes has been around since 2001 and while there are those who have become used to the software, most have been aggravated by frequent updates and overuse of computer resources. Apple is likely accurate in thinking the software won’t be missed by very many people.

    Apple Arcade

    Apple is also working their way into the video game streaming world with Apple Arcade, due to release this fall. Apple Arcade will consist of a series of exclusive games made just for their system and will be playable through your phone, tablet, Mac Computer, or Apple T+. They have a controller that you can use with AppleTV but are adding support for Playstation 4 and Xbox One controllers as well. The 100 or so available games are a bit weak looking but they are sure to find some developers who are willing to put out some quality content for Apple before too long. They’re going to have to in order to compete with Google’s Sadia and the new service coming soon from Sony and Microsoft.

    iOS 13

    Probably the most relevant of updates from WWDC has to do the Apple’s latest smartphone operating system, iOS13. The software boasts a new dark mode, faster app launches and downloads, faster Face ID unlock, and a new (to Apple at least) “swipe” style typing system.

    Dark Mode is cool and faster downloads and unlocking features are great but the iOS update doesn’t really have anything going on that is relevant to parents besides their focus on data security. More on that below.

    Photos and Video

    Photos in iOS13 is getting an overhaul as well. With the ability to pinch to zoom in your galleries and a new sorting method that groups photos together based on the date they were taken. Photos will also include a new smart gallery that will remove images like screen shots from your view, only showing the photos you’ve taken with your camera.

    Privacy is a Key Theme

    Every update at this year’s WWDC had privacy as a key theme. Directors and Developers mentioned over and over again what Apple does and doesn’t do with your data. Apple Maps uses encrypted data to help you find your way, the photos app does its date and location tracking locally, and they even mentioned a new “Sign in with Apple” that allows you to sign in with your Face ID and create accounts with individual dummy email addresses.

    Data security and privacy has been in the news a lot lately and Apple has been very vocal about their desire to keep their user’s information secure. Whether it is a direct attack against other tech companies who have made most of their money by collecting and selling data or just an honest desire to maintain their user’s trust, the result should be a bit more confidence that your information is safe if you are using their products. I always advise, however, that you continue to make efforts to protect your own privacy. Be careful what you share online. Turn off location access to apps that don’t require that information to work properly and most importantly, teach this approach to privacy to your children.

    You can listen to this article as a podcast on Family Tech Update.
    You can subscribe on Stitcher, Spotify, or Apple Podcasts using the links below the player.

  • It’s Being Called the Ultimate Unsend Button, Does it Encourage False Anonymity?

    It’s Being Called the Ultimate Unsend Button, Does it Encourage False Anonymity?

    Telegram is an end to end encrypted messenger that touts speed, privacy, and security. They have featured private messaging and self destructing messages for a while but their new feature takes privacy to a new level. You can now delete a message you’ve sent from your account and the account you sent it to no matter how long ago it was sent. Telegram is, again, standing up for privacy and users are buying in. Millions have flocked to Telegram after Facebook’s data leak news from the past several months. It looks like Telegram is doubling down on Privacy as their claim to fame. They’ve also added the ability remove your information from a message when the message is forwarded to other users. Some accessibility and ease of use features have also been aded.

     

    What Parents Should Know

    Security and privacy are often overlooked when we allow our kids to use internet connected devices. Privacy is becoming a major concern for experts and activists of family tech safety. Messengers that allow data to be collected and used for advertising shouldn’t be used by children and even teenagers due to the risks of such data being released or revealed without the messenger app developer’s consent. When an app features privacy as it’s distinquishing feature, you have to ask who the data is being kept private from. Obviously, we want data to be kept from third party companies who would use that data to advertise. Sometimes data is even kept private from the company that developed the messenger app that you are using. Telegram has a “secret messages” setting that must be set to keep your information encrypted from end to end. (End to end encryption means not only the company can see or collect what is being sent.)

    Anytime the ability to delete messages you’ve sent is added, I see red flags. While I think privacy is critical, there is also a risk of kids thinking they are safe from inappropriate or incriminating photos or messages being saved and used for nefarious purposes. It only takes a half a second to screen shot a message or image on your screen. Most phones allow you to record your screen to a video very easily. This means that you are non always anonymous online. If you are sending messages to someone, thinking you have complete privacy, you are trusting that the person you’re sending the messages to has your privacy in mind as well. Telegram is an easy way for predators, cyberbullies, and those interested in sexting, to send and receive messages that do their damage and then are removed as evidence.

    I have spoken to parents who have taken their kids to the police with complaints about people trying to groom them online but the police had no evidence because the messages had all been deleted. This is why a messenger makes the FamilyTechBlog uninstall list as soon as they add disappearing messages. It isn’t safe for your kids to chat with a feeling of anonymity or for them to chat with people who can send what they want and make the message go away after it’s been viewed. Telegram is rated 17+ and I fully agree with this rating. Private messengers that allow you to chat with anyone, anywhere shouldn’t be used by children and young teenagers. Especially when the messages can be removed at will.

  • Tinder, Grindr, and Predators. Social Media and Suicide. Who do we blame?

    Tinder, Grindr, and Predators. Social Media and Suicide. Who do we blame?


    There were more than 30 instances of abuse of children from the Tinder and Grindr apps since 2015. That number may seem small but when you consider that fact that kids have easily skirted around the age requirements of these dating/hookup apps and made contact with people who wish to harm them, any number is too high. While these companies say they’re doing all they can to keep kids from using their software, all they really say in response to these horrible occurrences is that the predators and kids violated their terms and services. Since the terms say you shouldn’t contact minors and that minors shouldn’t be using their software, they claim the responsibility isn’t theirs because the child was put in danger by using the app in a way that it wasn’t intended to be used.

    Officials are saying that isn’t good enough with law makers in the UK trying to create legislation that will require age verification on apps like Tinder and even some social media apps like Instagram. Recent suicides have been proven to be inspired by images of self harm that were viewed on Instagram. Again, officials at the social media company say that the most violent of the images violate their terms and services. They have recently, however, banned images of self harm and suicide and removed the categories from search results.

    Here is the question: When these horrible things happen, do we blame the companies who make these online products? Is it enough to write a terms and agreements and say that those who break the rules do so at the fault of their own and no fault of the company? So far, legally, that’s all it takes. It seems that the responsibility of the company ends with the terms and conditions page. If the user doesn’t follow the terms, then how is the company supposed to protect users? Some officials are asking for age verification which means keeping more records. This is something many companies don’t want to do because of recent privacy and data breach concerns. There is only one thing I know for sure, if families will get serious about monitoring their kids’ screen time and online activity, the number of these occurrences will dramatically decrease.

    Let me describe a scenario for you. Your 12 year old child wants to meet new people online, maybe they heard some friends talking about a dating or hook up app, maybe they just don’t have a lot of friends in real life. Whatever the reason, they’re looking for a way to meet people. While they’re looking through the app store they see this in the search results:

     

    They tap download, create a profile and start swiping. Eventually meeting new people on the app. Conversations move to WhatsApp, Facebook Messenger, or Signal and they schedule a meetup. Your imagination can take over from there and if you’ve read some of the news stories it can get pretty awful.

    Imagine, now, that you have parental controls set so that your child has to request permission to download apps. Maybe you even have their controls set to keep them from downloading apps rated for users over 12 years of age. Either of these approaches would keep you from hearing about your child’s new friendship or worse, romantic relationship with a stranger online. Instead, you’ll see that they’re trying to download an app that is designed to connect people for romantic relationships and be able to discuss this with them. You can share the dangers of building relationships with strangers and help them understand the importance of privacy, security, and parental supervision.

    There are built in ways to protect your child on both iOS and Android devices. The key is to set them up. Use the built in protections and features and don’t rely on these companies to protect your children. They don’t exist to keep your family safe or even to help people build healthy relationships. These companies develop their products to make money. It is foolish to expect Instagram to protect your kids from suicide, should they have a responsibility for what is on their app, yes, should you blame them if your kid harms themselves because they see something on the app, not entirely. You have to take some of the blame onto yourself. There are ways to keep your kids safe from that kind of content. If you don’t know about it or don’t use it, it isn’t the fault of the company. It’s yours. Be involved, pay attention, and do the work to keep them safe.

  • WhatsApp Update Brings Thumbprint and FaceID Lock To Private Messenger

    WhatsApp Update Brings Thumbprint and FaceID Lock To Private Messenger

    The private messenger, WhatsApp, has updated recently to allow users to lock the app from prying eyes by using their Touch or Face ID. Private messaging is becoming more important to users these days since the spotlight has been on Facebook and Google for their data mining and sales. WhatsApp has been a mainstay of private messaging for some time now and this new update takes privacy from an algorithmic/software level to a more obvious tangible place. You can now use your FaceID or TouchID, depending on the generation of your iPhone, to lock people out of the WhatsApp software entirely. This will keep people from opening the app and looking through your messages. Currently this feature is available for iOS only but it is rumored to roll out to Android soon.

    What Parents Should Know

    It’s important to know that there are options that allow you to keep an eye on your kids’ messaging without having to physically take their phone from them. However, if the physical approach is your style then this update from WhatsApp could become a problem for you. Messages being locked in this way needn’t deter you from checking up on your child’s messaging activity, though.  You can store your thumbprint in your child’s device so you can unlock it or just make them unlock the app for you when it comes time to inspect their messages.

    I recommend allowing your children to have a feeling of privacy by using some sort of software to monitor their messaging apps instead of taking the device from them every now and then. Not only does that plan give them a feeling of privacy, it is also a far better monitor then your weekly check up. If a message monitoring algorithm like Bark is active it will look at every single message your child sends or receives in real time, notifying you if any of those messages cross the line to dangerous or inappropriate content. Taking the phone from them to monitor it yourself allows messages to be removed before you get around to looking at it.

    I never advise spying on your children without their knowledge. They should know that you are keeping an eye on their messages and how the software works. They should also know what the consequences are if they send messages they shouldn’t be sending. Finally, you should have an open conversation to allow them to feel like they can come to you if they receive a message they are not comfortable with. No matter what you do to monitor your kids messaging, having a culture of transparency and openness in your home is critical.