Tag: pornography

  • These Apps Aren’t as Harmless as they Seem

    These Apps Aren’t as Harmless as they Seem

    Our kids use all kinds of different apps for many different reasons. Some for socializing, some for fun, and some for school and productivity. We don’t think twice about letting our kids use Google Documents or even the Bible app. Yet many of these apps aren’t as harmless as they seem. I receive messages from parents a lot asking if I have seen the latest awful thing people have done online. The answer is usually yes, and I am not surprised. For the last five years I’ve been learning about the digital/connected world our kids are growing up in and how it impacts our children and our families.

    Something I’ve learned is that if there is a system or an app that can be exploited to do harm, those who wish to cause harm will use it to do so. You see it yourself in your Facebook comments as some friends think it is the perfect forum for their disruptive thoughts. Worse still is the story from Bark’s project that put a 37 year old mom on instagram posing as a 13 year old girl. The response was shocking with inappropriate pictures and requests filling her direct messages just minutes after posting her first picture. The social function in the YouVersion Bible App being used to groom potential predatory victims. Google docs being used by young people for bullying, secret messaging, and sexting.

    It is shocking but I’m not surprised.

    What is our response to this tendency for people to use something meant for good and using it for the worst intentions. We can’t hide our head in the sand and keep our kids from using technology at all. This just isn’t realistic. We won’t be writing paper letters and saying no to laptops for school projects. The only reasonable response is to take responsibility for our children’s safety ourselves. We can no longer trust the apps that they use blindly, imagining that no harm can come to them simply because the app wasn’t meant for harm.

    We have to help our kids remember that the same stranger danger that is true when you’re six and at the playground is just as real when you’re fifteen and connected in direct messages by people you don’t know. I am not surprised by the nonsense that is happening on these apps. I just know that we, parents, are the only answer. People always find a way to ruin things that were meant for productivity or good. My advice is to talk to your kids. Help them know that. Tell them that if they are contacted by a stranger, even in an app like the Bible App they take caution. Remind them that they should say something if they see bullying online, even in a class Google Document.

    Our children are surrounded by voices telling them all kinds of truths. If you aren’t creating a safe place for them to come and be open with you about their concerns then you’re making it hard for them to live in this connected world. Do your best to be who they need you to be. I’m here to help.

  • ALERT! Kids Can Get to Pornhub from Snapchat in 5 Clicks

    ALERT! Kids Can Get to Pornhub from Snapchat in 5 Clicks

    We don’t like Snapchat. It has a history of allowing content that isn’t appropriate for kids even though the app is rated 12+ in the app store. The app is built around disappearing messages that are a nightmare for trying to prevent predatorial communication and sexting. Now, Protect Young Eyes has written an article highlighting the fact that, in just a few taps on your screen, you can get from Snapchat’s home page to Pornhub, the most popular pornography site on the internet. They even included a video that shows how easy it is to navigate to the adult site without ever leaving Snapchat’s app.

    Monitor Song Lyrics with Bark!

    What Does This Mean for Parents?

    Most filters for iPhone don’t monitor browsers in apps like Snapchat and Facebook. This is why the ability to connect to adult websites within these apps is so dangerous. It only takes a few taps on your screen to go from the Snapchat story to the “Premium” page on which you can click links to all of their other accounts. These links don’t open other apps that would be blocked by Screen Time or other parental control software. Instead, it opens in a browser within Snapchat, allowing access without being blocked by your filter or sometimes even reported by your Accountability software. The only real way to keep your kids off of those sites is to limit which apps they are allowed to use by blocking the app altogether.

  • The Three Worst Tech Parents

    The Three Worst Tech Parents

    I had the opportunity to speak at a conference last week that was full of educators and school administration. They were extremely excited about the things I had to share, they loved learning about ways to protect their students online, and they were generally interested in the statistics and facts about online dangers. They all, however, had one major complaint. “Parents just don’t seem to care.“

    That’s right! Teachers, administration, afterschool program leaders, and even librarians want to help kids learn the best way to use their tech devices. Everyone is concerned about overuse and too much screen time. Nobody wants kids to end up on the wrong websites or being communicated to by the wrong people. They all want kids to be protected while they’re on school property but they know that that is only a very small amount of time compared to the time they spend online at home.

    This all falls on parents. There is no one who has as much influence over their children as the parents who raise them. Teachers, coaches, pastors, and mentors all do what they can and have a real heart to protect your kids but if mom and dad aren’t taking part then it is an uphill battle.

    Here are three kinds of technology parents and where they mess up.

    1. The ”Do as I say, not as I do.” parent.

    I’ll never forget my neighbor’s grandfather when I was a child. He smoked like a chimney, several packs of cigarettes every day. When we would be outside playing with our friends, it never failed, he would come out light up a cigarette and immediately tell us all, “Never start smoking, it’s really bad for you.”

    I get addiction. I understand that there are things people can’t just give up. But this “do as I say not as I do” attitude can be very harmful to our kids. When it comes to technology most of us lift our phones about every 10 seconds on average. We spend 4 to 6 hours per day creeping Facebook, watching YouTube, and posting to Instagram.

    Even as a 10-year-old kid I realized how weird it was that this man was standing there, chain-smoking cigarettes, and telling us not to do the same thing. Our kids get confused when we tell them they can’t have any more screen time while we are looking at our phone just like we have all day long. Put it down, look up, and set a good example for your children.

    1. The “I’m super busy.” parent.

    I remember being told to play outside because my mom needed a few minutes to her self. We would go play at friends’ houses and every now and then a friend would say that his mom wouldn’t let us play there today because she needed the house to her self. Parents have always needed time without kids running around asking for things and getting on their nerves. The difference is that when I was a child I was going to the homes of people my parents knew. Now we set our kids down in front of devices on which they can communicate with the entire world.

    Using Netflix or YouTube as a babysitter is just simply a bad idea. It can be useful if you know how to set it up properly but most of the time parents know less about these sites and apps than their kids do. I get that you’re busy. I understand you have things you have to get done. It’s just very easy to allow your kid to be on the screen for 4 to 6 hours before you realize how long it has been. Use some sort of app that sets a time limit for your kids’ Screen time. That way it doesn’t fall to you to remember when they their time is up. It automatically kicks them off and you can tell them to get outside and have some fun in the sun.

    1. The “I have great kids.” parent

    Of course you have great kids. I know they don’t want to do anything wrong online. They will not bully people, they won’t send inappropriate photos, and they are definitely not visiting adult websites. The problem with this logic is that they don’t have to seek out these things, they come to them. Two out of every three kids who see adult content for the first time saw it by accident and the average age a young man sees pornography for the first time is age 8! These are, most likely, children whose parents would consider “Good Kids.”

    I sat and watched a young lady of seven years old create videos of herself and post them publicly on an app called Likee. I went to look at this app in the App Store and saw that it is rated 17+ because of the ability to post your videos publicly online. I guarantee mom didn’t know that app was posting videos that strangers can see online. Moms and dads trust their kids because they believe they’re going to do the right thing. The issue isn’t usually what your kid does online. Most of the time the problem is the strangers on the other side of that screen.

    Your kids need you to care!

    The worst thing we do, as parents, is decide that we can’t learn any more about the tech our kids are using. We cannot be fooled into thinking that the digital world is moving too fast for us to keep up. It does move fast, I understand, but there are resources that we can and should use to help us better wrap our minds around our children’s time on tech. Use FamilyTechBlog.com, our YouTube Channel, and Podcast to help you stay informed. Knowledge is definitely power. You need that power to keep your kids safe and help them develop healthy habits.

    Secondly, we often get too focused on ourselves and what we need. While our homes shouldn’t be fully centered around our children, we have to set some boundaries and standards to protect our kids from the nonsense that the online world can provide. We should pay attention to what they do on social media and not let them use those apps until they are old enough to use them responsibly. We need to be knowledgable about the video games they play, the sites they visit, and who they communicate with online. We should learn all we can, every chance we get, to continue to keep our kids safe.

    It is easy to get discouraged. We hear of the worst case scenario on the news almost daily. kids going missing, kids hurting themselves because of something they’ve seen online, and studies showing how damaging excessive screen time can be for our childrens’ brains. I advise that you don’t get discouraged but get inspired. Let this information drive you to learn more to protect your kids. Learn all you can and share what you learn with all of the parents you know. That’s the best way to protect our kids and help them build healthy habits.

     

    Listen to the podcast here:

     

  • Tik-Tok Under Fire as Study Finds 25% of Kids Talk to Strangers Online

    Tik-Tok Under Fire as Study Finds 25% of Kids Talk to Strangers Online

    Tik-Tok is at the center of a controversy surrounding the exposure to predators and child pornographers through live streaming on their app. One in twenty children who use live-streaming apps have been asked to take off their clothes according to a study by the UK’s Children’s Charity NSPCC.  Originally called Musical.ly, Tok Tok claims to “empower everyone to be a creator directly from their smartphones, and is committed to building a community by encouraging users to share their passion and creative expression through their videos.” Their mission statement sounds like they are building a place for our kids to stretch their creative muscles and build a supportive audience but in reality it is exposing them to potential danger.

    Sexual exploitation is only a part of the issue, there are popular hashtags on the app that highlight self harm and eating disorders. Tags like #thinspo (thinsporation) feature videos of children as young as eight showing their rib cages through their skin and proclaiming that they are inspiring to others who desire to be thin. Suicide and self harm are also featured on the app with complete with encouragement to hurt yourself and instructions on how to do so. Tik-Tok says you have to be 13 to use the app but as we have shared multiple times on this site, that age exists to protect the company from legal action concerning the collection of children’s data, not to protect your children from content on the app.

    While the app is rated 12+ in apps stores in the U.S. the reasons listed for the rating prove to be, in fact, very mature. The issue, again, as I’ve mentioned, is user generated content. Anyone with a smartphone and a wifi connection can make videos and now livestream in Tic-Tok, they can also watch you perform on the app. This makes for an open, dangerous atmosphere filled with predators, adult content, scams, and violence.

    What Parents Should Know

    Tik-Tok says they have filters and parental controls in the app that allow you to set the app to private but all of these measures have proven to be less than effective. Kids who use the app on their own can easily come across content that isn’t age appropriate. The content restriction and  time management settings in the app are password protected; they can be useful and should be set up if you allow your child to use Tik-Tok. Also be sure to turn off the ability for non-friends to comment on, share, and download (this is on by default, creepy right?) your child’s videos.

    We don’t want our kids talking to strangers online. All parents understand the dangers associated with live-streaming and posting public videos to the internet. Unfortunately many parents feel that their hands are tied when it comes to keeping their kids safe on these apps and websites. That isn’t the truth, however, there are tools (some in the app and some third party) which you can use to keep them from accessing things that are dangerous. An algorithmic filter is never going to be enough, though, so it is important that we have open communication with out kids about what they are posting and seeing on apps like Tik-Tok. Also, if your child doesn’t meet that age restriction then they shouldn’t use the app.

    Twenty five percent of kids talking to strangers online is a horrifyingly high statistic. It shows that while there are privacy settings and parental controls out there for parents to use, either parents aren’t using them or their kids are getting around them. I know that the privacy settings in Tik-Tok aren’t password protected so if your children want to talk to strangers on the app and they have time using the app by themselves there are ways for them to make that happen. It is important that parents take the responsibility to protect our kids online. Many media outlets are blasting these companies for putting our kids in danger but I have to be honest, you don’t blame the slide for your kid falling off and busting their face, you think of precautions that YOU can take to keep that from happening in the future.

  • I Can’t Help You Protect Your Kids on Apps Meant for Adults

    I Can’t Help You Protect Your Kids on Apps Meant for Adults

    I field messages and requests all week long from parents who want the latest tools for keeping their kids safe online. They ask about everything from YouTube to Instagram to Snapchat and want apps that will monitor social media use, block adult content, and limit screen time. While there are resources to help parents in each of those departments, some apps just aren’t intended for your younger child. Unfortunately many parents have a real problem giving in to that fact.

    Streaming Videos

    Let’s look at YouTube as our first example. The video app was created in 2005 as a place for anyone to upload short videos to share with their friends. Google purchased YouTube in 2006 and Social Media became popular soon after, rocketing YouTube to the successful streaming platform it has become. The site is loaded with videos from filmmakers, vloggers, video gamers, makeup artists, geeks, professionals, educators, ministers, animators, artists, basically any category you can think of. It has evolved into an immovable force on which there are 300 minutes of footage uploaded every single minute. YouTube has come under fire for some of their content being too mature or sensitive and so they’ve employed algorithms to keep tabs on inappropriate videos. They also released n app for children called YouTube Kids. This app has also seen its share of controversy after YouTube has been unable to keep sensitive material from showing up in videos on the app.

    YouTube obviously wasn’t intended for young viewers. It is a site that is populated primarily by videos uploaded by its users. Some companies that make content for kids use YouTube but this is a choice by these companies in response to the popularity of the platform. It’s an attitude that says: “Kids are there, so we should be there too.” The goal is to reach the audience already there, not necessarily to build an audience on YouTube. There are no real parental controls (safe search is mostly useless) and videos that are labeled as kid friendly are done so without any human eyes ever seeing the entire video. The only time a content reviewer sees the video is when enough users of the site have flagged it as inappropriate. Allowing your kids to watch YouTube on their own is a risk that many parents don’t even realize they are making.

    What about Social Media?

    Snapchat, Instagram, or Facebook are all the same. They, like YouTube, feature content created and posted by the users of the service. This “User Generated Content” can vary from political or religious views, to silly cat videos or memes, and random personal updates that mean nothing to anyone. People also post updates on their serious mental health issues, they share about their plans to harm themselves or others, the post images of themselves in compromising situations, and that’s just what people post publicly. Private messaging contains content that people post when they think nobody except those they trust is watching. Private messaging is how predators groom their victims. It’s how the out of control teenage boy convinces the girl to send him inappropriate pictures of herself. Social Media is intended to be a place to connect with people, some you may know, some you don’t. It is meant to be a public forum and that which is meant to be private, is meant to be completely private. This is where the problems come in when parents ask for ways to monitor their kids social media.

    Age Rating vs Terms and Agreements

    I see a lot of parents giving their kids access to social media and other online activities when they reach the age of 13. This is based on the fact that the terms and agreements that these sites have you approve before making an account list 13 as the minimum age to use their service. A common mistake parents make is thinking that this age is meant to protect their kids from content on the site when, in fact, it’s intended to protect the company from having data and information on kids under the age of 13. COPPA laws say that companies can’t collect and use information of kids under 13 without parental consent. If a company says you can’t use the site if you’re under 13 then they can do whatever they want with all of that data and if your kid is underage, it isn’t their fault. You ignored the Terms and Agreements when you allowed them to use the site.

    Age rating is the age recommendation you’ll see in the app store when you are downloading and app. This age restriction is based on the actual content in the app, not any legal requirements for the company. The usual standard is that apps populated by user generated content are rated 17+. This is because the company can’t guarantee that what is seen on their product won’t be considered adult content. When we allow our kids to use apps that contain user generated content we are allowing them to be subject to the opinions, behavior, and whims of everyone else who uses that app.

    Parental Involvement Before Parental Control

    When I am asked to help parents protect their kids in apps that are obviously not made for children I feel like I’m being asked to give parents a suit their kids can wear to protect them while they play in a burning building. I get it. It isn’t easy to tell your kids they can’t do something they want to do. “My friends are all on Snapchat.” or the one that irritates me to no end, “The teacher/coach says I have to use Facebook to get the homework/practice schedule.” Sometimes we just have to say no. It is difficult to set the boundaries and limits that keep our kids safe but if we have the right attitude about what we’re protecting them from it becomes easier. Social Media, YouTube, video games that are rated M for mature, non of these things are intended for people under the age of 17 and when we allow our kids to use these products, we open them up to a world that is meant for adults.

    It is difficult for algorithms to catch nudity or violence in uploaded videos. Social Media sites and private messaging apps go to great lengths to keep prying eyes from seeing what is being sent. This makes parental monitoring software hard to develop. Unfortunately some burning buildings are just too dangerous and there isn’t much that can be done to protect you if you’re inside. If you aren’t ok with your child seeing content that is meant for grown ups then I recommend thinking about uninstalling that app instead of trying to find software that doesn’t it allow it to do what it was intended to do.

  • How Can Artificial Intelligence Protect My Family?

    How Can Artificial Intelligence Protect My Family?

    How AI Works

    When you think of artificial intelligence it’s natural to imagine Skynet or some similar software that is running things for us some day. While that could be the overall goal someday, right now AI is nowhere near that smart. Currently artificial intelligence isn’t intelligent at all. While it does learn from the input that is fed to it, there is currently no way for AI to decide what it needs to learn on its own. There is a very large gap between software algorithms that can learn and an intelligent software that makes its own decisions.

    At CES in 2018 I watched a robot named Aeolus glide across a room cleaning up. It took it a solid three minutes to move from one side of the makeshift living room, reach down and pick up a wii remote, and roll to the table to set it down. It was nothing like we have been promised by television and movies but I guess it was still cool. What parents should understand is that while the developers of an AI can make promises of their algorithms learning and behaving as if they have intelligence, that is not the same as being actually intelligent. Humans still have to do the thinking.

    While it isn’t foolproof and is definitely not sentient, artificial intelligence is a good tool. There are many ways AI is useful and much of the latest hardware and software use AI  to do some of their most minor functions. Here are some of the interesting ways AI can make your parental control and accountability tools even better.

    Filters

    There was a day when an internet filter depended solely on the web or ip address of the site you were visiting to tell if there would be inappropriate content or not. There was a master list that had to be updated continually with new websites and key words. AI is different than that because the filter is based on images and other content that the AI was “fed” over and over again the algorithm then detects actual images, text, and videos on web pages instead of just the address of the site you are visiting. This can be helpful if a website doesn’t typically contain adult content but a certain article or comment section features material that would cross the line. A traditional filter couldn’t catch that but one that uses an AI can.

    Circle (meetcircle.com) and NetNanny (netnanny.com) are examples of filters that use smart algorithm to block web content.

    Accountability

    Accountability software works very similarly to filters except that when it sees something inappropriate it will not block it but alert whoever is on the list to alert. AI has revolutionized this sort of software because it allows parents to receive only lists of unwanted sites instead of having to sort through everything that has been viewed by the person they are keeping accountable. The software I recommend, Accountable2You (accountable2you.com promo code BecauseFamily,) is updated constantly to allow it’s algorithm to properly and effectively scan for adult content. It works very well. You may get occasional alerts for content that shouldn’t be considered adult, but it’s not too often and it’s worth it for the peace of mind.

    Privacy and Security

    Finally, when we discuss AI and algorithms we must talk about privacy and security. Algorithms may have been the beginning of many of our privacy problems but it may also be providing some solutions. Tools like BitDefender can be used to protect your home network. The AI can tell the difference between forgotten passwords and malicious login attempts. Our home networks are becoming increasingly worthy of being targets of hackers and encrypting your web traffic with AI can protect your from that kind of attack.

    I hear a few different reactions when I talk about artificial intelligence. Most people roll their eyes or glaze over because they aren’t even interested. It’s some tech term that they don’t think they can fully understand so they’d rather not talk about it. The other group is super interested, always wanting to learn more about it and understand it better. These are my nerd friends. I love them. Finally there’s the group that just freaks out. They immediately think of the movies and tv shows and just want to move into the woods and unplug. Which person are you? Are you willing to let AI work to your benefit in your family? Is it all too much for you? Let me know in the comments below.

  • Tumblr to FINALLY Ban Adult Content

    Tumblr to FINALLY Ban Adult Content


    *WARNING: this post uses quotes with direct language about pornography and graphic content.

    While most social media sites that allow user generated content have been working to protect their users from unwanted adult images and videos, Tumblr has been happy to be known as “porn GIF central.” Last month, however, their app was pulled from the iOS app store for child pornography and that seems to have caused the developers to reconsider their policies. Earlier this week, Tumblr announced that they are changing their sensitive content guidelines and will be blocking such posts in the future.

    Tumblr defines sensitive content as:

    photos, videos, or GIFs that show real-life human genitals or female-presenting nipples, and any content—including photos, videos, GIFs and illustrations—that depicts sex acts. – Tumblr help center.

    Their guidelines also mentions what type of posts will not cross their line to be considered “sensitive:”

    Examples of exceptions that are still permitted are exposed female-presenting nipples in connection with breastfeeding, birth or after-birth moments, and health-related situations, such as post-mastectomy or gender confirmation surgery. Written content such as erotica, nudity related to political or newsworthy speech, and nudity found in art, such as sculptures and illustrations, are also stuff that can be freely posted on Tumblr. – Tumblr help center.

    Their terms now state that content that is considered sensitive will not be allowed and that any sensitive posts that have been posted previously and not marked as explicit will be flagged and removed. Accounts that have been treated as explicit in the past (you can tag your own account as explicit) will maintain their explicit status and be allowed to continue posting, however, posts, both past and future, that are considered explicit under the new guidelines will be treated as such and removed.

    What Parents Should Know

    Very simply put, Tumblr is still going to allow some forms of sexual content and nudity in their app, as long as it can be labeled as political, newsworthy, or health and social justice related. Many other social media outlets already have these guidelines so Tumblr, while not allowing “hardcore” sexual content, there are still going to be images, videos, and GIFS, that you don’t want your children to see. My advice is, as always to keep an eye on what your children are using social media for, if they are sending messages to friends, you want to be sure they are wholesome and healthy communication and that they are only talking to people they know. If they are using it for artistic inspiration then you should know they could come across content you may consider sensitive, even if Tumblr does not.

    Bark is a good way to keep an eye on what your children are sending in social media messages. It uses an artificial intelligence to watch out for dangerous conversation for you and send you an alert if something about suicide, self harm, sexting, or bullying is sent or received. As I always say, the most important thing you can do is speak to your child about what they do online and what they use their social media for. You may hear from them that Tumblr is all safe now and that they should be allowed to download it, but let this article be your warning that what Tumblr considers safe may not be the same as what you consider safe.

  • Tumblr Removed from Apple App Store for Child Pornography

    Tumblr Removed from Apple App Store for Child Pornography

    Photo blogging app Tumblr has been removed from the iOS App Store because of child pornography. Earlier this month the iTunes App Store removed Tumblr from their market unexpectedly. The reason wasn’t announced at the time but it has recently become clear that scans showed child pornography was making it through Tumblr’s content filters. A statement from Yahoo (owners of Tumblr) confirmed that child pornography was the reason for the app’s removal and said that they are working hard to fix the flaws in their scanning algorithm and get the app back on the app store.

    Tumblr has been criticized for their lack of concern for adult and inappropriate content on their app. Some even call it “porn gif central.” They added an on/off switch for adult content when Apple made it a requirement but didn’t password protect it. Tumblr has a reputation for doubling down on the fact that pornography is what makes their app so popular. The app is still available on Android’s Google Play Store.

    What Parents Should Know

    It didn’t take much research for me to add Tumblr to my uninstall list a couple of years ago. It is still there and this latest news only solidifies that fact that it belongs there. There is content on Tumblr that many feel they want to see. Geek stuff, memes, humor, art, and photography are all featured on the app prominently but a simple search or click on the wrong related image can lead you to hardcore adult images and animated images. Your children shouldn’t be allowed to use Tumblr and your teens should be advised against it.

  • NetNanny | Let’s Review Video

    NetNanny | Let’s Review Video

    Let’s Review Video

    Net Nanny features one of the strongest filters available, custom settings, time management, alerts, and much more. In this video, I walk you thought the NetNanny website and we discuss its pros and cons. I share some of the set up woes I experienced with NetNanny but why I think those have been updated. 

    You can learn more at NetNanny.com 

  • Instagram Has Added 4-Way Group Chat

    Instagram Has Added 4-Way Group Chat

    Available today, Instagram has added a way to chat with your friends while simultaneously creeping the app. Instagram is allowing users to chat with up to four friends over video as well as multitask within the app by minimizing the chat screen. You can call friends directly and they will be notified of the call they can then accept or if they go into your group chat feed and see the icon is blue that means you’re chatting with someone and they can just join.

    Instagram has hit 1 billion users this month and they are doing their best to make it the app that people spend the most of your time in. This update allows users to meet the video chatting need that especially so many young people have and then doubles down with the ability to explore the app while you chat. This makes using Instagram an even more social experience.

    Instagram has also added new camera effects and channels that you can explore that highlight different topics.

    What Parents Should Know

    Video chatting is available in many apps including Snapchat, Facebook, HouseParty, and FaceTime. This is becoming the most common way for our young people to spend time with each other. Retail stores and malls are closing movie theaters are adding features to attract an older audience all because our kids don’t have to go anywhere to spend time with each other.

    Whether this is good or bad is up for you to decide. It is true that studies have been done showing that even video chatting does not meet the same social needs as being in the same room with somebody. So my advice is simply to monitor the amount of time your kids spend on their social media apps. Whether they are chatting with friends, just scrolling through images, or posting their own information. There’s a lot that needs to happen to keep them secure but all experts agree we have to be careful with how much time they spend on their devices.

    UNGLUE CAN HELP!

    Video Chatting Apps You Should Uninstall NOW