• Technology
  • Electrical equipment
  • Material Industry
  • Digital life
  • Privacy Policy
  • O name
Location: Home / Technology / Apps offer teens some one-and-done settings to stay safer online. Here’s a crash course.

Apps offer teens some one-and-done settings to stay safer online. Here’s a crash course.

techserving |
900

Cyd Harrell is annoyed by the texts she gets from Amazon every time her 17-year-old daughter buys anything.

The account came along with her daughter’s first bank account and the teen doesn’t need to ask about every single movie rental and minor purchase, Harrell reasoned. But the account also came with a collection of controls and monitoring tools for any parent anxious about releasing teens into the hairy world of online shopping unaccompanied.

Help Desk: Technology coverage that makes tech work for you

Harrell says she understands the risks that come with having a child online. She’s known people that received anonymous messages containing images of their children, presumably pulled from the Internet and doctored to appear pornographic. On the other hand, keeping tabs on a teen’s every online move is draining, intrusive and for many parents not doable. Her daughter has friends that get a phone call if they wander a block away from where they’re supposed to be, she says, and some parents have continued monitoring kids after they leave for college.

AdvertisementStory continues below advertisement

Parents like Harrell face tough decisions as concerns over teen safety in online spaces reaches a fever pitch. Last week, Google unveiled a tool that lets minors and their guardians request the removal of photos from the search engine’s image results. Google’s move came after documents leaked by a former Facebook employee revealed the company and its app Instagram didn’t disclose research suggesting Instagram negatively affects the mental health of young women and girls. Instagram announced a special version of its app for kids, then rolled it back when critics said it was a bad idea. In August, Apple announced it would start scanning phones for explicit images of children, then rolled that back amid concerns about privacy. Big tech companies are trying to improve kids’ safety and privacy online — but much of the work still falls to parents.

“I’m a little worried about it becoming a standard that parents monitor every single thing their kid does online. And that it will become some upper-middle-class standard of parenting that other people get denigrated for not meeting,” said Harrell, a civic design consultant in San Francisco.

When it comes to online safety, experts recommend less talking and more listening by parents, and the baked-in safety settings in apps teens love best can be a good jumping-off point.

AdvertisementStory continues below advertisement

Popular apps among teens including TikTok, Instagram, YouTube and Snapchat ranked in the top five most downloaded iPhone apps in the United States as of Tuesday, according to analytics company Sensor Tower, and all come with one-and-done safety settings that help limit unwanted interactions or protect privacy.

These settings aren’t fail-safe — much of the content Harrell believes to be most harmful to her daughter gets automatically recommended by the social apps themselves, she pointed out. And teens can undo many of these settings if they choose.

Here, we walk you through some important safety settings and tools every parent and teen should review. Safety settings are also an opportunity to talk openly about online risks: Ask your teen what they need to feel safe, and talk about what digital boundaries and settings would help get them there.

Apps offer teens some one-and-done settings to stay safer online. Here’s a crash course.

For teens, navigating the mental health pitfalls of Instagram is part of everyday life

Google Images

Under Google’s new policy, people under 18 — as well as their guardians or an authorized representative — can fill out an online form asking the company to remove photos of them from search results. Keep in mind that this only removes an image from search results, not from the websites they’re sitting on.

AdvertisementStory continues below advertisement

To see if there are unwanted images of your child in Google search results, type their full name into the search engine and click on the “Images” tab. If you and your child would like an image removed, you’ll need the URL of the image, the URL of the search results page and a screenshot. (Follow the links for help getting your hands on those. Google also addresses some frequently asked questions about the process here.)

TikTok

Video-sharing app TikTok lets parents link their accounts with their teen’s account and control settings including daily time limits for the app, whether teens can send and receive direct messages, whether accounts are public or private, who can comment on teens’ videos and whether teens can use the app’s search bar.

Story continues below advertisement

But if you’d rather give your teen more autonomy on the app, talk through the settings together — they can control many of them through their own accounts.

Advertisement

TikTok’s default safety settings depend on age — if your kid reports his or her age honestly, that is. Children under 13 are supposed to be funneled to a special “TikTok for Younger Users” with curated content and no sharing, messaging or commenting. Thirteen-to-15-year-olds also get different default settings than 16- and 17-year-olds, the company says. But age is self-reported and easy to fake.

To see the available privacy and safety settings, tap the profile icon in the bottom right corner of the app, then the menu in the top right that looks like three little lines. Go to “Privacy,” and you can toggle on the “Private account” slider, which means videos you share will only be available to people you’ve approved as followers. Account privacy should be on by default for people younger than 16.

Story continues below advertisement

Below that, you can control who will see your account as a suggested one to follow. For example, if you don’t want your account suggested to friends of your friends, slide that toggle to the off position.

Advertisement

Scroll down to the “Safety” heading to control whether other people can download videos you share, who can leave comments on your videos, who can see the people you’re following on TikTok, who can repurpose the videos you share for their own video creations and who can see which videos you’ve liked. You can also disable direct messages.

If you hit the backward arrow at the top left, you’ll find yourself back on the “Settings and privacy” page. Under the “Content & Activity” section, there are tools including the parent-child account pairing described above and “Digital Wellbeing,” where you can turn on the app’s restricted mode to see less “content that may not be appropriate for all audiences.”

Instagram

Facebook-owned photo-sharing app Instagram comes with some built-in safety features for people who list their age as under 18.

AdvertisementStory continues below advertisement

Strangers can’t send teens direct messages on Instagram unless the teen already follows them. For people younger than 16, accounts are set to private automatically, so strangers can’t add them as a friend without their approval.

To change an account from public to private, open the Instagram app and tap the tiny person icon in the bottom right corner. Then, open the menu in the top right corner, which looks like three little lines. Go to “Settings” then “Privacy” and turn on the “Private Account” slider.

The app also introduced restrictions for adults that use their Instagram accounts to send messages or friend requests to minors, or that have recently been blocked or reported by minors. These accounts won’t be able to find and follow kids’ accounts through the search bar or recommended-content sections. If a teen connects with a suspicious adult and that adult tries to send a message, the app will pop up a safety notice and ask to block or report the sender. These suspicious accounts aren’t removed because they haven’t explicitly violated the app’s rules, according to an Instagram spokeswoman.

AdvertisementStory continues below advertisement

Last, if your teens wants to see less “sensitive” content — which includes bare bodies, violence and drugs — in the explore tab and recommended-content slots in the main photo feed, they can go back to the menu and select “Settings,” “Account,” “Sensitive Content Control” and “Limit Even More.” (The “Allow” sensitive content option should not show up for people under 18.)

Be aware: Limiting sensitive content may also filter out content some teens want to see, like body-positive photos or work by LGBTQ artists. Instagram has said the filter is based on its notoriously fuzzy recommendation guidelines.

Instagram is giving you more control over ‘sensitive content.’ Here’s how to turn it off — or dial it up.

Snapchat

Snapchat accounts for people under 18 are private by default, which means they must approve any new friends and strangers can’t find them unless they know your teen’s Snapchat username or are already connected with a mutual friend. Their list of friends also remains invisible.

AdvertisementStory continues below advertisement

Everyone on Snapchat can receive direct messages only from people they’ve accepted as a friend. This year, the app started serving notifications that prompt users to review their friend lists and remove any unwanted people.

YouTube

Google’s video-sharing app YouTube changed its policies in August so that videos uploaded by people ages 13 to 17 are only viewable if the video has been shared directly — so they don’t show up on the teen’s profile or in search results.

The app also turned off autoplay, which queues up a new video right when another one ends, for accounts belonging to teens. Both the default video privacy and autoplay setting can be changed easily within a teen’s account.

Like other apps, the company says people under 13 aren’t allowed unless a parent or guardian enables it. Google launched YouTube Kids in 2015 and supervised accounts for tweens and teens earlier this year with advanced content restrictions.

Twitch

Twitch is an app for sharing live streams and other videos intended for people 13 or older, the company says. Much of the content is video-game-related.

Because video on Twitch is often streamed in real time, the company can’t always control what creators say or do. But there are settings your teen can enable to filter what kind of language shows up in the comment streams alongside live videos — including profanity, hostility, sexually explicit language and discrimination — although automated filters for harmful language have historically had a tough time getting things right.

To turn on language filters, tap on the three-dots icon to the side of the comment box on any video stream. Choose “Hide Offensive Language,” and you’ll see different filters you can toggle on and off.

You can also block direct messages from strangers on Twitch. Tap the profile icon in the top left corner of the app, then go to “Account Settings” then “Security & Privacy” and turn on the “Block Whispers from Strangers” slider.

Discord

Discord is an instant-messaging app popular with gamers and other online communities that comes with safety settings your teen can switch on. People on Discord must be 13 or over, according to a company spokesman, but age is self-reported.

These settings live in the account tab that looks like a game controller in the bottom right corner of the app. Tap the icon, then go to “Privacy & Safety.” The “Keep me safe” setting under “Safe direct messaging” means the app will scan all direct messages for explicit photos.

Below that, teens have the option to toggle off the setting that allows direct messages from people in chat communities, or servers, they join. And under “Who can add you as a friend,” they’ll see options for “everyone,” “friends of friends” and “server members.”

Banning your children from Instagram won’t help keep them safe. These tips will.