Instagram started this week in a big way, announcing that all of their current and potential account users under 18 will automatically be under Teen Accounts.
Teen Accounts, according to Instagram’s blog and the Teen Accounts About page are automatically private. Content is filtered through the app’s most restrictive sensitivity content setting. A sleep mode is added from 10 PM-7 AM every day, and a reminder to close Instagram is activated after 60 minutes of straight usage.
These new teen account policies are effective immediately. All new account users found to be within the age restrictions will automatically be given teen accounts. Current account holders will be messaged by Instagram to prepare to become a teen account in the coming week. Within 60 days, all underage accounts in the US, Canada and the UK are planned to be made into teen accounts.
Sneaky teens have driven Meta to add further documentation requirements to prove account users are using accurate, age-affirming information—no fake birthdays will cut it anymore—and implementing a new A.I program to ensure the account user is honest about their age.
Parents also play a crucial part in the new initiative. Only through a parent making their own Instagram account can any modifications be made to any individual Teen Account for anyone under the age of 16. The child needs to send the parent an invite for supervision, and once it is accepted, parents have access to parental controls. They can see what content topics their teen is watching on the app, and though unable to see their child’s messages, they will be able to see who they’ve messaged in the last seven days.
The company behind Instagram, Meta, claims this is in response to the safety faults social media has inflicted upon younger internet users.
Head of Instagram, Adam Mosseri, told the New York Times, “We decided to focus on what parents think because they know better what’s appropriate for their children than any tech company, any private company, any senator or policymaker or staffer or regulator.”
With all of the talk of protection and concern, it’s important to note that these changes appear less than a year after Meta had been sued by 42 states attorney generals in the U.S on the grounds that their platform’s data collection from underage users was against federal protection of minors, specifically the Children’s Online Privacy Protection Act (COPPA).
It begs the question if all of this is for “the good of the children” or an just an attempt to save face.
Regardless of intent, all that matters is that kids are safe online, right? But when does safety begin to overstep on privacy or even censorship? Should parents be able to observe who their teen is messaging online? With the rise of predatory actions online, parental protection makes sense.
But what about LGBT youth who aren’t out yet to their parents, or teens in absuive households? What about emancipated teenagers who for one reason or another, don’t have a parental figure to turn to?
Teen Accounts are going to be on the most heavily censored setting for Instagram. In Instagram’s guidelines, they specify the content they’re referring to:
“Content that may depict violence, such as people fighting.”
“Content that may be sexually explicit or suggestive.”
“Content that promotes the use of certain regulated products, such as tobacco or vaping products, adult products and services, or pharmaceutical drugs.”
“Content that may promote or depict cosmetic procedures.”
“Content that may be attempting to sell products or services based on health-related claims, such as promoting a supplement to help a person lose weight. “
On paper, this sounds reasonable and safe, but what about news stories, which often cover these types of subjects for the sake of education and warning? Where does art fit into this narrative, where violence and even nudity can frequently be used in widely regarded high art?
Where do we draw the line? Is this protection, or an invasion of privacy?
The U.S Attorney General released an advisory in 2023 specifying the concerning effects social media can have on underage users. In their studies about the effects, 46% of users between 13 and 17 years old claimed social media made them feel worse and 75% of adolescents stated that websites did a fair or poor job addressing complaints of cyberbullying and harassment.
Predators behind screens go after underage people every day through their phones and computer screens. Plenty of teenagers are smart enough not to fall for scams and avoid talking to strangers online, but plenty more will be exploited, manipulated and harassed before they even consider reaching out for help.
At the end of the day, a parent or guardian’s job is to do what Instagram is enabling them to do: Protect their kid.
Teen Accounts are only the beginning of stronger limits on both what teens can do on their own, and what can be done to them without parental interference or protection.
Is it perfect? No. Will kids and teens still try to potentially succeed in circumventing parental controls? Obviously. But are more advanced barriers needed for underage protection? Absolutely.