Instagram just announced a significant update to the way they safeguard and support teen users on their platform, and I believe that this development is timely, well-constructed, and one that should lead to measurable benefits. These changes will initially be made to all accounts of youth under 18 years of age in the US, UK, Canada, and Australia within the next two months, and then rolled out to other countries in 2025.

I was invited to the launch event in New York City to speak on a panel where we covered parent-focused strategies involving communication, rule-setting, belongingness, connectedness, resilience, and more. We were led by Nicole Lopez, Global Director of Youth Safety Policy, and joined by Dr. Hina Talib (@teenhealthdoc), Director of Adolescent Medicine and Co-Director of Pediatrics at Atria New York, Yvonne Johnson, President of the National PTA, and Lanie and Madison, two amazing teenagers who serve on the American Academy of Pediatrics Youth Council.

Here are the new features for Instagram Teen Accounts:

Accounts Will Be Set to Private by Default. Teen accounts are automatically set to private for all users under 18. This means that only approved followers can see their content and interact with them. Users who are 16 and 17 can go in and adjust this setting, but users who are 13 to 15 can only do so with parent permission through Instagram’s Parental Supervision tool

Messaging and Tagging Will Be Restricted. Teens under 18 can only receive DMs from, or be tagged or mentioned by, people they already follow.

Improved Content Filtering for Young People. Teens under 18 will be placed under Instagram’s most rigorous content control measures, which means that they won’t be exposed to [what Meta has determined as} “sensitive” or mature content on their Explore page or in Reels. In addition, they will be under the most restrictive version of the “Hidden Words” anti-bullying feature, so that offensive words and phrases will be filtered out of the Instagram comments and DM requests they receive.

Time Management Features. Instagram will send reminders to teens under 18 after one hour on the platform, in an effort to encourage them to go do something else. Additionally, a “sleep mode” will silence notifications and send automated replies to direct messages between 10 p.m. and 7 a.m.

Augmented Parental Controls. I mentioned earlier that 13- through 15 year-olds who want to adjust their privacy settings on Instagram will need parental permission. Instagram has had Parental Supervision tools available in the app since March 2022, which has allowed parents to set time limits, understand when their teen reports or blocks another user, and restrict their child’s ability to see “sensitive” content, who they can receive DMs from, and whether their child’s profile is Private to Public. With Instagram Teen Accounts, parents now can monitor which accounts their child has DMed in the last seven days (without seeing the contents of any DMs for privacy reasons), restrict access to the app during certain periods of the day or night, and view the topics their teen has chosen to follow. Previously, Parental Supervision required that both the child and the parent opt-in to the feature. Now, 13- to 15-year-olds will have to opt-in to this new level of protection to make any privacy changes (i.e., their parent will receive a notification that their teen is requesting a change, and they will have to approve or deny it). This should prompt more conversation and collaboration between parents and children, given that the latter will need to connect with their parent’s Instagram account and be granted formal permission to use less protective settings, while the former often simply needs to be more involved in what their teen does on the platform.

Topics to Choose From.  Teens will be given the option to choose from a variety of fun, positive, content areas to help train Meta’s personalized recommendation algorithm and give them more of what they want to see.

Other Considerations

When considering all of these changes, some possibilities come to mind that may compromise the efficacy of this initiative. First, parents must have an Instagram account (and set up Parental Supervision) if they want to support their 13- to 15 year-old child in changing any privacy settings. It is very possible that many are already on Instagram, but this may cause some frustration for those who simply don’t want to be on the platform. Second, a teenager may attempt to change their age to reduce the restrictions placed on them, or set up a separate account with an older age. To prevent this, Instagram is using machine learning and AI to identify teen accounts that may have lied about their birthdate when creating an account, or who attempt to change their age in-app to bypass these restrictions. This will be done by analyzing the user’s social graph (i.e., who they are connected with), their behaviors on the platform, and by processing other signals that often betray one’s true age.

Third, many teens have unique living situations. Separated or divorced parents will have to decide who will take the lead with an Instagram account formally connected to their child’s, and then take the lead on working with their child to adjust the privacy and safety controls. It is not clear what teens in foster care, group homes, or other environments must do, given that it’s likely not possible to reach out to the person who has their legal rights and ask them to connect on Instagram. I’m working to find out options to support teens in such settings.

Finally, I continue to think about the compromised ability of marginalized or minority youth to explore their identity or beliefs if their parent or guardian is always alerted to who they follow and who they are DMing. In the European Union and according to the United Nations (UN) Convention on the Rights of the Child, it is held that the human rights of children also apply in the digital realm, whereby they have the freedom and right to access information, express themselves, to participate and play, to be heard when laws and products that affect them are developed, and to benefit from appropriate privacy protections. The United States is the only UN member state that has yet to ratify the Convention, and as such it does not fully apply here – but it applies across many other countries.  It is not clear how the tensions between governmental regulation and platform actions ultimately will be resolved, but this is something I am watching closely. There are many agendas, angles, and moving parts.

Why Does this Matter?

Parents, guardians, politicians, lawmakers, and other stakeholders have repeatedly asked for additional safety measures and mechanisms from social media platforms (Office of the Surgeon General, 2021, 2023), and have regularly asserted that tech companies – which profit from the participation and engagement of tens of millions of teens, aren’t doing enough to proactively prevent exposure to certain risks and harms (Ortutay & Hadero, 2024). Platforms, of course, admit that those concerns exist, but struggle to find the proper balance between constraining their online experience and providing them the ability to access information, pursue their interests, make connections, and find their communities in relatively frictionless ways.

Research shows that youth are concerned about online privacy (Balleys & Coll, 2017; De Wolf, 2020). They are not oblivious. And they do implement various restrictions and controls as they see fit. However, research also finds that youth between the ages of 13 and 17 are much more likely to connect with unknown friends/followers, disclose personal information, and engage in other risky behaviors online as compared to 18- to 24-year-olds (White et al., 2015). In addition, there may be a disconnect between their awareness of privacy concerns and their actions to mitigate privacy risks (also known as the privacy paradox) (Barnes, 2006; Taddicken, 2014). It also has been shown that they may struggle with complex or granular privacy settings (Johnson et al., 2012) and, as a consequence, may simply use the default settings they are presented with.

If we have a population which is fundamentally more vulnerable (because they are minors!), and which naturally leans toward candid information disclosure (Hodkinson, 2017), broad social networks where they meet a diverse group of people to expand their social circle (Anderson et al., 2022), and risk-taking behavioral choices (Peluchette et al., 2015; Vannucci et al., 2020), it would do well for platforms to intentionally place young users in more of a “walled garden” environment for their mental health, well-being, and safety. In addition, research has shown that parental monitoring can facilitate a shared learning experience (Andrews et al., 2020) as opposed to a blunt tool of control and punishment for youth. The aforementioned concerns remain relevant. However, what Instagram has done here – even as an unwelcome imposition upon tens of millions of teen users around the world – moves us meaningfully in a forward direction when it comes to youth online safety.

My understanding is that Meta will carefully study their metrics to see what modifications need to be made, and determine how best to resolve the unexpected complications that will likely arise. Empirical research is also essential to study how this change affects teens on an emotional, psychological, and behavioral level (when it comes to what they do online), how many parents and guardians set up Parental Supervision and use it the way it is intended, whether the quantity and impact of risks and harms decreases, and how the quality of parent/teen relationships evolves. I’ll keep you updated as we learn of new developments, and so stay tuned.

References

Anderson, M., Vogels, E. A., Perrin, A., & Rainie, L. (2022). Connection, creativity and drama: Teen life on social media in 2022. Pew Research Center.

Andrews, J. C., Walker, K. L., & Kees, J. (2020). Children and online privacy protection: Empowerment from cognitive defense strategies. Journal of Public Policy & Marketing, 39(2), 205-219.

Balleys, C., & Coll, S. (2017). Being publicly intimate: Teenagers managing online privacy. Media, Culture & Society, 39(6), 885-901.

Barnes, S. B. (2006). A privacy paradox: Social networking in the United States. First Monday.

De Wolf, R. (2020). Contextualizing how teens manage personal and interpersonal privacy on social media. New Media & Society, 22(6), 1058-1075.

Hodkinson, P. (2017). Bedrooms and beyond: Youth, identity and privacy on social network sites. New Media & Society, 19(2), 272-288.

Johnson, M., Egelman, S., & Bellovin, S. M. (2012). Facebook and privacy: it’s complicated. Proceedings of the eighth symposium on usable privacy and security,

Office of the Surgeon General. (2021). Protecting youth mental health: the US surgeon general’s advisory. Retrieved December 6, 2021, from https://www.hhs.gov/sites/default/files/surgeon-general-youth-mental-health-advisory.pdf

Office of the Surgeon General. (2023). Social Media and Youth Mental Health: The US Surgeon General’s Advisory. Retrieved May 23, 2023, from https://pubmed.ncbi.nlm.nih.gov/37721985/

Ortutay, B., & Hadero, H. (2024). Meta, TikTok and other social media CEOs testify in heated Senate hearing on child exploitation. AP News. https://apnews.com/article/meta-tiktok-snap-discord-zuckerberg-testify-senate-00754a6bea92aaad62585ed55f219932

Peluchette, J. V., Karl, K., Wood, C., & Williams, J. (2015). Cyberbullying victimization: Do victims’ personality and risky social network behaviors contribute to the problem? Computers in Human Behavior, 52, 424-435.

Taddicken, M. (2014). The ‘privacy paradox’in the social web: The impact of privacy concerns, individual characteristics, and the perceived social relevance on different forms of self-disclosure. Journal of Computer-Mediated Communication, 19(2), 248-273.

Vannucci, A., Simpson, E. G., Gagnon, S., & Ohannessian, C. M. (2020). Social media use and risky behaviors in adolescents: A meta-analysis. Journal of Adolescence, 79, 258-274.

White, C. M., Gummerum, M., & Hanoch, Y. (2015). Adolescents’ and young adults’ online risk taking: The role of gist and verbatim representations. Risk Analysis, 35(8), 1407-1422.
The post Instagram Teen Accounts – A Win for Parents and Legislators appeared first on Cyberbullying Research Center.