Added Within Minutes: How Snapchat Fails Younger Users
- Charlie

- Jun 1
- 7 min read
Updated: Jun 8

Contents
Executive Summary / Overview
This case study is designed to identify the online risks minors experience when using Snapchat, focusing on the platform’s inability to protect underage users from online predators and inappropriate/explicit content. This investigation will highlight the urgency of improving safeguarding features and procedures on social media platforms to protect children online. This study is important as Snapchat is widely used by children and teens worldwide, and its moderation features fail to protect them from harmful interactions.
Objectives
The main objective of this investigation is to:
· Witness first hand how minors are treated and exposed on chat platforms.
· Identify how Snapchat’s Family Centre prevents harmful behaviours towards minors.
· Evaluate the effectiveness of Snapchat’s moderation systems for underage users.
· Identify where predators start conversations with minors.
· Provide recommendations for the platform to enhance the safety for children online.
What Is Snapchat?
Snapchat is a popular social media messaging platform where users are encouraged to communicate through photos, however, still have the option to use normal messaging features. As of 2025, the platform has over 800 million monthly active users, and 453 million daily active users. Snapchat, which is primarily aimed at younger generations, has a notable portion of young users, with 18.3% of users aged 13-17. According to Apple’s App Store, Snapchat is ranked the 5th most popular photo & video platform as of April 2025; further ranking as the 9th most popular social media platform worldwide. Users have the capability to set chats as “Delete After Viewing” which has become very popular as it gives users a sense of “privacy”.
Nevertheless, with its disappearing messages feature and minimal monitoring, it poses a greater risk to minors online. Predators can exploit this function to “cover their tracks” making it difficult for users to gather evidence or report inappropriate behaviour. Despite some safety features within the app, such as reporting tools and the ability to block contacts, Snapchat has been criticised for their lack of support and minimal effort to increase moderation techniques to prevent predatory behaviours, cyberbullying, and exposure to harmful content.
Methodology
This investigation was carried out by creating a dummy underage profile on Snapchat and interacting with random users who had added the account to simulate real-world exposure to potential risks. These methods followed ethical guidelines, ensuring no personal harm was caused during the investigation. All communication with users was carried out under the premise of safety testing, with emphasis on observing interactions and the effectiveness of Snapchat’s safety features in real-time.
Key steps of creating this method included:
· Creating an underage profile.
· Engaging with random users to observe the nature of messages and interactions.
· Testing Snapchat’s moderation features such as content filtering, blocking and reporting.
· Documenting the findings while ensuring no actual underage user was harmed or exploited.
Ethical Disclaimer
This investigation followed all legal and ethical guidelines, including consent and privacy protection for all individuals involved. The research was designed to simulate potential online risks in a controlled and ethical manner, ensuring that no illegal activities were conducted or encouraged during the investigation.
What We Found?
Within just three hours of creating an account, we received multiple unsolicited messages from strangers, identifying how easily predators or inappropriate contacts can reach and attempt to engage with minors. Many of these messages included intrusive personal questions, explicit content requests, and even attempts to arrange in-person meetings.
The anonymity the platform made it easier for users to create “disposable” accounts, allowing them to avoid accountability when sending or requesting explicit images and messages. This lack of traceability has created an environment where inappropriate actions and behaviours could exceed with minimal consequences.
Snapchat offers a social feature known as spotlight; it’s a similar experience to TikTok. However, we noticed that our dummy account was being suggested sexualised content with girls promoting their OnlyFans pages in minimal clothing or posing in certain positions to ‘show off’ their bodies.
Patterns revealed a high frequency of contact from strangers, with some users immediately sending sexually suggestive content and images or attempting to manipulate the dummy profile into sharing personal information. In one disturbing instance, a 22-year-old male offered to pick up our dummy profile in his car, explicitly stating that the minor’s parents “don’t need to know”.
While Snapchat provides reporting tools for users, there were instances where reported content was not addressed promptly, or it was too late as our dummy profile had already been exploited to explicit images and conversations. The platform’s inability to block inappropriate users allowed them to continue sending messages under a new profile.
Another concerning issue we discovered was Snapchat’s support options. Although they offer “guidance” on their website, it is very difficult to find an email or form to fill out when reporting serious cases of child exploitation. Snapchat expects users to contact their support account on X (formerly known as Twitter). Unfortunately, due to X’s nature of ‘free speech’ it is another app not safe for children to contact snapchat support about an ongoing issue; This approach not only lacks privacy as users are expected to create a public tweet mentioning the account but also creates barriers for younger users or parents seeking timely help. Relying on another social media platform for customer service, especially for cases involving potential exploitation or abuse, raises serious concerns about Snapchat’s prioritisation of safety amongst the userbase.
On the other hand, Snapchat offer their ‘Family Centre’ parental controls to ensure parents can monitor their child’s activity within the platform and who they’re interacting with. However, the Family Centre only gives parents access to see who their child last spoke to and who their friends are on the app; although this is a great step towards keeping children safe on the platform, parents should have more control such as being able to read their children’s chats as child safety is more important than their ‘privacy’ when it comes to parents looking at conversations their child is involved in.
Throughout the investigation, we actively reported these accounts that engaged in inappropriate, predatory, or explicit behaviours. Despite our efforts, around only one in four accounts reported received any form of acknowledgement. When Snapchat did reply, the message read “We wanted to let you know that we have investigated your report and found that the account violates our Community Guidelines or Terms of Service. Your keen eye is helping us to keep the community safe.” While this acknowledges that some action was taken, the response lacked any transparency about what steps were taken to ensure this wouldn’t happen again or whether the user was banned, suspended, or warned. The remaining reports were either ignored or left unresolved, suggesting that harmful behaviour is not consistently addressed. Leaving serious concerns about Snapchat’s ability – or willingness – to protect its younger users.
Notable User Interactions
Example 1 - Unsolicited contact within 15 minutes:
User A:
“Hey”
“Wanna see”
“My [Male Genitalia]”
*[NOTE: User A was blocked and reported]
Example 2 – Attempt to Arrange In-Person Meeting
User B:
“I should come see you sometime… “
[…]
“Maybe I could come down one night and take you for a drive”
Dummy Account:
“I don’t think my mum would be okay with that.”
User B:
“It’s okay she doesn’t have to know “
[…]
“I’ll drive and play with your hair while you be my passenger princess”
Dummy Account:
“I don’t really want too though.”
User B:
“It’s okay we can go somewhere where no one will know where we are “
[…]
“I’ll take you back home after I promise”
*[NOTE: User B was blocked and reported]
These examples represent just a small fraction of the hundreds of unsolicited interactions we witnessed within the first 24 hours. The amount and frequency of these encounters were disturbing, with a large number involving adult users who either requested explicit images from what they believed to be a minor or sent graphic content without any form of consent.
A lot of these individuals escalated their conversations, often within just a few messages, demonstrating an intent to exploit. The speed in which these interactions occurred highlight a concerning vulnerability within the platform’s user protection procedures. This pattern was not limited to a few cases but instead reflected a widespread issue that points to serious failures in content moderation systems and child safety within the platform.
Risk Analysis
Minors exposed to similar interactions and conversations would likely face trauma, emotional distress, potential grooming/radicalisation, and exposure to explicit content. The anonymity on Snapchat increases the risks, as minors may feel pressured to comply with inappropriate requests. Without intervention from the platform, these risks can lead to long-term harm, including increased vulnerability to exploitation.
Platform Recommendations
To improve Snapchat’s ability to keep users safe, especially children, the following recommendations are made:
· Implement stronger content filtering that can detect and block inappropriate content before reaching accounts with a DOB that is under 18.
· Introduce monitoring tools that automatically flag suspicious behaviour or messaging patterns.
· Improve the reporting and response time to ensure faster intervention.
· Increase parental control to allow better monitoring for underage accounts.
· Ensure Spotlight doesn’t suggest sexualised content to accounts aged under 18.
What Can Parents Do?
While accountability on behalf of the platform is crucial, parents and guardians play a vital role in safeguarding their children online. Snapchat's disappearing messages and privacy-focused design presents challenges to parents; however, parental involvement can significantly reduce the risks minors face. We highly suggest enabling Snapchat's Family Centre features to bridge that gap between you and your child.
Conclusion
This investigation has revealed several key vulnerabilities in Snapchat’s platform that expose minors to the dangers online. While we noticed some safety features exist, they are insufficient in providing adequate protection from harmful interactions. It is important for Snapchat and similar platforms to prioritise user safety, especially for younger audiences, by improving the current moderation systems and implementing more robust protective features.
.png)



Comments