Safety Center

Three helps people find and develop new friendships. We have a zero tolerance policy for cyberbullying and threats. We take extraordinary steps to keep our community safe.

24/7 Moderation Team

A team of human moderators work 24/7 to review reports submitted by users. Any users that have acted inappropriately, and/or violated our Community Guidelines, are instantly banned from the app.

Machine Learning

We use state-of-the-art machine learning technologies to identify users who violate our policies or engage in inappropriate behavior on Three.

People are given the power

Three is self-governing. If a user sees something inappropriate on Three he or she can tap the report button to have it reviewed within minutes by our moderation team.

Community Guidelines

Our community guidelines outline what content is deemed inappropriate, including content that is explicitly sexual, violent, or intended to promote physical or emotional harm to an individual or group.

Bullying and harassment

Bullying can be really hurtful, whether it’s online or in the real world. On social media, it can take many forms, including nasty comments, embarrassing photos, being excluded, low ratings or ‘dislikes’. Online bullying can be particularly upsetting as it can happen 24/7 and often takes place in front of large peer audiences.

It’s never OK to bully someone on Three. In our community guidelines, we ask our users to respect one another and we make it clear that they shouldn’t intimidate, threaten or harass anyone.

If we see bullying or trolling taking place, we warn the perpetrator about their behaviour and might take further action, such as suspending or deleting their account if necessary.

Inappropriate language

Young people often test boundaries with the language they use on the internet, whether it’s swearing or using emojis that have hidden sexual meaning. In fact, one of the challenges our Three moderators face is that the way teenagers communicate changes all the time.

As well as using swear words and other explicit language, young people often use certain phrases, acronyms and emojis as online code – something that looks harmless could actually be inappropriate for under-18s.

We monitor the language of our users and take action where appropriate. For example, we’ve banned the use of the aubergine emoji in profile names (as it is code for the word ‘penis’) and, if inappropriate language or emojis are used in the title of a live stream (called a Live), we message the user, giving them one minute to change it before we close their live stream down.

Grooming & Sexual Exploitation

Unfortunately, some people go online to target children and teenagers for the wrong reasons, such as to groom them for sexual abuse.

Three has a zero-tolerance policy for users promoting and/or distributing pornographic content. Saving, asking and/or threatening to post explicit content is strictly prohibited. We would also like to emphasize the uncompromising enforcement that we place regarding content involving minors. Any and all indications of child sexual exploitation are promptly reported and turned over to the authorities.

Underage users (must be 17+)

The age rating of Three is 17+. Minors under the age of 18 are not allowed on the app. Minors who attempt to sign up on our platform will be blocked. Help keep Three safe by reporting users who you suspect are minors and tapping "underage" as the reason for reporting. All accounts found violating this guideline will be terminated.

Nudity & Sexting

Flirting, exploring sexual feelings and having relationships – a natural part of growing up – increasingly happens online.

For teenagers surrounded by sexual images and behaviour in the media, it can be difficult knowing where to draw the line when sharing their own photos and videos. For example, if they see images of celebrities posing in their underwear, they might believe it's OK to do the same.

Some young people think that sharing nudes (often called sexting) will get people's attention or make them more popular. In some cases, they might be pressured into it by boyfriends, girlfriends, friends or even complete strangers and not realise that sharing such personal images could put them at risk of sexual exploitation. They might find that their photos and videos are passed on to other people without their consent. Known as revenge porn or sextortion, this sometimes happens when a relationship comes to an end and their former partner wants to hurt them.

Furthermore, young people could be breaking the law by taking, sharing or possessing these images.

To help protect our younger users, nude and sexual images are not allowed on Three and we take steps to remove them as soon as we are aware of them.

Fake profiles & Scams

On social networks, people sometimes pretend to be someone they are not. They set up fake profiles and try to dupe other users into chatting into them.

Catfishing and other online scams can be disconcerting and upsetting for young people and could even put them at risk of identity theft and sexual exploitation.

Any attempt to impersonate others is strictly prohibited. This includes any type of deception about who you are, whether it's lying about your age, your name, using someone else's photos, and any other type of deception. Help keep Three safe by reporting users who you suspect are pretending to be someone they're not by tapping report and selecting "spam" or "underage" as the reason for reporting. All accounts found violating this guideline will be terminated.

Self-harm & Suicide

Sadly, the pressures of growing up can sometimes get too much and some young people experience depression, low self-esteem, questions about their sexual identity and other issues.

Being part of an online community can be positive in many ways, enabling teens to share their experiences and feel less alone. In some cases, however, young people try to encourage others to participate in things like eating disorders, self-injury and suicide.

For the safety of our users on Three, we remove any posts on these topics and provide details of helplines so that young people can get the support they need.

Drugs and illegal activity

Violence, drugs, racism, homophobia, violent extremism, pornography, gambling, criminal activity... your child or students might come across things online that they should not see. It could be unsuitable for their age or maturity, inaccurate, offensive or even unlawful.

Our 17+ age limit helps to protect younger children. Furthermore, we don’t tolerate harmful and illegal content on Three and we take it down as soon as we become aware of it. If the content could be breaking the law, we refer it to the authorities.


Some teenagers experience body shaming, bullying and other forms of humiliation online.

In a world of Photoshopped images of celebrities and large social media audiences, there’s even more pressure for teenagers to look and behave a certain way and be part of the right crowd.

It’s our aim for Three to be a community in which young people feel confident, happy and secure so there’s no place for any hurtful comments, bullying or any other behaviour that could affect someone’s self-esteem. If we find out this is happening, we contact the perpetrator and might decide to suspend their account.

Peer Pressure

From just-for-fun pranks to more serious dares, teens often find themselves under pressure from others online.

It’s not always easy to say no but it’s important that young people understand the consequences of their actions. What might seem like a bit of fun could upset or embarrass them or someone else and could break the Three rules or even the law.

Our community guidelines make it clear that certain things are not OK on Three. For example, if someone is pressured into sharing nude images or harassing another user, we have the option to remove the content and suspend their account.


Spam, bot activity, and deceptive practices are not allowed on Three. To enforce this policy, we do not allow images or links of any kind to be Sent in Direct Messages. Spam and bot-like messages are also flagged to our system and are blocked before they are received by the end-user.

Help keep Three safe by reporting users and content that you suspect are spam by tapping report and selecting "spam" as the reason for reporting. All accounts found violating this guideline will be terminated.


If you are a user or parent and need help using Three or need help with an issue you are experiencing on Three, you can use our contact page to reach out to us. If you believe you have witnessed something illegal on Three, please use our contact page to let us know. We work with local and federal law enforcement agencies to resolve situations expeditiously.

If you are experiencing bullying on Three, we encourage you to report the user. If you are in the United States and you are in crisis or emotional distress, you can also text HELLO to 741741 to chat with a live, trained crisis counselor at Crisis Text Line. This service is free and available 24/7.

Useful websites

There is lots of information available online to help young people navigate their digital world. We’ve provided links to third party articles throughout the Advice section of our Safety Centre and you can also find details of some of the most useful English language websites about online safety below.


Bullying. No Way!


Kids Helpline

Office of the eSafety Commissioner


The Line


Canadian Centre for Child Protection

Canada Safety Council

Get Cyber Safe



Protect Kids Online


Better Internet for Kids



Belong To




New Zealand







Get Safe Online

Internet Matters

Internet Watch Foundation




Parent Info

Revenge Porn Helpline

The Diana Award


UK Safer Internet Centre

Young Minds


Be Internet Awesome

Common Sense Media


Crisis Text Line

Family Online Safety Institute


National Center for Missing and Exploited Children (NCMEC)


National Suicide Prevention Lifeline



The Trevor Project