Supporting a safe online experience for young people

March 18, 2022

 

Everyone has a right to feel safe and respected, and all of us - including companies like Meta, have a role to play in stopping bullying both on and offline. The National Day of Action Against Bullying and Violence is an opportunity for schools, students, caregivers, community groups, companies, and the government to come together to discuss effective solutions to bullying.

As one of the world’s largest technology companies, we know that we have a responsibility in keeping young people safe. But one form of abuse that has unfortunately always existed when people interact with each other is bullying and harassment. While this challenge isn’t unique to social media, we want to ensure that we’re taking steps to remove this type of content on our platforms and ensure that people are protected from harmful content wherever possible. 

Our work to address bullying and harassment 

To that end, we’ve created rules that define what we will and won’t allow on our platforms, we are continually increasing our investment in new technologies and tools to prevent people from being exposed to bullying and harassing content in the first place, and also give them the ability to easily block and report it if they are. 

When it comes to bullying and harassment, ​​context and intent matter. Bullying and harassment are often very personal — it can show up in different ways for different people, from making repeated and unwanted contact, to making public threats or to threatening public figures. This is why we are constantly looking at how we can update our policies to respond to new issues or challenges that people may experience, and we don’t make these policy and product decisions alone. We use insights and feedback taken from more than 800 safety partners globally, including Australian safety organisations, experts, and academics. 

In recent years, there has been an increase in abuse directed at public figures, particularly women. In 2017, we began consulting on what more we could do to better protect public figures from the abusive commentary that isn’t connected with public debate. Listening to feedback, we have made changes to our bullying and harassment policy to help protect people from mass harassment and intimidation, and now offer more protections to public figures, particularly women, people of colour and members of the LGBTQI community, who we know can face increased scrutiny. We also developed new tools to turn off comments on a post, limit who can comment, and prevent hateful messages from being sent, whether text or emojis.

Holding ourselves accountable 

How we act, spot, and remove this content is core to keeping it off our platforms, and investing in safety and security will help us achieve this.  

Last year, we spent USD$5 billion/ AU$7billion on safety and security. This allows us to use a combination of reports from our community, human review, and artificial intelligence technology to enforce our policies. We acknowledge it can sometimes be difficult for our systems to distinguish between a bullying comment and a light-hearted joke without knowing the people involved or the nuance of the situation, and using technology to proactively detect this content can be more challenging than other types of violations - however, we are making progress.  

Between October and December last year, we estimate that out of every 10,000 views of content on Facebook, 10 to11 views were of bullying and harassment content. This is lower than the previous reporting period - but we know there is more to do. 

We are working hard to reduce this type of content on our platforms, but we also want to equip our community with tools to protect themselves from potentially offensive content in ways that work best for them. To support people’s experience on our platforms, we’ve built in our apps to prevent, stop and report bullying and harassment online. 

Indications show these tools are working, last year we introduced a new warning tool on Instagram that detects when someone is about to post a comment that could include hateful or harmful. Since the launch of this tool, 50% of people either change what they were going to post or don’t post anything at all.

We can't tackle bullying alone 

Bullying, harassment, and violence are long-standing social issues that predate the internet and social media, so it’s important that we continue to work with others to address these challenges as a whole of society. To that end, we partner with safety organisations, like PROJECT ROCKIT, who serve on our Global Safety Advisory Board, with whom we have partnered for many years and together trained more than 25,000 young Aussie students in anti-bullying workshops to date.

We know there is more to do and we’ll keep making improvements but this is a challenge that goes beyond any individual company. It requires multi-stakeholder, cross-industry collaboration, and consistent rules for the internet that apply to all platforms.

Australia has been actively developing new regulations focused on digital platforms such as Meta. The issue is no longer that digital platforms are unregulated in Australia, or that the risks of the internet are unexamined - particularly in the area of safety. Instead, the main priority moving forward must be ensuring that existing regulations are effective in improving the safety of Australians online.

We believe that a collaborative industry-led process could assist with developing metrics, to better understand whether regulations and industry efforts, especially our own, have been genuinely effective. At Meta, we will continue to work with the Government and our industry peers to help make the new framework as effective as possible and to bring additional oversight and confidence to the industry’s work to keep users safe.

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy