Meta’s annual transparency report on Australia’s disinformation and misinformation industry code

Josh Machin, Head of Public Policy, Meta Australia - 30 May 2022

 

Meta is proud to be a founding member and signatory to the Australian industry code on disinformation and misinformation. In 2021, Meta opted into every commitment under the code, and our 2021 transparency report outlined 43 specific commitments to meet our obligations across both Facebook and Instagram. 

 

Today, Meta has published our 2022 transparency report, with details showing how our 2021 commitments were met, whilst enhancing transparency. The report also includes Australian-specific data outlining the measures that Meta has taken to combat disinformation and misinformation, including:

 

  • In 2021, we removed over 11 million pieces of content from Facebook and Instagram globally for violating our Community Standards in relation to harmful health misinformation. Over 180,000 of these pieces of content were from Pages or accounts specific to Australia.

  • We have made a COVID-19 Information Centre available around the world to promote authoritative information to Facebook users. Over 350 million people globally visited the Information Centre in Q4 of 2021, over 3.5 million of these were Australians.

  • Since the beginning of the pandemic to June 2021, globally, we have displayed warnings on more than 190 million pieces of content on Facebook that our third-party fact-checking partners have rated as false, partly false, altered or missing content.

 

These results follow a number of steps we took in Australia to combat disinformation and misinformation, including:

 

  • Securing an additional third-party fact-checking partner, RMIT FactLab, to join Australian Associated Press and Agence France Presse as our fact-checking partners.

  • Expanding our ad transparency requirements to cover social issue advertising since June 2021, well before the Australian federal election.

  • Providing millions of dollars worth of advertising credits to federal, state and territory governments and NGOs, to increase access to COVID-19 vaccine and health information.

  • Launching two, national media literacy campaigns: The consumer-focussed campaign with Australian Associated Press called ‘Check the Facts’; and a creator-focussed awareness campaign with First Draft called ‘Don’t be a Misinfluencer’.

  • Launching the Climate Science Information Centre on Facebook to provide Australians with authoritative information on climate change.

  • Supporting civic participation in the federal election campaign via a package of measures to combat misinformation and promoting authoritative information about voting.

 

Meta continues to build on our approach to disinformation and misinformation. In our 2022 Transparency report we confirm 45 specific commitments that we will put in place over the next annual reporting period, including the following new commitments: 

 

  • Meta will continue to provide greater transparency of our content ranking algorithms and give users more control over the content they see.

  • Meta will continue to add new functionality to the Ad Library to encourage scrutiny and transparency of political and social issue advertising.

  • Meta will fund training for Australian journalists on how to identify and prevent amplifying mis- and disinformation.

  • Meta will focus on new areas of research relating to disinformation and misinformation in 2022, including media literacy of First Nations peoples.

 

The industry code provides an effective framework to increase transparency of companies’ efforts to combat misinformation and disinformation. We look forward to continuing to work with Australian policymakers, civil society, academics and experts on how we can evolve our work moving forward. 

 

Read Meta’s response to the Australian disinformation and misinformation industry code in further 

detail here. 

Have Your Say with New Federal Election Stickers on Instagram

Josh Machin, Head of Public Policy - Australia – May 9, 2022

  • To coincide with the launch of early voting for the election, Instagram is launching three election-themed Stickers today to commemorate the voting process and people’s participation
  • The Stickers will also link to the Australian Electoral Commission (AEC) website containing up-to-date election and voting information
  • This announcement is part of our ongoing work to support our community and celebrate Australia’s democratic spirit

With the launch of early voting beginning today in Australia, we’ll be adding a splash of fun and colour with a new collection of election-themed Stickers on Instagram, celebrating Australian democracy and amplifying their democratic voices. The stickers will also link to the Australian Electoral Commission website for more information on the election and voting process.

Australians share over one million Instagram Stories every day. In the lead-up to the election, we’re encouraging our Instagram community to use the stickers to promote civic participation and share authoritative information about voting.

The Stickers were designed by the iconic Australian artist and designer Stavroula Adameitis (aka Frida Las Vegas), who took inspiration from Australian iconography.  We’re excited to continue our support of local communities, creators and artists like Frida who created the iconic Australian artwork for this year’s election Stickers.

Australian artist and designer Stavroula Adameitis said, “I’m stoked to collaborate with Instagram on this sticker series celebrating Australia’s right to vote in the 2022 Federal Election. It’s important that Australians have our voices heard whilst playing with these bright, bold and fun graphics to share the message that every single vote counts!”

This announcement comes in addition to all the work we’ve done during the election campaign to date to help promote election integrity in Australia. You can read more about our comprehensive approach here.

One of our key priorities has been ensuring a close working relationship with the AEC, to support their work. The Stickers will link people to the AEC website for more authoritative information on the election and voting process.

Tom Rogers, Australian Electoral Commissioner said, “We’ve been working closely with Meta in the lead up to, and during, the 2022 federal election. We’re very excited to see Australians using election-themed Instagram stickers and other Meta initiatives as a way to express positivity and pride about participating in Australian democracy.”

Say G’day to the 2022 Instagram election themed Stickers

  • Iconic Australian snacks, featuring the Pie and Vovo
  • Democracy sausages
  • A ‘Have Your Say’ shaka

 

 

How to add an Election Sticker to your story

  1. Open the Stories camera and take a photo
  2. Tap on the Sticker Tray icon
  3. Tap on the desired Sticker
  4. Once the sticker is on your photo, you may tap it for additional options. With each tap, the Stickers will rotate between three different iconic Aussie images, including the Democracy Sausage.

All three stickers will be visible at the top of the Instagram Stories Sticker Tray on 9th May.

Update on Meta’s work to support the 2022 Australian election

Josh Machin, Head of Public Policy - Australia – May 6, 2022

 

Meta has had a comprehensive strategy in place over the course of the Australian election campaign to combat misinformation, voter interference and potentially harmful content on our platforms. We shared details about our plans in a March 2022 blog post, and numerous roundtables with journalists, government agencies, law enforcement and security agencies (including via the Australian Government’s election integrity assurance taskforce), and academics.

In particular, we appreciate the close working relationship with the Australian Electoral Commission who has been referring content to us that they believe may violate Australian electoral law or represent voter interference. We are also continuing our collaboration with civil society organisations such as the Australian Strategic Policy Institute (ASPI) and First Draft.

This week, we and other tech companies received further questions from Australian academics – coordinated by a technology-focussed lobby group – regarding the steps we are taking in the months before the Australian election. We agree that it is important for digital platforms to be transparent and accountable to the Australian public. In that vein, we are publishing responses to the questions below.

We will have continued transparency of our efforts to combat misinformation in particular in our next annual transparency report, required under the voluntary industry code on misinformation and disinformation, which is due in May 2022. Although the reporting period covers 2021, we recognise the level of interest in the steps we are taking to promote integrity of the Australian election, so we will be voluntarily including further information in that report. You can find our previous report here.

We remain committed to engaging with Australian academics and experts on these important policy issues.

 

 

1. How many dedicated human content moderators will be bolstering your AI enabled system specifically for the Australian election?

In addition to the significant investments Meta has made in technology to proactively identify harmful content, we also have more than 40,000 people who work on safety and security at Meta (about 15,000 of which are dedicated content reviewers). As issues potentially arise during the Australian election, we have the benefit of being able to draw from not just the cross-functional team dedicated to the Australian election but also from any members of our global safety and security teams as needed, depending on the expertise and skills required. Another benefit of investing so heavily in safety and security means that we have 24/7 support; even while Australia sleeps, our safety and security teams in other timezones are able to review content that could be harmful and impact the Australian election.

2. What languages do these content moderators speak? (for instance, the top five spoken languages in Australian homes other than English are Cantonese, Mandarin, Italian, Arabic and Greek)

The benefit of drawing from our global investment in safety and security means that Meta supports content moderation in over 70 languages. We also have a global network of more than 80 third-party fact-checkers who cover more than 60 different languages to combat misinformation. We have run a consumer awareness campaign prior to the Australian election to help Australians spot misinformation and raise awareness of fact-checking. As well as running this campaign in English, we have translated campaign assets into Chinese, Vietnamese and Arabic (the top languages other than English spoken at home in Australia according to the Australian Bureau of Statistics).

In the lead-up to the election, we have received feedback from non-government experts that many non-English speaking diaspora communities in Australia may also use digital platforms from their home countries (see here for example). We continue to encourage policymakers to pay regard to the risks of misinformation and disinformation that can occur in Australian communities on non-English platforms.

3. Where are the content moderators dedicated to the Australian election based? If not in Australia, can their security and integrity be ensured during this time of geo-political instability?

Our safety and security teams are located around the world, with centres of excellence in 20 sites around the world, such as Singapore, Dublin and the United States, which ensures around-the-clock coverage.

4. How has your content moderation system taken into account the nuances of Australian English slang? (Using Australian slang is a common strategy for those seeking to evade detection by content moderation system on social media.)

This is an important imperative not just in the context of the Australian election. We have dedicated teams with deep knowledge and expertise in Australia, including the use of language in the Australian context. These teams are trained to be across slang words and other colloquialisms.

We always welcome input from local experts and civil society organisations about trends in possible abuse of our platform, and any academics are welcome to raise examples with us if they believe there is a new or emerging slang which is not properly being accounted for.

5. Who has been consulted in the development of election-related content moderation policies? How will you ensure these policies are adaptive and responsive to the events of the election?

Our election-related policies are global, and we have developed them through our involvement in over 200 elections around the world since 2017. We have engaged extensively with Australian stakeholders about these policies, especially those that have been developed since the 2019 Australian election like our voter interference policies, in the months leading up to the Australian election campaign – including providing transparency in our last report under the voluntary industry code on misinformation and disinformation. No stakeholder has raised concerns with us to date that these policies appear to be inadequate in the Australian context.

6. What AI enabled content moderation system will be deployed during the election, including image recognition technology? What are the error rates?

We provide transparency about our content enforcement approach in our global transparency centre.

7. What provisions have been made to protect communities from foreign interference?

Foreign interference in Australia can occur via a variety of means. In relation to our apps, we take extensive steps to identify and take action against threats to the election, including signs of coordinated inauthentic behaviour and block millions of fake accounts everyday so they can’t spread misinformation. These are outlined in detail in our last transparency report.

We also require all advertisers looking to run political, social and election related ads to complete an authorisation process, confirming their identification and be located in Australia.

In particular, we continue to engage closely with Australian security agencies regarding the overall threat environment for foreign interference in Australia. We continue to monitor this closely over the final weeks of the election campaign.

8. What avenues are in place to enable civil society organisations to flag harmful and false content (beyond the reporting mechanism of the eSafety Commissioner)?

The community has a wide-array of mechanisms available to report potentially harmful or false content to us. We have easy and simple in-app reporting for every piece of content. We also have close working relationships with Australian regulators, like the eSafety Commissioner and Australian Electoral Commission, who are able to quickly and easily refer content to us. We take a ‘no closed door’ approach to content review where we will review any and all content sent to us by an Australian regulator.

We also have built direct relationships with expert civil society organisations who are able to refer content to us directly.

The industry association DIGI maintains a complaints mechanism for any instance of possible breaches of the industry code on misinformation and disinformation.

Finally, for content on our services, any member of the community is able to raise it directly with our third-party fact-checkers for possible fact-checking:

9. In the United States you have shared data about removed coordinated inauthentic behaviour networks with independent researchers. Why have you not implemented this in Australia? (ideally all content and accounts removed under election-related policies should be stored and shared for post-election scrutiny)

The assertion in this question is not correct. In late 2020, we launched a pilot, CrowdTangle-enabled, research archive where we’ve shared over 100 of the recent coordinated inauthentic behavior (CIB) takedowns with a small group of researchers who study and counter influence operations. The Australian Strategic Policy Institute (ASPI) is one of our initial 5 key partners for this archive globally, where they can study and analyse the networks we’ve removed. We are pleased to see Australian representation in the small number of research organisations who are suitable for the pilot.

10. During the last United States election you labelled posts that were believed to be state-controlled media outlets. Why have you not implemented this in Australia?

The assertion in this question is not correct. We label state-controlled media outlets in many countries around the world, including in Australia.

11. Given that over 20% of Australians speak a language other than English at home, what languages will the third-party fact checks be translated into?

We also have a global network of more than 80 third-party fact-checkers who cover more than 60 different languages to combat misinformation.

Fact-checks are available in the language of the original content. A piece of content in a non-English language that arises in Australia is eligible to be fact-checked by other global partners who operate in that language.

12. What non-English language publications will third-party fact checks be provided to?

It’s not clear what is meant by this question. More information about the scope and eligibility of third-party fact checking on our services is available at our Help Centre

13. How will the speed of fact checking be measured during the election?

It’s not possible to give a timeframe around how long it takes a fact checker to verify content after it is posted on Facebook. This is because content is flagged to fact checkers in a variety of ways, and it is at the discretion of the independent fact checkers as to which pieces of content they review. The amount of time it takes a fact checker to verify a claim and undertake a fact check can also vary, depending on the complexity of the claim they are reviewing.

Content is flagged to our third-party fact-checkers to review in several ways:

  • our third-party fact-checkers proactively identify the content themselves
  • our technology identifies potential false stories for third-party fact-checkers to review. For example, when people on Facebook submit feedback about a story being false or comment on an article expressing disbelief, these are signals that a story should be reviewed
  • we also have a similarity detection system that helps us identify more debunked content than what our fact checkers see.

What’s important is once a story is debunked by our third party fact checkers, we make it less visible on Facebook and Instagram.

Artificial intelligence plays an important role in helping scale the efforts of our third party fact checkers. Once fact-checkers deem a piece of content is false and mark it as false, the overlay will appear almost immediately. After one fact check on one piece of content, we’re able to kick off similarity detection which helps us identify duplicates of debunked stories, and reduce their distribution.

These new posts are then fed back into the machine learning model which helps improve its accuracy and speed.

14. How will the reach of fact checking be measured during the election (i.e. how many people have viewed the content and their demographics)?

Once a piece of content is found to be false, we apply a warning label on it so it is not possible to see the content without clicking past the warning label. Once a warning label is applied, 95% of people on Facebook choose not to click through. This dramatically reduces the number of people who see the content.

15. How will your response to fact checking be measured during the election? (i.e. content take down, or reporting to the appropriate authority)

Our approach to combatting misinformation is comprehensive and is broader than simply content takedowns or third-party fact checking. Meta has led the industry in terms of transparency under the voluntary industry code on disinformation and misinformation and we continue to look for opportunities about what integrity data we might possibly be able to make available after the election in the interests of transparency.

We have expanded our third-party fact-checking program in Australia to include RMIT FactLab, who are joining our existing partners Agence France Presse and Australian Associated Press. We have also one-off grants to all our fact-checkers to increase their capacity in the lead up to the election.

16. During the last United States election, Facebook’s algorithm was adapted to reduce the distribution of sensational and misleading material, prioritising content from authoritative sources. Why has this measure not been implemented in Australia?

The assertion in the question is not correct. We provide transparency around how our ranking and recommendation algorithms work via our Content Distribution Guidelines and Recommendation Guidelines. As these policies outline, we reduce the distribution of sensational and misleading material at all times, not just in the lead-up to election campaigns.

We are also taking steps to promote authoritative information about the election, by providing prompts in the Feed of every Australian to direct them to the Australian Electoral Commission’s website.

17. During the last United States election, the distribution of live videos related to the election was limited. Why has this measure not been implemented in Australia?

While we learn lessons from each prior election, no two elections are the same. Working closely with elections authorities and trusted partners in each country, and evaluating the specific risks ahead of each election, we make determinations about which defenses are most appropriate. In the lead-up to each election, we monitor the threats on our platform and respond accordingly.

We have strong integrity measures to protect the abuse of products like Facebook Live at all times, not just for the Australian election. Our Community Standards and third-party fact checking initiatives apply equally to livestreaming as other types of content on our services.

18. Google has restricted the targeting for election ads in Australia. Has Meta considered this? If so, why has it not been implemented?

Digital platforms provide a range of different services, which may lead to different assessments about the best approach to integrity measures. At Meta, our approach is grounded in industry-leading transparency for political and social issue ads in Australia. We require these advertisers to go through an authorisation process, to add a disclaimer, and to agree to their ads appearing in the Ad Library for seven years after they run. We are committed to providing transparency of these ads that appear on our services.

19. During the last United States election, Meta implemented changes to ensure fewer people saw social issue, electoral and political ads that had a “paid for by” disclaimer. Why has this measure not been implemented in Australia?

The assertion in this question is not correct. In response to community feedback, we have been running tests in a number of countries to show political content lower in people’s Feeds. This applies to organic, non-paid content and was announced in the US after the last election.

In 2021, we announced a new control feature that allows people to have more control over the ads they see on Facebook. This feature gives people a choice to see fewer social issues, electoral, and political ads with “Paid for by” disclaimers in Australia.

20. During the last United States election, the creation of new ads about social issues, elections or politics in the last few days of the election was blocked. Why has this measure not been implemented in Australia?

While we learn lessons from each prior election, no two elections are the same. Working closely with elections authorities and trusted partners in each country, and evaluating the specific risks ahead of each election, we make determinations about which defenses are most appropriate. In the lead-up to each election, we monitor the threats on our platform and respond accordingly.

The decision on whether Australia’s blackout period for electoral ads should be extended to digital platforms is a choice for policymakers. We have consistently said over many years we support extending this requirement to digital platforms.

21. What measures do you have in place to screen the placement of ads to ensure all political ads are properly identified and labelled by the advertiser?

We take a number of steps to detect ads that should be classified as political or social issue ads but have not been correctly categorised by the advertiser. We do not make information about these detection steps available, to avoid providing information to bad actors on how to evade our policies. This review process may include the specific components of an ad, such as images, video, text and targeting information, as well as an ad’s associated landing page or other destinations, among other information.

Australian regulators and civil society are welcome to refer ads to us that they believe should be categorised as political or social issues ads and do not have this disclaimer.

22. Beyond the Ad Library, will Meta be making available a comprehensive public archive of all sponsored political content, including targeting data and aggregated engagement statistics by target audiences (accessible by API)?

Meta already makes available a comprehensive public archive of all political and social issue ads on our services, via the Ad Library. The Ad Library provides industry-leading transparency, including metrics such as estimated audience size, impressions, and statistics of the target audience (such as location and gender of the audience). This is also already available via an API to approved academics and experts.

23. What ‘break glass’ measures are on standby during the election?

We maintain a series of systems already to help protect the integrity of elections on our platforms. We continue to monitor threats on our platform and respond accordingly. At this stage, we have not been advised by law enforcement or intelligence agencies that the risk of real-world harm is high.

24. What type of event (in terms of reach and impact) would trigger the implementation of ‘break glass’ measures?

See above.

How Meta is Preparing for the 2022 Australian Election

Josh Machin, Head of Policy (Australia) – March 15, 2022

 

  • Meta has been preparing for this year’s Australian election for a long time and will be using a comprehensive strategy to combat misinformation, election interference, and other forms of abuse on our platforms.
  • We’re expanding our third-party fact-checking program in Australia to include RMIT FactLab. They’ll join Agence France Presse and Australian Associated Press to review and rate content. We’ll also be providing one-off grants to all our fact-checkers to increase their capacity in the lead up to the election.
  • We’re also working with the Australian Associated Press to re-run the “Check the Facts” media literacy campaign in three other languages – into Vietnamese, Simplified Chinese and Arabic to extend the benefits of the campaign even further.

With the Australian Election set to take place in the coming months, Meta has been preparing for them for a long time. We’ve been involved in more than 200 elections around the world since 2017, and we’ve learned key lessons from each one about where to focus our teams, technologies, and investments so they will have the greatest impact. Here are some of the ways that we’re already promoting safety and integrity across our platforms ahead of this year’s Australian Election.

 

Combating Abuse, Misinformation, and Election Interference Using a Comprehensive Approach  

We’ve made significant investments in safety and security, including approximately $5 billion/ $AU7 billion in 2021 alone and now have more than 40,000 people around the world working on safety and security. These investments have allowed us to build our ability to reduce the likelihood of abuses – including election interference, misinformation and online harms – from happening in the first place, rather than just addressing them when they occur. For this year’s Australian election, this work will include:

 

  • Announcing a new Australian Fact-Checker.
    We know the importance of ensuring Australians have access to reliable information about the election in Australia. Therefore, we are pleased to announce the expansion of our third-party fact-checking program in Australia today. From March 21, RMIT FactLab will join Meta’s global independent fact-checking program in reviewing and rating the accuracy of content in the lead up to the 2022 Australian Election, alongside Agence France Presse and the Australian Associated Press. We’ll also be providing one-off grants to all our fact-checkers to increase their capacity in the lead up to the election. Our fact-checkers work to reduce the spread of misinformation across Meta’s services. When they rate something as false, we significantly reduce its distribution so fewer people see it. We also notify people who try to share something rated as false and add a warning label with a link to a debunking article.

    • Russell Skelton, Director of RMIT FactLab said, “We see this as a really important public service. If we can play a role in preventing the dissemination of misinformation on social media that has the potential to mislead or harm, then we see that as providing a really critical service.”
    • We are also working with expert organisations such as First Draft to increase monitoring for misinformation in the lead-up to the election and publish related analyses and reporting on their website. First Draft will also be providing pre-election training for Australian journalists on how to identify and prevent amplifying mis and dis information.

 

  • Empowering People to Identify False News.
    Since we know it’s not enough to just limit or remove harmful or misleading electoral misinformation that people see, we’re giving people more context about a post so they can make an informed decision on what to read, trust and share. We have invested in dedicated Australian initiatives including:

    • Working with the Australian Associated Press to launch an education and awareness campaign comprising a series of short videos modelling how people can recognise and avoid misinformation by taking simple steps to ‘Check the Facts’. The campaign will run again this April and will include, for the first time, content translated into three other languages – Vietnamese, Simplified Chinese and Arabic to extend the benefits of the campaign even further.
    • Partnering with First Draft to develop the “Don’t be a Misinfluencer” campaign on Facebook and Instagram to help creators and influencers to share tips for spotting false news.

Security and election interference 

  • Combatting Influence Operations.
    We have specialised global teams to identify and take action against threats to the election, including signs of coordinated inauthentic behaviour across our apps. We are also coordinating with the Government’s election integrity assurance taskforce and security agencies in the lead up to the election. We’ve also improved our AI so that we can more effectively detect and block fake accounts, which are often behind this activity.

    • Since 2017, we have removed over 150 networks around the world including ahead of major democratic elections. We’re working to find and remove any networks of Coordinated Inauthentic Behaviour. We’ll also share publicly what we find.

 

  • Protecting candidates and leaders running in the Australian election.
    In December 2021, we extended our Facebook Protect security program to Australia. The program has been rolled out to those who might be at a higher risk of being targeted online, such as candidates and public officials, encouraging them to adopt stronger account security protections. We will also be running a Facebook security prompt to candidates and leaders to remind them to turn on two factor authentication. We are working with the Australian Electoral Commission and political parties to run training sessions for all candidates on our policies and tools and how to keep safe in the lead up to the election.

 

Encouraging transparency around political, election and social issues ads 

  • Mandatory transparency for political ads.
    Since the 2019 Australian Federal Election, we’ve made a series of changes for advertisers in Australia that want to run electoral, political and social issue ads. Advertisers are now required to go through an authorisation process using government-issued photo ID, and place “Paid for by” disclaimers on their ads. This includes any person creating, modifying, publishing or pausing ads that reference political figures, political parties or elections. It also includes social issue ads that seek to influence public opinion through discussion, debate or advocacy for or against important topics, such as civil and social rights, crime, environmental politics, education or immigration. We launched these requirements for social issue ads last year, before the first possible election date, to ensure it was in place for the Australian federal election. Any political, electoral or social issue ads on Facebook and Instagram that do not have the correct authorisation or disclaimers will be removed from the platform and archived in a public Ad Library for seven years.

 

  • Allowing Australians to control their experience.
    While political and social issue ads play an important role in every election, people have told us they want the option to see fewer of them in their Facebook and Instagram feeds. Last year, we announced a new feature that gives people more control by giving them the choice to see fewer social issues, electoral, and political ads. 

 

Encouraging civic engagement and empowering voters

We want to do more than just prevent abuse on our platforms during elections. We also want to empower people to participate, so for this year’s Australian elections, we’re supporting civic engagement in a number of ways.

 

  • Developing products on the platform to encourage people to vote.
    Closer to election day we will begin running notifications at the top of Facebook’s Feed for people over 18, reminding them to vote, while also connecting them with reliable information about the voting process. We will also be launching Instagram Stories election stickers to celebrate and encourage voter engagement.

Election Day Reminder (example only)

Election Day Reminder (example only)

 

  • Informing people about the latest election communications from political parties, journalists and the Australian Election Commission.
    We will release a public, Live Display on Crowdtangle dedicated to the Australian Election. The Live Display will provide real-time coverage of the most recent communications about the election on Facebook and Instagram.

As we get closer to the Australian election, we’ll stay vigilant to emerging threats and take additional steps if necessary to prevent abuse on our platform while also empowering people in Australia to use their voice by voting.

About RMIT 

RMIT FactLab is a research hub at RMIT University dedicated to debunking misinformation online and developing critical awareness about its origins and spread. It is devoted to social media verification, research and education and combating the viral spread of misinformation on social media platforms. RMIT FactLab brings together the best of quality journalism and academic excellence to teach and build awareness around the damaging impact of bad information. It also conducts original research into the digital news ecosystem.

54 Newsrooms and Independent Journalists Receive Funds From AUD $15M Facebook Australia News Fund

Andrew Hunter, Head of News Partnerships for Meta Australia – March 1, 2022

 

Meta Australia, in partnership with the Walkley Foundation, is pleased to announce the 54 news publishers and independent journalists who will be awarded AU$5M in funds from the Facebook Australia News Fund, a AUD$15M 3-year investment. The successful recipients represent local and regional newsrooms, independent journalists from across Australia, and serve diverse communities such as First Nation peoples, LGBTQI+, regional, young adults and women. The funds will help support investment in public-interest journalism, such as rural affairs, local civic journalism, women’s issues and will also help create regional and digital newsroom innovation and economic sustainability.

“As part of our investment in Australian news, we wanted to ensure that smaller, regional, rural and digital newsrooms were supported,” said Andrew Hunter, News Partnerships Lead for Meta Australia. “Funds are going directly towards public interest journalism as well as projects to help newsrooms grow and diversify revenue streams.  We think it’s essential that these newsrooms are economically sustainable so they can continue to tell the stories of their communities.”

The Facebook  Australian News Fund is provided in the form of two funds, the Newsroom Sustainability Fund and the Public Interest Journalism Fund. Both funds are administered through our ongoing partnership with the Walkley Foundation, which nominated an independent external judging committee to review applications against the funds’ eligibility criteria, which included whether or not the project contributed to the news organisation’s long-term viability or provided a public benefit. News organisations that already had a content licensing agreement in place with Meta were not eligible to apply for the funds.

“Money is going directly to the projects across Australia that our judges considered the most worthy during a rigorous arms-length assessment process,“ said Shona Martyn, CEO of the Walkley Foundation. “We received a total of 169 applications in this round. During the stringent judging process, budgets proposed by the applicants were analysed line-by-line. By offering partial funding to some projects, we ended up funding more applicants than originally envisaged. The two judging panels were looking for real need, creativity and detailed plans that could be guaranteed to deliver. The diversity of the successful applicants and the spread of awards across Australia is both encouraging and inspiring.”

“This week’s devastating floods in Queensland and northern New South Wales illustrate the importance of having a strong on-the-ground media presence in regional locations who have an intimate understanding of the area and its people,” Martyn added.

The Newsroom Sustainability Fund will allocate AU$2.5 million, in grants of up to AU$250,000 to fund regional newsrooms and digital-first publications. Funding will go towards innovation and revenue-generating projects, such as subscription paywalls and membership program development.

The Public Interest Journalism Fund, will also allocate AU$2.5 million, providing grants of up to AU$120,000 to small, regional publishers and independent journalists to fund news projects of public interest value and to encourage media diversity. We are committed to providing support for Australia’s underserved communities.

Newsroom Sustainability Fund recipients

Organisation name Project description
2SER Content publishing and fundraising platform restructure
Australian Associated Press Direct to business and government newswire
Australian Jewish News News app, article archive and content partnership
Bundaberg Today Women in Agriculture lift-out
Cairns Local News Cairns Local News digital transition
Central Queensland Media Re-launch the Longreach Leader
Galah Press NSF Membership model launch
Indian Link Media Group Website rebrand and expansion of digital content
IndigenousX 10-year anniversary re-brand, studio and office equipment
National Indigenous Times First Nations Images Library
Newcastle Weekly Newcastle Weekly website overhaul
North West Farmer North West Farmer publication and website
Papers & Publications Pty Ltd Launch of Fleurieu Newsmedia
Pro Bono Australia Pro Bono News paid subscriber model
Region Group Pty Ltd “MoJo” mobile journalism equipment
Shepparton Adviser The Shepparton Adviser digital audience strategy
Shepparton Newspapers Pty Ltd Launch of a custom news app
South Burnett Today South West Queensland Leader launch
Star News Group Pilot project for hyperlocal e-newsletter service
TBW Today Pty Ltd Re-launch the South Eastern Times
The Daily Aus Reader polling technology
The Illawarra Flame Merger of three print services into single digital edition
Wimmera Mallee News Pty Ltd Virtual Editorial Portal with automatic article uploading

 

Public Interest Journalism Fund recipients

Organisation name Project description
2SER Re-launch of Razor’s Edge,a Sydney-based current affairs show
Australian Associated Press Week-long news-gathering missions in remote locations
Blank Gold Coast Boosting online output, social media and coverage of women
Burdekin Local News Newsroom expansion with two part-time journalists
Cairns Local News Editorial Funding a cadet journalist placement
Catherine Scott Trafficking Jam podcast series on the impact of sex trafficking laws
Coonabarabran Times Hire a part-time Indigenous affairs reporter
Diamond Valley Enterprises Journalists – High Country Herald & Western Downs Town and Country
Ethnic Broadcasting Association  Queensland t/a Radio 4EB Creation of radio, podcast and in-language explainers for non-English speaking audiences
First Nations Media Australia Indigenous stringer journalist network
Harry Clarke – Country Caller Expanding Toowoomba news service to cover regional Queensland
Ian Kenins Small towns at the post-Covid crossroads and other long form projects
IndigenousX Public Interest Journalism Fund Debunking video series aimed at mythbusting commonly-held misconceptions about Indigenous people.
John van Tiggelen Various long-form journalism projects
Murray Bridge News Murraylands’ stories on poverty and the housing crisis
National Ethnic and Multicultural Broadcasters Council (NEMBC) National Multilingual News Service
National Indigenous Times National Indigenous News remote stringer journalist network
Navin Sam Regi Royal Commission into Aged Care report
Ninti Media Kangaroo Island timber plantation controversy
Pakenham Gazette Family violence news project – a monthly liftout and web article series
PRIMER Hiring a violence against women reporter
Roundbox Media Connecting Rural Communities
Rural Review Expanding news coverage to all of southern Queensland
Saltgrass Podcast Develop a marketing and subscription plan
Susanna Freymark The expansion of hyperlocal digital service, IndyNR
The Express Part-time hire of a cadet journalist
Truepenny Media The Point News Site for reporting on environmental, social and governance issues
Western Plains App Combined app for several local newspapers and radio stations

 

Facebook and AAP launch ‘Check the Facts’ media literacy campaign

Josh Machin, Head of Policy (Australia) – October 26, 2021

 

Facebook has invested significantly in order to detect and combat misinformation on our services, especially harmful health misinformation. But we also know that misinformation spreads in a variety of ways, including offline.

That’s why it’s so important to make sure that Australians are equipped to spot misinformation and be critical consumers of media so that they can challenge misinformation – wherever they might encounter it.

As part of 2021 Global Media and Information Literacy week, Facebook has supported AAP, one of our third-party fact-checking partners, to develop ‘Check the Facts’ – a nationwide media literacy campaign. Launching this week across Australia, the campaign aims to help inform and empower Australians to understand, identify and prevent the spread of misinformation, as well as how to support those close to them to do the same.

The ‘Check the Facts’ campaign plays to Australians’ love of sport and shows how misinformation can spread in everyday life, whether you’re at a BBQ with friends, at work with your colleagues, or having a night in at home.

The campaign shares three simple tips that draw on the basics of professional fact-checking to help people to identify reliable information:

  1. Who made the claim?
  2. What’s the evidence?
  3. What do trusted sources say?

In addition to the campaign, we have supported AAP to develop a range of free resources for Australians to proactively identify and avoid misinformation. These resources cover:

  1. Defining misinformation and media literacy: Explains what misinformation is and its common characteristics, and defines media literacy and why it is important.
  2. How to ‘Check the Facts’: AAP’s professional fact-checkers break down how anyone can identify misinformation through three key questions.
  3. What is a trusted source?: How to identify what sources you can rely on in the information age.
  4. How to spot visual misinformation: A beginner’s guide on how to spot a fake image.

‘Check the Facts’ is one of the many ways we’re helping to tackle misinformation in Australia. Recently, we launched the COVID-19 Information Centre, helping over 6.2 million Australians to access authoritative information on the pandemic. We’ve also offered support to the Australian Government to amplify authoritative information on the vaccine rollout, and expanded our fact-checking partner capability in Australia.

Last month, we also partnered with a misinformation prevention NGO, First Draft, to launch a local campaign and educational toolkit to address the spread of misinformation by digital creators.

For more information on the ‘Check the Facts’ campaign,  please visit AAP’s FactCheck website: factcheck.aap.com.au.

Applications open for the AU$15 million Facebook Australian News Fund

Andrew Hunter, News Partnerships Lead for Facebook Australia & New Zealand – October 14, 2021

 

We are excited to announce that applications are now open for the AU$15 million Facebook Australian News Fund. Interested applicants may apply to the fund at the Walkley Foundation website from today.

Apply here

The Facebook Australian News Fund will invest in public-interest journalism and support small, regional, and digital newsroom innovation and economic sustainability. The AU$15 million investment will be divided between two funds; the Newsroom Sustainability Fund and the Public Interest Journalism Fund over three years

The Newsroom Sustainability Fund will allocate AU$2.5 million per year, in grants of up to AU$250,000 to fund newsroom products and revenue-related projects. Priority will be given to entrants that are based in a regional area or serving local communities or digital-only publications.

The Public Interest Journalism Fund will also allocate AU$2.5 million per year, providing grants of up to AU$120,000, to invest in public interest journalism and encourage media diversity. It will support small, regional newsrooms and independent journalists, as well as newsrooms covering under-served communities.

Eligible Australian-based newsrooms and journalists can apply to either or both funds under the Facebook Australian News Fund. The Walkley Foundation will nominate an independent external judging committee to review applications against the funds’ eligibility criteria, which include whether or not the project contributes to the news organisation’s long-term viability or provides a public benefit.

Applications for the two funds close on November 26, 2021. Winners will be announced before the end of March 2022.

Over the past two years, joint initiatives between the Walkley Foundation and Facebook have supported more than 30 Australian newsrooms through programs such as the reader revenue Accelerator, the  AU$1 million COVID-19 relief fund, and the most recent Accelerator alumni grants, awarded in August this year.

Publishers and journalists who have applied for the above funding programs in the past three years are still eligible to apply for the Newsroom Sustainability Fund and the Public Interest Journalism Fund. However, organisations that already have a content licensing agreement in place with Facebook are not eligible to apply for these funds.

Facebook announces AU$15 million news fund and begins the phased launch of Facebook News in Australia

Andrew Hunter, Head of News Partnerships for Facebook Australia – August 4, 2021

 

Today, we are pleased to announce the launch of an AU$15 million news fund and the expansion of Facebook News to Australia.

The Facebook Australian News Fund will invest in public-interest journalism and support regional and digital newsroom innovation and economic sustainability. The support will be provided in the form of two funds, the Newsroom Sustainability Fund and the Public Interest Journalism Fund.

The Newsroom Sustainability Fund will allocate AU$2.5 million per year over three years, in grants of up to AU$250,000 to fund regional newsrooms and digital-first publications. Funding will go towards innovation and revenue-generating projects, such as subscription paywalls and membership program development.

The Public Interest Journalism Fund will also allocate AU$2.5 million per year over three years, providing grants of up to AU$120,000, to small, regional publishers and independent journalists to fund news projects of public interest value and to encourage media diversity. We are particularly focused on how the fund could provide support for underserved communities – such as Indigenous Australians, LGBTQI+ community, youth and women’s issues, rural affairs, and local civic journalism.

Expressions of Interest is open for both funds, and newsrooms, journalists, and publishers to register their interest and be notified when applications are open.

This investment will help publishers and independent journalists produce news of public interest value across Australia. It will also support smaller, regional, rural, and digital newsrooms as they develop new products and strategies to expand reach and revenue.

The new funds build on Facebook’s commitment to the Australian news community. For the past three years, Facebook has partnered with the news industry to support sustainable business solutions. This includes its AU$1.5 million investment in the 2019-20 reader revenue accelerator supporting 11 ANZ news publishers; the AU$1 million COVID-19 relief fund that went to 17 rural, regional, and community newsrooms across Australia, and the newly awarded Accelerator Alumni Grants.

These programs and investments are in addition to the value from referral traffic and revenue programs we provide to Australian publishers. In 2020, Facebook generated approximately 5.1 billion free referrals to Australian publishers worth an estimated AU$407 million. And from January to November 2020, Australian publishers generated AU$5.4 million from our revenue share programs, such as In-Stream Video advertising.

Today, we are also pleased to announce Facebook News will begin to roll out to a small number of Australians. This phased launch will ensure the product works in a way that provides benefits to Australian audiences and publishers. We’re planning to expand Facebook News to more Australians in the coming months.

In addition to our investments in Facebook News and the Australian News Fund, Facebook will continue to support the news industry through products and programs to promote premium journalism and drive referral traffic and revenue for publishers.

Expanding Transparency Around Social Issue Ads in Australia

Josh Machin, Head of Public Policy (Australia) – June 18, 2021

 

Today, we’re announcing the expansion of initiatives focused on increasing transparency and controls around social issues, electoral, and political ads in Australia.

While political ads play an important role in every election, people have told us they want the option to see fewer of these on their Facebook and Instagram feeds. Earlier this year, we announced a new control feature that allows people to have more control over the ads they see on Facebook. This feature gives people a choice to see fewer social issues, electoral, and political ads with “Paid for by” disclaimers in Australia.

To enable the social issue, electoral, and political ads control feature, people can adjust their ad topic preferences:

  1. Visit Ad Preferences then click ‘Ad Topics’.
  2. Under the list of ‘Ad Topics’, you’ll see a list of topics including ‘Social Issues, Elections or Politics’.
  3. Across from ‘Social Issues, Elections or Politics’, click ‘See Fewer’.

People can also turn on this control by clicking on the top of these ads in their feed and we will stop showing them ads about social issues, elections, or politics that include a disclaimer.

We’re also expanding the proactive enforcement of social issue ads to Australia. Since last August, advertisers in Australia that want to run electoral and political ads were required to go through the authorisation process using government-issued photo ID, and place “Paid for by” disclaimers on their ads. This includes any person creating, modifying, publishing or pausing ads that reference political figures, political parties or elections. Ads will also be entered into our Ad Library for seven years.

From June 29, these requirements will also apply to advertisers that choose to run social issue ads that seek to influence public opinion through discussion, debate or advocacy for or against important topics, such as civil and social rights, crime, environmental politics, education or immigration. Any political, electoral or social issue ads on Facebook that do not have the correct authorisation or disclaimers will be removed from the platform and archived in a public Ad Library for seven years.

These additional transparency features for people on Facebook and Instagram and mandatory requirements for advertisers will help promote safe and healthy debate on influential topics, so that people can better understand who’s trying to influence them with ads. We do this, because we believe the discussion of ads about social issues, elections or politics deserves transparency.

Extending our industry-leading transparency to now cover social issue ads is part of our commitments as a founding member and signatory to the voluntary Australian Code of Practice on Disinformation and Misinformation. The Code — led by industry association for the digital industry in Australia, DIGI — is an industry effort to help reduce misinformation online, and has been prepared at the request of the Australian Government.

Other Australian specific commitments we have committed to include:

  • offering additional support to the Australian Government to direct people to authoritative information about the vaccine rollout, including a substantial provision of ad credits and the offer to build a vaccine finder service in Australia
  • expanding our third-party fact-checking partner capability within Australia in 2021
  • funding Australia-specific research and program partnerships by independent experts and academics on media literacy, misinformation and disinformation in 2021

More details on how to get authorised to run social issue, electoral, and political ads is available here.

Learn more about social issue ads and which do or don’t require disclaimers here.

Facebook’s response to Australia’s disinformation and misinformation industry code

Josh Machin, Head of Public Policy, Facebook Australia – May 21, 2021

 

Today, Facebook has released its first response to the voluntary industry code on disinformation and misinformation.

At the end of 2019, the Australian Government asked the technology industry to develop a voluntary code of conduct to help reduce the risk of online misinformation.

The industry association for the digital industry in Australia, DIGI, led the industry’s response effort and launched the Australian Code of Practice on Disinformation and Misinformation, earlier this year.

Facebook is proud to be a founding member and signatory to the Code and has committed to 43 specific commitments to meet the obligations outlined in the voluntary code. Since the beginning of the pandemic, Facebook has taken aggressive steps to combat harmful COVID-19 misinformation and will continue to work with health experts to make sure that our approach and our policies are in the right place as the pandemic evolves.

Importantly, for the first time, we have prepared selected Australia-specific statistics about content on our platforms, to encourage a sophisticated public policy debate about misinformation in Australia.

  • From the beginning of the pandemic in March 2020 to the end of December 2020, globally we removed over 14 million pieces of content that constituted misinformation related to COVID-19 that may lead to harm, such as content relating to fake preventative measures or exaggerated cures. 110,000 pieces of content were from Pages or account specific to Australia (noting that Australians benefitted from the content we removed from other countries as well).
  • We have made a COVID-19 Information Centre available around the world to promote authoritative information to Facebook users. Over 2 billion people globally have visited the Information Centre; over 6.2 million distinct Australians have visited the Information Centre at some point over the course of the pandemic.

Other Australian-specific commitments we have committed to include:

  • offering additional support to the Australian Government to support authoritative information about the vaccine rollout, including a substantial provision of ad credits and the offer to build a vaccine finder service in Australia
  • expanding our fact-checking partner capability within Australia in 2021
  • funding Australia-specific research and program partnerships by independent experts and academics on media literacy, misinformation, and disinformation in 2021
  • extending the current industry-leading transparency we provide to political advertising to also cover social issues advertising.

The new Australia-specific commitments contained in this report are in addition to the significant global efforts that Facebook already undertakes to combat disinformation and misinformation.

The Australian Government has indicated it is spending the first half of 2021 to assess if the DIGI industry code meets its expectations. If not, the Government will institute mandatory regulation.

Facebook believes the voluntary code is a credible, world-leading first step in the collaboration between the technology industry and governments to combat misinformation. Crafting new frameworks that balance freedom of speech but prevent the spread of harmful misinformation is challenging. In Australia, DIGI followed a best-practice process in developing this industry code, commissioning independent expert research, undertaking a round of public consultation and deeper engagement with experts, publishing a draft version, and genuinely incorporating feedback received into the final.

The Code takes account of the lessons of other international efforts (in particular, the EU Code of Practice on Disinformation) to incorporate best practices and address the areas where stakeholders have raised concerns.

There still remain many challenges with misinformation public policy issues, including definitions and measurement.

Finally, we need to understand online misinformation as part of a broad ecosystem of information-sharing. Misinformation can occur offline, online, on TV, on radio or podcasts, or in face-to-face conversations between family and friends. Often, sharing of misinformation may be inadvertent – but it can also be deliberately shared by political groups or bad actors. Pushing back on misinformation is a constant task, part of the essential process of open debate in a democratic society.

Facebook looks forward to contributing to the public policy debate in Australia around those questions over the next year. Through a very significant work program (detailed in our response) and support for research, we welcome the opportunity to contribute to the discussion in Australia.

Read Facebook’s response to the Australian disinformation and misinformation industry code in further detail here.

To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookies Policy