fact-checking
Meta to test crowd-sourced fact-checking using X's model
Meta will begin testing its crowd-sourced fact-checking initiative, Community Notes, on March 18, following the model used by Elon Musk's X, the company announced on Thursday.
Meta had previously discontinued its fact-checking programme in January, with CEO Mark Zuckerberg stating that fact-checkers had become “politically biased,” echoing criticisms long voiced by conservatives. However, media experts and social media researchers expressed deep concern over the policy change.
French publishers, authors sue Meta for AI copyright infringement
“The decision not only eliminates a valuable resource for users but also lends credibility to the widespread disinformation narrative that fact-checking is politically biased. Fact-checkers play a crucial role by providing essential context to viral claims that mislead millions on Meta,” said Dan Evon, lead writer for RumorGuard, the News Literacy Project’s digital tool that curates fact checks and educates people on identifying misinformation.
Meta first introduced fact-checking in December 2016 following Donald Trump’s election, responding to concerns about the spread of “fake news” on its platforms. For years, the company partnered with over 100 organisations across more than 60 languages to combat misinformation. The Associated Press withdrew from Meta’s fact-checking programme more than a year ago.
Community Notes will eventually replace fact checks, though not immediately. Meta stated that potential contributors in the U.S. can begin signing up for the programme, but their notes will not be visible right away.
“We will start by gradually and randomly admitting people from the waitlist and will take time to test the writing and rating system before any notes are published publicly,” Meta explained.
Meta emphasised that it would not determine what content gets rated or written, and notes will only be published if contributors with diverse viewpoints reach a broad consensus. Unlike the previous fact-checking system, where flagged misinformation saw reduced distribution, posts with Community Notes will not face penalties.
Meta initiates layoffs to reduce workforce by 5%
Fact checks will remain in place outside the U.S. for now, though Meta intends to expand Community Notes globally in the future.
8 days ago
How fact-checking can counter misinformation in Bangladesh
The battle against misinformation is intensifying across Asia, including Bangladesh, as nations grapple with its damaging effects on societal harmony and public trust.
From fabricated claims of communal violence to manipulated narratives on international conflicts, the region faces a growing crisis that demands robust fact-checking initiatives and greater public awareness.
One alarming example emerged in Bangladesh on the night of 5 August last year, following the Awami League’s fall from power.
Journalist Kamrul Islam received reports alleging that Indian intelligence forces were trapped in Bangladeshi police stations and were shooting civilians. These claims, fuelled by falsified videos, spread rapidly on Facebook, inciting widespread panic. By morning, Kamrul’s verification efforts exposed the narrative as entirely baseless.
TikTok’s future at US Supreme Court: Free speech vs security
This incident underscores the pressing issue of misinformation in Bangladesh.
According to a 2024 survey conducted by Fact Watch, a Bangladeshi fact-checking organisation, social media platforms were responsible for 638 instances of misinformation in the country last year.
August: A Peak in Misinformation
The Fact Watch study identified August as the peak month for disinformation, with over 91 cases recorded. Many of these falsehoods centred around fabricated accounts of violence against minorities in Bangladesh.
On the international front, misinformation during this period included false reports of Iranian President Ebrahim Raisi’s death, alleged Iranian attacks on Israel, and fabricated updates on the Israel-Palestine conflict.
Platforms such as Facebook, Instagram, Twitter, and WhatsApp remain the primary conduits for spreading misinformation. Driven by emotional reactions, users often share unverified claims, amplifying their reach and impact.
“Misinformation is crafted to provoke emotional responses,” explained Professor Suman Rahman, Guidance Editor at Fact Watch. “People seldom verify such claims, especially when they align with their existing beliefs, leading to a cascade of unverified information.”
Meta tries letting Facebook Marketplace users view eBay listings
Kamrul’s experience highlights the dangers of this phenomenon. “That night, I shared the unverified reports with colleagues, who then posted them on Facebook. The panic and disinformation spread like wildfire, dominating the night,” he recalled.
Meta’s Decision and Its Implications
In a controversial move, Meta, the parent company of Facebook, Instagram, and WhatsApp, recently announced plans to cease fact-checking operations in the United States.
Citing concerns about bias and excessive censorship, the decision has drawn sharp criticism.
The International Fact-Checking Network (IFCN), a former partner of Meta, accused the tech giant of being politically influenced and lenient towards misinformation propagated by political leaders.
Analysts warn that if Meta implements similar policies in Asia, it could worsen the spread of disinformation in countries like Bangladesh.
“Discontinuing fact-checking in sensitive regions like South Asia could have devastating consequences,” said Kadruuddin Shishir, Fact-Check Editor for AFP in Bangladesh. “In Bangladesh, where misinformation has previously triggered communal violence, robust fact-checking is indispensable.”
Shishir urged the Bangladeshi government to pressure Meta to expand its fact-checking initiatives.
He also highlighted the lack of fact-checking mechanisms on platforms such as YouTube, making them fertile ground for disinformation.
Challenges in Traditional Media
Misinformation is not confined to social media. A Pew Research study revealed that 73% of Americans believe mainstream media frequently spreads false information. Similar challenges afflict Bangladeshi media, where fact-checking is often neglected.
“While leading international outlets have dedicated fact-checking desks, Bangladeshi newsrooms largely lack such resources,” Shishir noted. “Without proper training, journalists struggle to differentiate between verified and unverified information, undermining their credibility.”
Meta ending fact-checking program, Zuckerberg announces
The Need for an Independent Fact-Checking Commission
Experts argue that reliance on corporate entities like Meta for fact-checking is unsustainable. Professor Mofizur Rahman of Dhaka University proposed the establishment of an independent fact-checking commission in Bangladesh.
“If fact-checking organisations are influenced by corporate or government interests, impartiality is compromised. An independent commission would ensure unbiased verification,” he asserted.
Professor Suman echoed this view, emphasising the importance of media literacy among the public. “Educating people to critically evaluate information is just as important as fact-checking itself,” he said.
The Way Forward
As misinformation continues to evolve, so too must efforts to combat it. Strengthening fact-checking initiatives, fostering public awareness, and ensuring the accountability of media platforms are critical steps in safeguarding the truth. For Asia and Bangladesh, where misinformation has had tangible and often dangerous consequences, the fight against falsehoods is more urgent than ever.
2 months ago
Meta ending fact-checking program, Zuckerberg announces
Meta CEO Mark Zuckerberg unveiled significant changes to the company’s moderation policies on Tuesday, citing a shifting political and social climate and the need to restore free expression across its platforms. The updates will impact Facebook, Instagram, and Threads, which together serve billions of users globally.
Meta will discontinue its existing fact-checking program, which relied on partnerships with third-party organizations, and implement a community-driven system similar to X’s Community Notes, according to a report by NBC News.
“We’re going to get back to our roots and focus on reducing mistakes, simplifying our policies, and restoring free expression on our platforms,” Zuckerberg said in a video message. “First, we’re going to get rid of fact-checkers and replace them with community notes similar to X, starting in the U.S.”
Meta set to introduce Ray-Ban smart glasses in 2025
In addition, Meta will modify its content moderation policies, particularly around political topics. Changes that previously reduced political content in user feeds will be undone. Zuckerberg highlighted the U.S. election as a pivotal factor influencing these decisions, criticizing what he described as pressure from “governments and legacy media” to increase censorship.
“The recent elections also feel like a cultural tipping point towards, once again, prioritizing speech,” he said.
Zuckerberg acknowledged that the complex systems Meta had developed to moderate content were prone to errors, impacting millions of users.
“We built a lot of complex systems to moderate content, but the problem with complex systems is they make mistakes,” he said. “Even if they accidentally censor just 1% of posts, that’s millions of people, and we’ve reached a point where it’s just too many mistakes and too much censorship.”
While Meta will continue to strictly moderate content related to drugs, terrorism, and child exploitation, the company plans to ease some policies surrounding sensitive topics like immigration and gender. Automated moderation systems will now focus on "high severity violations," relying more on user reports for less severe issues.
“We’re also going to tune our content filters to require much higher confidence before taking down content,” Zuckerberg explained. “The reality is that this is a trade-off. It means we’re going to catch less bad stuff, but we’ll also reduce the number of innocent people’s posts and accounts that we accidentally take down.”
Meta donates $1m to Trump's inauguration fund
The decision to end the fact-checking program marks a departure from Meta’s earlier efforts, launched in 2016, which involved third-party fact-checkers certified by organizations like the International Fact-Checking Network (IFCN). Over 90 organizations participated, fact-checking content in more than 60 languages.
Meta’s shift mirrors broader trends in the social media industry, where companies have increasingly scaled back on moderation efforts amid criticism of bias and politicization. Conservatives, in particular, have long accused Meta’s fact-checking system of favoring liberal viewpoints—a claim that has been disputed.
X’s Community Notes, the model for Meta’s new system, has gained popularity among conservative users for its mix of fact-checking and community-driven contributions.
Zuckerberg’s announcement comes as social media companies navigate a politically charged environment. The NBC News report notes that Meta, like other tech giants, has sought to align with incoming political leadership. The company donated $1 million to President-elect Donald Trump’s inaugural fund, and Zuckerberg praised Trump in an interview before the election, though he stopped short of an endorsement.
2 months ago
ARTICLE 19, DW Akademie launch online course on misinformation, fact-checking
ARTICLE 19, the UK-based human rights organization, and German leading media development organization DW Akademie have jointly launched an online course titled ‘Misinformation and Fact-checking: Media Information Literacy’.
Registration of the course is now open. The five-week course, offered in Bangla, will begin on August 1, said a media release on Monday.
Also read: ARTICLE 19 to support troubled journalists, activists in Bangladesh
Anyone on desktop, laptop, tablet and smartphone can do the course at their convenience if they have internet connection only.
One has to go to the link https://banglatutorial-media.org/ to register for the course. Upon successful completion of the course, participants will receive an accomplishment certificate from DW Akademie and ARTICLE 19.
2 years ago
Facebook expands its fact-checking prog in Bangladesh
Facebook on Monday announced the expansion of its third-party fact-checking programme with the addition of international partner AFP and Bangladesh-based organization Fact Watch, as part of its ongoing efforts to reduce the spread of misinformation in Bangladesh.
AFP and Fact Watch join Boom, who has partnered with Facebook since 2020 when the program was first introduced in Bangladesh.
AFP and Fact Watch, which have been certified by the Poynter Institute's non-partisan International Fact Checking Network (IFCN), will review and rate the accuracy of Bangla and English stories on Facebook, including photos and videos in Bangladesh, said a media release.
When third-party fact-checkers rate a story as false, altered or partly false, it will appear lower in News Feed, significantly reducing its distribution on Facebook. Instagram will also make it harder to find by filtering from Explore and hashtag pages, and downranking it in Feed.
Pages and domains that repeatedly share false news will also see their distribution reduced and their ability to monetize and advertise removed.
In addition, content across Facebook and Instagram that has been rated false or altered is prominently labeled so people can better decide for themselves what to read, trust, and share.
These labels are shown on top of false and altered photos and videos, including on top of Stories content on Instagram and link out to the assessment from the fact-checker.
Facebook's fact-checking program started in December 2016. Today, the social media platform has over 80 partners fact-checking content in over 60 languages.
“Expanding our fact-checking program with new partners from AFP and Fact Watch, is an important step in our effort to reduce false news which requires the support of the broader community. As part of our effort to build more informed communities, the University of Liberal Arts Bangladesh, the IFCN and our fact-checking partners will host a discussion on the importance of fact-checking during the pandemic. We welcome the efforts and hope to work together to help build a more informed community in Bangladesh,” said Anjali Kapoor, Director of News Partnership, APAC at Facebook.
Last year, Facebook worked with the Ministry of Education, ICT Division and civil society partners in Bangladesh to launch We Think Digital, the company’s flagship program to empower the next generation of digital-first citizens to become more responsible and create respectful communities online.
Last week, the Facebook Journalism Project and Reuters launched Reuters Digital Journalism Course in Bangladesh to help journalists build a strong foundation in digital reporting and editing.
Facebook has been working with Bangladesh’s Ministry of Health and Family Welfare and ICT Division to help people access information about COVID-19, hygiene practices and vaccines.
In April, the social media platform also launched a media literacy campaign in Bangladesh to tackle COVID-related misinformation.
3 years ago