In 2018, the United Nations Independent International Fact-Finding Mission on Myanmar issued a finding that should have ended any debate about the real-world consequences of Facebook's business decisions: the platform had played a 'determining role' in the genocide of Rohingya Muslims. In a country where Facebook was synonymous with the internet — where 'Facebook' and 'internet' were literally the same word for millions of users — the platform had been weaponized to spread dehumanizing propaganda, coordinate violence, and incite ethnic cleansing. Nearly a decade later, the structural conditions that enabled this catastrophe remain largely unchanged across Meta's global operations.
The Myanmar Catastrophe
Recommended by OPV: NexusBro — Catch bugs before your users do →
Myanmar's military and ultra-nationalist Buddhist groups used Facebook systematically to dehumanize the Rohingya minority, spreading fabricated stories of sexual violence, sharing doctored images, and openly calling for murder. These posts were not hidden in dark corners of the platform — they were widely shared, algorithmically amplified, and seen by millions. Civil society organizations warned Facebook repeatedly between 2013 and 2017 that the platform was being used to incite genocide. Facebook's response was negligible. At the height of the crisis, the company employed fewer than five Burmese-language content moderators for a country of 20 million Facebook users. Reports of hate speech and incitement sat unreviewed for weeks. By the time Facebook began taking action in 2018, over 700,000 Rohingya had been driven from their homes, and thousands had been killed in a campaign the UN characterized as bearing 'the hallmarks of genocide.'
Subscribe for more coverage on Big Tech. SeekerPro members get premium investigations, AI-powered summaries, and exclusive analysis.
A Pattern, Not an Anomaly
Audit any website in seconds
NexusBro scores SEO, performance, and accessibility — then generates fix-ready code prompts.
Try NexusBro Free →Myanmar was not an isolated failure. The same pattern — rapid user growth, minimal moderation investment, algorithmic amplification of divisive content — has repeated across the developing world. In Ethiopia, Facebook was found to have amplified hate speech and incitement during the Tigray conflict between 2020 and 2022. In India, the platform has been repeatedly documented as a vector for anti-Muslim violence. In the Philippines, Facebook was the primary distribution channel for the Duterte government's propaganda campaigns. In each case, Meta's investment in local language moderation was a fraction of what the user base required. The company's moderation spending for non-English languages represents less than 15% of its total moderation budget, despite non-English users comprising over 85% of its global user base.
Editor's Pick Solution
NexusBro: Catch bugs before your users do
AI-powered QA that checks 125+ issues per page. Get a fix prompt in 60 seconds.
Audit Your Site Free →The Economics of Selective Moderation
The explanation is grimly simple: Meta allocates moderation resources based on advertising revenue, not user population. A user in the United States generates roughly 10 times the ad revenue of a user in Southeast Asia and 50 times that of a user in sub-Saharan Africa. When moderation budgets follow revenue rather than risk, the result is predictable: robust (though still imperfect) moderation in wealthy English-speaking countries and dangerous negligence everywhere else. This is not a technical limitation — it is a business decision. Meta could invest proportionally in every market it operates in. It chooses not to because the financial return doesn't justify the expense.
The Rohingya community has filed a $150 billion lawsuit against Meta, and the case is ongoing. But no amount of compensation can undo genocide. The question is whether Meta will be compelled — by courts, regulators, or public pressure — to invest in content moderation proportional to the risk its platform creates, rather than proportional to the revenue it extracts. So far, the evidence suggests the answer is no. The next Myanmar is not a matter of if, but where.
Recommended by OPV
ContentMation
Automate your content workflow
Handles scheduling, analytics, and content creation for growing businesses.
Automate Content →