Human Rights Impact Assessment of Facebook in India 

Tuesday, February 22, 2022

In January 2022, we joined 24 civil society organisations in sending a public letter to Facebook Human Rights director Miranda Sisson to request the release of the Human Rights Impact Assessment of Facebook in India. This demand was made in the light of Facebook’s (now Meta’s) alleged commitment to human rights, contrasted with growing concerns that the lack of content moderation on Facebook’s platform in India is fuelling dangerous levels of hate speech against Indian minorities, particularly Indian Muslims. Our concerns emerge from the recent revelations made by Frances Haugen, which underscore our own research on the lack of content moderation that is allowing hate speech and disinformation to fester on Meta’s platforms: Over the last two years, Foundation The London Story’s team of researchers and volunteers have spent thousands of hours flagging hateful and otherwise toxic content on Facebook in non-English languages, ranging from Hindi to Dutch. This content contained episodes of dog whistling, misogyny, COVID mis- and disinformation, and dehumanizing of and direct calls for violence against Indian Muslims.  

For the purpose of the Independent Human Rights Impact Assessment, The London Story has been an active stakeholder in providing training and information. In Hindi language, we identified over 607 public Facebook pages regularly engaging in vitriol against Indian Muslims, of which the 27 most toxic pages had a combined video view of around 1 billion. We identified videos with over 40 million combined views across several Facebook pages where the speaker is calling for extermination of Indian Muslims, most notably that by Yati Narsinghanand, of which the Indian police has taken reluctant cognisance. Overall, the evidence our team gathered included over 200 posts that were flagged to Facebook for violating Facebook’s own content policy but remain online – including vigilante actors doxing couples by posting their information online, and using Facebook live to crowd-pull mobs with an intention to create violent disruptions. 

Not only has Facebook delayed the process of Human Rights Impact Assessment, by only commissioning a law firm to conduct it after years of pressure, but it has also blatantly ignored the concerns shared by civil societies and concerned individuals by continuing to ignore content moderation in India. The process of flagging toxic content on Facebook, which leads to an automated response claiming that ‘content does not go against our moderation policy’ has rather created more barriers than solutions. It has internalised apathy in Facebook’s system.  

At Foundation The London Story, we are concerned that the inaction by Meta regarding toxic content is normalising dehumanisation of Indian Muslims and contributing towards the growing risk of genocide in India. As Meta has not yet responded to our calls to finalize and release the Human Rights Impact Assessment, we – along with diaspora partners like India Civil Watch International, Indian American Muslim Council, Article 19, Internet Freedom Foundation, and several other stakeholders who were directly involved in the process of narrating through experience and evidence the impact of Facebook on human rights in India – invite everyone to sign onto this petition calling on Facebook to release the report.  

Menu