This Article is From Jul 16, 2022

Meta Platforms In India Found Exposed To Human Rights Risks Due To Third Party Action

The report is based on an independent human rights impact assessment (HRIA) commissioned in 2019 by Meta on potential human rights risks in India and other countries related to its platforms.

Advertisement
India News Posted by

Meta platforms include Facebook and Whatsapp.

Meta platforms, which include Facebook and Whatsapp, were found exposed to human right risks such as "restrictions of freedom of expression and information" and "hatred that incites hostility" due to action of third parties, the first human rights report of the social media giant has said.

The report is based on an independent human rights impact assessment (HRIA) commissioned in 2019 by Meta on potential human rights risks in India and other countries related to its platforms.

The project was undertaken by Foley Hoag LLP.

"The HRIA noted the potential for Meta's platforms to be connected to salient human rights risks caused by third parties, including: restrictions of freedom of expression and information; third party advocacy of hatred that incites hostility, discrimination, or violence; rights to non-discrimination; as well as violations of rights to privacy and security of person," the report said.

The HRIA involved interviews with 40 civil society stakeholders, academics, and journalists.

Advertisement

The report found that Meta faced criticism and potential reputational risks related to risks of hateful or discriminatory speech by end users.

The assessment also noted a difference between company and external stakeholder understandings of content policies.

Advertisement

"It noted persistent challenges relating to user education; difficulties of reporting and reviewing content; and challenges of enforcing content policies across different languages. In addition, the assessors noted that civil society stakeholders raised several allegations of bias in content moderation. The assessors did not assess or reach conclusions about whether such bias existed," the report said.

According to the report, the project was launched in March 2020 and it experienced limitations caused by Covid-19, with a research and content end date of June 30, 2021.

Advertisement

The assessment was conducted independently of Meta, the report said.

The HRIA developed recommendations for Meta around implementation and oversight, content moderation, product interventions etc which Meta is studying and will consider them as a baseline to identify and guide related actions, the report said.

Advertisement