Analysis of the Report about Online Content Regulation by the UN Special Rapporteur on Freedom of Opinion & Expression

Reorienting Rules for Rights: A Summary of the Report on Online Content Regulation by the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression

UN Special Rapporteur on Freedom of Opinion & Expression to the UN Human Rights Council 2018 IS ATTACHED – 20 Pages.

Summary of the report on online content regulation by the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression (A/HRC/38/35).

The internet is the greatest tool in history for global access to information and expression. Internet companies have become central platforms for discussion and debate, information access, commerce and human development. Companies running platforms are enigmatic regulators, establishing a kind of “platform law” in which clarity, consistency, accountability and remedy are elusive.

States have a significant impact on how companies deal with online content regulation. Companies face increasing pressure from states to comply with state requests (both legal and extralegal) to moderate or remove content and are also taking pre-emptive measures through, for example, adaptations to their terms of service agreements (ToS). In response to these worrying trends the UN Special Rapporteur on freedom of opinion and expression proposes a series of measures states and companies can undertake to put human rights at the very centre of online content moderation.

Do human rights principles and standards apply to online content regulation?

Freedom of expression (FoE) is protected by Article 19 of the Universal Declaration of Human Rights and the International Covenant on Civil and Political Rights. The United Nations, regional organisations and treaty bodies have affirmed that offline rights apply equally online. Any restrictions placed on the exercise of the right to FoE must be legal, necessary, proportionate and legitimate. More importantly, the restrictions placed must not undermine or jeopardise the essence of the right. Though states are primarily the duty bearers to enforce and protect these rights, non-state actors like internet companies cannot shy away from playing their part in the realisation of “the right to hold opinions without interference” and “the right to seek, receive and impart information and ideas of all kinds, regardless of frontiers” and through any medium. This includes the internet.

The activities of companies in the information and communications technology (ICT) sector implicate the rights to privacy, religious freedom and belief, opinion and expression, assembly and association, and participation in public life, among others. The Guiding Principles on Business and Human Rights, adopted by the Human Rights Council in 2011, place a duty on states to ensure environments that enable respect for human rights on the part of businesses, who must strive to ensure that their policies and practices adhere to the Principles in letter and spirit. By applying human rights in their work, they would not be restricted; to the contrary, it would offer a globally recognised framework for designing tools and a common vocabulary for explaining their nature, purpose and application to users and states. Human rights law also gives companies the tools to articulate and develop policies and processes that respect democratic norms and counter authoritarian demands.

What are the problems emerging from online content regulation?

Standards not rooted in human rights

Most companies do not recognise their human rights obligations and as a result do not explicitly base content standards on any particular body of law that might govern expression, such as national law or international human rights law. Few companies apply human rights principles in their operations, and most that do see them as limited to how they respond to government threats and demands.

Government pressure and vague laws

While states require companies to restrict illegal content, they also often rely on censorship and criminalisation to shape the online regulatory environment. States use broadly worded restrictive laws, vague or complex legal criteria without prior judicial review, and the threat of harsh penalties to compel companies to restrict content and suppress legitimate expression. The commitment to legal compliance can be complicated when relevant state law is vague, subject to varying interpretations, or inconsistent with human rights law.

Extraterritorial requests

Some states are demanding extraterritorial removal of links, websites and other content alleged to violate local law, which would allow censorship across borders, to the benefit of the most restrictive censors.

Extralegal requests

State authorities increasingly seek content removals outside of legal process or even through ToS requests and have established specialised government units to refer content to companies for removal. States also place pressure on companies to accelerate content removals through non-binding efforts, most of which have limited transparency, exacerbating concerns that companies perform public functions without the oversight of courts and other accountability mechanisms.

User kept in the dark

Company disclosure about removal discussions, in aggregate or specific cases, as a result of human evaluation is currently limited and must be reported on adequately. Users who post reported content, or persons complaining of abuse, often do not receive any notification of removal or other action or have any avenues to challenge removals. Even with appeal, the remedies available to users appear limited or untimely to the point of non-existence and, in any event, opaque to most users and even civil society experts.

Automation and overblocking of content

Demands for quick automated flagging, removal and pre-publication filtering sometimes result in overblocking and disproportionate censorship. Devoid of context, this approach has led to removals of depictions of nudity with historical, cultural or educational value; historical and documentary accounts of conflict; evidence of war crimes; counter speech against hate groups; and efforts to challenge or reclaim racist, homophobic or xenophobic language.

Hate speech and marginalisation of vulnerable groups

Company policies on hate speech, harassment and abuse do not clearly indicate what constitutes an offence. The vagueness of hate speech and harassment policies has triggered complaints of inconsistent policy enforcement that penalises minorities while reinforcing the status of dominant or powerful groups. Steps taken by platforms have resulted in the suppression of lesbian, gay, bisexual, transgender and queer expression, as well as advocacy against repressive governments, reporting on ethnic cleansing, and critiques of racist phenomena and power structures. Misogynist or homophobic harassment designed to silence women and sexual minorities and incitement to violence of all kinds continue to thrive in online spaces, which has a significant impact on the offline realities of the people targetted.

Disinformation

Companies face increasing pressure to address disinformation spread through links to bogus third-party news articles or websites, fake accounts, deceptive advertisements and the manipulation of search rankings, which might not always be feasible.

Real name policy

Strict insistence on real names not only exposes bloggers and activists using pseudonyms to grave physical danger, but has also led to blocking of the accounts of vulnerable users and activists, drag performers and users with non-English or unconventional names.

What should the states and companies be doing to address these problems?

Recommendations for states

Recommendations for ICT companies

Human rights principles, policies and assessments

Transparency

User education and awareness

Development of policy on misinformation and media

Industry-wide accountability

Responding to state requests

Context, consultation, diversity and groups at risk

Automated content moderation

Content curation

Notice and appeal

https://www.apc.org/en/pubs/reorienting-rules-rights-summary-report-online-content-regulation-special-rapporteur-promotion

Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression

Извор: WUNRN – 05.07.2018