EXPLORING THE NITDA CODE OF PRACTICE AND ITS POTENTIAL IMPACT ON SOCIAL MEDIA AND ONLINE PLATFORMS 

June, 2022 - AGBOOLA DOSUNMU

On 13thJune 2022, the National Information Technology Development Agency (NITDA) issued the draft Code of Practice for Interactive Computer Service Platforms/Internet Intermediaries (“the Code”).[1]

The objectives of the Code include setting out best practices for Platforms and making the digital ecosystem safer for Nigerians and non-Nigerians in Nigeria. The Code is also expected to set out measures to combat harmful online information and adopt a co-regulatory approach toward implementation and compliance. The Code thereafter sets out provisions across six parts to achieve these objectives.

According to NITDA, the Code was developed in collaboration with the Nigerian Communications Commission (NCC) and the Nigerian Broadcasting Commission (NBC), with input from “interactive computer platforms” such as Twitter, Facebook, WhatsApp, Instagram, Google, and TikTok.NITDA further stated that the Code is aimed at “protecting the fundamental human rights of Nigerians and non-Nigerians living in the country, as well as defining guidelines for interacting in the digital ecosystem”.[2]

As expected, Nigerians have been distrustful of the Code, with many concluding that it is an attempt by the Nigerian Government to regulate social media and quash freedom of expression.[3]This is understandable considering the antecedents of the Nigerian Government when it comes to its posturing regarding social media Platforms.

For instance, in 2019, there was the Social Media Bill that was before the National Assembly, by which the Government was exploring ways of curbing the perceived excesses of social media users.[4]That ill-fated Bill was closely followed by the Prohibition of Hate Speeches Bill (“Hate Speech Bill”).[5]When the public outcry regarding both Bills became resounding, they were stepped down.

However, in 2021, following Twitter’s deletion of a Tweet posted by the President of the Federal Republic of Nigeria, Twitter was banned for several months, with users in Nigeria unable to directly access the Platform and many users resorting to using Virtual Private Networks (VPNs) to access the microblogging site.[6]

In that same year, there was an attempt by the Nigerian Government to amend the National Broadcasting Commission Act to empower the NBC regulate social media Platforms.[7]

It is against this background that we examine the provisions of the Code and determine if it is indeed a tool designed to restrict free speech in Nigeria.

WHAT ENTITIES ARE AFFECTED BY THE CODE?

It is pertinent to determine early enough who would be affected by the Code.

The following entities are expected to comply with the Code:

  • Interactive Computer Service Platforms– the Code defines these asany electronic medium or site where services are provided by means of a computer resource and on-demand and where users create, upload, share, disseminate, modify, or access information, including websites that provide reviews, gamingPlatforms,andonline sites for conducting commercial transactions.

The inference drawn from this definition of interactive computer service Platforms is that it would cover Platforms such as companies’ websites, fintechs, gaming companies, edtechs, healthtechs, e-commerce Platforms, social media Platforms and other service providers that offer goods and services through their Platforms.

  • Internet Intermediarydefined in the Code as including,but not limited to, social media operators, websites, blogs, media sharing websites, online discussion forums, streamingPlatforms, and other similar oriented intermediaries where services are either enabled or provided and transactions are conducted and where users can create, read, engage, upload, share, disseminate, modify, or access information.

This definition captures a number of companies already covered under interactive computer service Platforms. It includes streaming Platforms (like Netflix, YouTube, etc.), social media Platforms, internet service providers, e-commerce intermediaries, fintechs, etc. Indeed, both Interactive Computer Service Platforms and Internet Intermediaries are referred to as a “Platform” under the Code.

  • Large Service Platforms(Large Platforms)defined asan Interactive Computer Service Platform/Internet Intermediary whose users are more than one hundred thousand (100, 000)“.

This simple definition indicates that Platforms and Intermediaries (collectively referred to in this article as “Platforms”) with more than one hundred thousand users are classified as Large Platforms.

COMMENDABLE PROVISIONS

The Code contains some commendable provisions such as the provision mandating the removal of non-consensual sensual contents,[8]provisions addressing contents harmful to a child,[9]provisions introducing a notice-and-take-down regime and provisions concerning Platform rules.[10]

Item 1 of Part II also promotes equal distribution of information for Nigerian users.

CONCERNS WITH THE CODE

The Code contains certain provisions that may be used by an abusive government to curtail/infringe free speech. Indeed, the three major areas of concern under the Code with respect to restricting free speech are the provisions allowing the Government to order the removal of a content, the provisions mandating Platforms to proactively removefalse information likely to cause public disorderand the provisions requiring local incorporation of Platforms. The major areas of concern and other notable areas are examined below.

Mandatory incorporation of foreign Platforms

The Code imposes additional obligations on Large Platforms[11]which include an obligation to be incorporated in Nigeria, have a physical contact address in Nigeria and appoint a Liaison officer for communications with the Government.

It is likely that Large Platforms that do not carry out the obligations set out above would be prevented from operating in Nigeria.

The first problem with this position is the sheer number of Platforms that will be classified asLarge Platforms. This is because the 100,000 user threshold is extremely low and can be contrasted with the 45 million active monthly users within jurisdiction threshold forVery Large Online Platformunder the proposed EU Framework.[12]

Similarly, it is not compulsory for Platforms to be locally incorporated or have local addresses under the DSA as an appointment of aLegal Representativetypically suffices.[13]It should also be noted that under certain situations, NITDA may also require a Platform with less than one hundred thousand users to comply with the obligations of a Large Platform.

Takedown of content

A Platform is required to take-down a content within 24 hours of receiving notice from an Authorised Government Agency[14](“Agency”) of the presence of an unlawful content.[15]Unlawful content is defined under the code to meanany content that violates an existing law in Nigeria.

However, the problem is that the Agency is not required to specify how or why a content is unlawful, and the Platform is not given the time/avenue to verify the unlawfulness of the content -particularly where it is unclear whether the content is in fact unlawful.

The position under the Code can be contrasted with the position under the German Network Enforcement Act where content must bemanifestly unlawfuland the position under the French “Lutte contre la haine sur internet” (“Fighting hate on the internet”) law where the content must bepatently illegalbefore a takedown within 24 hours is required.

In addition, under the DSA, the relevant Agency/Court giving an order to take down a content is required to, among others, provide a statement of the reasons explaining why the information is illegal by reference to the specific provision of the law infringed.[16]

Removal of false information likely to cause public disorder[17]

In addition to creating a general obligation to monitor, this provision in the Code creates a multitude of vagueness which can be easily exploited by an abusive Government as the Code neither defines false information nor public disorder. Consequently, for example, Platforms like Twitter, Instagram etc. may be sanctioned if they fail to proactively take down posts related to the shootings at Lekki and other similar incidents.

This position can be contrasted with other frameworks like the EU E-Commerce Directives[18]and the DSA[19]where States are prevented from imposing a general obligation to monitor as well as countless case laws requiring that any legislation attempting to restrict free speech must be sufficientlyprecise to enable the citizen to regulate his conduct: he must be able – if need be with appropriate advice – to foresee, to a degree that is reasonable in the circumstances, the consequences which a given action may entail.[20]

Indeed, there are other areas of concern under the Code. For example, the Code does not contain provisions for reviewing contents taken down. In addition to taking down an unlawful content, the Code also requires all Platforms to take all reasonable steps to ensure that such contentstays down.[21]Consequently, it is conceivable that an Agency can claim that a content is unlawful and the Platform will be forced to not only take down the content, but to also take steps to prevent the content from resurfacing. The only option available to affected Nigerians will be to approach the Court for a declaratory judgement that the content was not in fact unlawful.

Similarly, the definition of prohibited material to mean content or information objectionable on the grounds of public interest, morality, order, security, peace, or is otherwise prohibited by applicable Nigerian laws is another potential window of abuse. For purposes of clarity, the mere fact that a content is objectionable (without being unlawful) should not be enough ground to remove such content especially in light of how loose the term “objectionable” can be interpreted.[22]

In addition, the absence of an internal complaint-handling system under the Code can be contrasted with the framework under the DSA where Users can lodge a complaint against the decision to remove a content or suspend a User.

Finally, it appears the Government is trying to regulate five different categories of information namely misinformation,[23]disinformation,[24]harmful content,[25]unlawful content[26]and prohibited material.[27]It is therefore conceivable that an abusive Government will be able to fit in an unfavourable content into at least one of these five categories and consequently take steps to remove the content.

RECOMMENDATIONS

Amongst many other recommendations, we suggest that the negative provisions identified above be amended to introduce safeguards or outrightly removed to obviate the possibility of abuse by an excessive government and introduce a measure of protection for the general public and the affected Platforms.

We further recommend the following:

  1. all provisions imposing general monitoring obligations be removed;
  2. the introduction of a notice-and-action framework; and
  3. the introduction of a suspension mechanism for persons/Agencies who are shown to frequently submit notices or complaints that are manifestly unfounded.

We also recommend that time be taken to study how other jurisdictions are able to effectively manage social media and hate speech so that we do not have a Code that does more harm than good to the digital ecosystem.

[1]NITDA, “Code of practice for interactive computer service Platforms/ internet intermediaries” available athttps://nitda.gov.ng/wp-content/uploads/2022/06/Code-of-Practice.pdfaccessed 21 June 2022

[2]NITDA, “Press Release” available athttps://twitter.com/NITDANigeria/status/1536392359977664512?s=20&t=O76Ofq9GPDtcFgkpJ5Y0ggaccessed 21 June 2022

[3]Timi Odueso, “Nigeria seeks to regulate social media with new NITDA Code of Practice” available athttps://techcabal.com/2022/06/14/nigeria-seeks-to-regulate-social-media-with-new-nitda-code-of-practice/accessed 21 June 2022

[4]Social Media Bill available athttps://guardian.ng/wp-content/uploads/2019/11/Protection-from-Internet-Falsehood-and-Manipulation-Bill-2019.pdfaccessed 21 June 2022

[5]Hate Speech Bill available athttps://www.movedemocracy.org/wp-content/uploads/2020/10/Hate-Speech-Bill.pdfaccessed 21 June 2022

[6]AlJazeera, “Nigeria ends its Twitter ban after seven months” available athttps://www.aljazeera.com/economy/2022/1/12/nigeria-ends-its-twitter-ban-after-seven-monthsaccessed 21 June 2022

[7]PLAC, “The National Broadcasting Cooperation (NBC) Act (Amendment) Bill” available athttps://placng.org/i/tag/the-national-broadcasting-cooperation-nbc-act-amendment-bill/accessed 212 June 2022

[8]Part I(4) of the Code

[9]Part I(5) of the Code, Part II(2 & 3) of the Code and Part V(7) of the Code

[10]Part II (1, 3 & 8) of the Code

[11]Part III of the Code

[12]Article 25(1) of the Proposal for A Regulation Of The European Parliament And Of The Council On A Single Market for Digital Services (Digital Services Act) And Amending Directive 2000/31/Ec (“DSA”)

[13]Article 11 DSA

[14]Classified under the Code as NITDA, NBC, NCC or any agency authorised by its enabling law.

[15]Part I(3) of the Code provides that: All Interactive Computer Service Platforms/Internet Intermediaries (Platform) shall Act expeditiously upon receiving a notice from a user, or an authorised government agency of the presence of an unlawful content on its Platform. A Platform must acknowledge the receipt of the complaint and take down the content within 24 hours

[16]Article 8(2) DSA

 


Footnotes:


[17] Part V(7) of the Code provides that: All Platforms shall Where a false information is likely to cause violence, public disorder, or exploitation of a child, the Platform shall caution the publisher and remove the content as soon as reasonably practicable


[18] Recital 47 and Article 15 Directive 2000/31/EC (“E-Commerce Directive”)


[19] Article 7 DSA


[20] See generally The Sunday Times v. United Kingdom 6538/74; CASE OF WINGROVE v. THE UNITED KINGDOM (Application no. 17419/90)


[21] Part I(6) of the Code provides that: All Interactive Computer Service Platforms/Internet Intermediaries (Platform) shall: Exercise due diligence to ensure that no unlawful content is uploaded to their Platform. Where a Platform receives a notice from a user or any authorised government agency that an unlawful content has been uploaded, such Platform is required to take it down and ensure it stays down. No liability shall be incurred by a Platform where such Platform has taken all reasonable steps to ensure that an unlawful content is taken or stays down.


[22] Part IV of the Code provides that: A Platform shall not continue to keep prohibited materials or make them available for access when they are informed of such materials. Prohibited material is that which is objectionable on the grounds of public interest, morality, order, security, peace, or is otherwise prohibited by applicable Nigerian laws.


[23] Unintentional dissemination of false information


[24] Verifiably false or misleading information that, cumulatively, is created, presented, and disseminated for economic gain or to deceive the public intentionally and that may cause public harm


[25] Content which is not unlawful but harmful


[26] Any content that violates an existing law in Nigeria


[27] Content or information objectionable on the grounds of public interest, morality, order, security, peace, or is otherwise prohibited by applicable Nigerian laws



Link to article

MEMBER COMMENTS

WSG Member: Please login to add your comment.

dots