Pioneering responsible innovation across industry web 3.0 and metaverse environments 

September, 2023 - Shoosmiths LLP

Shoosmiths explore the need for regulation in the evolving landscape of Web 3.0 and the Metaverse, focusing on  recent approaches to regulating emerging technologies, and the importance of striking a balance between flexibility and effective safeguards.

At a Web 3.0 Symposium hosted by the Digital Regulation Cooperation Forum (“DRCF”) in October 2022, one of the questions posed for UK regulators was “how can we best encourage responsible innovation in relation to decentralisation and / or distributed ledger technology applications?” Encouraging “responsible innovation” in Web 3.0 and metaverse technologies is a thornier topic for regulators than it might at first appear. It requires trust in regulators from stakeholders and the general public, but a prerequisite for building this trust is an effective and transparent regulatory framework. Such regulation should help protect individuals while fostering innovation in a safe and sustainable manner, thereby establishing trust at the public and industry level in a natural way. But getting to this point poses many challenges. 

This article will explore in more detail why regulation of the Web 3.0 and metaverse spaces is required, how it might best be implemented, and will look at the risks around a regulatory regime that could be too prescriptive (thereby potentially inhibiting innovation) or too flexible (thereby potentially not addressing the risks).

The Role of the Regulator

Regulators are supposed to be makers of participant-agnostic policy responsible for setting, monitoring and enforcing standards designed to achieve objectives in a particular area. The DRCF notes how the role of the regulator is changing amid developments in new technologies; “our roles as regulators are becoming increasingly important to ensure that consumers’ and citizens’ interests are at the heart of digital innovation”. In the context of Web 3.0 and metaverse technologies, regulators will need to strike a balance between preserving the flexibility for innovation – by allowing companies and individuals to harness the benefits of decentralisation, new markets and technological advancements – and minimising the potential risks, including of data breaches, fraudulent activity and market volatility. To do so, regulators must keep pace with rapidly emerging technologies, and create regulatory frameworks that support this balance. 

Navigating Regulatory Challenges

Regulation is required

Regulators looking to regulate emerging technologies face a constant dilemma around how to balance encouraging innovation with minimising harm. While a flexible approach is often welcomed by developers, the success of emerging technologies can be predicated on there being sufficient trust in regulators among consumers and stakeholders. In recent times we have seen how this trust can be eroded. For example, the FTX collapse (termed by Damian Williams as “one of the biggest financial frauds in American history”), resulted in a significant loss of confidence in emerging digital markets and reduced appetite for investment. One of the key issues this scandal highlighted was the need for stakeholders and consumers to consult with regulatory bodies to develop effective regulatory frameworks. Without consultations, regulators risk creating frameworks which are not fit for purpose, or which negatively impact confidence in these emerging technologies.  Conversely, if following any such consultation, consumers or stakeholders wish for these markets to not be regulated, they must satisfy themselves with the risk that without effective regulatory frameworks they will not be afforded any protection when another scenario like the FTX collapse emerges. The more frequent the scandals become, the more likely the trust in these emerging technologies will be eroded.

Although the collapse of FTX negatively impacted trust around digital markets, it did accelerate the idea that greater regulatory oversight may be required for these types of emerging technologies. In respect of crypto currencies, Sir John Cunliffe of the Bank of England commented that “we should not wait until [the crypto world] is large and connected to develop the regulatory frameworks necessary to prevent a crypto shock that could have a…destabilising impact”. 

In the same speech, Cunliffe noted that crypto technologies have the potential to improve efficiency, functionality and reduce risk in the financial system. Despite these potential benefits, Cunliffe further suggests emerging crypto technologies such as tokenisation and encryption “will only be developed adopted at scale when they sit within regulatory frameworks that can effectively manage their risks”. The need for regulation is true of the metaverse as well as crypto – in a 2023 survey by BCS, the UK’s Chartered Institute for IT, 81% of respondents believed that the metaverse would create new regulatory challenges, and over two-thirds said they were concerned about safety issues.

In order to build trust and confidence in these technologies, then, regulators must ensure adaptable frameworks are implemented and maintained.  Below, we discuss which regulatory approach may be the most appropriate. 

Can regulation be too prescriptive? The response to the European Union Artificial Intelligence Act (AI Act).

Historically, regulation has struggled to keep pace with emerging technologies. The EU’s AI Act, proposed in April 2021, aims to establish a comprehensive framework for governing artificial intelligence (AI), with one of its stated goals being to strike a balance between safeguarding fundamental rights and fostering innovation. However, the initial draft of the AI Act failed to anticipate the rapid rise of large language AI models (LLMs), such as OpenAI's ChatGPT. The AI Act was subsequently revised to address the rise of LLMs and generative AI, imposing additional obligations on LLM developers, such as requiring them to register their products with the EU, undergo risk assessments, and meet transparency requirements around copyright. 

In an open letter to the EU, over 150 executives from some of Europe’s largest companies expressed concern about the disproportionate compliance costs and liability risks placed on LLM developers, and the AI Act’s potential to “jeopardise Europe’s competitiveness and technological sovereignty”. According to the letter’s signatories, the AI Act will “almost certainly lead to highly innovative companies relocating their activities to non-European countries as well as investors withdrawing their capital from the development of European foundation models and European AI in general.”  

This situation highlights how well-intentioned efforts to regulate transformative technologies can face resistance from the very industry players they seek to reassure. The uncertainty created by what was to be an all-encompassing Act risks creating friction between the regulators and technology companies in the short term and having even more damaging effects on innovation in the long term if not resolved.  It potentially serves as a lesson for regulators dealing with other emerging technologies like Web 3.0 and the metaverse, and emphasises the need for a balanced approach that encourages innovation without stifling it. Of course, achieving that is easier said than done.

Flexibility is key. The positive response to the Digital Securities Sandbox (DSS)

The United Kingdom’s consultation for its DSS (the first financial market infrastructure sandbox delivered under the Financial Services and Markets Act 2023) takes a different regulatory approach to the EU’s AI Act.  A notable feature of the DSS is that it is designed to temporarily disapply and/or modify existing legislative frameworks, with the prospect that such modifications may become permanent if considered appropriate (without requiring the traditional (and lengthy) legislative process). Participating entities can test and scale their technologies, while regulators are provided with the flexibility to set requirements that can be adapted as the activities within the sandbox evolve. So, instead of providing an all-encompassing legislative framework at the outset, the DSS instead provides a controlled framework in which to test emerging technologies and modify the legislation as and when appropriate.  

This flexibility is available to UK regulators due to a fundamental difference between its and the EU’s legislative framework: the Financial Services and Markets Act 2023 gives the FCA (indirectly) the power to codify its statutory agenda without amendments to primary legislation which would take years to agree at a political level and for the UK court system to interpret the existing rules as they stand; the EU institutions do not have this luxury.

The flexibility afforded to the DSS has been welcomed by stakeholders, who have cited its flexibility as a “significant potential strength”. In its response to the DSS consultation paper, UK Finance called the flexibility a key differentiator to the EU’s own distributed ledger technology pilot scheme, writing that the EU’s prescriptive approach has “significantly constrained [the DLT pilot regime’s] attractiveness to institutional investors, as well as its ability to truly support innovation”. 

The DSS highlights the importance of regulators remaining effective when implementing frameworks to encourage and facilitate innovation in emerging technologies.  This does not, however, mean adopting a laissez-faire approach.  In the absence of robust regulatory framework, there is a risk that the digital markets created by Web 3.0 and metaverse technologies become a “Wild West” frontier where criminal activities such as fraud (see the FTX example above), Ponzi schemes, and social-engineering scams flourish and the severity of already rampant issues like hate speech and misinformation escalates.

Conclusion

Regulating emerging technologies like those in Web 3.0 and the metaverse is crucial but doing so in an effective manner presents a tricky balancing act. As the FTX collapse shows, rules are needed to ensure “bad actors” are deterred from exploiting new and innovative technologies to engage in illicit or criminal activities, leading to innocent users being left to deal with the consequences (financially or otherwise) without any support. However, overly strict regulations, as may potentially be the case with the EU's AI Act, can spook developers and investors and potentially restrict innovation.

More flexible approaches, like the one taken by the UK's DSS, seem to strike a better balance. The DSS’s approach allows for testing and adapting regulations as the technology evolves and has been praised for its potential to support innovation without stifling it.  However, it remains to be seen whether the actual implementation and use of the financial market infrastructure sandboxes under the DSS garners the same positive support. We are hopeful that this shall be the case.

Ultimately, for Web 3.0 and the metaverse to really take off as viable technologies at scale, it is essential that developers, regulators, and other stakeholders are able to build a sufficient level of trust in them. This will require implementation of effective, adaptable regulation that both adequately protects the public, while at the same time continuing to encourage innovation. Finding the right balance is challenging, but vital for the safe and sustainable growth of Web 3.0 and Metaverse technologies in the UK and beyond.

 



Link to article

MEMBER COMMENTS

WSG Member: Please login to add your comment.

dots