3 min

2024: Regulation and Cyber-Surveillance, a Pivotal Year for the Cloud Market

While 2024 is undoubtedly the year of generative AI, it will also be decisive for cloud providers, both due both to their direct contributions to the issue of AI and through the creation or modification of certain regulations.

In both Europe and the United States, several texts are currently under discussion that will have major impacts on the global cloud landscape, with practical implications for the technological choices of European organizations. Now more than ever, decision-makers need to adopt a global approach to risk management in order to anticipate change.

The European Union is trying to obtain its Member States’ agreement on a European certification scheme, and to harmonize the various national “trusted cloud” certifications, such as SecNumCloud, supported by ANSSI in France. The aim is to provide a clear and harmonized level of trust for all European organizations that use cloud services, and less complexity for operators subject to fragmented performance requirements that differ from one Member State to another.

However, the issues at stake in this scheme are far more political than technical with regard to immunity to extraterritorial laws. The inclusion of such a clause, as in the French SecNumCloud (v3.2), will play a pivotal role in the development of the cloud market in Europe.

In the United States, the extension of section 702 of the Foreign Intelligence Surveillance Act (FISA) until April 2024, and the uncertainties surrounding future changes – with two texts under discussion as potential replacements – adds another layer of complexity and uncertainty. As a reminder, FISA governs procedures for the physical and electronic surveillance of foreign natural and legal persons. Amended in 2008, it allows the US government, via Section 702, to monitor electronic communications abroad with the mandatory assistance of service providers.

Section 702 owes its fame to the Edward Snowden affair, which revealed the existence of several mass surveillance programs, such as PRISM, which enabled the NSA to access the communications of foreign Internet users located outside the United States – all through companies like Microsoft, Apple, Google, Facebook and Skype.

Section 702 has a direct impact on cloud players, as it allows the US government to ask these “electronic communication service providers” (ECSPs) to disclose the information they host. Access to information requested by the US authorities is not limited to servers located in the United States, but to all servers operated by service providers domiciled in the United States. The question of the location of the hosting operation therefore affords no protection.

As far as encryption is concerned, there are many doubts about its actual ability to provide effective protection, as illustrated by the claim that US agencies have the capacity and computing power to infiltrate any encryption solutions. To quote the French Member of Parliament Philippe Latombe: “Encryption is a bit like putting an armored door on your apartment. It’s harder to break down, but you can still get in.” And beyond encryption, what about the management of the risks of technological independence? The Americans have already demonstrated their ability to react forcefully to trade and political disputes by denying China access to the semiconductor market. Is a scenario in which the United States taxes or even shuts down US services to Europeans really inconceivable? What about after the next US presidential election?

2024 is the year of an increasingly shared realization that we can no longer neglect these regulatory issues. Cloud projects, along with data, AI and cyber projects, can no longer be seen as purely technological issues. They are also regulatory and strategic.

Current and future regulations, and the toing and froing between the United States and Europe – as evidenced by the third version of the agreement on data transfers, which is already back before the European Court of Justice – are creating a gray area, rendered all the more complex and opaque by the fact that technological innovations are likely to make the regulatory environment obsolete.

Take, for example, the right to be forgotten, enshrined in European law on generative AI, which is trained by inputting massive volumes of data. What happens to your right to be forgotten when your data has been absorbed by a generative AI? By its very nature, an artificial intelligence is incapable of forgetting.

All these questions remind us that in addition to major technological upheavals, we are also confronted with crucial political and strategic choices. With the European elections just around the corner, it is high time these issues were given their rightful place in the campaign. The greatest danger of all would be to do nothing.

Send this to a friend