When we think of cybersecurity, we think first about protecting our data rather than about managing it. Being hacked is one risk, but not knowing what your data contains, where it is or what access rights are associated with it is a risk that is potentially just as serious. All these questions can be condensed into one: should I destroy such and such data, and if so, when and how?
The more information you store, the more you increase your attack surface, your legal liability in the event of a data leak and the likelihood of being in breach of personal data protection laws, not to mention the increase in management and storage costs. So why take risks with data that is no longer of any use in any way?
These issues are becoming increasingly complex as businesses migrate their data to the cloud, as Blancco points out in its study published in March 2023 entitled “Data at a Distance: How Cloud Migration Affects Data Classification, Minimization and Disposal”. The company, which specialises in end-of-life data management, examined the impact of cloud computing on this issue.
It focused on two sectors where data constraints are particularly stringent: healthcare and finance. Given the highly confidential nature of the information processed and the high stakes involved (especially financial ones), the legal and regulatory framework governing data in both these sectors is particularly strict.
The cloud, a solution, but also a cause of data inflation
For this reason, Blancco believes that these sectors represent the state of the art in end-of-life data management. In all, no fewer than 1,800 decision-makers working in this area were surveyed across six countries: the United States, Canada, the United Kingdom, France, Germany and Japan. They were evenly split between the healthcare and finance sectors.
Cloud storage is easy to access, not particularly expensive and has become very popular: 51% of the organisations surveyed have already migrated entirely to the cloud, 11% are considering doing so, and 37% host part of their data in the cloud. One of the side-effects of this craze is that companies have increased the volume of data they store by 69%, as Blancco reveals in its study.
This data inflation often makes data management more difficult and clashes head-on with the objective of reducing the volume of data to ensure it is properly managed. This goal runs parallel to the digitalisation of processes, which simplifies the collection and processing of data. At the same time, “65% of respondents said that the switch from analogue to digital had increased the amount of redundant, obsolete or trivial (ROT) data collected, processed and stored”. As a result, almost two-thirds of respondents said they wanted to rethink the way they assess what data they need to remove from their databases.
There is considerable room for improvement in this area, says Blancco, which believes that every company should have an effective data classification system and be capable of reducing its data volume and permanently erasing end-of-life data.
The burden of legal and regulatory constraints
However, just over half the companies have a mature classification system and are happy with basic deletion of end-of-life data instead of certified destruction with traceability, providing them with all the technical and legal guarantees. Furthermore, 28% of those questioned are content to simply specify a timeline for the lifespan of data, which defines when it should be destroyed.
The problem is that this seemingly simple solution lacks finesse and does not generally take into account the numerous laws that apply to these sectors, such as the GDPR and, more recently, the California Privacy Rights Act, to name but two. Each country has its own legislation, and there are often sector-specific regulations, too. This is why effective classification is so important.
Cloud computing apparently does not help with this process, since 65% of the companies surveyed believe that it is easier to do it on premises. And contrary to what 24% of respondents believe, migration to the cloud does not actually streamline end-of-life data management.
Quite the opposite in fact, as Blancco points out: “It also brings with it certain data governance challenges, in particular obstacles affecting the way in which cloud users’ data can be destroyed in an effective and compliant way when necessary, while generating proof that this has been done”.
Destruction in the cloud requires finesse
Remote storage providers do not necessarily offer data destruction services. Yet 65% of the managers Blancco surveyed felt they could “trust [their] public cloud provider to manage end-of-life data appropriately” on their behalf, leaving 35% sceptical. These doubts mainly reflect customers’ lack of knowledge of the steps their providers are actually taking to manage the issue.
And while cloud providers generally mention data destruction services in their contracts, these are not necessarily processes that meet the highest standards. The US NIST 800-88 standard is the de facto benchmark in this area. It requires data to be overwritten multiple times with random sequences of digits, and for this erasure to be verified and certified. It also recommends verifying and certifying cryptographic erasure.
These methods are perfectly suited to secure erasure in a cloud environment. Whether in a private cloud, where servers may be difficult to access, or in a public cloud with even more constraints, physical destruction is not the solution. In the complex physical environment of the cloud, cryptographic erasure provides one of the best guarantees: files are encrypted and then their decryption key is destroyed, making them impossible to recover. The entire process is closely monitored and documented. Erasing the encryption key and encrypted data provides the best guarantee.
Healthcare and finance could do better
“The resulting traceability is important for ensuring accountability when information is destroyed. It proves that data has been completely erased for compliance purposes, provides a legal defence in the event of a breach, establishes that the chain of custody has been preserved, and more,” stresses Blancco.
Blancco’s survey provides mixed results on this point, with 63% of respondents stating that they use software data erasure with traceability to ensure their end-of-life data is destroyed. This encouraging result would be excellent if the survey had not focused on finance and healthcare, two sectors that should be beyond reproach in this area given their regulatory obligations.
Another area for improvement highlighted by the study is that 22% of participants have an audit log, but it is not certified, “which is close to best practice, but nevertheless leaves organisations exposed to misrepresentation or a lack of confidence in the log data”. What’s more, 59% of managers admit to using simple data deletion at least some of the time.
Deleting is not erasing
The study naturally concludes by stressing the need to implement and adhere to Blancco’s recommended best practices:
- Classify existing data and data generated by the company as part of an ongoing process so you know the data inventory and lifespan associated with each type of data in real time. You then know which data to destroy and when, in compliance with the laws and regulations relating to each type of data.
- Destroy end-of-life data using the best methods for each type of hosting environment. In the cloud, Blancco favours cryptographic erasure.
- Document the erasure of data so that you can always prove that you have taken the necessary steps to comply with legal obligations.
To achieve these objectives, you generally need to audit and review your internal processes. But you should also check with your cloud providers to find out whether they can comply with your new requirements.