Because the price of public cloud storage is so low, there's a tendency in IT to never perform a data purge. But...
this is a mistake, as data sprawl can quickly become unmanageable. This can lead to serious operational errors, complicate legal compliance and erode cloud's agility. It can also drive up your cloud storage costs.
Older files are often the most difficult to properly purge from public cloud storage. But aside from any potential compliance issues, it's good to delete these files once they no longer provide value. The challenge is to determine the point at which you could, and should, remove them.
Build a data purge policy for cloud
The answer is to purge by policy. If an organization classifies all its cloud data into 30 different categories, it could apply different time-deletion policies to each. This automated process is more sophisticated than just time-stamping.
A data purge policy for cloud should first classify files that IT teams should never remove. Some of these might be related to the Health Insurance Portability and Accountability Act and the Sarbanes-Oxley Act or other national laws. Other examples include engineering computer-aided design files, and files and raw footage at media companies.
Beyond that, decisions around a cloud data purge should hinge on business relevance. Most data is valueless after some period of time, so set policies for data types or user groups based on value. Create policies for outgoing employees, for apps that are no longer available and for files that are subject to daily, weekly or monthly updating -- chances are, you don't really need seven years of log files.
Even with policies in place, be careful with the purging process. One common issue is distinguishing between backup data and archives. Public cloud storage often comes in three tiers: primary networked storage at the top, followed by backup and then archiving for cold storage. It might make sense to move data down a tier, but these lower tiers tend to come with extended contract periods or early deletion fees -- all of which factor into the purging times you set for files. One option is to create multiple backup/archive storage buckets, each with an extended contract and its own unique purge time, rather than just tier and delete continuously. While this might complicate management, it should assure minimum cost.
Many backup services include one or more size reduction techniques, such as compression, deduplication or global data reduction. These save money in storage and transfer costs, so you should apply them to your stored data, whether in the cloud or in-house. This will also save money if there are early deletion fees for your cloud services.
Dig Deeper on Cloud management and monitoring
Related Q&A from Jim O'Reilly
OpenStack Cinder has added a revert-to-snapshot function, enabling enterprises to recover from corrupted data sets. However, if the feature falls ... Continue Reading
Don't let backup data encryption fall through the cracks. When encrypting backups, key management and compression are just two of the best practices ... Continue Reading
While tape is notably offline and thus protected from cyberattacks, the cloud could comprehensively surpass it for backup if service providers figure... Continue Reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.