Digital transformation has ramped up significantly in the past year as organizations accelerate their shift to take advantage of changes that have occurred during the pandemic. However, looking too far into the horizon can be costly if the price is ignoring the technology that powers day-to-day operations. In some industries, if those daily operations fail, downtime can cost up to $50,000 per minute—equivalent to $3 million per hour.
The increase in cyberattacks, storage platform bugs, and the inevitable occurrence of natural disasters have raised disaster recovery and backup plans to the level of “essential” for businesses. Typically, enterprises have a two-layered backup system. A primary recovery source typically located on-premises; a secondary system created either on-premises at a different data center location or in the cloud. In the event of a disruption and the primary storage system is knocked out, there is usually replication done on the second system to keep businesses running.
However, if disaster strikes and both systems fail—such as in the event of a ransomware attack or accidental or malicious deletion—the organization faces serious risk: unless they had the foresight to create a third copy as part of a more comprehensive, risk-averse approach. Creating this third copy is more than just a matter of simple duplication. There are typically two approaches to third copy data protection.
The first approach involves creating a third and additional system; this is often created using another technology so it can be replicated at a different schedule and kept away from the primary or secondary system. The second—and arguably more effective approach—is to have a “golden” copy of critical data that is kept in the cloud, be that public, private, or on-premises cloud.
The Advantages of Creating a Golden Copy
Creating a golden copy is a cost-effective and easy way to maintain business continuity in the event of a disaster or data breach. Identifying and singling out business-critical data allows organizations to keep the cost of storage down. Once this data is identified, it can be isolated in a physical “bunker” in the cloud, which is completely separate from the primary and secondary storage systems, creating an air gap that inhibits ransomware or other errors from infecting this third copy. In the event of an interruption, the data can easily be moved at a granular level to any network-attached storage (NAS) or object system. Alternatively, businesses can also point their applications to the golden copy while they work on getting their primary systems back online.
Another advantage of the golden copy approach is that limited people have access to the bunker. Apart from technical threats such as bugs and malware, there is always the danger of user error. Employees themselves cause problems, albeit usually unintentionally. One of the most significant benefits of the golden copy approach is that because there is no indication that the bunker exists, people cannot identify the bunker data in the primary data centers or cloud. By keeping the location a secret and only allowing a few personnel to know of its existence, organizations are protected against internal and external threats.
In all, protecting business-critical data from the dangers that can threaten it in today’s world requires multiple layers of strategic protection. A golden copy makes it possible to keep businesses running in the face of the myriad possibilities and threats facing a company’s data. Organizations should always be prepared for the worst and with the simple step of implementing a golden copy, they can be.