Ah yes, a hint of spring is in the air. The days are getting longer and warmer. Soon I will be able to indulge in one of my favorite pastimes, fishing. Fishing, as a sport, is always a challenge. Fish respond to a multitude of factors like water temperature, type of forage, sunlight and ever-changing stream conditions. One must weigh a variety of factors in order to be successful in actually catching a fish. Simply put, one must think like a fish to catch a fish. You need to assess the physical environment, the water temperature, amount of sunlight, stream conditions, the aquatic insects that serve as food and the depth of the water.
So why is protecting your data any different? Data protection, like fishing, is a challenge. The amount of data is never static or unchanging. Those whose job, primarily CIO’s, it is to architect a coherent data protection strategy must take in to consideration a wide range of factors to cost-effectively manage and protect their data as well as ensure a consistent, repeatable recovery strategy. Are they managing physical, virtual or cloud infrastructure? Is the cloud infrastructure public or private? What are the recovery time objectives (RTO) and recovery point objectives (RPO)? On top of these considerations is the application environment they must support.
In today’s dynamic IT environment, CIO’s must weigh and asses a variety of considerations particularly when the volume, velocity and veracity of data growth is unabated. The acceleration of virtual servers and applications make the task of architecting a coherent protection and recovery strategy extremely challenging in today’s world. Protecting data is a must; ensuring timely and consistent recovery is no longer a luxury. Firms that do not have clearly defined protection and recovery strategies in place run the risk of being at a competitive disadvantage. You see, data is the new currency. Any data loss is unacceptable and potentially detrimental to a firm’s bottom-line, credibility or market value if it is a public company. Exposure to any data loss is a risky proposition for an IT administrator but particularly a CIO whose job it is to safeguard their firms’ critical assets.
In this new era of highly virtualized infrastructure, CIO’s must have architectures in place that are flexible and can span virtual, physical in addition to cloud environments. Data protection and recovery becomes a significant challenge especially when protecting critical applications such as Exchange or SQL Server. The application environment is ever-changing and now highly virtualized. Moreover, application owners want the shortest possible RTO’s and RPO’s. They also want the ability to have self-service recovery to insure they can resurrect their data rapidly in the event of an unforeseen outage. As a result, greater emphasis is placed on rapid, repeatable recovery of applications and data. More importantly, a flexible and consistent disaster recovery needs to be in place. This places enormous burdens on CIO’s to bridge legacy infrastructure, applications and processes to ensure recovery is consistent and repeatable as well as scale to cover a greater number of applications.
In essence, a CIO needs new tools and processes to ensure recovery in this new era of rapidly virtualizing, cloud-connected environments that have expansive and unabated data growth. Today, the process of protection, recovery and disaster recovery makes catching a fish look very easy indeed. CIO’s must consider solutions that provide greater alignment of their business objectives and allow them to reduce costs, save administrative time and reduce their exposure to data loss and unnecessary risks. The same old tools and processes will simply not protect many organization’s critical data assets and ensure rapid recovery in the event of a disaster. CIO’s are now in an unenviable position to safeguard their organizations data. If they fail, they may be asked to go on an extended fishing vacation or jump off a short pier.