Backups matter. CTOs and CIOs know basic data protection is not as mundane as it seems at first glance. Data is central to the survival and success of digital business. Losing it—even losing access for a few hours or days—affects productivity and profitability.
Cost-effective and reliable backup solutions that can scale as business grows are critical to the success of today’s digital enterprise. Content is growing exponentially and often unpredictably due to emerging technologies, platforms, and services—from video streaming to ecommerce to social media to the IoT. Ensuring that data is always available, no matter what disaster occurs, requires a backup solution that understands the specific scale, integrity, and speed requirements of your business model and industry.
Data as Currency and Fuel
Data storage must be approached as a foundational IT strategy. It impacts the core components, capabilities, and competitiveness of every digital business. Data storage and protection strategies should encompass backup, cyber security, privacy compliance, business continuity, disaster recovery, and operations (data management and availability).
Data is an asset, and increasingly, a form of currency. It requires formal, integrated security and access infrastructure, just as financial assets require banks and investment accounts. We wouldn’t accept partial security, restoration, or access to our monetary funds, so why accept it when it comes to data, which likewise carries current and potential value?
Data is also fuel for digital enterprises. The quality, availability, and pliability of data directly impacts everything from the brand to the bottom line. If data drives operations, innovation, and revenue, then data loss, restricted availability, and poor performance is unacceptable. You wouldn’t, after all, let your delivery fleet run out of gas. The pressure is on, from the top down, to mature and optimize data protection and management.
The digital transformation of industry and commerce, public services, social networks and personal lifestyles means that massive amounts of data of many different types are being generated and leveraged: video, audio, high-resolution images, IoT sensor data, security logs, customer data, social media communications, documents, and more. All of it has to be organized, stored, protected, and analyzed. Where not so long ago we were dealing in gigabytes, many businesses are solidly in the petabyte era and rapidly approaching exabyte territory. Ensuring that such large (and growing) collections of data are always available, no matter what disaster occurs, requires innovative backup solutions and strategies that scale and flex as new applications, infrastructure technologies, and business models emerge.
The larger data backups become, the more difficult they are to manage and configure within acceptable time limits, so the efficiency of the backup process has become paramount. Traditional storage durability is limited and drive failure recovery often too slow. The cost of dedicated backup appliances or multiple NAS filers is too much to sustain at petabyte scale, and hardware lock-in is painful and debilitating. Tape degrades over time, and creates inefficiencies in loading, searching, transporting, and storing. NAS is poorly utilized, with no way to avoid dead space waste. Growth requires separately managed systems, resulting in silos after scale-up. RAID is especially impractical for large datasets, due to the limited data protection afforded by double parity, this increases the chance of data loss to an unacceptable degree.
The explosive growth of data, the expectation of anytime/anywhere accessibility, and the ever-present need to drive down cost and complexity, are combining to shift focus and investment to new infrastructure strategies. Digital businesses demand high confidence and reliability in the face of myriad challenges to the integrity, security, and usability of their data. Natural disasters, data center disasters (fire, electrical, theft, sabotage), geopolitical instability and terrorism, cybercrime and hacktivism, and energy regulations and expenses top the list of external and unpredictable threats to data and the businesses that depend on it.
Big Data and its supporting technologies present yet another set of difficulties: data becomes too big to manage, unstructured data presents unique issues, data produced by consumers and IoT scales unpredictably, and keeping a handle on vendors and costs hampers agility. Legacy systems create silos and complexity, as a hodgepodge of IT services is layered on top in an effort to modernize. Complying with an ever-shifting landscape of data privacy and security regulations at industry, state, federal, and international levels drains resources away from efforts to extract value and insight from the data.
Backup Moves Forward
Backup and recovery requirements are so critical and dynamic, Gartner has predicted significant shifts in the data center backup and recovery market over the next few years, as enterprise customers consider new vendors and approaches, as well as innovative ways to leverage backup capabilities that go beyond operational recovery (testing, DevOps, etc.). For example, by 2018, more than half of organizations will replace (or significantly enhance) the backup solutions they had deployed as of the beginning of 2015.
Digital enterprises in the midst of deep transformation need data storage and backup solutions that are cost-effective, reliable, and scalable. These solutions must integrate and flex with established systems as well as emerging infrastructure and use cases. They must meet increasing demands for performance, availability, protection, recovery, cost, and ease of use, and be adaptable to each organization’s unique mix of priorities and critical requirements. Doesn’t seem so mundane anymore, does it?
As demands for performance at petabyte scale intensify, storage features like hardware independence, scale-out operation, storage efficiency, and erasure coding become paramount. Integrating all these features, and more, into a unified platform that can manage huge amounts of unstructured data has compelled storage vendors to develop innovative new architectures and powerful software controllers, allowing enterprises to move away from proprietary hardware-centric solutions that don’t scale quickly or cost-effectively.
Streamlining with SDS
The need for better backup and restore solutions is a primary driver of cloud adoption. Object storage solutions have emerged as the prevailing approach for dealing with petabytes of unstructured data in private, public, and hybrid cloud platforms. Because object storage carries metadata on each file, it eases the pain of organizing and managing the huge amounts of unstructured data created by IoT, ecommerce, social media platforms, and more—while enhancing accessibility, speed, and security.
When combined with commodity hardware infrastructure, powerful software controllers using the platform-agnostic SDS approach, can enable scalable object storage at significantly lower TCO compared to traditional backup solutions. When data management and protection is software-defined and decoupled from hardware appliances, capacity expansion, upgrades, and swap-outs are streamlined and automated. Maintenance can be performed on a rolling basis without impacting availability, and hardware can be replaced and added with no downtime.
SDS solutions not only enable hardware freedom, they provide advanced data durability protection mechanisms such as erasure codes designed for very large scale data, multi-data center deployments for ensured availability and site disaster protection, and automated management efficiency at petabyte scale. Leading SDS solutions have also been integrated with popular backup, data management, and archive solutions and can fully leverage enterprise security services such as Active Directory with Single Sign-On SSO) and enable multi-tenancy requirements. SDS provides the scale-out power and durability of object storage along with the ability to support object enabled applications (via the defacto AWS S3 protocol), legacy file protocols including NFS and SMB, as well as challenging mixed workloads with small and large data sizes
It’s easy to see how this capacity to integrate and scale will reduce complexity, paving the way for multiple uses related to backup: disaster recovery, operational recovery, archiving, test/development, DevOps, and regulatory compliance (especially for financial services, healthcare, and public sector enterprises). It will be exciting to see what disruptive and unforeseen capabilities will arise from creative combinations of and deployments of SDS, object storage, and cloud technologies. Meanwhile, those still in the early stages of data storage optimization will find that backup is an essential first use case when piloting SDS and private cloud technology.
Data Makes the Difference
Data management is set to become a more prominent differentiator in the year ahead. The rapid proliferation of cloud platforms and services, the phenomenon of digital entertainment, the goldmine of advanced data analytics, the burgeoning of the IoT, the global rise of smart manufacturing—all are set to explode into mind bending manifestations of creativity, innovation, power, and automation. That is, assuming we don’t trip over our own data: it has to be secured, accessible, and more efficiently managed if we want to maximize its potential.
As enterprises emerge from Big Data chaos with a clear strategy and more advanced tools, those who can trade on and be fueled by their valuable data will leap ahead. Those weighed down by mountains of data they can’t readily access and manipulate will fall further behind. And for those enterprises that can leverage optimized SDS architecture to face the onslaught of threats and opportunities without worrying about data loss and downtime, the sky’s the limit.