Thursday, March 28, 2024

march, 2024

Executive Viewpoint 2019 Prediction: DataCore Software

Hyperconverged Becomes Hybrid-Converged

In a recent state of software-defined storage, hyperconverged and cloud storage market survey, respondents said they are ruling out hyperconverged because it does not integrate with existing systems (creates silos), can’t scale compute and storage independently, and is too expensive.
The hyperconverged market is maturing and users are now demanding hyperconverged systems that better meet their needs. As a result, vendors are creating optimized solutions that effectively match those requirements, and many of the traditional hardware vendors in the hyperconverged infrastructure (HCI) market are shifting to a software offering.

Additionally, there has been an overall evolution from the earlier vision of “hyperconverged,” primarily consisting of the convergence of compute, storage and network in one single hardware unit, into a model that is software-driven and software-defined, which is called “hybrid-converged.” Hybrid-converged infrastructure provides the same advantages of HCI with additional functionality that allows it to connect to external hosts and to present external storage to the unit. Essentially, this breaks the silo approach and allows independent storage of compute and storage.

As a result, users no longer have to choose whether to buy into the HCI model or not—they can have the benefits without the limitations. HCI becomes a building block for a modern datacenter but not necessarily a model that requires discrete HCI systems.

Software-Defined Storage Will Play a Pivotal Role in the Modern Data Center

The line between ‘traditional’ data centers and the cloud has been blurring, as the former are now mostly virtualized resources on a co-location environment. The evolution towards software-defined infrastructure, whether that encompasses virtualized resources or public cloud, is what will give IT flexibility and freedom.

The first step to modernizing the data center is to break the silos, achieve vendor independence, and remove vendor-imposed refresh cycles. This is why software-defined is the foundation for the modern data center. The first wave, software-defined compute (virtualization) is well established. Networking (SDN) and security are coming, but they are immature. Software-defined storage is mature and established, and is growing quickly in adoption.

Hardware refresh cycles in particular represent one of the most challenging aspects for IT, especially storage hardware. Typically, an IT department will go through a storage refresh cycle every three to five years, but in some cases the hardware can be used for a longer period of time. Software-defined storage is very flexible, and enables new storage and technologies to be added—whether it’s AFAs, NVMe, containers or cloud storage—non-disruptively.

With software-defined storage, once the latest and greatest hardware comes out, it can easily be integrated into the environment, helping to modernize the data center and increase infrastructure agility, and there’s no need to rip and replace when new technologies arrive. Software-defined storage also makes it easier to adopt different types of storage (direct attached, arrays, JBODs, NVMe, etc.) and new configurations like hyperconverged or hybrid-converged.

Additionally, the modern data center will be required to incorporate storage technologies that support synchronous mirroring in local and metro clusters, asynchronous replication for disaster recovery, and continuous data protection, which is like a time machine to undo any damage from ransomware attacks. The most important aspect to ensure availability is the recovery—and it should be recovery that is instantaneous and automatic, with zero touch fall back and re-build.

As IT departments look to reap the benefits of the software-defined datacenter, the advantages of software-defined storage will be quickly realized in terms of performance, uptime and flexibility. This will help them spend less time on repetitive tasks and expand the technology to cover more of their IT footprint, including additional workloads or datacenters.

As Container Adoption Continues to Grow, New Storage Challenges Will Emerge

Containers bring unprecedented mobility, ease and efficiency for rapidly deploying and updating applications. They are here to stay, and as adoption continues to grow, containers will play a big role in IT in the coming years, and following an adoption path that is much faster than virtualization or cloud.

As the technology matures, containers face a few challenges, mainly around security and storage. The state of software-defined storage, hyperconverged and cloud storage market survey reported that the following surprises/unforeseen actions have been encountered by users after implementing containers: 1) lack of data management and storage tools; 2) application performance slowdowns—especially for databases and other tier-1 applications; and 3) lack of ways to deal with applications such as databases that need persistent storage.

As deployments move from evaluation and testing phases to production deployments, IT organizations require an ability to deliver the same data storage services that are currently provided to monolithic application architectures. More importantly, a solution has to be capable of providing shared storage to existing virtualized and bare-metal application infrastructures, as well as allow DevOps engineers to consume storage on-demand, ensure stateful application data is persistent, and provide the same level of availability and performance as currently provided to the traditional application infrastructures.

Software-defined storage can enable administrators to present persistent storage to container hosts deployed as VMs on virtual hosts, with the ability to provide persistent storage to container hosts deployed on bare-metal as a next step. The presentation of the persistent storage should be done through native controls of orchestration solutions, like Kubernetes, and leverage advanced storage capabilities like continuous data protection (CDP), auto-tiering and synchronous mirroring.

As a result, users will be able to manage the provisioning of storage to container deployments, with the same platform as the rest of the application workloads, and provide the same level of enterprise storage services required for all critical production environments. This will help to further advance the ongoing adoption of containers.

Software Defined Storage Will Accelerate NVMe Deployments

NVMe (Non-Volatile Memory Express) is one of the hottest industry topics right now. As a protocol
for accessing high-speed storage, it promises to provide many advantages over legacy
protocols such as SAS and SATA. In today’s world, which is driven by the need for always-on,
real-time data, this becomes a particularly attractive value proposition.

A recent market survey stated that about 7% of respondents report that more than half of their storage is NVMe. However, while adoption is low, enthusiasm for the technology does appear strong and the future for NVMe looks bright. IDC expects that by 2021, NVMe-based arrays using NVMe over Fabric (NVMe-oF) host connections will be driving more than 50% of all external primary storage revenue.

Part of the reason for slow adoption is currently that while NVMe, and NVMe-oF, promise new levels of performance for flash-based storage systems, for large, distributed systems, the problem remains of deploying, managing, and migrating data and applications. There are a lack of software and data services to provide a simple path for businesses to transition without suffering the costs and disruptions required to benefit from the technology. Customers are usually forced by their storage vendors into a ‘rip-and-replace’ abandonment of current investments.

NVMe and NVMe-oF deployments will need proven software to accelerate customer adoption. For example, software-defined storage can act as a bridge that unifies and abstracts legacy and new storage, allowing users to seamlessly integrate new technologies such as NVMe-oF and gain the
benefits without having to sacrifice past investments.

Software-defined storage can provide a basis for managing all types of storage at the speed
required to realize the benefits of NVMe. Effective software-defined storage can eliminate
changes to hosts, provide quality of services, automate data migration, support NVMe with
the existing fabric network, and provide a wide range of enterprise-class data services such
as CDP, load-balancing, HA mirroring, auto-tiering, and data migration.
Software-defined storage allows for the adoption of varied implementations of NVMe,
including local SSDs and NVMe-oF using standard HBAs, and end-to-end NVMe for workloads
demanding minimum latency.

Performance, simplicity of deployment, and the ability to leverage existing storage are all
critical factors in easing the adoption of NVMe. As we move toward the next evolution of
performance and lowering latency with NVMe/NVMe-oF, software-defined storage will help
dramatically improve performance and utilization, reduce down-time, and minimize cost and
management complexity.

Organizations Will Increasingly Use Data Analytics as a Strategic Asset to Enhance Business Efficiency and Effectiveness

By 2020, it is estimated that 1.7 Mb of data will be generated every second for every person on Earth. However, without the right tools and mindset, all of this data will simply be noise. That’s why business leaders around the world are transforming their organizations to become data-driven by leveraging data as a strategic asset to enhance business efficiency and effectiveness.

When turned into insights, data can become the cornerstone of positive transformation. Insights provide information that is actionable and relevant, at all levels of the organization. They drive positive business outcomes, influence behaviors and enable more informed decisions; ultimately making a business more effective, efficient and intelligent. How effectively companies can turn data into insights will be a key differentiator over the next decade.

Insights are transformed into data-driven intelligence with the collection, synthesis, analytics and visualization models now available. Additionally, as the data landscape continues to mature, algorithms, machine learning, and artificial intelligence (AI) will be increasingly used to derive insights. These “intelligent analytics” will be able to process large amounts of data in order to deliver intelligent recommendations.

As businesses look toward data for future opportunities, and that data continues to grow, IT departments will need to embrace technologies such as software-defined storage to help them manage it all.

DataCore Software

Gerardo Rada
Gerardo Rada
Gerardo A. Dada is chief marketing officer (CMO) at DataCore Software. Dada is an experienced technology marketer who has been at the center of the web, social, mobile and cloud revolutions at some of the world’s leading companies. Prior to DataCore, he most recently served as vice president of product marketing and strategy at SolarWinds. Earlier, Dada was head of product and solutions marketing at Rackspace, where he established the company as the leader in hybrid cloud. He has also held senior marketing roles at Bazaarvoice, Motorola, and Microsoft. Dada received a five-year business degree from a UAEM University in Mexico and a general management certificate from University of Texas at Austin.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

1,595FansLike
0FollowersFollow
24FollowersFollow
2,892FollowersFollow
0SubscribersSubscribe

Latest News