Monday, Aug 21, 2017
HomeTopicsBig DataExecutive Viewpoint 2017 Prediction: DataCore Software

Executive Viewpoint 2017 Prediction: DataCore Software

Parallel Processing Software will be This Year’s Big ‘Productivity Disrupter’

2017 will be the year that parallel processing software goes mainstream to unleash the incredible processing power of today’s multicore systems, positively disrupting the economic and productivity impact of what computing can do and where it can be applied.

Parallel processing software will go beyond the realm of specialized uses such as HPC and areas like genomics that have focused primarily on computation, and impact the broader world of applications that require real-time responses and interactions. This includes mainstream applications and storage that drive business transactions, cloud computing, databases, data analytics, as well as the interactive worlds of machine learning and the Internet of Things (IoT).

The key is that the software has to become simple to use and non-disruptive to applications to allow it to move from these more specialized use cases to general application usage. By doing so, the impact will be massive because application performance, enterprise workloads and greater consolidation densities on virtual platforms and in cloud computing that have been stifled by the growing gap between compute and I/O will no longer be held back. This will be realized with new parallel I/O software technologies now available that are easy to use, require no changes to the applications and are capable of fully leveraging the power of multicores to dramatically increase productivity and overcome the I/O bottleneck.

The real driver of change is the economic and productivity disruption. Today, many new applications such as analytics are not practical because they require hundreds if not thousands of servers to get the job done; yet each server is becoming capable of supporting hundreds of multi-threading computing cores, all these engines are available to drive workloads that until now have sat there idle, waiting for work to do. 2017 will usher in an era where one server will do the work of 10 — or 100 servers — of the past. Parallel processing software that unlocks the full utilization of multicores will lead to a revolution in productivity and allow a new world of applications within the reach of mainstream IT in 2017 (learn more).

The Impact on Real-time Analytics and Big Data Performance

The combination of faster response times and the multiplying impact on productivity through parallelization will fuel the next step forward in ‘real-time’ analytics, big data and database performance. In a world that requires the rate and amount of interactions and transactions to happen at a far faster pace with much faster response times, the ability to do more work by doing it in parallel — and to react quickly — is the key.

The disruptive productivity force of parallel processing and the ability to leverage all the computing power of multicores will be a game-changer that will propel real-time analytics and big data performance into the forefront by making it practical and affordable. The implications on productivity and business decision making based on insights from data in areas such as financial, banking, retail, fraud detection, healthcare, and genomics, as well as machine learning and Internet of Things type applications, will be profound (read this to learn more).

The Microsoft Impact Grows: Azure Stack, Hybrid Cloud and SQL Server 2016

The success of Microsoft’s Azure Cloud is already evident, however, the real impact will be the larger strategy of how Microsoft has worked to reconcile the world of on-premise and cloud computing. Microsoft was one of the first to recognize that the landscape will continue to be a mix of on-premise and cloud.  Microsoft’s Azure Stack makes it seamless to get the benefits of cloud-like computing whether in the cloud or within a private cloud, and has become the model for hybrid cloud computing. Likewise, Microsoft continues to further integrate its Windows and server solutions to work more seamlessly with cloud capabilities.

Additionally, one of the most dramatic changes at Microsoft has been how it has reinvented and transformed its database offerings into a true big data and analytics platform for the future. SQL Server 2016 has become far more powerful and capable, and now deals with all types of data. As a platform, it is primed to work with Microsoft’s large eco-system of marketplace partners, including DataCore with its parallel processing innovations, to redefine what is possible in the enterprise, the cloud, and with big data performance and real-time analytic use cases for traditional business applications, as well as new developing use cases in machine learning, cognitive computing and the Internet of Things.

Beyond Hyper-Convergence: Hyper-Productivity

Hyper-converged has become the buzzword of the day, but let’s remember that the real objective is to achieve the most productivity at the lowest cost. As 2017 progresses, hyper-converged software will continue to grow but to cement its success, users need to be able take full advantage of the promise of its productivity.

Hyper-converged systems are in essence a server plus a software-defined infrastructure, but often they are severely restricted in terms of performance, and too often lack needed flexibility and a path for integration within the larger IT environment (for instance not supporting fibre channel, which often is key to enterprise and database connectivity). Better utilization of one’s storage and servers to drive applications is the key. Parallel processing software will enable users to take advantage of what hardware and software can do (see this video from ESG as an example).

For example, powerful software-defined storage technologies that can do parallel I/O  provide a higher level of flexibility and leverage the power of multicore servers so fewer nodes are needed, making them more cost-effective. Likewise, the software can incorporate existing flash and disk storage without creating additional silos; migrate and manage data across the entire storage infrastructure; and effectively utilize data stored in the cloud.

Data infrastructures including hyper-converged systems can benefit from the advances of advanced parallel I/O software technologies that can dramatically increase their productivity by untapping the power that lies within standard multicore servers.

Final Thoughts

The world continues to go ‘software-defined’ in order to cost-effectively utilize off-the-shelf computing; parallel processing is the game-changer needed to disrupt the productivity curve and unleash the power of unused computing cores that are all around us. These two forces are coming together in 2017, and they will redefine what is possible while opening up many new applications and use cases by enabling faster performance, real-time responsiveness, massive consolidation, smaller footprints. This will lead to greater cost efficiencies and disruptive economics that will reshape the world of IT in 2017.

DataCore Software

NO COMMENTS

LEAVE A COMMENT

X
})(jQuery)