Friday, May 3, 2024

may, 2024

Transforming Your Data into a Business Asset

As the volume, velocity, variety and value of data grow, the possibilities of what can be done with it expand exponentially.  Organizations globally are gaining valuable insights and further intelligence from the massive amounts of data they’ve collected over time, and many are realizing that archiving their data forever may lead to new and unexpected business opportunities down the road.  To transcend data, not only as information, but as a valued asset, organizations will need to rethink how it is captured, preserved, accessed and transformed, not only to increase business competitiveness, but also to create infrastructures that enable it to live forever.

To get to the data forever finish line, it is important to first understand the value of data itself.  Data that was never saved before, is now being kept, and data that was frozen in time, is being brought back online for analytics or to enable the possibilities of monetization.  For example, old oil fields and gas drilling sites around the world have volumes of seismic data already extracted and captured.  Years of captured data can save millions of dollars in seismic analysis, which in turn, can help determine whether to turn an old abandoned oil field into a productive one.

Once data is captured, its value may not be evident at first, or even in the near future, but it may become priceless years from now once big data analysis is performed and new insights are realized.  From a recent survey of two hundred enterprise professionals and service providers that provided insights into their unstructured data storage requirements, more than twenty percent indicated that their policy is to keep their data forever.

So what are the sources of data value?

Sources of Data Value

At the very high-end of the value curve is transactional data which is directly tied to critical daily activities and operations that link to business revenue and financial performance.  Losing transactions could be disastrous, not only adversely affecting lost revenue, but opportunities, relationships and even your company’s reputation.  As such, transactional data is typically considered hot, or Tier 0 data that is usually kept separate from other data within the storage infrastructure or data center to ensure immediate access, protection and coherency.

Operational data has tremendous value as well, and when pulled from factory or production equipment using trend analysis for example, can improve manufacturing yields or provide alerts regarding upcoming maintenance or service requirements.  In these instances, the results are tactical and used to improve potential inefficiencies.  In the future, results from operational data can be strategic and used potentially to improve the infrastructure itself.

Another valued data source is market data which is collected from many different outlets over time, and through analysis, can determine how demographics impact purchase patterns, specific purchasing trends, the effectiveness of ad campaigns and social media, or sales forecasts based on past seasonal successes.  Competitive data captured is also valuable as it can provide analysis on how competitors react to changes in the market so you can anticipate their next moves and take appropriate actions to combat it.

Rounding out the list of valued data sources is security data.  With new regulatory requirements, laws and mandates challenging your data strategy, IT managers must rethink how to scale data for long-term growth and value.  They must consider and deploy technologies, as well as secure processes (such as comprehensive threat detection and intrusion prevention, strict user authentication and access controls, data encryption and immutability, and secure erase features), in order to build a comprehensive data secure environment.

Regardless of the source, analyzing data from the past can deliver strategic value.  The tactical day-to-day value of data is obviously very important as it keeps businesses running.  And, though the long-term strategic value of data, and saving it forever, is more difficult to defend, its results can provide many business efficiencies and competitive advantages for years to come.

Adding Further Value to Data

Almost every data center globally is undergoing some form of digital transformation, and in many cases, analyzing large volumes of captured data is part of that transformation.  Most data has hidden value that can be mined, transformed and analyzed; however, users cannot extract value if they don’t understand what the data means, what impact it has, or the value it can bring to the business.  Chief data officers (CDOs), data scientists and analysts are playing a more prominent role helping to map the statistical significance of key problems, and translate the analysis quickly to implement into the business.  Organizations that place value on data science are gaining valuable insights from the volumes of data they’ve captured over time.

As data value is drawn from analysis, it must be protected so it can be accessed and used whenever required.  The traditional methods of protecting data have involved RAID models, erasure coding (encoded data that needs to be reconstructed), or systems designed with high availability (HA).  However, traditional NAS and SAN (file and block) environments cannot cost-effectively scale to keep up with the rapid growth in data, nor are they architected to support unstructured data, rendering them ineffective.

In the data forever architecture, an object-based storage (OBS) system can be ideal as it can easily scale, reduce total cost of ownership (TCO), and offer more immediate self-healing background and data integrity checks.  Some OBS systems are also designed to enable non-disruptive upgrades and seamless data durability across multiple hardware generations, so as your system grows and changes, it can support these capabilities.

Does Data have a Lifecycle?

Determining whether data should be kept forever or discarded requires an understanding of the data lifecycle.  For example, the lifecycle of transactional data can include inventory checks, credit card verifications, shipping validations and order completion content.  By keeping a history of customer purchases, inquiries or even web browsing, the results can be extremely beneficial when these customer patterns are analyzed, providing better profiling of customers or market changes.

To preserve this value, the right balance of storage and performance are needed so that the characteristics of data match the storage that needs to be implemented.  For example, transactional data captured today is worth placing on high-performance NVMe™-based storage solutions to gain the benefit of immediate accessibility while delivering heightened user experiences.

A week, month or year later, that same data has less and less immediate value requiring less expensive storage media.  And since the data is being archived for future analysis, real-time high-performance takes a back seat in favor of storage media with affordable petabyte-scale and less performance characteristics.

Data protection becomes an important part of the data lifecycle.  As you plan to analyze your data in the future, it must be stored on reliable media, not only for immediate access, but to include the highest integrity and accuracy possible.  Traditional tape storage methods have become ineffective requiring multiple copies of the data to be placed on tape in case data restoration or tape degradation issues occur.

Data access also plays an important role in the lifecycle discussion as data availability and online access provides high value versus data tucked away in a vault.  In some cases, such as public clouds, there are performance dependencies on network connectivity when accessing data online so having a data access strategy will not only address the value of your data, but how fast you need to access it.  For example, hot transactional data require minimal latencies in the order of microseconds, whereas secondary data (Tier 2) may only require millisecond latencies, and colder, archived data (Tier 3) could be in the seconds.

The choices and trade-offs that organizations face includes where and how they deploy their applications and infrastructures using traditional IT, public cloud and private cloud models, whether on- or off-premises in private cloud data centers, in co-location facilities or distributed cloud data centers.  As a result, many organizations are choosing a hybrid approach that combines traditional IT resources with private and public cloud usage.  This approach provides fast and flexible deployment and development options while also retaining the security features, management control and predictable economics associated with a private infrastructure that can be used for certain workloads, applications and use cases.

Next Steps

The partnership between data, storage and performance requires a data strategy driven by business needs.  This provides an opportunity to take a fresh look at the issues at hand with key priorities focused on the ‘right’ duration of data retention, where more or less data control is required, cost and scalability requirements, and rounded by a data forever plan.  As data supports the business and operational processes, best practices, competitive advantages, etc., identifying the data objectives will differentiate itself and add tremendous value.

In support of this data strategy is the ‘right’ storage plan that enables data, versus one that limits it.  Immediate access to transactional or Tier 0/1 data, as well as support for fast data analysis, requires high-performance, flash-based storage.  Whereas colder, archived data that will be analyzed in the future, is well suited for less expensive, more scalable object storage systems that migrate data to a private cloud.  The result is a data forever architecture.

Final Thoughts

To ensure the partnership between data, storage and performance, a data strategy driven by business needs is required because today’s invaluable data may be priceless years from now and wish you had kept it.  The end result will enable the data to get the respect it deserves, be treated like the asset it is, and deliver the competitive advantages that make the business even stronger.

Once the data forever architecture is in place, then you have to manage it in a consistent way with your business processes so that your storage processes support them.  Data sitting on tape in a vault has potential value but only when the tape is brought back online.  Once again, invaluable data captured today could reap high value in the future as long as the data strategy in place supports the business objectives.

Western Digital

Forward-Looking Statements:
This article may contain forward-looking statements, including statements relating to expectations for Western Digital’s Data Center Systems (DCS) products, the market for these products, product development efforts, and the capacities, capabilities and applications of its products using hybrid cloud strategies. These forward-looking statements are subject to risks and uncertainties that could cause actual results to differ materially from those expressed in the forward-looking statements, including development challenges or delays, supply chain and logistics issues, changes in markets, demand, global economic conditions and other risks and uncertainties listed in Western Digital Corporation’s most recent quarterly and annual reports filed with the Securities and Exchange Commission, to which your attention is directed. Readers are cautioned not to place undue reliance on these forward-looking statements and we undertake no obligation to update these forward-looking statements to reflect subsequent events or circumstances.
©2018 Western Digital Corporation or its affiliates. All rights reserved.  Western Digital is a registered trademark or trademark of Western Digital Corporation or its affiliates in the US and/or other countries.  All other marks are the property of their respective owners.
Erik Ottem
Erik Ottem
Erik Ottem is the Director of Marketing within Western Digital’s Data Center Systems group and responsible for go-to-market execution, collateral development, product messaging and positioning, sales and channel training, and press and analyst briefings. He has over twenty-five years of experience in high-tech storage sales and marketing that cover systems, semiconductors, devices and software for such companies as IBM, Seagate, Agilent, Gadzoox Networks, Violin Memory and Western Digital. Erik earned his Bachelors of Science degree in Plant Science from the University of California, Davis and his Masters of Business Administration degree from Washington University in St. Louis.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Stay Connected

1,595FansLike
0FollowersFollow
24FollowersFollow
2,892FollowersFollow
0SubscribersSubscribe

Latest News