Thursday, Jul 27, 2017
HomeFeaturesArticlesExplosion in Data Demands a New Approach in Order to Generate True Commercial Advantage

Explosion in Data Demands a New Approach in Order to Generate True Commercial Advantage

For as long as we can remember, the financial industry has relied on data – from the earliest bankers mentally calculating interest rates on their loans, to today’s traders using complex algorithms to move large sums around electronically.  But we’re at a tipping point where the modern demand and need for financial data is happening at a speed that far outpaces the speed at which humans can reasonably interact with said financial data.   Machines 1, humans 0?

It’s safe to assume the production of data around the world isn’t going to slow down or reverse any time soon. What steps can be taken to ensure that we manage the vast output of data appropriately and enjoy the commercial benefits that result?

The challenge here is that consumer and corporate financial transactions not only happen globally, but across complex, interconnected networks that data must pass through, driven by defined rules and processes.  This is supported by basic modern banking technology fundamental parameters, which include:

  • Real-time competencies
  • Transactional throughput capabilities
  • Deep analytics intelligence
  • Effective data workload management control
  • An intuitive user interface and dashboard presentation layer

However, these aren’t always effective. Conventional approaches to even the most contemporary banking frameworks and their architectural development fail to engineer-in the need for analytics velocity from the outset.  The specific location and operational approach to data itself has failed to provide a platform for analytics at the speed and accuracy needed.

When it comes to the financial industry and its use of big data analytics, real-time data analytics is subject to complex guidelines due to global exchange rates, privacy factors and the increasing use of time series data tracking.  However, if we can manage these complexities on-site and work directly at the place where data is stored, then we can save on time and cost like never before, thus making previously impossible tasks possible. The complexities themselves have developed as a result of year after year of silo-centric IT programs being pushed forward with little thought toward the future and the possibility of real-time processing and analytics.

Now that we have the ability to build our approach to data with more customized and controlled mechanisms, banks and all varieties of financial institutions will be able to make informed commercial decisions quicker than their competitors. As a result, they will be able to seize market opportunities, as well as meet the demand of their customers faster.  Also, these same institutions will be able to specify their analytics to help identify and reduce theft, corruption, security breaches and all forms of malicious activity.

The key here is not to approach data analytics from a new perspective, but instead from a different point of applied data processing and application logic. The answer to this is in-database analytics, the technology inflexion point which can allow us to leverage analytics insight on demand, the second the data is available.  This is possible because in-database analytics run directly inside your database, using the full power of the platform.  Traditional analytics products require you to move data from the data warehouse to another environment for processing, but in-database analytics allows you to process the data without moving it.  Some of the benefits of in-database analytics include:

  • As data gets bigger, so does the cost to move it. Typically, up to 80 percent of the processing time for analytics solutions can be consumed just by moving data.
  • Modern data warehouses provide powerful engines that, if optimized and coded to take advantage of data and process parallelism, can result in models that run 10X to 100X faster than non-optimized models.
  • They’re easy to use. It takes about an hour to install more than 700 datamining, machine learning and financial models into your database. Also, there’s no additional hardware or storage needed and the models inherit security measures are already in place, so there’s no user setup to manage.
  • Finally, once installed, the models become part of the database and appear as native functions, and run off of the most popular data language– SQL.

Let’s take the example of calculating VaR (Value at Risk).  Traditionally, users move data from their database to another analytics environment and run all the calculation included in VaR modelling.  This typically takes 2-6 hours, depending on the environment.  We used in-database analytics and performed 10,000 simulations for a portfolio of 500 stocks over the course of 252 days, which created 1.26 billion simulations.  We then calculated P&L for 30,000 positions with discrete intervals (1..5 days, 1..4 weeks, 1..3 months, etc.) with 10,000 simulations which involved 1.5 billion P&L calculations.  Finally, we performed aggregation and VaR calculations for each discrete interval.  In total, 12.6 billion simulations were performed in less than 2 minutes and the entire VaR and P&L process can be performed in less than 5 minutes.

There are endless examples of other use-cases for taking advantage of in-database analytics – from cleaning up money laundering and driving better customer service, to minimizing the need for ALLL (Allowance for Loans and Lease Losses) and reducing the burden of CCAR (Comprehensive Capital Analysis & Review).  These use cases have directly led to the evolution and rise of in-database analytics.  For ultimate scalability and performance, why move the data to the analytics if you can move the analytics to the data?

It’s clear that the opportunity for banks and financial institutions to bring in-database analytics and an entirely new approach to data mining into their operational strategies is not minor. Moving to use in-database analytics is an evolutionary step, and the level of competitive advantage gained is directly related the level of adoption. So, consider the ways in which in-database analytics can help you take back real control of your data and start reaping the benefits immediately.

Fuzzy Logix

NO COMMENTS

LEAVE A COMMENT

X
})(jQuery)