Tuesday, Jan 16, 2018
HomeFeaturesArticlesA Nightmare on Wall Street

A Nightmare on Wall Street

There’s a nightmare on Wall Street. Data, that was traditionally only inside companies’ four walls, is now everywhere – on premise, in the cloud, accessed by employees and contractors, who are logging in from laptops, iPhones, watches and tablets. Companies can no longer control their data, devices or users. Employees working remotely don’t need to log into the company’s VPN to access what they need. Oftentimes they don’t even need to authenticate to access sensitive corporate data. There are no boundaries.  Most companies don’t know where their data lives, moves, who touches it, or where it is going.

A memorable example of this data nightmare is Equifax. Reports said the hackers got in by exploiting an unpatched vulnerability on a legacy system. However, once they were in, then what? According to the FTC, the breach lasted from mid-May through July. They accessed people’s names, Social Security numbers, birth dates, addresses and, in some cases, driver’s license numbers. They also stole credit card numbers for more than 200,000 people and dispute documents with personal identifying information for more than 180,000 people. The details have not yet been divulged regarding how the hackers accessed so much sensitive information. However, it’s safe to assume the data itself was not being tracked, and therein lies the problem for many large enterprises.

When companies do not track their data, they don’t really know if it’s being protected. For example, if an employee sent a list of customer names and contact information outside of the company, no one would know the data left because it’s not being tracked. The impact on the company if that data were to end up in the wrong hands would be quite significant, and yet the company would most likely not know until the compromise happened, and the information was made public.

The old cyber security strategy of building up the perimeter to protect the data is no longer effective. In today’s world companies need to see their data, including how people are accessing and interacting with it, outside the four walls.  That entails switching their focus from protecting the network to protecting the data. After all, the data itself is the only parameter they can control.

It takes a combination of tools to achieve this goal. The starting point for most companies is to implement a Data Loss Prevention (DLP) solution to track where data is going including most recently the cloud. Tagging tools are also important so that companies can properly classify the data, such as an email or a PDF file, so that security tools like DLP know to detect when certain tagged data leaves the company.  Data at rest technologies on the other hand are required to track when data moves within the confines of the organization. CASB tools provide the authentication and visibility into the interactions with cloud data while encryption tools make the data unreadable for unauthorized users. Combined with these solutions, behavior analysis identifies and analyzes how users interact with the data, and whether their behaviors are unusual for themselves, their peers, and overall team.

Finally, to leverage the power of all these solutions, companies need to correlate the data coming from those various tools to get a complete picture of their greatest risks from a data standpoint. For example, Jane logs into Office 365, sends information tagged as “classified” to her private email account which violates policy, and her behavior is not only unusual for herself, but also compared to her peers and overall team. The result – a highly critical event that needs immediate investigation.

While this all sounds like it requires even more resources to manage all the different tools, which is something that most companies can’t afford due to the lack of skilled manpower, it’s really the opposite.  By implementing an approach that combines the tools mentioned above, companies can make their limited set of analysts more effective prioritizing the lists of events that need investigation.

We have heard a lot recently about treating cyber security as a risk management issue. That means getting out of the mentality of “How many vulnerabilities do I have today?” or “How many events did I remediate today?” and moving to a more comprehensive mentality of “Which vulnerabilities, if exploited, would impact my business the most?” New regulation such as the New York State Department of Financial Services (NYS DFS) Cyber Security Framework and the President’s Cybersecurity Executive Order include mandates that focus on a risk-based approach. One of the key requirements of the NYS DFS Cyber Security Framework is that organizations must perform risk assessments and build their cyber security programs based on the results of those assessments. The framework does not prescribe equal treatment for all assets, meaning they should focus on the assets that matter most to the company based on the data in those assets. The Cybersecurity Executive Order requires all federal agencies to submit a Framework Implementation Action Plan as well as a set of metrics that show how they are protecting their most valuable information assets from cyber-attacks and breaches, again based on the data.

If public and private sector organizations adopted a data-centric protection approach like the one described here, complying with new risk based regulations like these would be, for the most part, inherent. They would understand their “crowned jewels” and the most significant threats and vulnerabilities that put those jewels at risk.  What these regulations expose is that although it is important to know where the data lives and how it moves, it is not possible to control either of those things. Therefore, today more than ever, it is important to understand how valuable the data is to the company and the threats that could compromised the most valuable data must be prioritized and mitigated first.

It’s time to wake up.

Bay Dynamics