One of the key challenges with Big Data is how to perform analysis across multiple data sets.
Producing metrics that are useful and meaningful, and that tie together various analytical points can be a challenge. Instead of relying on siloed analytics in narrow contexts, it is imperative that these contexts be bridged using secondary analysis.
From an enterprise risk management perspective, there is a big picture that must be considered. The real challenge with Big Data is in going from individual siloes of data analytics to a bigger picture that successfully and meaningfully puts those analytics into the full-enterprise context. It's how we map these analytical islands to each other that ultimately provides the support we need for improved quality in our decisions.
Data, data everywhere
There are dozens of data sets throughout security and operations, ranging from standard KPIs (key performance indicators) to newer metrics around security incidents (data breaches, lost devices) and security performance – from firewalls, anti-virus, intrusion detection system (IDS), data leakage prevention (DLP), identity and access management (IAM), vulnerability scans and penetration testing. Traditional IT operational metrics – like “mean time between failures,” “mean time to repair,” and uptime and availability statistics – round out the overall picture, at least for operations.
Analysis is frequently performed within each of these disparate contexts, but rarely are they rolled up into a more comprehensive viewpoint. Where analyses typically break down is around value alignment. How the business views daily operations is generally different from how IT views operations. Business metrics tend to be more concerned with financial reporting, assets and performance relative to external benchmarks, like the market, competitors or customers. It is important that business and operational metrics be aligned to improve decisions and help ensure business survivability.
Gaining value alignment
One of the first questions that should be tackled when considering aligning business and operational metrics is, “How does the business function on a daily basis?” Specifically, being able to describe daily business functions – or, processes – is a great way to understand what assets and attributes are key to ongoing business survival. For example, if one's business relies on a short turnaround for accounts payable and receivable in order to carry a degree of float (e.g., as a VAR), then understanding the maximum acceptable degree of disruption to these processes is key.
Note that there are several implicit considerations in answering this question. First, one must understand what it is the business does and how it does it. Second, one needs to understand the underlying asset picture (people, processes and technology). Finally, these assets can be further tracked down to specific operational practices and metrics. This top-to-bottom value chain provides the necessary linkages for the business to understand its liability and for operational teams to understand the relative importance of a given resource.
In practice, many of these metrics and measurements exist within analytical siloes. Security and operations teams tend to track given systems fairly closely. Similarly, most businesses closely track business metrics to ensure that financial performance is within expected ranges (and hopefully net-positive). The challenge that Big Data represents today is in trying to align these disparate analytical islands. Improving decision quality and the ability to effectively manage risk means bridging the gap, which is typically accomplished through an additional analytical tier.
Improving decision quality
It's a generally accepted fact that business leaders want to make the best decisions possible in a given context using the best data available. It's also generally accepted that there is no such thing as “perfect data,” and that there's a fine line between waiting too long and not long enough to get enough data to make a good decision. Complicating matters further is the reality that decisions relative to information security and risk management are increasingly scrutinized from a legal perspective, especially in the case of a data breach. How can business leaders improve decision quality and achieve a more legally defensible position?
“Today, more often than not, analytical siloes are not being effectively bridged to provide the best data possible to decision-makers.” – Chris Goodwin, CTO, LockPath |
The answer comes in pulling all the pieces together. There is no question that there is a lot of useful data available throughout the average enterprise. Where concerns potentially arise is in whether or not that data is being fully or adequately leveraged in making decisions. Today, more often than not, analytical siloes are not being effectively bridged to provide the best data possible to decision-makers.
Adding a second tier of analytics is critical in the world of big data. A reasonable solution will aggregate operational analyses and tie them to business metrics and concerns. A useful second-tier analysis effectively describes key business functions or processes and business assets, and then correlates operational reports and metrics to them. For example, linking accounts receivable to technology assets to operational practices and metrics can help expose significant enterprise risk.
At the end of the day, decision-making processes must integrate disparate data sources, while at the same time leveraging analysis aggregation capabilities to make those individual analyses accessible and useful. Fully aligning the value chain from business function down through operational practices will allow business leaders to make better quality decisions, while reducing enterprise risk.