Better risk management through data analytics


“It’s a capital mistake to theorize before you have data,” according to the infamous fictional detective, Sherlock Holmes. But what you do with the data once you have it is equally important. Today, data analysis goes far beyond an Excel spreadsheet.

To date, the discipline of data analytics has traditionally been dominated by manual, unstructured processes. Just as insurance has tended to lag behind sectors such as banking in its adoption of technology, risk management has tended to follow other disciplines such as finance in its adoption of analytics. technology driven.

Therefore, risk assessment has always been fundamental, trying to learn from past events and failures – providing hindsight rather than insight.

However, that is starting to change.

Requirements and Capabilities

Three factors are behind this transformation:

1) The first is simply the increasing availability of data from both inside and outside organizations. The increased sophistication of enterprise resource planning (ERP) systems makes it easier to capture and extract data from source systems. The rise of big data sources – public and private databases – and a mass of data coming from the Internet of Things (IoT) means that the volume of accessible data has never been greater. Addressing risks within an organization with limited data coverage will almost certainly lead to risks being overlooked.

2) The second is the increased availability of technological tools to store, structure, analyze and visualize data.

Cloud technology, for example, has eliminated capacity constraints and internal hardware requirements for storing large datasets. Meanwhile, tools like “Alteryx” services allow users to incorporate different types of information, including unstructured data, such as text. And data visualization and analysis tools help organizations understand this not only to understand the present and the past, but also to predict the future.

An example of this involves a proactive fraud detection system, which we used on a client project to predict fraudulent journal entries, based on historical input data. This AI-based fraud detection system was also used to flag any outliers in the data, which would facilitate reactive fraud detection. This not only helped our client deal with the risk of fraud, but also highlighted other overlooked controls that needed to be implemented.

These tools have proven invaluable in managing supply chain disruptions during the pandemic and its aftermath, for example, providing greater flexibility, resilience and visibility between suppliers. For example, screening requirements for potential new vendors (and their vendors) creates a massive manual workload. However, using technology to connect to external databases, such as sanctions lists and company records, can automate screening processes to manage risk while quickly onboarding vendors.

3) Finally, regulation has contributed to reinforcing the need for better risk analysis. Beginning with banks and other financial institutions, requirements for aggregated risk data providing a single measure of total risk exposure have encouraged the use of more sophisticated analytical tools.

Another good example of upcoming regulatory requirements is the new German (and soon European) supply chain law. In this law, companies are required to monitor human rights and environmental risks in their supply chain. As you can imagine, supply chain processes are complex and contain large amounts of data. Without analyzing this data using the appropriate tools, businesses are unlikely to comply with the new law. In short, the risk of not adequately analyzing the risk has increased.

However, while the drivers for improving data analytics in risk management are compelling, the hurdles, especially for small and medium-sized businesses, can still seem significant.

A clean break

Obviously, the cost of more advanced tools, such as predictive analytics, can be substantial. Being at the cutting edge of technology requires a significant investment. But less ambitious adoption of better data management, analytics, and visualization can still bring returns through increased automation efficiency and improved visibility and understanding of historical risks: business intelligence without providing advanced analytics.

Second, many companies lack in-house expertise, and given current skills shortages, it is difficult to address them. At the end of 2022, a study by training company Skillsoft found that around three-quarters of IT decision makers globally faced critical skills gaps in their technology departments. The survey of more than 9,000 global IT specialists found that 76% had skills gaps in their departments, a 145% increase since 2016.

Additionally, recruiters Harvey Nash warned that the growth of the global tech sector is at risk due to a massive skills shortage. A recent survey found that more than two-thirds (67%) of digital leaders globally are now unable to keep up with the pace of change as they struggle to attract the right talent.

Again, however, the problem is not insurmountable. Managed and subscription services are likely to be more efficient solutions for many companies anyway, and there is a steady stream of new providers.

Break down barriers

Nevertheless, the main barriers to greater adoption of data analytics for risk management are internal. The first is often the inability of companies to use the internal capabilities they have. All companies collect at least a few key performance indicators and business performance metrics, but responsibility for these often lies with management.

Passing this to the analysis function will not only make reporting faster and more accurate, but also free management to better plan and define an analysis strategy.

The second key barrier is data quality. The lack of data is rarely the key problem: the challenge is to capture, process and store it correctly. Without accepted standards for formats, content, and accuracy, most of an organization’s effort and expense will be spent cleaning data rather than analyzing and elaborating it.

Ensuring acceptable data quality requires strong data governance and management strategies. To be effective, however, it must start with education. When information is collected manually, managers must not only learn how it should be captured, but also why – and the implications for business analytics if something goes wrong. In addition, the effectiveness of the governance strategy should be monitored and reviewed regularly.

Without sufficient quality, no amount of data will be useful, or as another famous author, Mark Twain said, “Data is like garbage. You better know what you’re going to do with it before you pick it up.


About Author

Comments are closed.