Sunday, May 18, 2025

Bad data can kill your organisation

By Matt Young, Senior Vice President, APAC, Nutanix

With the rise of the digital economy, data has become an organisation’s most valuable asset. Business operations and processes have all become data dependent and driven.

A recent study by Digital Realty shows that data adds more than US$1.7 trillion to the world’s seven richest economies (G7). On a standalone basis, the value of this data would represent the world’s 10th largest economy, leapfrogging South Korea, Russia and Canada.

Across industries, companies are busy developing strategies to identify, capture and optimise the use of data in business decision making. The hidden problem many companies face is that while good quality data is a true business enabler, bad data can set back research, reduce or destroy competitiveness and hinder innovation.

Bad data refers to data that is incorrect, incomplete, incomprehensible, in the wrong place, irrelevant, and out of date. Practically speaking, poor data wastes sales time; distracts data scientists and consumes IT time synching systems that can’t communicate to each other. All of which leads to a lack of trust in the “numbers” and a lack of decision making from executives.

Many enterprises struggle with the accuracy of the data that they use for day-to-day activities. No industry or organisation is immune and if not rapidly remedied, it can result in serious financial and reputational loss.

Matt Young, Senior Vice President, APAC, Nutanix

And as customer experience begins to define brands – bad data can have significant impact on the bottom line. In the US alone, 89 percent of executives believe inaccurate data is undermining a good customer experience. To put this into perspective, 93 percent of Singapore organisations use data for critical, automated decision-making. The analysis of bad data can bring major business cost – with inaccurate insights sabotaging expansions plans or business purchases, some worth millions of dollars.

Gartner research has found that organisations around the world believe poor data quality to be responsible for an average of US$15 million per year in losses. As companies digitalise, their data and information environments become more complex and the losses are likely to increase unless the bad data problem is tackled in a timely manner.

The lack of awareness about the need to nurture data is widespread. While data savvy companies like Amazon, Google and Airbnb are using their data to map and model their customer behaviour so as to serve them better; most companies have no clear view of their data.

Nearly 60 percent of organisations do not measure the annual financial cost of poor-quality data.[5]  Gartner highlights that leading information-driven organisations proactively measure the value of their information assets, as well as the cost of poor-quality data and the value of good quality data. This gives them an advantage in the marketplace.

The Monetary Authority of Singapore has taken a necessary first step with the implementation of a Suspicious Transaction Reporting form in August last year to emphasise the importance of good, clean and usable data within the financial services industry.[6] If we are to instil a data-conscious mindset across critical industry sectors, more government agencies need to take note and push out similar agendas.

So, what steps can today’s CIOs take in order to clean up their data?

  1. Centralise: Cleaning the data stack of bad data is not a simple one-off event – think long term. Start by Ignoring the different channels data uses to enter the company and concentrate on a centralised strategy for data management – and evolve to ensure detection at the source.
  2. Consolidate: Large organisations have multiple databases run by different departments as well as other data sources that they are unaware of. A factor that is compounded for enterprises with multiple branch locations. Consolidating and identifying databases and information repositories minimises the creation of bad data, aiding standardisation of the company data.
  3. Standardise: Analyse your data to understand it better. The most common reason companies end up with bad data is a lack of standardisation in the collection process. Using a standardised set of parameters, not only within the company but also with suppliers and partners will help to maximise clean data coming into the enterprise.
  4. Investigate: Understand the nature of the corruption in the data. Look for corroborating data to baseline and understand the nature of the corruption. This provides an opportunity to fix anomalies and restore the pristine quality of the data.
  5. Eliminate: Duplicate data is a major cause of data inaccuracy and occurs as a result of the multiple repositories mentioned earlier. It is then compounded by human error in the process. Use the consolidation process as an opportunity to eliminate duplicates in order to arrive at the standardised baseline. Getting there may take time but is a must for quick access of customer information and improving business intelligence.
  6. Sanitise: Cloud platforms, particularly Hybrid Cloud, provide an ideal environment to clean and sanitise data – with numerous data cleaning tools available.

There is a broad consensus that used correctly, data can help fuel the enterprise, add true value and greatly benefit the business. What is also becoming clear is that there is no such awareness surrounding bad data, and the harm it can have on the company’s reputation, efficiency and profitability.

As we increase our dependence on data, the value of that data will increase. Taken care of and managed correctly, data is an asset with limitless possibilities for enterprises. Mismanaged and mishandled, it has the ability to create dramatic declines and unfathomable falls.

In a data driven future – knowing the cost of bad data could become a matter of survival for every enterprise.

Table of contents [hide]

Read more

News