CONTRIBUTOR

Enterprises can have all the data they can imagine, but if they don’t trust their data, the data won’t do them much good. In fact, when business leaders don’t trust their data, that data becomes a sunk cost to store, manage and secure. Unfortunately, according to a recent survey, most enterprises don’t trust their data.

According to the survey data commissioned by decision intelligence provider Quantexa, only 42% of data and IT leaders believe that all business units trust the accuracy of the data available to them. In comparison, 27% do not think their organization is maximizing the value of its data.

These findings align remarkably well with a survey conducted by Deloitte, published last year. That survey found 67% of the senior managers or higher said that they’re not comfortable accessing or using data from their tools and resources. And even at enterprises that consider themselves “data-driven,” 37% of respondents to Deloitte’s survey “express discomfort” with their data.

When discussing enterprise data quality, we’re talking about the accuracy, completeness and consistency of data throughout the enterprise. The goal is to attain high data quality so that it can be trusted for analysis by humans or machines. Effective data quality management requires continuous monitoring and improvement efforts that tackle the challenges associated with duplication, missing values, inaccuracies, and outdated information. Maintaining high-quality data is essential for effective decision-making, cybersecurity risk management efforts and maintaining competitiveness.

The Quantexa survey also showed the drag the lack of trust in enterprise data has on automation. According to the survey, only 38% of those surveyed that  automation and believe they can trust their outcomes. Another 23% have some operational decision-making in place, but accuracy needs improvement.

The survey revealed the most pressing data quality challenges enterprises face today, such as scalability limitations (41%), onboarding data to enterprise platforms (34%), security (38%) and the availability of the necessary skill sets (33%).

Duplication issues remain a challenge for 12% of respondents. According to the survey, duplicated data resides within data lakes, warehouses, and databases. Such duplicates prevent data/IT leaders from getting maximum value from their data. The survey respondents said duplication issues compound into hassles with data reconciliation and remediation (46%), increased exposure to risk (42%), and an inability to make timely and accurate decisions that will positively impact their customers (31%).

More enterprises are finding the key to data success: A data quality management program, DQM. By implementing a strategy for managing data – from conception to dissolution- organizations can ensure that their critical data resources are reliable, accurate, complete, and consistent with the company’s objectives. This process involves data profiling, standardizing, cleaning, and governing the quality of one’s data. The ultimate aim is to enhance the value of one’s datasets by ensuring top-notch performance at every phase of their lifecycle.

The data quality tools market is expected to grow nearly 18% annually from 2023 through 2028. These tools help enterprises manage data cleansing, integration, master data management, and metadata management, according to Mordor Intelligence. “As data quality is a significant stake for large organizations, software companies propose increasing numbers of tools focusing on these issues. The focus of these tools is changing from specific applications to a more global view that includes all aspects of data quality,” the research firm stated.

The Quantexa survey was completed in partnership with Censuswide and is based on a study of more than 365+ IT and data decision-makers in the UK, USA and Canada. The survey took place in January 2023.