survey

A survey of 500 data and IT leaders who manage data workloads of 150 terabytes or more published today finds 95% of respondents are investing in data analytics infrastructure, with 59% of looking to switch data warehouse providers.

 

Conducted by Ocient, a provider of data analytics services, the survey finds that 35% of those switching data warehouse platforms already are, while 55% are still considering their options with an eye toward deciding which platform to adopt in the next six to 12 months. A total of 44% also want to streamline the number of platforms they need to manage as well, the survey finds.

A full 94% said it’s somewhat or very important to increase the amount of data their organization analyzes in the next one to three years, with 55% reporting they expect the amount of data they need to store and analyze to grow quickly in that same period.

That growth in data volume suggests legacy data warehouse platforms are not up to the challenge of managing what its rapidly becoming petabytes of data that organizations need to analyze, says Ocient CEO Chris Gladwin.

There’s also a lot more concern about the quality of the data being collected as organizations prepare to invest in artificial intelligence (AI), he added. Nearly half of respondents (47%) cited data quality as being one of their top priorities. A slightly lower percentage (44%) are prioritizing bringing AI and machine learning (ML) capabilities, the survey also finds.

Overall, the survey also suggests there is less of a tendency to view the cloud as the default option of deploying data warehouses, says Gladwin. A total of 41% are looking to adopt flexible hybrid or multi-cloud data analytics solutions, the survey finds.

More organizations are now moving toward storing and analyzing data where it makes the most financial and technical sense versus automatically trying to store every dataset in the cloud, notes Gladwin/ Over time, the cost of processing large volumes of data in the cloud can exceed an on-premises IT environment. “It’s no longer a one size fits all approach,” says Gladwin.

Most organizations have been to varying degrees looking to make more fact-based decisions faster as part of the digital business transformation initiatives they have launched. The challenge most of them invariably encounter is the practices used to historically manage data have been inconsistent. The end result is there is a lot of low-quality data that often conflicts with multiple other data sources. Most organizations would be well advised to define their long-term goals for addressing data quality before launching a major AI initiative, notes Gladwin.

Regardless of the approach, data management is inexorably moving toward being a top-of-mind IT issue as it becomes clear that an organization’s ability to compete going forward will be tied to the quality of the data it collects, stores and analyzes. The trouble is the foundation upon which much of that data is managed today is not nearly as stable as most business leaders tend to assume.