One of the issues that has made investments in IT frustrating over the decades is that much of the data being collected remains largely inaccessible. There are countless applications that house data but invariably organizations have needed to hire specialists that have mastered SQL and various application programming interfaces (APIs) to make any sense of it all.
Rocket Software today announced this week those days are coming to an end via an update to its data management platform that adds a set of visualization tools that promises to make data accessible to all users regardless of their background in data science.
Chris Wey, president of data modernization for Rocket Software, said Rocket Data Intelligence automates the management of data from the moment its created. The platform scans the metadata any platform exposes to create a repository of that metadata. That repository then makes it possible to classify data and identify dependencies.
Armed with those lineage insights it then also becomes possible to define workflows to update data regardless of where it is stored. Organizations, for example, can also identify superfluous data they no longer need to store or, as part of a modernization effort, consolidate redundant management platforms.
Very few organizations excel at data management, which is becoming a major issue given how dependent digital business transformation initiatives are on being able to access reliable sources of pristine data to automate processes. Organizations today require a data intelligence platform to make sense of the massive amounts of data that only continues to exponentially increase month over month. “It’s an unsolved problem,” says Wey.
The Rocket Data Intelligence platform traces it lineage back to ASG Technologies, which Rocket Software acquired in 2021. The company is now attempting to also acquire Software AG. However, in the face of rival bids from venture capital firms, it’s not clear whether Rocket Software will prevail. Regardless of that outcome, however, Rocket Software can already aggregate data residing everywhere from the mainframe to the cloud.
The challenge organizations face today can be traced back to data management sins of the past. Most organizations did not focus much on best practices for managing data. Those issues are now coming home to roost as various automation initiatives get bogged down by the need to resolve conflicting data sets strewn across the enterprise. Those issue are only going to be compounded further as those organizations look to build artificial intelligence (AI) models that require accurate sources of data to ensure optimal outcomes.
It might be a while before many organizations eventually straighten out their data management strategies, but a lot of progress is being made. Some organizations have even gone so far as to appoint chief data officers that are specifically tasked with turning data into an asset that can be more easily exploited.
In fact, at this juncture it’s not so much a question of whether best practices will be applied to data management as much as it is how pain will be experienced along the way.