CONTRIBUTOR
Chief Content Officer,
Techstrong Group

There are a multitude of ways to launch a digital business transformation initiative, but the one thing almost all successful efforts have in common is that data is processed and analyzed in near real-time at the point where it is created and consumed.

Digital CxOs, unfortunately, often overlook how complex an endeavor it is from an IT perspective to achieve that goal. The bulk of IT systems employed today process data in a batch mode. As a result, it may be 24 hours or more between when an event occurred and when backend applications are updated to reflect it. This explains why, for example, a retail outlet will report that it has an item in stock on its website, but when a customer arrives to purchase it, that item is no longer available.

All things being equal, most customers would prefer if vendors did not list where items are supposedly available when the odds are they are not. That’s become especially problematic in a COVID era where supply shortages have made it exceedingly difficult for many organizations to make anything approaching an accurate forecast.

The only way to resolve that issue is to be able to process and analyze data in near real-time. The minute an item is sold, it should be reflected in all the backend applications that track inventory across an extended supply chain. As simple as that might seem, it typically requires an IT team to build out applications and infrastructure capable of processing data at the edge and then stream the results back to any number of backend applications rather than waiting for data to be shipped back in batch mode to either a local data center or cloud service.

In fact, that requirement is why there is so much interest these days in event-driven IT architecture data streaming platforms such as Apache Kafka and Apache Pulsar. The challenge is that implementing those frameworks requires a significant amount of IT expertise that has not often been readily available. Heading into 2022, however, that’s about to change. Cloud service providers such as Boomi that specialize in integration are gearing up to launch services that promise to make it simpler for just about any organization to process data in near real-time. Boomi Event Streams, for example, is an event-driven architecture (EDA) service that will be embedded within the Boomi data integration service. It will be based on the Apache Pulsar framework, said Ed Macosky, senior vice president and head of products for Boomi.

The goal is to significantly reduce the level of IT acumen required to take advantage of these technologies by embedding them in a cloud service that can be invoked via a graphical user interface (GUI) or a set of application programming interfaces (APIs) that Boomi exposes.
Regardless of approach, the ability to process data in near real-time is about to become more widely accessible. The implications for digital business transformation initiatives will be nothing less than profound as a lot of the heavy IT lifting that was previously required becomes just another automated cloud service.