The path that application delivery has taken over the last 20 years is a story of convergent evolution. More specifically, this era has seen the development of multiple delivery technologies that—when looked upon in hindsight—have revealed themselves to be the initial, nascent evolutionary steps towards what is becoming the application edge.
The first in a series of blogs that looks at application-related technologies whose tide will be coming in during the next few years, especially as we evolve toward a more fully dispersed application delivery fabric and the emergent role of the edge. We’ll begin by examining where we are today (and the paths taken so far).
The payoffs of a well-constructed data architecture go beyond operational process. The benefits extend into strategic operational efficiencies, deeper business insights, and the ability to reach adjacent business opportunities, all executed in a more agile manner.
A thoughtful and deliberate data strategy is fundamental to enabling the quality and cost-effectiveness of the most important business workflows. Further, when the workflows are instrumented to transmit their observed data exhaust to a collection and analysis infrastructure, the workflows themselves can be continuously analyzed and improved, resulting in constantly adaptive and optimized business workflows.
The first step in a discussion about data architecture is to define what the concept of “data architecture” encompasses. Unsurprisingly, the answer turns out to be nuanced—it is layered and multifaceted. To help ground the discussion, it is useful to start by thinking about it in terms of the journey of collected telemetry data.
"Data is the new oil" or "Data is the grease of the digital economy." If you're like me, you've probably heard these phrases, or perhaps even the more business school-esque phrase "data exhaust monetization," to the point of being clichés. But like all good clichés, they are grounded in a fundamental truth, or in this case, in a complementary pair of truths.