Centralized data warehousing and analytics architectures are under attack by the very trends that made them a household name. A massive proliferation of sensing and computing platforms is now hitting the enterprise with the advent of cost-effective, mobile, and highly connected devices driving the build-out of the much discussed “Internet of Things”. Without a re-thinking of the processing paradigms of the past, today’s enterprise will be ill-prepared to deal with, let alone thrive in, this new (IT) normal. Consider the building blocks of this new reality:
In this new normal, the exponential expansion of edge-based computing and data generation will far outstrip linear expansion of centralized capabilities. Simply put, data has traditionally needed to be moved to the analysis. With the proliferation of data sources, end points and platforms this model will grow increasingly untenable. This mismatch creates a highly disruptive force leading to unacceptable risks for the enterprise:
Analysis, applications and even solutions are generally small, controllable and mobile. So why not take the analysis to the data? By reversing the paradigm that is under so much pressure one can realize agility, speed and cost advantages that work with, rather than against, the explosive growth of technology, end points and data.
An agile, cloud-enabled, and intelligent data and analytics fabric is needed for enterprises to successfully address this new normal.
The first challenge for this fabric is to maximize the use of and value derived from edge-based computing and data management capabilities. This is a core requirement to deal with the explosion of endpoints described above. A second challenge is the ability to select - often dynamically - which edge-originated content has the potential to be aggregated (at higher levels of the hierarchy) with content from other edge “zones” for superior returns. In a highly distributed system the “economics” of data movement and processing must continually be evaluated to ensure optimal use of and return on one’s investments. Once the selection decision is made the third challenge presents itself: how to securely, cost effectively and quickly forward the right data to the right aggregation point(s) for secondary processing and achievement of results. In short, such a fabric must meet a challenging set of requirements:
Most importantly, this fabric must simultaneously support both a hierarchical and horizontal deployment of embedded “intelligence” across the processing space to maximize flexibility for widely varying environments.
For such a fabric to be successful it will be incumbent upon providers to successfully address traditional challenges – the degree of “openness”; feature breadth versus depth and the many barriers to enterprise adoption. The industry has excelled at producing platforms, generating data and providing connectivity.
Now comes the hard part – realizing business value by accelerating innovation, improving the speed and quality of business decisions and orchestrating optimized business processes.
Article written by Tom Fountain and Phil Moyer
Want more? For Job Seekers | For Employers | For Contributors