Insights

Analytics on the Edge

Analytics on the Edge

The proliferation of computing platforms, data, and monetization opportunities drives the need for a new "data and analytics fabric”

Centralized data warehousing and analytics architectures are under attack by the very trends that made them a household name. A massive proliferation of sensing and computing platforms is now hitting the enterprise with the advent of cost-effective, mobile, and highly connected devices driving the build-out of the much discussed “Internet of Things”.  Without a re-thinking of the processing paradigms of the past, today’s enterprise will be ill-prepared to deal with, let alone thrive in, this new (IT) normal.  Consider the building blocks of this new reality:

  • Massive data flows are generated “at the edge” by the pervasive use of mobile devices and sensors that spin off volumes of raw and intermediate data;
  • There is an acceleration of highly distributed application environments comprised of legacy infrastructures, cloud-based resources, and mobile computing platforms; and
  • There is pervasive connectivity between all elements of the broader ecosystem with extreme aggregate bandwidth.

In this new normal, the exponential expansion of edge-based computing and data generation will far outstrip linear expansion of centralized capabilities. Simply put, data has traditionally needed to be moved to the analysis. With the proliferation of data sources, end points and platforms this model will grow increasingly untenable. This mismatch creates a highly disruptive force leading to unacceptable risks for the enterprise:

  • Security risk  - With each movement and duplication of stored data, the surface area for potential security attacks increases
  • Cost risk - Data being stored and transferred multiple times results in staff, storage, and compute costs accelerating exponentially faster than the actual growth of the underlying data
  • Business risk – As computing moves to the edge, the freshness of data in centralized warehouses will decline.  Businesses risk incorrect answers and lost opportunities because of the time and complexity of Extracting, Transforming, Loading, and Processing from all endpoints simultaneously

Analysis, applications and even solutions are generally small, controllable and mobile. So why not take the analysis to the data? By reversing the paradigm that is under so much pressure one can realize agility, speed and cost advantages that work with, rather than against, the explosive growth of technology, end points and data.

An agile, cloud-enabled, and intelligent data and analytics fabric is needed for enterprises to successfully address this new normal.

The first challenge for this fabric is to maximize the use of and value derived from edge-based computing and data management capabilities. This is a core requirement to deal with the explosion of endpoints described above. A second challenge is the ability to select - often dynamically - which edge-originated content has the potential to be aggregated (at higher levels of the hierarchy) with content from other edge “zones” for superior returns. In a highly distributed system the “economics” of data movement and processing must continually be evaluated to ensure optimal use of and return on one’s investments. Once the selection decision is made the third challenge presents itself: how to securely, cost effectively and quickly forward the right data to the right aggregation point(s) for secondary processing and achievement of results. In short, such a fabric must meet a challenging set of requirements:

  • Compatible – Fabric must function across legacy and new technologies
  • Deployment agnostic – On premise, off premise, hybrid environments must all work and be managed together seamlessly
  • Lightweight – Minimal computing resources should be required, allowing the fabric to scale from legacy hardware, to virtualized cloud platforms, to mobile and new micro compute devices
  • Secure – The fabric must secure connections inside and outside the enterprise, minimize data movement and be centrally managed and auditable
  • Atomically elastic and scalable – The ability to scale up and down at the workload level versus at the machine level will allow efficient utilization of resources and components across all platforms
  • Intelligent – The fabric must be imbued with the capability to sense workload, resources, capabilities, and even stressors within a given environment, and respond with programmatic actions to maintain performance, security, and integrity

Most importantly, this fabric must simultaneously support both a hierarchical and horizontal deployment of embedded “intelligence” across the processing space to maximize flexibility for widely varying environments.

Looking forward we can expect rapid innovation in every corner of the Internet of Things.

For such a fabric to be successful it will be incumbent upon providers to successfully address traditional challenges – the degree of “openness”; feature breadth versus depth and the many barriers to enterprise adoption. The industry has excelled at producing platforms, generating data and providing connectivity.

Now comes the hard part – realizing business value by accelerating innovation, improving the speed and quality of business decisions and orchestrating optimized business processes.

Article written by Tom Fountain and Phil Moyer
Want more? For Job Seekers | For Employers | For Contributors