Facebook Twitter Facebook Twitter
Beecham Research Banner
The Future of Cloud and IoT is . . . also at the Edge
Back

It seems to have become a received wisdom that, for M2M and IoT applications in the future, all the processing of remote machine data will take place in the cloud. But is that really the case? As far as we can see, intelligent devices at the edge are getting more intelligent – not less. So what’s really going on?

This issue was explored in detail in a recent white paper by Beecham Research for Oracle. Follow this link to download.

Firstly, this is not just an esoteric argument. It does matter where the intelligence for applications actually is. It matters for security, for power requirements, for data flows, for speed of response and for robustness of the solution. It has a big impact on the architecture of the solution and therefore for the support of it.

The thinking goes that M2M – and in particular IoT – is all about lots of dumb sensors out there with their data being sent to the cloud for processing. So where are all these dumb sensors? Most of them are currently attached to machines. In the future, even more will be attached to those machines, but in addition they will be in the environment immediately surrounding those machines as well. Why? At the moment, to monitor what is happening, but in the future we will need more sensor points in order to optimise performance. Therein lies the true opportunity – moving from remote monitoring to performance optimisation. In other words, automated control.

Sensors will in future be everywhere. No doubt about that. Good times for sensor manufacturers. So where will the data from these be processed for performance optimisation, or for traffic control or a myriad other real time applications? At the centre? What if the network goes down – does everything stop? Does it really make sense to send all data to the centre for processing, then send it back to remotely control a machine or device? How fast is the response time for that sort of solution?

Clearly a more robust solution is to have processing at the edge and at the centre, in a hierarchy. At the edge for speed of response and robustness, while at the centre perhaps for support, maintenance and Big Data analysis. One of the key new opportunities that IoT is aiming to address is sharing data between applications to create cross sector service opportunities. Depending on the application and the requirements, that might mean intelligent devices at the edges of networks connected not just to the Internet but to other devices as well. Business value will come from using the large amounts of resulting data and acting on it quickly, as closely and as automatically as possible, to create new services.

In other words, the future for efficient and effective cloud processing for IoT is to have huge amounts of processing at the edge as well. That’s where it gets a bit more complex . . .


1 comment
  • David

    Finally, a thoughtful statement involving cloud computing (formerly remote measurement and control ) that addresses significant issues like what happens if the network fails do cloud connected resources, machinery, toasters, printing presses, electricity generation etc just go off line? Of course not, so what’s so good about remoting the control away from the point of use? Significant computing power is available in very small packages and environmentally compatible packaging. Control systems locally, measure and report performance as needed to other locations, but assume that the system output must continue even when remote systems and services fail. Good system design addresses safe operations in sub-optimal conditions. In closing generate reports, but first make sure that internet-connected devices soldier-on safely when their connection to the “collective” is lost or impaired.