Cutting edge technology

Cloud computing / fog calculation / edge calculation / MIST

Introduction: From the perspective of IoT practitioners, the need for more usable and distributed computing is often seen. When it comes to integrating the Internet of Things with OT and IT systems, the first problem is the sheer volume of data that the device sends to the server.

From the perspective of IoT practitioners, there is often a need for more usable and distributed computing. When it comes to integrating the Internet of Things with OT and IT systems, the first problem is the sheer volume of data that the device sends to the server. In a factory automation scenario, there may be hundreds of integrated sensors that send 3 data points every 1 second. Most of the sensor data is completely useless after 5 seconds. Hundreds of sensors, multiple gateways, multiple processes, and multiple systems need to process this data almost instantaneously.

Most data processing supporters support cloud models, which always send something to the cloud. This is also the first IoT computing foundation.

1. IoT Cloud Computing

basically pushes and processes your sensory data in the cloud through the Internet of Things and cloud computing models. You have an ingest module that can receive data and store it in a data lake (a very large memory), then process it in parallel (it can be Spark, Azure HD Insight, Hive, etc.) and then use it fast The rhythm of information to make a decision.

Since the beginning of the construction of the Internet of Things solution, there are now many new products and services that can be easily done:

can use AWS Kinesis and Big data lambda services

]

can take advantage of Azure's ecosystem to make building big data capabilities extremely easy

or you can use tools like Google Cloud products like Cloud IoT Core

on the Internet of Things Some of the challenges faced in this are:

Users and businesses on private platforms are uncomfortable with Google, Microsoft, Amazon, etc.

Delays and network outages

[ 123] increased storage costs, dataSecurity and Persistence

In general, big data frameworks are not sufficient to create a large ingestion module that meets data needs

2. The fog calculation for the Internet of Things

can be made more powerful through fog calculation. Fog calculations use a local processing unit or computer instead of sending data all the way to the cloud and waiting for the server to process and respond.

4-5 years ago, there were no wireless solutions like Sigfox and LoraWAN, and BLE had no mesh or remote capabilities. Therefore, a more expensive network solution must be used to ensure a secure, persistent connection to the data processing unit. This central unit is at the heart of the solution and there are very few professional solution providers.

From the implementation of a fog network, you can understand:

This is not very simple, you need to know and understand a lot of things. Building software, or doing it on the Internet of Things, is more straightforward and open. Moreover, when the network is treated as a barrier, it slows down.

For such an implementation, a very large team and multiple vendors are required. It is also often faced with vendor lock-in.

OpenFog is an open fog computing framework developed by a well-known industry for fog computing architecture. It provides use cases, test benches, technical specifications, and a reference architecture.

3. IoT Edge Computing

The Internet of Things is about capturing tiny interactions and reacting as quickly as possible. Edge computing is closest to the data source and enables machine learning in the sensor area. If you fall into the discussion of edge and fog calculations, you should understand that edge computing is all about smart sensor nodes, and fog computing is still about local area networks, which can provide computing power for large data operations.

Industry giants such as Microsoft and Amazon have released Azure IoT Edge and AWS Green Gas to improve machine intelligence on IoT gateways and sensor nodes with good computing power. Although these are very good solutions, it can make the work very simple, but it changes significantlyIt changes the meaning of the edge calculations that the practitioner knows and uses.

Edge computing should not require machine learning algorithms to run on the gateway to build intelligence. In 2015, Alex spoke at the ECI conference about the work of embedded artificial intelligence on neural memory processors:

True edge computing will occur on such neuron devices, which can be preloaded with machine learning algorithms. Serve a single purpose and responsibility. Would that be great? Let's assume that the end node of the repository can perform local NLP on a few key strings that form a password, such as "Open Sesame!"

This kind of edge device usually has a neural network-like structure, so when a machine learning algorithm is loaded, basically a neural network is burned inside. But this burning is permanent and cannot be reversed.

has a new embedded device space that facilitates embedded edge intelligence on low power sensor nodes.

4. The MIST calculation of the Internet of Things

can do the following to promote the data processing and intelligence of the Internet of Things:

cloud-based model

fog-based computing model

]

Edge Computation Model

There is a computer type that complements the fog and edge calculations to make them better without waiting for the previous year. You can simply introduce the network functions of IoT devices, assign workloads, and have no fog or dynamic intelligence models provided by edge computing.

This mode can be used to bring high-speed data processing and intelligent extraction devices with a memory size of 256 kb and a data transfer rate of ~100 kb / sec. For Mesh networks, you will definitely see a facilitator of such a computational model, and someone will come up with a better MIST-based model that can be easily used.