30 Oca 2023
4 dk okuma süresi
Edge computing has emerged as a rational and significant use case demonstrating why hybrid cloud architectures outperform purely centralized cloud approaches. There are many reasons to use edge computing, but most involve bringing computing closer to where data is created or consumed. It's advantageous to avoid data traveling across a network to respond to an event when the latency is important, as in the case of a process that has strayed outside of acceptable bounds on a factory floor.
World of many edges
There are numerous edges implemented for various purposes. Edge servers and gateways can connect multiple servers and devices in a distributed environment, such as a manufacturing plant. An end-user premises edge may resemble a traditional remote/branch office (ROBO) configuration with a rack of blade servers.
Telecommunications companies have their architectures divided into three parts: the far provider edge, the provider access edge, and the provider aggregation edge. There are various terms for the various classes of equipment used by telcos; the basic point, however, is that it is a range that cannot be satisfied by one tier of the edge device. Many organizations will use regional data centers to supplement a core data center to handle better redundancy, large data transfers, and data sovereignty requirements.
Need for automation
The edge has a large footprint, especially as we move out to the network's edges. There are numerous devices and frequently limited IT staff.
While automation is important for streamlining operations at any scale, it is not hyperbole to say that an edge deployment without automation is impossible. Even if it were economically feasible to make updates and other changes manually and locally, ensuring consistency - a key theme in any edge architectural discussion - across an edge architecture necessitates automation.
Fundamentally, automation simplifies edge architectures, which is beneficial for various reasons, particularly given the limited IT staff available at edge locations. New sites can be deployed more quickly, and there is less downtime due to misconfigurations and other errors that automation eliminates.
Kubernetes
Kubernetes is commonly associated with container clusters. However, it is becoming increasingly marginalized. There are several reasons for this.
The first is that an edge does not always imply a small size. Telcos, for example, are increasingly substituting software-defined infrastructures for dedicated hardware, such as the radio access network (RAN) that connects user equipment to the core network. Kubernetes can help applications like this coordinate the many components of the architecture.
Another popular edge application is machine learning. Machine learning models are still typically trained in a core data center. However, once trained, the model can be deployed to the edge, where operational data is collected. This reduces the amount of data transmitted over the network and allows any actions required to respond to that data to occur locally and quickly.
Considering the previously discussed consistency and simplicity, if an enterprise runs Kubernetes in its data center, shouldn't it want to run the same software wherever possible? Less work dealing with unique platforms on different tiers of their architecture means more standardization. It also helps to foresee the need to expand edge computing capacity in the future.
While it is true that some edge nodes will be too constrained to run Kubernetes, the community is putting much effort into Kubernetes variants that reduce the amount of code that needs to run while providing a consistent management view. It's a topic under investigation. The best approach for a given deployment will be determined by factors such as resource constraints, network reliability, and availability requirements.
Established patterns
Edge deployments are tailored to various degrees because they combine IT with operational technology (OT), which is likely to be industry-specific, installed industrial equipment, and many other aspects of a company's established operations. Even if you can't buy a single-edge platform that meets your requirements immediately, many existing technologies can be linked in patterns you can see from deployment to deployment. The specifics will vary. Architectures can have striking similarities.
Portfolio architectures, such as the one described above for RANs, are an example of documenting patterns observed across multiple successful customer deployments. They are open-source and can be customized to meet the needs of a specific organization. They include an overview, logical and schematic diagrams for each technical component, and other reference material.
Industrial edge, which is applicable across several vertical industries, including manufacturing, is another portfolio-architecture directly relevant to edge computing. It depicts sensor data routing for model development in the core data center and live inference in factory data centers.
Another architecture demonstrates the use of edge computing to enable medical imaging diagnostics. This AI/ML example shows how to improve a medical facility's efficiency by reducing time spent on routine diagnosis and medical diagnosis of disease, giving medical experts more time to focus on difficult cases.
While integrating IT and OT systems is a difficult task, edge computing is already in widespread and productive use. There are applications in everything from telco 5G to factory preventative maintenance, in-vehicle fleet operations, asset monitoring, and more, in addition to the portfolio architectures already discussed. Edge computing has the potential to accelerate data-driven outcomes, improve end-user experiences, and increase application and process resiliency.
İlgili Postlar
Technical Support
444 5 INV
444 5 468
info@innova.com.tr