System on a chip takes IoT processing to the edge

via TechRepublic

These smart systems can help edge computing move to the next level.

system on a chip (SOC) combines hardware and software, including components like a graphical processing unit (GPU), a central processing unit (CPU) and system memory (RAM) all on single chip

If the chip is endowed with neural network technology comprised of algorithms designed to recognize underlying data relationships the way a human brain does and then further “smartened” with artificial intelligence (AI) that is made for use in the consumer, automotive, and industrial markets, it adds even more edge intelligence that enhances operational and IT performance and can be distributed throughout enterprises.

This is the vision on which Brainchip, a neutral network processing provider, and Socionext, a developer of system-on-a-chip solutions, have been collaborating.

“The goal is to bring more edge AI to the marketplace,” said Brainchip’s COO Roger Levinson. “The traditional IT architecture that is data center-centric doesn’t support edge computing that well,” Levinson said. “Consequently, many organizations find themselves with a dearth of IT resources at the edge of their enterprises, where the Internet of Things (IoT) and AI are now working. These organizations find themselves with limited power, compute, and memory at the edge.”

In these circumstances, it’s difficult to scale out the many edge IoT applications that are entering the marketplace and that can help operational improvements.

A solution like Brainchip-Socionext can plug into an edge device through USB or PCIe ports because it uses standard interfaces. It can provide up to 24 core processors in a box. Edge computing packages like this offer real promise for offloading processing and data loads from a central data center. They also reduce bandwidth and throughput requirements and minimize latency, because processing can occur at the edge.

Among the use cases that benefit from edge-empowering technology are smart cities with a need to manage remote cameras and traffic signals; and in-cabin automotive applications that can authenticate driver IDs and monitor driver behavior.

Here are some takeaways for IT managers who must now think about IoT deployment and management.

1. Rethink your corporate IT architecture

Centralized computing isn’t going away, but strictly data-centric IT architectures won’t be sufficiently flexible or economical if your plan is to deploy IoT, AI, machine learning, and other applications at the edge of your enterprise.

Architecturally, IT needs to redeploy storage and processing resources to the edge. Additionally, data lines and networks will need to be reconfigured so they can support the edge and its new data flows.

2. Distribute processing

More compute and storage need to be deployed at the edge so edge applications can run. These edge deployments can be scaled upward or downward as edge demands warrant.

3. Institute sound edge security practices

Edge resources must be secured, and those running applications at the edge must be trained in security best practices. Many edge users will not have IT backgrounds, which makes security training especially important.

On the network side, a zero-trust network patrolling the edge can do a lot for security and monitoring of user activity at the edge.

4. Revise disaster recovery and failover plans

Disaster recovery and failover plans are often left until last in IT project work. However, with more of the IT workload likely headed to the edge, disaster recovery and failover plans should also be revised and tested as edge deployments occur. Edge monitoring, robotics and process automation are becoming mission critical applications in many organizations. A comprehensive disaster recovery and business continuity plan should include them.