BrainChip Identifies Advantages to Radar and Lidar Systems Leveraging Event-Based AI
Laguna Hills, Calif. – September 9, 2024 – Developers of radar and lidar systems have a new option in addressing key detection and tracking challenges by utilizing event-based AI computing architectures that provide performance improvements over conventional signal processing algorithms, say researchers at BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, brain inspired AI.
Radar systems are used in industries beyond aviation and military and are prevalent in automotive, robots, drones, and anything with autonomous mobility. Similarly, lidar (light detection and ranging) applications span settings like engineering, energy, agriculture, and transportation, among others. The demand for efficient, responsive, smaller and lower-power adaptable radar and lidar technology is high as these industries increasingly rely on AI/ML.
Event-based AI/ML represents an advancement in AI/ML technology, capable of working efficiently with sequential or continuous data streams, which represent the types of signals produced in radar and lidar systems. Event-based computing is ideal for processing point cloud solutions directly instead of preprocessing into 2D images for traditional neural processing withconvolutional neural networks (CNNs) or recurrent neural networks (RNNs).
Event-based computing takes advantage of sparsity of networks and data to only perform computations that impact final inference results, producing more efficient network execution and utilization of compute resources. Using new neural architectures that combine spatial and temporal computations reduces the number of computations needed compared to convolutional neural networks as well. Most importantly, the event-based computations can improve the detection and tracking characteristics of the radar.
Event-based computing, when widely adopted, has the promise to improve speed, accuracy, and resource efficiency in radar/lidar systems, with advantages including:
– Rapid, accurate response – Systems that use event-based computing can rapidly detect and respond to signals and changes in the environment, such as a moving object. More traditional systems use sampling frames to collect and process data at regular intervals, regardless of if there is a signal, a change, or an activity, resulting in unnecessary computations and higher latency until the next frame is processed.
– Power efficiency – Traditional compute systems are constantly consuming power even when no significant events are occurring. Event-based AI processing only computes when an event occurs, so it requires far less energy to operate. Event-based computing’s lower power consumption can extend the operational life of radar/lidar systems in field deployments where sustained use is required.
– Better data management – Radar and lidar systems are data-intensive with much of the data redundant or not relevant to the operation data. Traditional AI neural networks quickly get bogged down processing this unnecessary data, causing latency and delayed responses. Event-based computing focuses only on relevant data, which vastly reduces data overload and eases storage.
– Size and scope – Applications in this category include very small-scale systems, like gesture recognition and robotics, to automotive radar that detects the sudden appearance of an object, classifies it as a pedestrian, another vehicle or road obstacle, and tracks it to estimate if it is on a collision path with the vehicle. Traditional large-scale systems, like global weather monitoring that tracks storms or air traffic control, often have multi-channel antenna systems, which can also benefit from event-based processing. Event-based processing is highly adaptable and capable of improving outcomes in both small and large environments or anywhere it is important to allocate resources efficiently.
“Event-based AI processes only critical information, which enables faster decision making and improved safety,” said Tony Lewis, BrainChip CTO. “This temporal-enabled, neural-networking model delivers improvements in detection accuracy, safety and efficiency in radar/lidar systems.”
BrainChip’s Akida™ is an event-based compute platform ideal for early detection, low-latency solutions without massive compute resources for robotics, drones, automotive and traditional sense-detect-classify-track solutions. BrainChip provides a range of software, hardware and IP products that can be integrated into existing and future designs, with a roadmap for customers to deploy multi-modal AI models at the edge.
To learn more: https://bit.ly/3ZjrExo
About BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY)
BrainChip is the worldwide leader in Edge AI on-chip processing and learning. The company’s first-to-market, fully digital, event-based AI processor, Akida™ , uses principles that mimic the human brain, analyzing only essential sensor inputs at the point of acquisition, processing data with unparalleled efficiency, precision, and economy of energy. Akida uniquely enables Edge learning local to the chip, independent of the cloud, dramatically reducing latency while improving privacy and data security. Akida Neural processor IP, which can be integrated into SoCs on any process technology, has shown substantial benefits on today’s workloads and networks, and offers a platform for developers to create, tune and run their models using standard AI workflows like Tensorflow/Keras. In enabling effective Edge compute to be universally deployable across real world applications such as connected cars, consumer electronics, and industrial IoT, BrainChip is proving that on-chip AI, close to the sensor, is the future, for its customers’ products, as well as the planet. Explore the benefits of Akida at www.brainchip.com.
Follow BrainChip on Twitter: https://www.twitter.com/BrainChip_inc
Follow BrainChip on LinkedIn: https://www.linkedin.com/company/7792006
###
Media Contact:
Mark Smith
JPR Communications
818-398-1424
Investor Relations:
Tony Dawe
Director, Global Investor Relations
tdawe@brainchip.com