Alif Semiconductor, at the Embedded Vision Summit taking place May 16–19, 2022, will be joining Edge Impulse to demonstrate the dramatic increases in AI/ML workload performance and efficiency its Ensemble microcontrollers (MCUs) and fusion processors are able to deliver over traditional systems.
Machine learning in embedded applications is rapidly gaining traction through its ability to improve existing solutions and enable developers to build elegant new solutions to problems that traditional methods in the embedded space are not able to solve. Until now, Vision-based use-cases have been particularly difficult for developers, as available processors have either been too weak in terms of processing performance, or too power-hungry, especially with common battery-powered systems. In addition, traditional development workflows for embedded designs are not well suited for the additional steps required to select, configure, train, and deploy a machine learning model onto an embedded MCU.
Alif’s highly scalable lineup of Ensemble MCUs and fusion processors are powered by Arm’s Ethos-U55 microNPU and latest-gen Cortex-M55 CPU. The Ensemble family combines dedicated high-performance and high-efficiency processing systems built around the concept of always-available, battery-friendly, AI-accelerated environmental sensing. This significantly improves AI/ML performance compared to current CPU-bound approaches, while consuming only a fraction of the power of such solutions. The Ensemble E7’s M55 + U55 High-Performance core is benchmarked as running convolutional Neural Network workloads 100x faster (780us compared to 74ms) than a Cortex-M7 at a similar clock speed.
“The improvements to Neural Network inference times we have seen on our Ethos-U55 based system while working with Edge Impulse on this integration have been very impressive,” said Steve Pancoast, VP Software and System design at Alif Semiconductor. “The ease of use that the Edge Impulse platform brings, combined with the performance improvements resulting from their EON tuner and compiler, will make a big difference for our customers’ design cycles.”
Edge Impulse has created the leading development platform for machine learning on edge devices, quickly and easily guiding developers through the entire process of collecting and structuring datasets, designing ML algorithms with ready-made building blocks, validating the models with real-time data, and deployment of the fully optimized production ready result to an embedded target. The Edge Impulse development platform, in use by firms like Polycom and Advantech, has been extended to fully take advantage of the AI acceleration built into the Ensemble devices, delivering more performance while keeping power consumption in check.
“Edge Impulse’s mission has always been to enable our users to create the next generation of intelligent devices, and our partnership with Alif really lets us take this to the next level,” said Zach Shelby, Co-Founder & CEO at Edge Impulse. “We are very excited to be able to highlight the combined capabilities of our development platform and the Ensemble devices, and are looking forward to seeing the kinds of products they will be able to create.”
Alif Semiconductor and Edge Impulse will showcase the performance of the Ensemble MCU family combined with the Edge Impulse development platform during the Embedded Vision Summit’s “Technology Exhibit,” happening on May 17th and 18th at the Santa Clara Convention Center. Please join us at the event to learn how we can help your next ML-enabled embedded project perform better, at a fraction of the power and development effort.