With the continued expansion of the internet protocol (IP) into IoT devices through technologies...
AI at the Edge of the Edge - Thoughts from the Zephyr Developer Summit 2024
ZDS is always a great networking event with loads of interesting talks (Seattle,WA)
This was my second year at the Zephyr Developer Summit (ZDS). Once again it didn't disappoint in delivering great talks, great networking and a snapshot of the great progress and future plans for The Zephyr Project embedded RTOS.
I was a little surprised however that AI/ML didn't make a stronger showing at ZDS though. I was fortunate enough to get to CES earlier this year and everyone was talking about "pushing AI/ML to the edge". Perhaps having spent so much of my career around the "embedded intelligence" area, I just expected to see ML running on many Zephyr devices by now, but it was clear that ML for this "embedded edge" (i.e. untethered IoT) is still in its very early days with Zephyr developers. I guess it's reasonable that required combination of data scientists with knowledge of deep ML for constrained devices with experienced embedded (low power) Zephyr developers is not that common yet.
Nonetheless, @Embeint's @jordan presented Beefy ML on the work done at @CSIRO that adds great capability and value to the @ceres ear tag devices for classifying animal behaviour.
The other embedded ML talk I enjoyed was from @Benjamin Cabé, The Linux Foundation on his "weekend hack" in getting TinyML running for an "electronic nose" .
The magic that really enables the ML for Ben's demo was done by @EdgeImpulse (and I was lucky enough to meet a fair few of the crew during the event). This is certainly a game changer for embedded AI practitioners (aka Data Scientists) and in fact opening up ML creation for embedded devices to a broader (less AI-savvy) developer audience. However, despite Ben's rapid outcome on the electronic nose, creating ML for devices on ultra-low power objects for commercial applications is still quite the feat, requiring logging and annotation of data and deep knowledge of the MCU (and features) in order to create a ML model that can inference low power and run within all the constraints of a low power, lost cost microcontroller.
At Embeint, we believe in the vision of a world where Ambient Intelligence transforms the way people live and work. To reach that however, we need to continue to simplify the creation of ultra-low-power ML/AI enabled, LPWAN IoT devices, so that intelligence can be embedded into everyday objects - and this is what Embeint are going "all in" on (love that All-In podcast btw).
That's not to say that cloud or powered edge won't have some great genAI that inference on the 'big picture' to solve problems and provide unprecedented context that adds massive value and efficiencies - it's just not the problem we're focussing on initially.
Outside of ML (though potentially still related to our vision) I found the talks ZBUS, MicroPython and LLEXT very informative:
ZBUS
|
|
MicroPython
|
|
LLEXT (Linkable Loadable Extensions)
|
|
In Conclusion
There were lots of other talks on new and future features of Zephyr which I don’t have time to detail in this blog - but to summarise, it was great to see the vibrant Zephyr community continuing to gain speed in improving and adding to the Zephyr Project.
Hope to see you at the next ZDS!