EDGE AI POD
Discover the cutting-edge world of energy-efficient machine learning, edge AI, hardware accelerators, software algorithms, and real-world use cases with this podcast feed from all things in the world's largest EDGE AI community.
These are shows like EDGE AI Talks, EDGE AI Blueprints as well as EDGE AI FOUNDATION event talks on a range of research, product and business topics.
Join us to stay informed and inspired!
EDGE AI POD
How Innatera is Revolutionizing Low-Power AI with Neuromorphic Chips
What happens when we redesign computing hardware to work more like the human brain? The results are transformative for edge AI.
Sumit Kumar from Inatera takes us inside the world of neuromorphic computing – a revolutionary approach that's bringing brain-like intelligence directly to sensors. Born from research at Delft University of Technology, Inatera is tackling one of the most significant challenges in modern technology: how to perform complex AI tasks on battery-powered devices without draining power.
The key lies in spiking neural networks that are fundamentally different from conventional AI approaches. These event-driven networks operate with computational dynamics that mimic brain function, resulting in models 100 times smaller than traditional AI while consuming just a fraction of the power. For applications like video doorbells, acoustic scene classification, and wearable healthcare, this means continuous monitoring with millisecond latency at just a few milliwatts – outperforming traditional microcontrollers by at least 10x.
Beyond current applications, neuromorphic computing opens entirely new possibilities. The technology excels not just with conventional vision but with radar sensors and other modalities, particularly in privacy-sensitive situations. Robotics represents another frontier, where neuromorphic systems can enhance environmental perception, process complex sensor fusion, and enable low-latency control. Through academic partnerships and industry collaboration via the Edge AI Foundation, Inatera is helping build the ecosystem that will make neuromorphic computing as ubiquitous as neural networks are today. The future of edge AI may indeed be neuromorphic.
Learn more about the EDGE AI FOUNDATION - edgeaifoundation.org
well, yeah, so welcome, welcome to, uh, welcome to our partner segment here, and you know the purpose of this is to introduce partners to the community and we have lots of different partners in the foundation, probably about 50 plus and uh, some some more famous than others. You guys are based in Delft, which is a little south of Amsterdam, and Inatera is pretty well known in the neuromorphic world, but I thought it would be good to just touch base with you and see what's new with Inatera and give people a little exposure to what you're working on. So thank you for joining us and thank you for your patience in our recording process. Thanks for having me. Okay, so where were we? I think it was like what's the origin story? Here in a terra, where, where? Where did you guys kind of come from? And your modus operandi?
Speaker 2:yeah, sure so in a. Terra is a dutch semiconductor company, and and we spun out of the Delft University of Technology back in 2018. And we were really born with the objective of bringing brain-like intelligence directly to sensors, computing and energy efficient computing, and back at the university, we'd been working in this area of building electronic platforms that could recreate brain function, basically to allow neuroscientists to do their experiments better. And alongside that, we had this activity on energy efficient computing, which is ongoing, and both of these kind of came together in 2018. Basically, in response to what we were seeing in the industry, battery powered devices were getting packed with complex sensors and all of the data that these sensors would generate would require a very powerful AI to process, but within an extremely narrow power envelope, and that's where we really saw that, you know, neuromorphic computing could actually make a difference, and that's what led us to setting the company up in 2018.
Speaker 1:Yeah, and so for folks that are not familiar with neuromorphic, which is kind of becoming a hot thing, I think. As I understand it, the idea is that in traditional AI, you know, we've tried to create sort of mimic kind of brain functions and brain activity, but sort of force that into a traditional computing architecture that maybe wasn't designed for it. And what you're doing is really designing the underlying hardware architecture to be a better map to how the brain actually works.
Speaker 2:Indeed, that's right. So the way we do AI in the industry today is really by abstracting the kind of mechanisms we see in the brain, and that abstraction exists simply because the hardware has been limited, and what neuromorphic computing aims to do is to bring in a new paradigm of information processing mechanisms that leverage the capabilities of hardware that has actually been built with the objective of running these sort of algorithms in mind. So in the corner of the world that we come from, we rely on something called a spiking neural network. It's basically a neural network which is event-driven, but the underlying basis of that neural network is a very rich set of computational dynamics that basically requires a completely new sort of processing architecture to really run, and the end result of doing things with a spiking neural network is that you end up doing very powerful AI with models that tend to be about 100 times smaller than conventional models that you find in the industry today and with a radically improved power consumption so much, much lower power consumption than typical AI approaches used today.
Speaker 1:Right, right, yeah. And so what are the big markets, big verticals for you with the neuromorphic and with Inotera these days?
Speaker 2:We think that we're only starting to scratch the surface in terms of what neuromorphic computing can do, and today what we focus on is a lot on pattern recognition and signal processing, and this is where the meat of the problem tends to be.
Speaker 2:We see a lot of applications in the consumer electronics, iot as well, as you know, wearable healthcare sort of spaces, so think of applications where you've got sensors that operate on a continuous basis and produces rich stream of data which today is processed in the cloud.
Speaker 2:I think those are the applications that we go at, first, trying to reduce the processing latency to the order of milliseconds and reduce the power consumption to a few milliwatts in the worst case, and that tends to be in applications like video doorbells, where we we have the most efficient solution that uses a far infrared sensor together with our neuromorphic chips, and then quite a bit in the audio space for things like acoustic scene classification, which is part of the noise cancellation process, where again we outperform traditional microcontrollers by at least a factor of 10 in terms of power consumption.
Speaker 2:So these have been a couple of areas where we've seen a lot of thrust and a lot of pull from customers, but there are, you know, completely new areas where neuromorphic computing actually brings to the table capabilities that have just not been there within the AI space up until today, and the kind of applications that will end up unlocking will truly be groundbreaking, which is why a very old saying of mine within the Edge AI group, back when it was called TinyML, was that the future of TinyML is neuromorphic, because if you're really looking to go and sit next to the sensor and do intelligence, it's got to be your music.
Speaker 1:Yeah, no, it makes sense. I mean, we're seeing some of the most radical and innovative solutions out there are really battery-powered or self-powered, frankly because not only do you get the kind of deployment flexibility but the ability to peel and stick or to deploy AI-driven sensors and AI-driven equipment without requiring, you know, power supply. I'm back. I don't know what's going on. The tech gods are not on our side this morning. Okay, so I'll pick it up from where I left off, because I think we're still recording.
Speaker 1:Yeah, so what I was going to say was this idea of low power AI, which you're familiar with, obviously, from our kind of tiny ML roots, really also provides, like real flexible deployment options.
Speaker 1:So the ability to peel and stick AI sensors anywhere radically reduces the cost of deployment and also enables lots of new scenarios just weren't possible in the past. So from a kind of a commercial perspective, the idea of low power ai is not just interesting from a battery life, but it's also deployment cost and and overall kind of you know cost of you know the return on investment that you're going to get out of the deployment. So I imagine that you're getting a lot of interest. You mentioned around video and AI vision sensors. You know, I mean we sometimes we see, you know NVIDIA Jetson-based AI vision platforms, which obviously are on the far extreme of power consumption. But what are you seeing? Like I was curious, like with the discussion around physical ai and robotic systems and and how do you see sort of neuromorphic um and in a terra playing in the robotic space? Is there um things happening there that would be of interest?
Speaker 2:So we do get a lot of interest on the vision side, but we tend to not focus on conventional vision applications simply because what we see is that in most applications where you're doing vision, the sensor is already consuming so much power that the processor which is sitting behind it isn't always the bottleneck anymore. It isn't always the bottleneck anymore. Where we do focus on vision, it tends to be on really, really power-constrained vision using low-resolution image sensors or event-driven cameras. But what we've seen is a larger market, even outside of vision. So, considering radar sensors, which are increasingly used in devices like doorbells as a mechanism to wake up the more power hungry camera, or in applications where you possibly don't want to have a camera at all, for for private cities, for instance, fall sensors inside of your house, you you cannot actually have an imaging component inside of it, and all of these are really applications that truly have that, you know, continuous monitoring. That requires an energy efficient AI sitting on board, basically also preventing this data from going into the cloud. A large chunk of applications has been around these battery powered devices, but we do see robotics as being a very interesting area to go into in the future.
Speaker 2:So I see two or three different verticals or you know parts that we could go down.
Speaker 2:One is around how you can sense and perceive the environment better and how one can really adapt to different scenarios in which robots really might need to operate, and I think the strength of neuromorphic here is their ability to, is the ability of these solutions to learn autonomously in the field.
Speaker 2:That's really going to be a core strength of these sort of devices in the future. The second one is how one can actually deal with the large volume and large complexity in sensor data that you would typically have inside one of these future robots. And I think here is the fact that neuromorphic is not just a solution for doing inference. One can do a lot more than inference. If you look at the brain, for instance, you see a lot of signal processing functionality that is carried out inherently by the brain's neurons and synapses, and I think this is really an opportunity in the future where you'll see these neuromorphic systems doing a lot more than just that pattern recognition or inference alone. The third part is really within this realm of control and dynamics, where I think the advantage that neuromorphic brings to the table is that of low latency, computation and actuation, and I think that this is an area which is being explored quite a bit in academia, but eventually we'll see a lot more traction in the industry in the years to come.
Speaker 1:Yeah, we had Professor Kadithi Pudi from UT San Antonio at our keynote in Austin recently and I know she's a leading voice in neuromorphic computing and I know you're working with. We have a neuromorphic working group in the foundation Charlotte Frankel. Professor Frankel from Delft is part of that as well, so it's a really interesting crossover. I mean, I guess you're leveraging a lot of academia for the research that's going into your space. I mean how do you and how does Inotera sort of interact with academia, I mean?
Speaker 2:what's the operational model that you have?
Speaker 1:currently with working with academia.
Speaker 2:We're actually very much engaged in research in everything that we do. The field of neuromorphics is really emerging and we're building products that actually incorporate neuromorphic technology and take them out to market, and that involves a lot of research activities on a day-to-day basis. We've got a large network of partners that we collaborate with in funded projects where we've got specific objectives that we're trying to achieve, and all of these are constantly feeding into the kind of products that we build and application concepts that we also realize. I think a very important part of everything that we do is this aspect of developing an ecosystem for neuromorphic computing, and in our minds, it's around having capabilities to benchmark solutions, having standard hardware implementations that one can actually port application software between, and having a diverse set of application software and actually general tooling available to target this wide range of hardware that companies are coming up with, and that actually requires us to sit down together, talk with our competitors, talk with researchers from universities and come to a common understanding of what we need in order to make this field more vibrant.
Speaker 2:I remember what the world looked like back in 2010 or 2012, long before we had these established frameworks for training neural networks. I don't know what the neural network world really looked like, and today what we've got in the industry is a far cry from that. There's so much standardization. We treat neural networks as if they've existed forever and everyone's kind of, you know, just supposed to know how to train a model from the time they're born. And I think it's really great that the industry is here and we have similar ingredients that are around for neuromorphic computing as well Great researchers, great companies, great applications that can make use of this tech. And I think the role of you know these working groups and foundations like the Edge AI Foundation is really to bring everything together and to make it possible for neuromorphic to succeed. And finally, as we go down this path, we also engage, you know, with universities on a more one-on-one basis to solve specific problems. Bring together groups of people, let them loose really on a problem which is really troubling us and see what comes out of it.
Speaker 1:Right, yeah, no, it sounds like definitely neuromorphic is becoming much more of a commercialized kind of deployed type of solution. That's a lot of conversation around that and I and.
Speaker 1:I also agree it's really a collaborative effort to team, team sport, team sport. When it comes to a lot of these edge AI technologies, and I know we had talked in the past about the neuro bench effort and some other things. Like you said, how do we sort of bring together the kind of tool chains and comparative processes so that people can deploy, develop and deploy neuromorphic based solutions faster? And so it sounds like Indotera is sort of on the definitely on the cutting edge of all that stuff. So that's, that's awesome. And so how long has indotera been part of edgi foundation?
Speaker 2:it's been since the beginning, or many years, I think so, in in spirit, we've been with the foundation from the beginning. Uh, practically, I think we joined towards the end of 2022, I see, and uh, this was also the time that we were just coming out of stealth. Really, everything that we were doing was pretty hush-hush right at the beginning, and then we joined forces somewhere towards the end of 2022. And I think that even the foundation and all the impact that it has has come really a long way since then.
Speaker 1:Yeah, no, I appreciate that. I think so and, like you said, this industry is moving so quickly, you know, it's almost like a week to week jump of knowledge and understanding. So, yeah, it's great to see what you're doing. Hopefully, folks that are watching this will dig into neuromorphic computing a little more and see what's going on there, because I really do think it's a fascinating new development that, like you said, it's going to enable all these new scenarios to happen and maybe enable some of the existing scenarios to happen a lot better and cheaper and faster. So that's a good thing. But, sumit, yeah, I really appreciate the time. Thank you so much and hope all is well over in the Netherlandsetherlands and, um, I'm sure we'll see each other soon absolutely sounds good.
Speaker 2:All right, thanks for having me.