EDGE AI POD
Discover the cutting-edge world of energy-efficient machine learning, edge AI, hardware accelerators, software algorithms, and real-world use cases with this podcast feed from all things in the world's largest EDGE AI community.
These are shows like EDGE AI Talks, EDGE AI Blueprints as well as EDGE AI FOUNDATION event talks on a range of research, product and business topics.
Join us to stay informed and inspired!
EDGE AI POD
TinyML Implementation for a Textile-Integrated Breath Rate Sensor
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Clothes that quietly listen to your breath might be the missing link between hospital‑grade vigilance and everyday comfort. We walk through how our team built a textile‑integrated breath sensor that actually works in the wild—embroidered interconnects, 3D‑printed dielectric islands, and a carbonized‑silicon yarn strain gauge stitched into a belt—then taught it to estimate breathing at the edge with TinyML.
We dig into the engineering choices that matter: why flexible interconnects are the “holy grail” for wearables, how a simple peak detector falls apart with drift and burn‑in, and what it takes to turn raw strain signals into reliable features. After screening public datasets that didn’t match our sensor, we built our own: band‑pass filtering in the 0.1–1 Hz range, three‑second windows, normalization, and event‑button labeling for clean ground truth. From there, we used Edge Impulse’s EON Tuner to search architectures and landed on two contenders—a CNN on time‑domain windows and a compact DNN with wavelet features—then deployed both on an STM32L4 with DMA, timers, and CMSIS‑DSP preprocessing.
The results are candid and practical. The CNN was slower but consistently more accurate and robust; the DNN was snappier with lower power but less reliable under offset and noise. Models trained on a different sensor’s data struggled to generalize to our belt, reinforcing a core lesson for smart textiles: sensor‑specific datasets and fine‑tuning are essential. We close by mapping next steps—expanding our dataset, improving transfer across garments and users, exploring hydration prediction, and tightening on‑device optimization—so remote patient monitoring can be seamless, private, and wearable all day.
If you enjoy deep dives into edge AI, embedded systems, and human‑centric health tech, follow the show, share it with a colleague, and leave a quick review to help others find it.
Learn more about the EDGE AI FOUNDATION - edgeaifoundation.org
Why E‑Textiles Matter For Care
Building The TexHype Belt
From Peaks To Edge AI
Models, Deployment, And Next Steps
SPEAKER_00I'm a senior scientist there. My name is Jorgios Kokinis, and today on a bit of a different note from most of the inputs from the industry, I will be presenting our work on a tiny ML implementation of for a textile integrated breath rate sensor, a bit from an academic point of view. Some words about Silicon Austria Labs is a research institute publicly funded, based in Austria, in three cities in Austria, and is focusing its research on all kinds of electronics-based systems, from power electronics to embedded systems, system integration technologies, RF systems, and sensor systems. My team is electronic sensors, and we are conducting research on sensor elements but also readouts systems for sensor nodes. Thus, naturally, we get into the realm of edge AI and recently started working on the field. Since I'm gonna be discussing textile integrated electronics, some driving factors behind this niche of electronics. One is the miniaturization driving factor. Since the traditional transistors, we have come a long way to grain sites MCUs. And on the other hand, we have the other main driving factor, which is additive manufacture, which also incorporated in our project with multimaterial 3D printing. Of course, we cannot forget about market potential and societal benefits. The last years there are all kinds of new usages of e-textiles in diverse applications from safety features, heating, health monitoring, rehab, fitness, and every now and then you have a lot of new applications popping up. Of course, you can already purchase, and there's a lot of products in the market that are really highly advanced, but academia still spearheading the field. In the context of flexible of textile integrated electronics, flexible interconnects is a holy grail. Most of the solutions we find in academia and in industry is basically consisting of the electric islands where the circuits have been assembled, and then there is a search for interconnecting technologies that could deem these systems viable and robust. But what is the need for textile integrated electronics and sensor in healthcare? Of course, you are aware of the aging of population, especially in Europe, and this and the substantial burden it poses on the social healthcare systems. So is there a way through adaptation of such systems to reduce the hospitalization days in hospitals? To these burdening issues, our project Text Hypes, Textile Integrated Hybrid Printed Electronics, comes to bring some solutions to answer some questions on the one hand by remote patient monitoring systems where alerts can be provided to healthcare professionals about a deterioration in the health of a patient, and on the other hand to provide a seamless integration system for electronics and sensors into garments. So our approach to the issue is similar to other systems that are that are in an academic mainly field. So is to have a 3D printed multi-layer dielectric island where the assembly of the circuit comes later on and then use in our case switch back embroider interconnects with a swing machine, but also to develop embroider embroider uh breath rate sensing elements. So we came to develop the Tex Hype Belt. We have all these kinds of modules, the power module and 3D printed LED, a heart rate sensor based on a PPG sensor module, an MCU and accelerometer, and finally a breath rate sensor, and here you see an early prototype of the system. I'm focusing here on the breath rate sensor. We have the different layers of the dielectric island upon which, which was already 3D printed, upon which uh conductive ink was also dispensed. And uh we used an embroidered carbonized silicon yarn to basically form a strain gauge that is measuring the expansion of the chest. So we developed uh this sensor module with the MCU and then we had to decide what we're gonna do with it. Uh the first obvious solution is to use some peak detection algorithm, but then we came with uh really big hurdles because such sensors have a burn-in effect, but also there is a lot of offsets appearing at random points during operation that also the peak detection algorithm can not cannot tackle. That's the obvious uh solution it was to look for into an edge AI implementation. First of all, we screened for public data sets. Uh unfortunately, most of them used ECG and PPG data that were then labeled using impedance monography or capnography, and we only found one data set that it was using a sensor similar to our to our system. Umfortunately, that one as well had several limitations, which I will touch on this on the next slides. Uh so we opted to create our own custom data set. Uh volunteers uh was were recording data in a seated or in standing position, and of course we took care of anonymity in the labeling. Going back to the wog dataset, main hurdle was that it was actually missing labels. So ironically, we used the peak detection algorithm to label the data on ourselves. Um here you see a good a nice example of how this worked perfectly well, but then we had data that looked like this, and actually it was not very easily usable. Here we also have an image of the raw signal and and the spectrum. So we gathered i data, we uh went through a data pre-processing pipeline where we cleaned filter with a bandpass filter of 0.1 to 1 Hz. This is the region that is of interest. We segmented it into three-second splits, normalized them, and labeled them via an event push button. That was also a uh a pretty good strategy as we could have a very well-defined uh birth rate on our data. Moving on to the model architecture, here we heavily depended on uh edge impulse and the neural architecture search feature via the Eon tuner. There was an automated process to find the optimal neural network architecture design. We narrowed down our search to two uh two distinct models. One used a convolutional neural network where the IDMU input was the raw signal with some pre-processing steps on the MCU, and the second was a deep neural network with a wavelet transformation. Naturally, the CNN uh net model had uh many more parameters than the DNN uh and we proceed on training both on both datasets, and we had a really good convergence, and in the metrics it was uh obvious that the text hype dataset uh was performing uh slightly better, and the CNN model was performing always better than the DNN. This followed the model deployment. Uh you we use an STM32L4 microcontroller, some initial pre-buffering uh pre-processing steps to provide provide the data for the CNN network and the DNN network. Uh we had we acquired the data through an analog to digital converter via DMA, and we use two timers, one to precise control the sampling intervals, and the second just as uh a solution to measure the intervals on the inference time. Um the pre-processing on the MCU consists of an uh FIR filter from the CMC's DSM library, um some buffering and event flag, and finally some um minimum and maximum normalization. The post-training optimization uh significantly reduced the size of the models by threefold, and uh here I included an image of the filter data from the sensor. Finally, we run the inference at the edge. For the convolutional network, um we had a relatively uh large inference time, close to six seconds. Uh but also as we've seen before the results were much better, and we have uh a power consumption between 3.3 to 6.6 millibytes. Well from the DNN that was the maximum in power consumption, on the other hand, was much faster in inference time. Now we had two datasets um and we wanted to try uh the public dataset that was trained uh from sensor data from a different sensor to see how it well it j it generalizes on the sensor data from our model. Um unfortunately we see that uh the model does not generalize very well. We have a main average error of 8.3 for the CNN and the 10 for the DNN. And so the next step would be to fine-tune the model with the data from the text-hype dataset and see if we can uh improve the performance. On the other hand, uh of course the dataset that was used to train the model and was uh getting input from the same sensor uh perform much more better. So the next step uh is to use uh to further collect uh data and train AI and models towards the remote patient and monitoring system that I uh look uh that I mentioned in the beginning. Uh currently we are focusing on uh hydration uh prediction and finally of course use some uh embedded AI integration of uh such a system. So that brings me to the end of my presentation. Thank you very much for