EDGE AI POD

Low Code No Code Platform for Developing AI algorithms

EDGE AI FOUNDATION

Revolutionizing edge computing just got easier. This eye-opening exploration of ST Microelectronics' ST-IoT Craft platform reveals how everyday developers can now harness the power of artificial general intelligence without writing a single line of code.

The modern IoT landscape presents a paradox: billions of devices generate zettabytes of valuable data, yet transforming that raw information into intelligent systems remains frustratingly complex. ST's innovative low-code/no-code platform elegantly solves this problem by distributing intelligence across three key components: smart sensors with embedded AI algorithms, intelligent gateways that filter data transmission, and cloud services that handle model training and adaptation.

At the heart of this revolution is truly remarkable in-sensor AI technology. Imagine sensors that don't just collect data but actually think – detecting whether a laptop is on a desk or in a bag, whether an industrial asset is stationary or being handled, or whether a person is walking or running. These decisions happen directly on the sensor itself, dramatically reducing power consumption and network traffic while enabling real-time responses. The platform offers 31 different features including mean, variance, energy in bands, peak-to-peak values, and zero crossing that can be automatically selected and applied to your data.

What makes ST-IoT Craft truly accessible is its browser-based interface with six pre-built examples spanning industrial and consumer applications. Users can visualize sensor data in real-time, train models with a single button click, and deploy finished solutions directly to hardware – all without diving into complex code. The platform even handles the intricate details of filter selection, feature extraction, window length optimization, and decision tree generation automatically.

Ready to transform your IoT projects with embedded intelligence? Visit stcom, search for ST-IoT Craft, and discover how you can teach your sensors to think – no coding required.

Send us a text

Support the show

Learn more about the EDGE AI FOUNDATION - edgeaifoundation.org

Speaker 1:

Good morning everybody. So I'll talk about a low-code no-code platform that we have built for deploying AGI solutions and algorithms. My co-authors are Swapnil Sayansaha and David Ali Priyandi Sayan Saha and David Ali Priyandi. So, as we heard in many talks this morning, there are billions of IoT devices now that are transmitting zettabytes of data to the cloud. This was the very first talk. We heard about 50 billion devices. Who knows? The number might be much bigger or smaller, depending on how you define the IoT devices. However, high-quality data is essential, as we all know, for building good AI ML algorithms. One of the important aspects here is how do you bring in the right kind of data? So I'll talk a little bit about that. And the second thing is that, if you consider building a sensor gateway to cloud model, it's something that is fairly intensive and complex. So one of the approaches that we are taking in this is to put together a low-code no-code solution which will reduce this complexity and solve this problem. So you would have seen you know, a demo of this in a workshop yesterday that was held by my colleague, davide Sergi, in the workshop in the morning. So I'm talking more about you know, this particular solution that we have at ST. So what exactly is ST-IoT Craft solution that we have at ST? So what exactly is ST-IoT Craft? So it's an online, scalable platform and you know, essentially in this particular platform you start out with. You know this SensorTilepro that we have, which has a variety of sensors in it accelerometers, gyroscopes, temperature sensor, pressure sensor, magnetometer, microphone all of these different sensors. It has Bluetooth connectivity and, essentially, some storage on it. So this particular solution now can talk to the cloud solution through a gateway, and this gateway can be your mobile phone, your PC, whatever else it may be.

Speaker 1:

The STI to craft solution has a variety of examples that are available, so you can start with an example that might suit your use case scenario. To become familiar, you have a bunch of example data sets that you can work with in order to try out the whole system. There are project APIs, data set APIs, even job control APIs that are available so you can configure different jobs to put the solution together. You also do have your own cloud app that you can eventually build using a cloud app that we provide. So if you consider all the components that are used to build this thing, you would need a piece of hardware to start with. So we have a bunch of example hardwares that are available, fully functional, that you can use to build this thing. You would need a piece of hardware to start with. So we have a bunch of example hardwares that are available, fully functional, that you can use to build your solution. One of them is Sensortilepro. I'll talk about a few others that you can utilize.

Speaker 1:

Then the next piece of this whole solution is in-sensor AI technology. So we have smart sensors that can run machine learning algorithms in them. Some of these sensors are, you know, available and are being utilized in commercial products such as your laptops, you know, for four or five years now, for detecting things like is the laptop on the lap or on the desk, or is the laptop in the bag, or is the laptop on your, you know, is the person walking with it? All of those solutions are implemented inside the sensor. So in-sensor AI technology is one of those pieces. And then ST software solutions. You know there are a variety of software solutions that we have, which can you know, let you analyze data, let you crop data, prepare this data for machine learning training process and finally, you know this ST AI-AT craft solution that I'm talking about. You know it does have data set and project privacy. It's a very serious aspect of it, which the previous talk that we heard about. So this particular solution also does take care of that, using components such as CubeArmor, grafana, all those such components that you utilize on the cloud side.

Speaker 1:

So let me talk a little bit more about the in-sensor compute part. So, as you can imagine, in-sensor compute part, the fundamental thing it needs is a sensor which interacts with the physical world. So you would essentially use a sensor to measure acceleration, angular rate, magnetic field strength, pressure, temperature, humidity, whatever it may be. Microphone would measure. You know sound, so you know this. You, whatever it may be, microphone would measure sound. So this sensing data essentially will be the first piece. Then there's a computational block. So if you can apply filters on this sensor data, your machine learning algorithms will perform better. So we have a method of automatically selecting the filters. I'll talk a little bit more about in the next few slides. And then you have, you know, predefined feature blocks available inside the sensor. So put together an algorithm. You know there are a variety of feature blocks, such, as you know, mean variance, energy in the band, peak-to-peak value, zero crossing. All those features can be utilized and then, finally, you can run your decision tree inside the sensor and you can even apply a meta-classifier on that decision tree's output.

Speaker 1:

So now to implement this using STI IoT Craft. I'll talk about the pieces for that. So the very first is the supported hardware. So there are a variety of hardware blocks that are available. One of them is this STWinbox, so it contains industrial grade sensors accelerometers, gyroscopes, temperature sensor. It has Bluetooth, low energy connection, so you essentially can extract the data, the stored data, or you can extract the data in real time. Sensortilebox Pro it has consumer-grade sensors, so accelerometers, gyroscopes, pressure sensor, humidity sensor, microphone, and there are two flavors of these S-T-Winbox, which contains industrial-grade sensors.

Speaker 1:

On S-T-A-O-T-E-Craft, you have a variety of examples that are available. So we you know one of the features of this is called as flash and run. So there are six examples available on STI IoT Craft. So, sti IoT Craft, you can access that through a browser, so you can go to stcom, you can search for STI IoT Craft Craft and essentially, you know you'll be at a landing page where you will see six of these examples. So for industrial, the examples are smart asset tracking there's an example for fan coil monitoring. There is an example for a smart power tool. So using the sensor on a power tool, you can determine the state of the power tool like a drilling machine For consumer applications. There are three more examples, like human activity recognition, gesture recognition, and there are two flavors of gesture recognition. So these examples will allow you to work with an available set of data that's already on the cloud and you can actually flash this down to the sensor tile all the way from your browser.

Speaker 1:

So main idea here is that you have a complete working solution that you can try out without having to write any chunk of code. So here is an example of what you will be able to see on the browser. So the top trace that you see here is a real-time trace, which is, of course, a screenshot for accelerometer data. So the three axes of accelerometer data are shown here. So in case of a smart asset tracking example, you would have four different classes in this case that are being shown, would have four different classes in this case that are being shown. So the output of algorithm running on the edge in the sensor dial is visible to you on the browser in real time. So there are four classes here. So the first class is that the asset is stationary but upright. The second one is asset is not upright. The third one is asset is in motion. The fourth one is asset has been shaken. So all of this can be seen in real time on the cloud using your browser on the cloud.

Speaker 1:

So how do you create a new project? So the very first thing that you would do is you would define a data set. This data set essentially would contain whatever. If you are collecting data for this smart asset tracking example, you would be logging data for when the box is stationary or when the box is in motion. You would be collecting data when somebody is shaking, this box is sitting upright or not upright. All of that. So you collect all of this data. So you define your data sets. Then you import your data into the project. This could be done using something like sensor tile or you could bring in your own data. If you have logged this data previously, you can import this data. You define your new project and then you define your new model that you want to use. You train the model and you can test this model in real time.

Speaker 1:

So for data management, you know there are two, three different aspects. Number one is data acquisition. So the data acquisition essentially can be done using this STI IoT Craft web portal. So web portal will allow you to define your sensor. You know, will essentially establish the path all the way from cloud through the gateway to the sensor node, and then you know, will essentially establish the path all the way from cloud through the gateway to the sensor node, and then you know, you basically define, you know which sensors you're collecting data from. You can collect this data from multiple such sensors or you can use the mobile app. There's a mobile app that's available which will allow you to do the same thing. You can upload previously acquired data.

Speaker 1:

So we have a CSV wizard. So this CSV wizard will allow you to transform this data into the format that is expected by STI IoT Craft. So you could have different sampling rates for sensor data. You could have different scales for the sampling data, so all of those could be transformed. So essentially, it will be able to deal with files that are in heterogeneous data sets. So some files might contain sensor data with, let's say, 50 hertz of sampling rate. Others might contain 200 hertz of sampling rate. So you can downsample the sensor data to the chosen sample rate and the CSV wizard will guide you through that process. You can also download those data sets onto your laptop, your desktop device, and do your own analysis. So STI IoT Craft provides you that capability.

Speaker 1:

So the other aspect is a visualizer that's available to you so you can visualize this data. You can zoom into the data, crop the data, you can label the data. The cleaning of all of this data can be done utilizing this particular tool. And now let me come back to this one. So now next step is the model training. So in case of model training, it's a one-button solution that we have. So the very first piece that you see here is that there is training data that's available. So this tool lets you see, in case of asset tracking example, which is what I'm describing here, there are four classes the device is stationary, or asset is stationary, it is in motion, asset is upright, or it's not upright, or it's been shaken. So you can visualize this using this data that all of the data that you have is sort of equally distributed, that you don't have too much of data, let's say, stationary data, that your 90% data is for stationary. That is something that you want to avoid. So you would be able to see that Next step is you can select the sensors.

Speaker 1:

Next step is that you can select the filters and features and train your solution. So it can all be done using one button. You know, a single button, without writing any line of code. So what you see here is all the default features that have been selected. So for filters, there are multiple methods that you can utilize a simple feature selection method or, you know, exhaustive feature selection method. Two different methods that we can utilize a simple feature selection method or an exhaustive feature selection method, two different methods that we have developed. For features, same thing the tool will automatically select the features for you based on the label data that you have provided. So how does it work? So the very first step is your hyperparameter tuning in this case, which is you would select basic or exhaustive, one of those two methods of selecting filters. These filters will be applied to XYZ axis of the sensor data or the norm of the sensor data, or norm squared. You can choose that.

Speaker 1:

Next step is automatic window length selection. So, as you can imagine, when you are applying these features, you'd be applying that to a window of data. So, looking at the spectral content, essentially this algorithm will be able to pick up what is the right window length. So imagine this example of human activity recognition. So for human activity recognition, if you want to detect if a person is walking, running, jogging, a one second update would be good enough. So if your sensor data is acquired at, let's say, 25 hertz, your window length in this case would be 25 samples. Automatic feature selection. So there are five different methods that compose this automatic feature selection. So you have random forest, you have ANOVA, you have AdaBoost, you have sequential feature addition or recursive feature elimination. So using any of these five methods, or a combination of these five methods, you can arrive at a set of features that will be utilized to train your model.

Speaker 1:

On Now, sensor you know that we talked about smart sensor. It offers you 31 such features. You know such features include mean, variance, energy in the band, peak-to-peak value, zero, crossing, all of those. So essentially, you will go through that process. Next step would be, you know, generating a decision tree. So this can all be done automatically. Now, once the decision tree is generated, you would generate the. You would transform that decision tree into a file that will be downloaded ultimately into the sensor. So, from cloud through gateway, this new decision tree would be downloaded onto the sensor tile or stwinbox one of those two devices or your own platform. So there's a bidirectional communication that you would need to establish, which is what I'll talk about next. So bottom line here is that you have this whole process of extracting filters and features, applying filters and features to the sensor data that you have made available, generating a decision tree and then preparing that to be downloaded down into the sensor. Finalizing all of that, you know, preparing that in the right format. It is all done using one button. So you know, just you know, try. Mlc button will do all of this, so it will prepare the solution for the machine learning core in the sensor. Of course, it will also give you the statistics when you provide label data. Through this process, you will arrive at a decision tree and that decision tree, the statistics on that, will be presented in that.

Speaker 1:

So now let's briefly talk about the IoT system itself. So the IoT system essentially has these three components. So sensor nodes, so sensor nodes. You can use these use cases that are shown on the portal. There are six examples I talked about. So you can use one of those to start with, or you can bring in your own data. So, using the CSV wizard that I talked about, you can upload your own data for processing.

Speaker 1:

The second piece is the gateway. So gateway runs a logic called as data sufficiency logic, which means the data that has been transferred from the sensor tile is processed on Gateway, where you're conserving the communication bandwidth. You're not shipping stationary data because if you just consider the asset, most of the time it'll be stationary. You don't want to keep on shipping the stationary data all the time to the cloud. You want to ship relevant data for all the selected classes. That will be done on the gateway. So gateway. The other thing it does is it does protocol translation. So you can connect your sensor tile to the gateway using either BLE Bluetooth Low Energy Connection or using USB Bluetooth low energy connection or using USB. So the protocol translation essentially does, you know, take this Bluetooth BLE data and then prepares that in the format which can be processed by the data sufficiency logic. The next piece it does is identity translation. You can imagine hundreds of thousands of those sensor tiles that are deployed, or your sensors that are deployed all over. So you need to set up a bidirectional communication between the sensor smart sensor through this gateway to the cloud. So this identity translation is done for each sensor with this digital twin in the cloud.

Speaker 1:

So there are multiple paths of building a solution. So the very first one that can be is you take your sensor tile using a USB connection to your laptop. You can essentially try this out. You have your solution STIoT craft solution which is running in the cloud. You have a web browser to access it and essentially it does this. Next one is you can bring in a Raspberry Pi kind of a gateway. This gateway essentially will run your data sufficiency logic, will do some storage for you and then communicate through a STIoT Craft client app to the cloud solution. Next implementation you can have is you can build your own client app if you do not want to utilize the one that is provided with STIoT Craft. And finally, if you are happy with this solution, then you can essentially to utilize the one that is provided with STI IoT Craft. And finally, if you're happy with the solution, then you can essentially take all the logic that you have developed, which you have tested using Sensortilepro, onto your own device. Your device could be your smartwatch, your IoT device, it could be your you know, whatever that you know industrial sensor node, whatever it may be.

Speaker 1:

And finally, just describing this, what are the different components that we have? This is sort of to give you a full idea of what this IoT system is. So, in case of IoT system, I talked about protocol translator, identity translator, the client app, all of those pieces. There is Cosmos DB that is used to essentially store all of this data in, you know, globally. You can have, you know, this solution deployed using Cosmos DB kind of storage mechanism.

Speaker 1:

So, in summary, we talked about a platform that uses distributed computing paradigm. You run, you know, your AI logic inside the sensor, which essentially will give you a solution that is very, very effective in terms of reducing the battery power consumption for a battery-powered device. You run logic, such as data sufficiency logic, on the gateway and the third chunk of logic runs in the cloud, which is you, you know, generate the best AI algorithm for the selected sensor. Data and adaptation will occur here as well, in the cloud. So you know, we have a solution that can be used to, you know, put together a solution that is very effective in terms of being power efficient, you know, fully deployable, worldwide and scalable. That's all I have. I can take any questions that you may have.