|Product Performed to Expectations:||7|
|Specifications were sufficient to design with:||8|
|Demo Software was of good quality:||9|
|Product was easy to use:||8|
|Support materials were available:||9|
|The price to performance ratio was good:||7|
|TotalScore:||48 / 60|
This is my review of the /SMARTEDGE AGILE sensor and the Brainium IoT platform. The platform and the sensor are designed to form an end to end solution for providing both artificial intelligence and security.
As a background, I’m a lifelong hobbyist of (home) automation, I have a MSc in electrical engineering, and I’m currently employed as a software architect, designing and implementing IoT-solutions for monitoring and control of solar power plants.
When I first received my /SMARTEDGE AGILE, the first thing I noted how little information about the device was actually given in the packaging. This is of course understandable, as people buying this kind of device generally know what they’re buying, instead of browsing through the shelves of the local techstore.
Of course there are the obligatory information on how to get started, what sensors are included, et cetera, so in a sense I could see this device on store shelves as well. The second thing to note about the device, is just how small it is. The dimensions are just 32 x 17 x 64 millimeters, and the whole thing weighs in at 27 grams. This is even more surprising once we get to the second stage, which of course is to tear the whole thing open!
The teardown itself is very easy, just couple of hex screws at the end of the device. However, I wouldn’t suggest tearing up the device to anyone, as there are couple of rubber seals which are fairly easy to break. This is nice to see, even if the device doesn’t advertise any IP rating.
On the inside, there’s an STM32L486 -series microcontroller from STMicroelectronics, which should provide plenty of processing power for this kind of application. The BLE 5.0 is handled with nRF52840 by Nordic Semiconductor, which is to be expected as Nordic seems to be the go-to BLE IC provider nowadays. Also there is some kind of expansion header (?), which at least at the moment of writing is undocumented. Hopefully we’ll see documentation and programming examples in the future, for adding custom sensors…
At this point I’d like to note, that I did the commissioning the way I’d expect anyone to do it; without reading any documentation.
Once the hardware was back in one piece, it was time to proceed with the commissioning of the SMARTEDGE AGILE by registering to the Brainium -platform. The registration itself is done in two steps, first only email address is requested. At this point you’ll receive an email with “registration token”, which is used in the next step, together with a new password. My guess is that this is done to prevent registration with non-existent email account, a fairly common practice nowadays.
After logging in for the first time, you’re greeted by a short step-by-step tutorial for connecting the AGILE devices, as shown below.
As the device has no means of communication besides Bluetooth, it requires a gateway to access the internet and any server-side software it depends on. Installing of the Brainium gateway application to an Android phone goes through the usual route of going to the Play Store, installing the application and running it. After logging in, your device is setup as a gateway, ready for bridging communication between SMARTEDGE AGILE and the Brainium cloud service.
At this point I first felt confused, then frustrated and finally stupid. There I was, staring at the “Step 1: Connect Agile devices” on my PC, phone connected as a gateway and nothing happened. There was no way of searching for devices on the Android application, which left me a bit baffled. After a while, I realised that the step-by-step tutorial is not interactive at all. To proceed, you need to click on the faintly visible purple arrow visible at the bottom-right corner on the screenshot above (note, the image is cropped for viewing pleasure, the page itself was centered on a fairly large screen ).
Once I had figured out my stupidity, I was so frustrated that I think I didn’t really follow the “recommended” steps of setting up the system. Instead of creating a project, I started by browsing through the web interface, and finally found my way to the Gateways -page. At least my phone showed up and I could rename it to something reasonable.
After this, it was time to connect the SMARTEDGE AGILE with my gateway. As a first step, I had to agree to the terms of “discovery period”, which means 180 days of portal usage, 15 GB of cloud storage, and 2 GB of “meta sensing traffic”. The limits are served as “whichever comes first”, so if you stream real-time data constantly, you’re bound to run out of traffic.
Connecting the SMARTEDGE AGILE to the gateway was in the end fairly simple, the search of devices is started via the web page and device appears to the UI fairly quickly. One negative thing about the listing is that (as far as I understand) there’s no identification on the device, which corresponds to the web interface. This means that you’re most likely to pick up your co-workers device, if setting up simultaneously.
After connecting the SMARTEDGE AGILE to the Brainium platform, it was time to setup a project to utilise it. Setting up the project was also fairly simple, maybe because I was kinda familiar with the user interface already. It was just a matter of selecting a name and adding the device to the project. After setting up a few widgets to show values, it was time to start streaming data from the sensors for the first time. This can be useful for checking on a device from time to time, but my guess is that this is also the best way to get rid of remaining traffic quota fairly quickly.
My very first reaction was “wow”. The data streaming is incredibly fast and responsive, considering the steps in the transmission chain (it should be noted that I of course had cranked up the speed choices). The delay between moving the device next to the light on my testbench and values updating on screen was almost non-existent. After playing around with different sensors on the device, it’s fairly obvious that the sensors are pretty much the best you can get. They seem both very accurate and responsive according to my tests.
Once the initial tests were done, it was time to get to know the artificial intelligence side of the device. The AI functionalities seem to currently support only one sensor, the accelerometer. I would have hoped for a bit broader sensor support from the start, for example the light and sound would be useful for my own use cases. The good thing to note is that seems like both the firmware and the web portal seem to receive constant updates, so I have high hopes of future improvements. At the time of writing there are two functionalities in the AI system; motion recognition and predictive maintenance.
I first started with the motion recognition, as my plan was to utilise it in my test project. The UI is quite good, and setting up the motion recognition model is fairly simple. Also, it’s possible to setup multiple motions for a single model. Below is a screenshot of my first test training, which included just moving the device in a similar manner to a door opening and closing.
There really isn’t a lot to configure, but I guess that's the whole point of using artificial intelligence, trying to get rid of human involvement.
The predictive maintenance functionality is even more simple in comparison to the motion recognition. The goal is to just attach the AGILE to a device you want to monitor, and the AI detects the typical operation. If there is changes to this behaviour, they are detected as anomalies. Below is a could of screenshots of the UI for this, with the AGILE sitting on my desk as I’m writing this review.
There’s even less information how this functions in comparison to the motion recognition, as you don’t even get to see the data on which the AGILE has reacted to. All you see is the anomaly type, the timestamp and the deviation from the typical pattern. The learning does seem to work, as there are no constant anomalies detected from my typing and all of the five times I moved the device are detected.
Unfortunately I didn’t have the time to do this part properly because of the busiest time of year at work, so I’m just going to write up what I aimed to do and how I’m planning on doing it.
My home automation at the moment consists of multiple measurements, including temperature, humidity, and light sensors. Also included are motion detection in all main areas and magnetic switches on the doors and windows. The one thing I haven’t been reliably detect for now is whether I’m going in or out of the door. The goal for this would be to set the home/away state of the automation.
This is where the SMARTEDGE AGILE will step in. My goal is to teach it the different patterns of door opening and closing, depending on the direction I’m going. From the tests I’ve done so far, it seems like it’s perfectly possible to detect the differences in movement, timing etc.
When first looking into the device, I only used my phone as the gateway. This of course won’t work, if it’s going to be a more permanent solution. Luckily there exists a gateway software for Raspberry Pi. Installing and setting up the software was exactly as simple, as the instructions suggest. The gateway has worked fairly well, but it should be noted that I haven’t tested it together with other bluetooth devices connected to the same RPi.
Once the Brainium -platform is setup, the next logical step is to try and get the information out of it, for consumption on other platforms. To achieve this, first we need to setup our rules to actually get alerts from anomalies. Also it’s possible to just monitor sensor values changing and receive alerts from going over/under threshold values. Alerts are set up through the AI Rules -section in the device management, shown in the screenshots below.
After saving the rule and the alert for it, the alerts appear to the list in the Brainium platform, as shown in the screenshot below.
For getting the alerts out of the platform, Brainium offers three ways; portal only, e-mail, and IFTTT integration. Luckily there’s third option the don’t yet advertise that much, which is the APIs.
The Brainium platform provides both REST and MQTT APIs for receiving the alerts from the devices. I opted for the MQTT, as it provides me with a real-time alerts without having to poll for new alerts constantly. Setting up the MQTT client was a breeze, as the platform has quite excellent documentation.
After setting up the MQTT client, it’s just a matter of subscribing to the alerts -topic. This can be done either for each device separately or all devices at once. After subscribing to the topic and setting up a callback function for handling it, the rest of the integration should be a breeze.
The Brainium -platform seems very promising and easy way to start doing stuff based on artificial intelligence. The platform is fairly simple to use, although tooltips to better guide you through the UI wouldn’t help. Also the UI should be redesigned a bit to work better on mobiles and tablets. At least I would think that the most obvious choice for maintenance personnel walking through a factory would be to use a phone or a tablet, instead of lugging around a laptop.
As for the SMARTEDGE AGILE, my thoughts are a bit conflicted. The device itself seems fairly robust and fit for initial testing of the platform. But for actual deployment the battery life doesn’t seem long enough and I would also like to see at least one (preferably more) voltage (0-10 V) and/or current (4-20 mA) inputs. Both of those are very commonly used in existing systems, and would therefore allow integration to existing measurements very easily. Of course this would also mean expanding the pattern recognition and predictive maintenance functions to these signals as well.
Seems like the SMARTEDGE AGILE was created to give an very easy way of testing out the platform, but without documentation on how to integrate the functionalities to your own device, I get the feeling that at least for me it would be hard to find a use case for this. On the other hand, if there was a clear solution for either integrating the stuff on my own hardware or even streaming data from my own platform to Brainium for the AI-side, I’d see numerous use cases already.