Rabbits can detect danger, they are constantly on the lookout for predators, such as foxes. They have good sense of hearing, sight and excellent sense of smel

According to this article from One Kind Planet, "They have nearly 360° panoramic vision, allowing them to detect predators from all directions; they can see everything behind them and only have a small blind-spot in front of their nose. Their eyesight is geared towards detecting movement at great distances, however, so their up-close vision is limited. Whilst their vision is good, their sense of smell is better, and they will likely be able to smell a predator before they see them.

Rabbits hear in a similar range to humans, but they can detect higher frequency sounds than we can. Their hearing is highly developed, and they can detect sounds from far away. Rabbit ears can move independently of each other, a feature which is used by rabbits to help work out where a sound is coming from."
I was inspired by this video when I was thinking about security in Azure Sphere. Security is avoiding danger .... reducing risks. Always be alert!

That's one of the reasons why Azure Sphere architecture is built with many security features in mind. From application environment, authenticated connection, software updates, Azure Sphere platform provides high-value security at a low cost.

Just like a rabbit, when we deploy IoT project in the wild, it has to be aware of its surroundings, detecting predators around. Reducing risk of being "pwned".

One way to secure data is not to send it to the internet for processing. If it can be processed locally on the device or somewhere in local network, it's better for security. Less possible leakage of data to hackers, less data loss due to network traffic, less processing cost by not sending data up to the cloud somewhere.
That's the promise of Edge Computing. Data processing happens closer to the user. The world of computing is becoming user-centric again and not up in the server or cloud somewhere.


Container support in Azure Cognitive Services


Azure Cognitive Services are rich APIs that are available to developers to build intelligent applications without having direct AI or data science skills or knowledge. It helps us developers to create applications that can see, hear, speak, understand, and even begin to reason.


I was surprised how easy to use these Pre-built AI models in my projects. Then I saw container support, and it blew my mind.

Yeah, that's how it felt like. It means, I can deploy these API to a container locally. I don't have to send it up to be processed. This is great for IoT applications, for Edge Computing, for security.
If I combine the security of Azure Sphere, have these Azure Cognitive Services API in a container running locally, then I would have relatively secured data without sending it outside of local network.
Yeah... blew my mind.
So which Azure Cognitive Services API I picked? Well something that I think would make it "begin to reason".


Anomaly Detector containers

Like a bunny rabbit watching it surroundings for possible dangers, Anomaly detection algorithms can detect unexpected items in a time-series data stream. It will start "jumping" for anomalies in our data. We've experienced this already in our day-to-day life, if you have a credit card and you're on vacation somewhere, then suddenly you can't use it anymore. There's an algorithm somewhere that runs and detects, hey this is not normal behavior, let's flag it. That's an example of Anomaly detection working for you. It's not perfect, but it's worth it when it detects fraud.
That's when machines started to "begin to reason", it's something abnormal so it detects anomalies.

Azure Anomaly Detector API easily embed anomaly detection capabilities into your apps so users can quickly identify problems. It can detects anomalies as they occur in real-time because it infers the expected normal range of your data.


What is the project I did?


My goal is to show how to use Anomaly Detection API in Azure Sphere by using sensor data.


I used the accelerometer data and send it to Anomaly Detector API running in a container on my laptop or raspberry pi. If it detects an anomaly it sends the signal to the relay and make the bunny jump.


It won't save the world, but if it inspired you to save the world thru the technology used here, then I helped made the world a better place.





Besides, it's a cute bunny, who can't resist. Come on click on the "Thumbs Up" button and follow me.



1. Hardware

Of course the Azure Sphere Development board from Avnet. A battery pack to power this thing.



A Relay Click from MicroE
A Remote control bunny. Just because...
Here's how the whole setup looks like. I connected the controller of the bunny to the relay.

2. Software

Since this is a contest, here's the stats.
I followed these instructions on how to setup and connect azure sphere. Here's one you can use. Monica has the Lazy person's guide to Azure Sphere. I'm lazy, so I followed that.

then I started with this sample project. If I'm going to use Cognitive Services I need to learn how to access an API in Azure Sphere

I'm a C# developer, this Azure Sphere can only be programmed in C... where's the sharp.
I guess I need to be sharp to be programming in C again. LoL. I was surprised how many instructions that need to be setup just to send an HTTPS call. Look.

/// <summary>

///     Download a web page over HTTPS protocol using cURL.

/// </summary>

static void PerformWebPageDownload(void)


    CURL *curlHandle = NULL;

    CURLcode res = 0;

    MemoryBlock block = {.data = NULL, .size = 0};

    char *certificatePath = NULL;



    bool isNetworkingReady = false;

    if ((Networking_IsNetworkingReady(&isNetworkingReady) < 0) || !isNetworkingReady) {

        Log_Debug("\nNot doing download because there is no internet connectivity.\n");

        goto exitLabel;




    Log_Debug("\n -===- Starting download -===-\n");



    // Init the cURL library.

    if ((res = curl_global_init(CURL_GLOBAL_ALL)) != CURLE_OK) {

        LogCurlError("curl_global_init", res);

        goto exitLabel;




    if ((curlHandle = curl_easy_init()) == NULL) {

        Log_Debug("curl_easy_init() failed\n");

        goto cleanupLabel;




    // Specify URL to download.

    // Important: any change in the domain name must be reflected in the AllowedConnections

    // capability in app_manifest.json.

    if ((res = curl_easy_setopt(curlHandle, CURLOPT_URL, "https://example.com")) != CURLE_OK) {

        LogCurlError("curl_easy_setopt CURLOPT_URL", res);

        goto cleanupLabel;




    // Set output level to verbose.

    if ((res = curl_easy_setopt(curlHandle, CURLOPT_VERBOSE, 1L)) != CURLE_OK) {

        LogCurlError("curl_easy_setopt CURLOPT_VERBOSE", res);

        goto cleanupLabel;




    // Get the full path to the certificate file used to authenticate the HTTPS server identity.

    // The DigiCertGlobalRootCA.pem file is the certificate that is used to verify the

    // server identity.

    certificatePath = Storage_GetAbsolutePathInImagePackage("certs/DigiCertGlobalRootCA.pem");

    if (certificatePath == NULL) {

        Log_Debug("The certificate path could not be resolved: errno=%d (%s)\n", errno,


        goto cleanupLabel;




    // Set the path for the certificate file that cURL uses to validate the server certificate.

    if ((res = curl_easy_setopt(curlHandle, CURLOPT_CAINFO, certificatePath)) != CURLE_OK) {

        LogCurlError("curl_easy_setopt CURLOPT_CAINFO", res);

        goto cleanupLabel;




    // Let cURL follow any HTTP 3xx redirects.

    // Important: any redirection to different domain names requires that domain name to be added to

    // app_manifest.json.

    if ((res = curl_easy_setopt(curlHandle, CURLOPT_FOLLOWLOCATION, 1L)) != CURLE_OK) {

        LogCurlError("curl_easy_setopt CURLOPT_FOLLOWLOCATION", res);

        goto cleanupLabel;




    // Set up callback for cURL to use when downloading data.

    if ((res = curl_easy_setopt(curlHandle, CURLOPT_WRITEFUNCTION, StoreDownloadedDataCallback)) !=

        CURLE_OK) {

        LogCurlError("curl_easy_setopt CURLOPT_FOLLOWLOCATION", res);

        goto cleanupLabel;




    // Set the custom parameter of the callback to the memory block.

    if ((res = curl_easy_setopt(curlHandle, CURLOPT_WRITEDATA, (void *)&block)) != CURLE_OK) {

        LogCurlError("curl_easy_setopt CURLOPT_WRITEDATA", res);

        goto cleanupLabel;




    // Specify a user agent.

    if ((res = curl_easy_setopt(curlHandle, CURLOPT_USERAGENT, "libcurl-agent/1.0")) != CURLE_OK) {

        LogCurlError("curl_easy_setopt CURLOPT_USERAGENT", res);

        goto cleanupLabel;




    // Perform the download of the web page.

    if ((res = curl_easy_perform(curlHandle)) != CURLE_OK) {

        LogCurlError("curl_easy_perform", res);

    } else {

        Log_Debug("\n -===- Downloaded content (%zu bytes): -===-\n", block.size);

        Log_Debug("%s\n", block.data);





    // Clean up allocated memory.



    // Clean up sample's cURL resources.


    // Clean up cURL library's resources.


    Log_Debug("\n -===- End of download -===-\n");






Yeah, I struggled writing C again.




I found how to use the Relay on Azure Sphere dev board thru this article.
If you're going to play with click boards, clone this repository to start. I used the relay demo


git clone --recurse https://github.com/Avnet/clickboard_demos

Anomaly Detector

Ok, so how do I connect this to Anomaly Detector running in a container?
First create a resource in Azure.
I followed the instructions here
Here's the link to get an API key valid for 7 days.
After you get your API key, if you want to try out and test Anomaly Detector API, here's the link https://algoevaluation.azurewebsites.net/#/
Fill in your Key, click on Sample 1, then Start Notice how it's simulating the event, if it finds an anomaly
When you hit stop, look at the Current Response window. It will tell you if the last value is detected to be an anomaly.


Then you can create your Azure Resource


Click Create -> fill in Name, Subscription, Location, Pricing tier, Resource Group


After it's done, click Go to resource
Then you'll have your api key and endpoint
I followed these instructions to run the API in a container.
Important: As of Nov 2019, Anomaly detector container API requires permission to login to private registry to download the container.
You must first complete and submit the Anomaly Detector Container Request form to request access to the container. You must use an email address that's associated with either a Microsoft Account (MSA) or Azure Active Directory (Azure AD) account in the form.
Once you're request is approved, you will receive an email with instructions to access and login to the private container registry.
Once you login, now you can download the image


>docker pull containerpreview.azurecr.io/microsoft/cognitive-services-anomaly-detector:latest

Then you can run


docker run --rm -it -p 5000:5000 --memory 4g --cpus 1 \

containerpreview.azurecr.io/microsoft/cognitive-services-anomaly-detector:latest \

Eula=accept \

Billing={ENDPOINT_URI} \


Go to your favorite browser, then you can access the api

Note: Anomaly Detector container API need to be occasionally connected. Meaning, it has to connect to the internet to send billing data once in awhile. This is ok, at least I don't have to send any sensor data to Azure Cloud.


Putting it all together

Now that I have the pieces together, there are few things I need to do. Since I'm sending data to Anomaly Detector API, I had to learn how to parse JSON in C, learn how to collect data and add timestamp.

For parsing JSON, I used

I had to re-learn C the hard way thru this



The important part here is that "AllowedConnections", make sure the ip address of your laptop/raspberry pi with Anomaly Detector container API is accessible. You can also directly connect this to your Azure Anomaly Detector Endpoint too instead of your laptop.

If you're running your container on Windows, one thing I learned is that you have to go to Windows Firewall and open port 5000. This way Azure Sphere can access the container API.



"SchemaVersion": 1,

"Name": "Anomaly Detection",

"ComponentId": "011b3254-6b4e-4af1-bc81-22e7f677f4bf",

"EntryPoint": "/bin/app",

"CmdArgs": [ ],

"Capabilities": {

"AllowedConnections": [ "", "", "anomaly-detector-ron.cognitiveservices.azure.com" ],

"AllowedTcpServerPorts": [],

"AllowedUdpServerPorts": [],

"Gpio": [ 8, 9, 10, 15, 16, 17, 18, 19, 20, 12, 13, 0, 1, 4, 5, 57, 58, 11, 14, 48 ],

"Uart": [],

"I2cMaster": [ "ISU2" ],

"SpiMaster": [],

"WifiConfig": true,

"NetworkConfig": false,

"SystemTime": false


"ApplicationType": "Default"



Main program


On-board Sensors

I am using the accelerometer data and since Anomaly Detector currently support single value, I decided to use the y-acceleration value.


memset(data_raw_acceleration.u8bit, 0x00, 3 * sizeof(int16_t));
lsm6dso_acceleration_raw_get(&dev_ctx, data_raw_acceleration.u8bit);
acceleration_mg[0] = lsm6dso_from_fs4_to_mg(data_raw_acceleration.i16bit[0]);
acceleration_mg[1] = lsm6dso_from_fs4_to_mg(data_raw_acceleration.i16bit[1]);
acceleration_mg[2] = lsm6dso_from_fs4_to_mg(data_raw_acceleration.i16bit[2]);
tracked_value = acceleration_mg[1];
Log_Debug("\nLSM6DSO: Acceleration [mg]  : %.4lf, %.4lf, %.4lf\n",
acceleration_mg[0], acceleration_mg[1], acceleration_mg[2]);

I also found out that the lowest Anomaly Detector API data granularity is minutely. So I have to trick the API by changing time series stored in my data list. The initial startup of program, would capture the current time to variable refTime. And every seconds, I capture the accelerator data, add 1 minute to refTime, push an entry to my List.



I also purge the List if the sensor data is more than 200. So it's only tracking the last 200 entries.
refTime = refTime + minute;
struct tm tm = *localtime(&refTime);
printf("now: %d-%02d-%002dT%02d:%02d:%02dZ\n", tm.tm_year + 1900, tm.tm_mon + 1, tm.tm_mday, tm.tm_hour, tm.tm_min, tm.tm_sec);
char* timeBuffer = (char*)malloc(TIMESIZE);
snprintf(timeBuffer, TIMESIZE, "%d-%02d-%002dT%02d:%02d:%02dZ",
tm.tm_year + 1900, tm.tm_mon + 1, tm.tm_mday, tm.tm_hour, tm.tm_min, tm.tm_sec);
Entry* entry = malloc(sizeof(Entry));
double value = (double)tracked_value;
entry->data = value;
entry->time = timeBuffer;
List_push(entries, entry);
if (entries->count > 200) {

Eventually, once I have enough data to send to Anomaly Detector API, I build the json string from List.


Sending telemetry: {
"MaxAnomalyRatio": 0.25,
"Sensitivity": 95,
"Granularity": "minutely",
"Series": [
"Timestamp": "2019-01-01T00:02:00Z",
"Value": 0.8540000319480896
"Timestamp": "2019-01-01T00:03:00Z",
"Value": 68.076004028320312
"Timestamp": "2019-01-01T00:04:00Z",
"Value": 71.370002746582031
"Timestamp": "2019-01-01T00:05:00Z",
"Value": 71.003997802734375
"Timestamp": "2019-01-01T00:06:00Z",
"Value": 70.150001525878906
"Timestamp": "2019-01-01T00:07:00Z",
"Value": 71.248001098632812

I create a POST to Anomaly Detector API endpoint. I'm using /anomalydetector/v1.0/timeseries/last/detect
you can test it out on your container too.


The result would look like this. I use parson library to read the json data received from the API. I use the isAnomaly attribute.




"period": 0,

"suggestedWindow": 1441,

"expectedValue": 71.1259994506836,

"upperMargin": 0.74541999816894533,

"lowerMargin": 0.74541999816894533,

"isAnomaly": false,

"isNegativeAnomaly": false,

"isPositiveAnomaly": false




I read the isAnomaly attribute, then set the relayState to set or clear.



root_object = json_value_get_object(root_value);
bool isAnomaly = json_object_get_boolean(root_object, "isAnomaly");
Log_Debug(isAnomaly ? "true" : "false");
if (isAnomaly)
   relaystate(rptr, relay1_set);
   relaystate(rptr, relay1_clr);

That's about it. Here's a thought:



My take on this ... focus your efforts. Glad I'm done.


If this project inspired you to work with Azure Sphere, or rekindle your love programming in C, or was blown away by Anomaly Detector API, press the "thumbs up" button and follow me.