|Product Performed to Expectations:||10|
|Specifications were sufficient to design with:||8|
|Demo Software was of good quality:||9|
|Demo was easy to use:||10|
|Support materials were available:||8|
|The price to performance ratio was good:||9|
|TotalScore:||54 / 60|
First of all, I would like to thank Randall, Daniel, everyone at element14, Farnell and Maxim for the fantastic opportunity I have been given to test these kits. Originally I intended to cover all the review material in one single RoadTest post, but considering its size I thought it would make sense to split it into two separate but consecutive parts. Therefore, this is the first part of a two parts RoadTest review of the 1-Wire® Grid-EYE® Sensor w/Arduino-Compatible PCB.
The MAXREFDES131# 1-Wire GridEYE Sensor and MAXREFDES130# Building Automation Shield combined together can be used for a wide range of applications, and probably each kit should have deserved its own RoadTest. I will do my best to cover as much ground as possible, but inevitably some features will not be thoroughly tested (apologies for that!). In particular, I will focus more on object detection leveraging thermal information, therefore the GridEYE sensor will be my “champion”, relegating the Building Automation Shield to the role of “sparring partner”.
This first part of the RoadTest will focus on describing the kits, and going through the unboxing and the standard documentation, while also performing some basic testing of both the sensor and the board, using the demo software provided. For that, I am using Arduino Uno as controller, and my host will be a Linux machine for all those tests but one: the GridEYE sensor demo application can only run on Windows (and so be it!).
In the second and final part of the RoadTest, I will explore one of the possible applications of the kits that particularly caught my imagination: I will implement a very simplistic prototype for a safety monitoring system for a building automation application, manageable via web dashboard. This system will monitor a well defined area, where some hazardous device/machinery is supposed to be operating, aiming to detect human movement near the “dangerous” spot. If such movement is detected, an alarm will trigger some corrective actions (i.e. power down/disconnect the device), to make sure the hazard is removed and the safety of the human preserved. Using this system, I will also test the limits of the thermal sensor. The MAXREFDES131# board will provide the thermal sensing capability, the MAXREFDES130# will act both as gateway to the sensor and actuator for the corrective actions. The shield will be connected to the Arduino Uno board, which will run the necessary firmware to manage both the boards. Arduino will be connected to a Raspberry Pi2 node via serial over USB connection. The latter will host a web server and the application logic, and will be connected to the internet via RJ45 cable.
The 2 kits arrived in an element14 box, nicely protected in bubble-wrap. Inside, each kit had its own box.The small black box was for the MAXREFDES131#, and had the Sensor Board and a RJ11 6P6C 2m long cable, while in the bigger one for the MAXREFDES130# there was: the Building Automation Shield, another RJ11 6P6C 2m long cable and the 24V power supply, complete with interchangeable plugs (EU, UK, USA, AU).
MAXREFDES131#: The board is based on the Panasonic AMG8833 GridEYE Infrared 8x8 Array Sensor (I2C slave device). This sensor is the 3.3V High performance type – High Gain model and, according to the datasheet, should give us a 0 °C to 80 °C temperature range for the measurement, with accuracy of +/- 2.5 °C, resolution of 0.25 °C and a human detection distance of 7m or less. The view angle of the sensor is 60°, both vertically and horizontally. This view angle, at the max detection distance, offers a whopping max coverage area of 49m2 for human detection, which is rather good, especially for industrial application, where it is more common to find high ceilings (the max coverage area is a square of 7m x 7m, and can be fully exploited if you can install the sensor on a 6m high ceiling, facing the floor). Nevertheless, even in a non-industrial installation, like office blocks, shops or domestic areas, where the detection areas needed are typically smaller, the GridEYE still has plenty of uses.
The sensor’s 64-pixel thermal image frame can operate at 2 different capture rates: 1 frame/sec or 10 frames/sec. Internally the frame capture rate is actually fixed at 10 frames/sec all the time, but when operating at 1 frame/sec the sensor output is calculated as the average of the 10 frames captured during the 1 second interval. In the 1 frame/sec mode, the noise affecting the image is also reduced by 1/3. The rest of the board circuitry provides the 1-Wire slave to I2C master interface and some logic allowing the addressability of the sensor (multiple 1-Wire sensors can be chained together on the same 1-Wire bus leveraging the second RJ11 connector).
MAXREFDES130#: At the heart of this rather big Arduino-compatible shield, there is the versatile MAX11300 programmable mixed-signal I/O device (PIXI), which allows the board to deliver eight 0VDC-10VDC analog outputs, one 4-20mA output, one 4-20mA input, eight non-latching relays (controlled via the MAX4822 8-channel driver), three latching relays. There is also a Real Time Clock, provided by the DS3231 (optionally can be powered with a CR1220 3V battery), and 1-Wire master, provided by the DS2484. Both ICs are I2C slave devices, controllable by the Arduino host.
Looking at the back of the board, it is clear that the kit, although Arduino-compatible, has not been designed to be used as a shield for Arduino Uno itself. Infact, when you plug in the Uno, the USB port shorts some of the latching relay's soldered pins. Good job I had some double-sided tape laying around: just stuck some on top of the USB port and problem solved!
Time to look at testing the kits using the demo provided with the standard documentation. Looking up the information, I had no problem finding enough documentation for the MAXREFDES130#: the hardware is well illustrated and the software APIs can be easily inferred by looking at the demo code and at the library code for the ICs mentioned earlier. Looking at the MAXREFDES131# the feeling changes: as before, the hardware information is readily available, but for the software, only limited information can be inferred by inspecting the demo code and the OWGridEye library. In particular, the most interesting part of such library, the level 3 APIs, which deals with object detection and object tracking, is closed-source and completely undocumented. I believe this library is not provided by Maxim, but it comes directly from Panasonic. In any case, I believe this is a serious shortcoming, considering perhaps the strongest point and key market differentiator for the GridEYE sensor is ironically its ability to detect and track objects!
The first demo tested is the one for the GridEYE. The demo itself is simple but effective: it gives you an insight on what the sensor can do, without gimmicks. It would have been nice to have couple more MAXREFDES131# boards and perhaps some longer patches of RJ11 cable (I managed to get hold and used a 10m cable for the tests), to verify the robustness of the 1-Wire sensor addressing function and the sensors network itself. For the RoadTest, I'm using an Arduino Uno as microcontroller, and to keep things as easy as possible, I'm using the Arduino IDE to build and upload the firmware. Following the instructions detailed in the Quick Start section of the online documentation, setting up the demo is quite straighforward: the only thing to pay attention to is not to install the required libraries by looking them up using the "Manage Libraries" functionality of the Arduino IDE, but rather download them from GitHub as zip file, and load them using "Add .ZIP Library" functionality (the several OneWire libraries installable through "Manage Libraries", if used, lead to the "OWGridEye.h:38:36: fatal error: Slaves/Bridges/Bridges.h: No such file or directory" ). Once all the hardware has been plugged and powered-up and the firmware uploaded, all is left to do is to plug in the USB cable from Arduino to the Windows Host and launch the MAXREFDEF131_Demo.exe application.
Unlike for the GridEYE, the demo for the Building Automation Shield requires a bit more work for the setup. For the testing, I am using 2 digital multimeters, a breadboard, 8 green LEDs, 3 yellow LEDs, a 5V power supply and a few wires.
First, lets compile the demo firmware. Detailed information on setting up the demo are provided in the Quick Start section of the online documentation. To be able to compile the demo sketch, the following libraries need to be installed as .ZIP files (they can all be found on GitHub - links provided):
Once uploaded the freshly built firmware, opening the serial monitor (setting the speed to 57600 baud/sec) will start the demo. The first task is to calibrate the current source/sink, by connecting the current source port to the current sink port, with the digital multimiter in series, to measure the current flowing. The calibration process needs to be repeated both for the lower current (4mA) and the higher one (20mA).
The video shows the result of a sequence of commands sent from the demo, to set some non-latching relays (the green LEDs), then set all, unset all and finally set/unset of the latching relays (the yellow LEDs).
I will skip the test of the RTC and move onto testing one of the eight analog output voltage ports and the analog output current port. Setting the port AO8 to 3.3V and port AIO to 10mA via demo causes the digital multimiter to read the following from the ports (as shown in the photo):
This concludes the journey into the demo software distributed with the kits and also the first part of this RoadTest. In general, my impression on the software is good: the code is sufficiently commented, and easy to understand. It is just a pity that for such good kits, whose added value really comes from being driven by clever software (for example, think about all the logic behind automating controls), very little emphasys has been put on producing and documenting libraries that could provide building blocks and make the designer's job a little easier. Regarding the hardware, I'm very pleased with the way both kits performed so far. The GridEYE sensor seems to fullfill its promises, although its limits will be explored in more depth during the executon of my project. The Automation Board is like a box of assorted chocolates, you can always find something that you like in it! It really is a sort of swisse-knife for automation project. The only downside with that, is that unless a project really uses all its features, the risk is that it will end up seriously underutilised, which will give a bad return on the investment.
Reading the documentation and the marketing material about the GridEYE sensor, I was intrigued: such a small device, relatively inexpensive, with just 64-pixel resolution, could be used to detect and track human movements, and thus replace expensive and resource intensive video cameras in disparate applications. Is it just that easy? To form a judgement on the matter looks like I will have to "suck it and see". And what better way than prototyping a very simple human detection system to gain that knowledge first hand. This is how the idea was born.
In its most basic form, the system needs to be able to monitor specific areas within its field of vision and detect human presence in such areas. If a human is detected, an alarm must be triggered, and an action is taken. Very simple, yet it identifies a pattern that can be applied to a number of scenarios in different contexts. The scenario I have chosen is related to safety in domestic/commercial/industrial environment, thought as an extension of a building management solution.
Before the design work starts, one fundamental question need to be answered: how do I recognise a human being, using the GridEYE sensor, and what detection range should I expect to work within? To answer, I need to dig a little deeper into the sensor’s specifications and some basic thermography techniques. Generally speaking, there are 2 ways to detect an object using information about its temperature: the first relies on the knowledge of the object’s absolute temperature (i.e. with normal clothing in a room at 15-20°C, mean skin temperature is 32-35°C), the second leverages the knowledge of the environment’s temperature, and detect the object by evaluating the differential in temperature between the object and the environment itself. The choice of which method to use is dictated from the conditions for the measurement.
The GridEYE is an array of 64 thermopile sensors, arranged in a 8x8 cell matrix and Panasonic claims it can detect a human at 7m, based on a 4°C difference with the background temperature and a body size of 70cm x 25cm. The overall view angle of the sensor is 60 degrees horizontally and 60 degrees vertically, which makes each cell of the array have a view angle of 7.5 degrees, both horizontally and vertically. This is a very valuable piece of information, as would let us understand how much area is covered by each cell in relation with the distance.
Using some trigonometry, we can calculate the length B’B’’ covered by a single cell in relation to the distance, and hence the covered area. I have put this information in tabular form:
Distance from the sensor
Now, assuming a typical human body width of 50cm, and covering an area of approx. 0.25m2, from the table it seems reasonable to expect the sensor to be able to detect a human at distance of 7m, with sensitivity rapidly falling once passed the 5m mark. The GridEYE data-sheet claims that at 7m distance it can detect a human body with at least 4°C differential with the background temperature.
Lets now carry some experiments to appreciate the sensitivity of the sensor and to verify to what extent the above claim is true. Unfortunately I don’t have a low noise room available to carry the experiment, so to test the sensor’s limits, I have chosen to measure my body temperature while lightly dressed and in a “noisy” environment (narrow space with several walls and objects), repeating the measurements with environment temperature ranging between 20°C and 28°C. The results are condensed into the following chart:
From the chart I noticed the differential in temperature drops below 2°C as early as 3m distance, and at 7m mark the measurements only grant a differential of 1°C or less. This means that, despite the Panasonic data-sheet, a reliable detection of my body at a distance above 5m, if possible, it is going to be very tricky.
How do I justify the discrepancy between my measurements and the claim on the Panasonic data-sheet? I believe the standard measurement carried by Panasonic has been taken in an environment with very little background noise (i.e. a big room, with walls and object well beyond the 7m sensor’s detection range), and perhaps using an object resembling the human body shape, heated at constant temperature. In my experiment the background noise plays an important role, moreover my body is covered with some light clothing, which interfere with the heat radiation. Therefore, I’m assuming that in my experiment, the detection object size is represented mostly by the head and neck, which will account for a 35cm x 25cm object, rather smaller than the one used by Panasonic (70cm x 25cm).
I can draw the following conclusion from the experiment: the reduced radiation surface of the human body, in a noisy environment, reduces the range for detection. A human body, immersed in such environment, shows a differential in temperature with the background greater than 2°C (which makes the reading more reliable) only for distances below the 3m mark. For distance above 5m, the noise starts to become dominant, and the readings unreliable. At lower temperatures, the sensor seems to perform better, which could be due to the fact that the skin temperature in colder environment drops slower than the noise produced by the background. I hope I will be able to take further measurements in colder environments (<20°C) at some point in the future, to corroborate this.
Now that I have got a better understanding of the problem, lets discuss how to realise a possible software application that will do the monitoring. Hardware-wise, I have available the following components:
The UNO will communicate with the RPI2 via Serial over USB. RPI2 will be connected to the internet (either via RJ45 or Wifi).
We will need 3 main software components to assemble the solution:
The firmware is developed using C language and the Arduino IDE, all the other components are developed using the Python language. I have chosen Python for the implementation purely because it is a language that lends itself very well to rapid-prototyping. Same rationale has been used for the choice of communication protocol between UNO and RPI2 and RPI2 and the web browser: I used JSON (I have to say it did give me some headaches on Arduino, because the ArduinoJSON library can be very memory-hungry).
I will not dwell on the details of the implementation, as it would end up in a lengthy (and boring) write-up, but please do feel free get in touch if you are interested in the gory details. Anyway you can download and take a look at the whole code used for this prototype from my GitHub repository.
The video shows the prototype in action (by the way, in the video I wrongly stated that the absolute temperature set for the detection in the demo was 25°C, but it was actually 24°C!) . I’m planning to make another video, setting up a more “real-life” scenario. I will add it as soon I get around to record it.
First of all, couple of words about the hardware. The two boards are nicely designed and are very useful, their layout is clear and well organised. The MAXREFDES130# Building Automation Shield gets a score of 10 out of 10, as it is a very good board all round. Its price of about £130 I believe is justified, if you use all the functionalities it provides. The MAXREFDES131# GridEYE sensor, on the other hand, has provoked mixed reactions in me: on one side it is a great sensor, and the possibility of using the 1-Wire bus and daisy-chain multiple sensors on a single bus, is a definite plus, and it works really well if used within short-medium range (gesture recognition and small rooms monitoring are probably the best applications for it). But definitely, contrary to my first impression, I don't think it will be well suited for industrial applications, where monitoring of larger areas might be required. The claimed 7m range to me seems a bit over-stretched: I don’t doubt the sensor can detect human body at that distance, but the complexity of the processing needed to extrapolate reliable data would probably be an overkill for the majority of real-world applications. I would have loved to test the Panasonic GridEYE library, which provides some advanced APIs for object detection and tracking (perhaps those APIs exploit the sensor’s full detection range?), but no source code has been provided, and the binary library distributed with the OWGridEYE package looks like it has been built for a 32-bit ARMv5 target platform, which I didn’t use. And anyway, no documentation was available for the APIs. Therefore, the MAXREFDES131# can only score a 7 out of 10.
One final word of appreciation to the Arduino UNO: despite its limited processing power and its small-sized memory, and forgetting the few “hiccups” while prototyping the firmware, I have to say it performed very well, proving once again that this very cheap board is indeed an invaluable tool when it comes to rapid prototyping with microcontrollers.