|Product Performed to Expectations:||9|
|Specifications were sufficient to design with:||9|
|Demo Software was of good quality:||8|
|Product was easy to use:||8|
|Support materials were available:||10|
|The price to performance ratio was good:||9|
|TotalScore:||53 / 60|
The is a very neat sensor array (available on a ) that can be used to sense infra-red heat. This is useful for tasks such as detecting humans. It differs from the PIR sensors used in home alarms; those PIR sensors rely on heat movement in order to detect presence. The Grid-EYE will continuously sense the presence of IR radiated energy even if there is no movement of the source.
I was keen to RoadTest the Grid-EYE because such a sensor has many real world use-cases such as security, interactive advertising, gesture control and so on. Check out the short (5-minute) video here for a brief overview and to see the experiments I tried with the Grid-EYE. (Note: I had an , but this is now obsolete and the compatible but higher performance is available now).
The Grid-EYE evaluation kit consists of the Grid-EYE sensor on a PCB together with a microcontroller and a Bluetooth Smart (Bluetooth LE) module. There are several ways to work with this evaluation kit – either using Bluetooth Smart, or connecting directly to the board using USB, or by using the Arduino-compatible headers.
Internally the device consists of an array of thermopiles; one per pixel. The physics behind an example thermopile are outlined here. The implementation within the Grid-EYE will be different of course (see further below). But from a physics point of view it can be interesting to see how thermopiles function.
They consist of many thermocouple junctions all chained together. The hot junctions are all connected to a thin, small disk that doesn’t have much thermal mass. The thin disk can be considered to be the pixel. The cold junctions are connected a much larger thermal mass ring. When energy hits the disk the heating effect will cause a voltage to be created due to the temperature difference between the disk and the outer ring. The summed up voltage from all the thermocouples is the sensor output. A separate temperature sensor is used to determine the temperature of the outer ring, i.e. the temperature of the cold junctions.
The Grid-EYE consists of 64 thermopiles implemented on silicon in an array followed by a selector, and amplifier and an analogue-to-digital converter. The output can be read via an I2C interface. There is a small 2.6mm square window with lens (possibly germanium; it is opaque to visible light) to allow infra-red energy within a 60 degree viewing angle.
In the case of the Grid-EYE the silicon thermopiles are constructed using micromachining techniques; the gap between the pixel area and the surrounding disk or substrate is micro-machined and the elements of the thermocouples are possibly a metal layer and a silicon layer (there are various methods according to google; some using polysilicon, others using crystalline silicon).
In theory the applications are vast. The ability to detect humans and their behaviour means that air conditioning and lighting can make better automated decisions so there is potential to save energy in homes and offices. There are security applications too; unlike a passive infra-red (PIR) sensor the Grid-EYE can detect humans even if there is no motion. Data from the Grid-EYE can be processed to extract presence and motion and control security cameras or alert users. There are likely to be healthcare and Internet of Things (IoT) applications too; it is an ideal sensor where users might not use wearable devices. The sensor can signal whereabouts an individual is inside a room, and from this data valuable intelligence about the user’s situation may be extracted. It could have uses for monitoring patients in homes or hospitals as a result.
Beyond typical use-cases like automatic lighting switches in homes or offices, televisions could use such sensors to automatically switch off if no-one is watching. Inside a vehicle the sensor could be used to identify if air needs to be directed to a passenger. The Grid-EYE is very low cost compared to off-the-shelf thermal imaging systems and therefore could also play a role in helping people determine if parts of their home may need better insulation, or for gesture control for consumer devices.
There actually are not many thermopile array sensors available at low cost. Not many manufacturers make them. One I have used in the past is from Melexis. It was straightforward to use but suffered from a few issues; firstly, many distributors do not stock it and it proved difficult to acquire a few samples for zero cost (often it is hard to purchase items quickly in a business without a lot of approvals, so samples can be critical), and it is quite a large sensor in comparison to the Panasonic one. The Melexis one was most likely aimed towards in-car use since the array had an unusual shape (16x4 pixels) while the Panasonic one has a square array (8x8 pixels). The Panasonic one appears to be geared towards mass market use-cases such as air conditioning or IoT sensors and is significantly lower cost (around half the price in small quantities).
Aside from thermopile arrays, individual single thermopiles are available from quite a few sources including Texas Instruments, Amphenol Sensors and Measurement Specialities. The single sensors are useful in devices such as patient monitoring, in-ear temperature sensors and other remote temperature sensing purposes.
There is a wealth of code that is available for the evaluation kit from the Panasonic website. The demo code for the USB interface connection option uses NI LabView but after briefly examining it I decided to use my own code to access the data from Grid-EYE to have a portable option (so we can use it running on any machine including the Raspberry Pi). The LabView code is merely a convenience to help evaluate the product so it’s not an essential thing anyway.
In a real scenario one would directly connect to the Grid-EYE module of course; it has an I2C interface. I decided to use Bluetooth Smart with the evaluation board for reasons described further below, however the board does allow direct I2C interfacing to Grid-EYE by moving some jumpers and using the Arduino compatible header pin locations (header pins would need to be soldered). The board is quite flexible in this respect. Source code for an Arduino DUE is available and it would be easy to port to other microcontrollers too.
Since there are 64 sensors in a 8x8 array inside the Grid-EYE module, this conveniently matched the Raspberry Pi’s Sense HAT 8x8 array of LEDs. It seemed like it would be a fun thing to connect these two devices using the Pi 3’s built-in Bluetooth Smart capability. The exercise would involve me learning what format data appears from the Grid-EYE and what transformations are needed to map it to a temperature. An evening coding resulted in a nicely running system using a Pi to connect to the Grid-Eye board using Bluetooth Smart and displaying the data in two locations; in a web browser and on the Pi’s attached 8x8 LED display. See the Building a Thermal Imaging System with the Raspberry Pi 3, Sense HAT and Panasonic Grid-EYE blog post for more details.
Whoever decided to populate the Grid-EYE Evaluation Board with Bluetooth Smart capability made a great decision, because it is highly useful to access the data in this way! The Grid-EYE can be positioned anywhere in a room to experiment with positioning, and there is no need to worry about running cables up walls. I was able to use Blu-Tack (temporary putty) to stick it to a wall and then walk with my mobile phone (connected to the wireless LAN) and observe the image stream from the Pi. This was a very quick way of getting a good feel for the Grid-EYE’s capabilities.
The other advantage of this method is that one can easily code (using any language of choice) any filters or algorithms for processing the data in real-time on the Pi, before sending it to the browser running on the mobile phone. In my case the processing that I did was to map the data into a ‘heat map’ type of colour sequence (black/dull blue=cold, through red, through to bright yellow=hot for example) and also to run ‘bilinear interpolation’ on the 8x8 array of data for a better high-resolution image. The heat map colour sequence uses an algorithm I’ve used in the past several times, ported to Node.js this time.
The final code doesn’t need a Pi; a Bluetooth 4.0 USB adapter can be plugged into a PC and the software run. It will display the video from Grid-EYE in real time in a web browser.
Here are the instructions to install and run this new software. To install it on a PC, type the following at a Linux command prompt (I used Ubuntu running on x86 hardware since that is something virtually everyone can install and run quickly in a virtual machine for free regardless of the desktop OS used):
As root user:
apt-get install libbluetooth-dev
Then as a normal user:
git clone https://github.com/shabaz123/grid-eye cd grid-eye npm install noble npm install socket.io npm install imagejs
The grid-eye/index_pc.js file will need a modification, search for the line with the text progpath and set it to point to the grid-eye folder. The slash at the end is important. For example for me it says:
That’s it! To run, ensure a Bluetooth Smart USB adapter is plugged in and then as root user type:
Use a browser (on your PC or mobile phone for example) to navigate to http://xx.xx.xx.xx:8081/index.html and a square window for video will appear. At any stage power up the Grid-EYE board (using a 5V USB supply) and the video will immediately start streaming to the browser.
For the Bluetooth Smart USB adapter I tested with one that came up as a ‘Cambridge Silicon Radio’ chipset device (type lsusb in the Linux shell to see this). Also a was tried too, and that worked fine as well (lsusb indicates it to be a 'TDK Corp.' chipset).
The code is experimental grade so be prepared to tweak things if different temperature ranges or colours are needed for instance.
Since I’d already briefly tested the Grid-EYE in the earlier Raspberry Pi Thermal Imaging blog post, this time I focussed on a human detection use-cases. The first test was a dynamic one, to see if the Grid-EYE could successfully detect me if I ran past the sensor. It worked very well (see the earlier video to see it properly in action). I fired up the software and a web browser and the Grid-EYE captured the action. For this test I mounted the Grid-EYE evaluation board at waist height and walked or ran past about 1.5 metres away, to simulate a corridor or walkway.
To make best use of the sensor array it would be important to process the data; the raw data is just temperature values. I implemented some very simple processing for now. With more effort it would be possible to write code that could identify which direction a person walked or ran by comparing the output frames over time.
Another great application came from beacon_dave Dave Ingles in one of the comments in the previous blog post, and he was wondering if it could identify where people were located so that cameras or microphones could be directed appropriately.
I set up the Grid-EYE sensor to point at a seating area where five people could potentially be located.
I then captured video with me sitting at each of those locations. Grid-EYE successfully provided a different fingerprint for each location. It would be super easy to sample (say) five frames, do some pattern matching and return a result with extremely high probability where I’d been seated, all within half a second.
The Panasonic Grid-EYE functioned well and it was exciting to see the possibilities that exist with such a sensor. I liked that it was compact, low power and extremely quick to get started because the evaluation board is so well done. It was an extremely good idea for the board to implement Bluetooth Smart. I was very impressed how easy it was to sense one of five locations in a fairly small area during the occupancy tests, something that would ordinarily be difficult (and more power hungry) with a video camera solution.