I would like to present my final project!



I have created a smart range hood. It can run itself and uses a variety of sensors. It has a local user interface as well as mobile integration. The creation of this device required many, many hours of work to literally build the entire complete assembly from components - custom sheet metal, custom PCBs, custom sensor array enclosure to protect the electronics, and a touchscreen interface in custom box to fit on the front of my range hood. I did all logic and programming over the course of this project.


This project actually completes our kitchen remodel as the final part of the puzzle. We had a 40" range opening and wanted to get new appliances without paying exorbitant prices for fancy brands. Other things included custom backspash with embeded LEDs, a new stove and new counters.


See this page for the complete build process.


Let's recap -

What did I use?


The main ingredient:

Raspberry Pi 3 Model B with NoIR camera module

- The brains of the operation





Other components:


A few bonus items I picked up for this project which gave a lot of extra polish -



Blog post #1 and #2 outline the general ideas of what I was trying to achieve. The cliffs notes-version is a smart device which has direct local control. All logic is self-contained; however it can be linked to other smart home systems. I wanted my wife to be able to text me when dinner is ready right from the stove, complete with a picture of that night's meal. The design was also required to be robust enough for general, daily use. I've done enough "permanent breadboard" projects in the past and wanted to step things up a little.

I additionally had a requirement that whatever fancy stuff I do to this new range hood must be reversible for the eventual time when I move.


The local control was achieved through the touchscreen interface. The RaspberryPi as the main controller served very well doing much more than a simple micro controller - it not only does all the IO functions, but it also serves a webpage as the local control and runs MQTT connectivity to link this system with OpenHAB for my existing smart home system. The PCBs and 3D prints hold all the components of the system very securely so that the system is not fragile. I validated that the new range hood still works as a normal hood in blog post #8 and in this post I show all the new equipment up and running.


I was quoted $600 from Broan for a 39" basic "dumb" range hood to fit in my 40.5" opening. The price jump dramatically from that for anything nice with stainless steel. My goal was to spend less than $1000 for a truly custom hood. The sheet metal came to about $600 after tax and the donor hood was $75. Since my wife reads this blog, I won't be posting the final cost, but I think I was under the cost target. I love you honey!!! I promise I'm almost done with this project!!! Thanks for supporting me


Timeline / system planning

Blog post #2 described my general timeline for the project. It was as follows:

  1. Design; high level
    1. Final component selection; interface decisions
  2. Sheet metal design & fabrication
    1. This was a long-lead item and needed to be ordered early
  3. Component mock up
    1. This phase covered getting all the devices to work together and establish basic functionality on a breadboard
  4. Internal hood design
    1. This encompasses mechanical mounting of all components inside the existing hood
  5. PCB for interfacing electrical components
    1. This was really a stretch goal for me as I've never done a PCB before. I was keeping this one aside to see if I'd have time and it turned out I did
  6. Full implementation
    1. Final assembly
    2. Final Coding


Did I hit all of these in the correct order?

For the most part... I got the sheet metal complete in blog post #3 then went on to the component mock up with a breadboard full of components. I think that I spent quite a bit of time in this stage and ended up glossing over a lot of #4 - internal hood design. I found just a week ago that I should have made my overall PCB smaller since I had a significant space restraint inside the final hood. I did hit #5 for PCBs, and ended up ordering not just one, but two. Full implementation came in around blog post #10, 11, and this one.

I did find that one difficulty with a completely brand new project is that I was designing things around constraints that didn't fully exist yet. Or at least I hadn't spent enough time planning.

For example - getting my custom PCB Pi-Hat to fit inside the box was a challenge since I had made it so large thinking "I can put it anywhere" but it turned out I couldn't because of the final cable length of the 40-pin GPIO cable I purchased. Designing the air baffle was also very challenging and it doesn't really mount anywhere it just sits on the insert and falls out if you remove it. Additionally, the cabling coming in to the baffle is kind of messy and should be re-evaluated. But those are all projects for another day...


Yes, but does it work?

I'm glad you asked! The very first time I got all the components install for the final integration, I started taking images with the thermal camera to integrate the GridEye. While doing so, the exhaust fan automatically clicked on because the methane and LP sensors picked up the small amounts of gas in the air. This actually surprised me since I wasn't really expecting that much "actionable" data from all these sensors; but low an behold things actually worked. Once I integrated the GridEye sensor, I had loads of temperature data for the Range Hood and I was able to easily run the fan when the stove was on.

So yes, it works very well! It works better than I imagined it would!


The touchscreen inside the 3D printed enclosure before I blow'd it up.


See the video at the end for the final product.


Stretch goal - Actual integration with the Grid Eye

I wasn't expecting to have time to do much more than just take nice pictures; but I was able to do some good integration.

I started with a base GridEye from Adafruit. Back in blog post #3 I mentioned that the base GridEye had a limited sensing range of 20-80 C. A comment noted that a different version of the sensor has a higher sensing range. I was able to get the stand-alone surface-mount sensor from Newark and I found a PCB layout available online to utilize it. I gave myself about a 20-30% chance of this all working. I've never done surface-mount soldering before (although I've been around people who do it quite a bit). I ordered up the PCBs piggy-backed to my other boards back in Blog Post #9.


I cut the boards apart when they arrived and went for it! Essentially what I had to do was prepare the solder pads for the sensor by melting solder on them without the sensor. Once all the pads had solder, I placed the sensor and heated all the pads on one side by running the iron back and forth across them. Once the solder turned molten, the sensor 'sank' down into place. Then I quickly repeated the process on the second side. I did have to slide the sensor around a bit to get correct alignment, and I took it off a few times to try over. I was worried that I may have sank too much heat into the sensor at one point since it got very hot. The datasheet notes exactly how much heat is 'safe' and I may have exceeded it in my rush to get this done.

Getting ready to solder the sensor on the board - wish me luck!


Back of the board with SMD resistors and caps. The address bit is the little solder pads bottom center. This board had .1" spacing header and JST 2mm spacing.


I threw a header on the PCB and the rest of the little 0805 resistors and caps and crossed my fingers. The first power up didn't work and I realized I had crimped one wire on the jacket missing the copper core. I eventually got it to respond and I could see it on the I2C bus; but the Adafruit library was failing to pull images from it. After some further investigation, I realized that the library must not have been able to talk properly since the new GridEye was addressed 0x68 instead of 0x69 which the Adafruit comes with. I switched the solder pads to the other address and viola! Images!


I then just updated my Air Baffle mount for the new hole pattern and went ahead with integration.

Old and new cameras with mounts for the air baffle


I used the Adafruit library to take images and they display them in the NodeRed dashboard.

Reading the temperature data was a little more involved. The GridEye has 64 individual sensors ("pixels") which each read out a two-byte temperature measurement. That means 128 bytes of data; however this is a limitation on the I2C bus of 32 bytes. I tried reading all 128 at once and the Pi basically just locked up. I decided that the next best was was read 32 bytes of data at a time and save them to a global array.They I can just use the complete global array from within NodeRed to get my data.

Translating the bytes into degrees F required a few steps - I first read in an array of 32 bytes; representing 16 "pixels". To combine the two bytes into a single number, I bitshift 8 bits to the left and combine the two with a bitwise OR. The binary number uses "two's compliment" which means that the first bit in the sequence is the positive/negative indicator. The sensor has a resolution of .25 degree C (and an accuracy rating of +/- 3 deg C), so I divide the number by four to obtain the degrees C. Then that is converted into F and saved out to the global array.


Read 32 bytes; bitshift 8-byte array, then to C, then to F. ("Get Temps full row")

var gridEye = [];
var loop=0;

for(n=0; n<32; n+=2){
    gridEye[loop] = ((msg.payload[n+1]<<8) | msg.payload[n]) * 0.25;
    //convert to F
    gridEye[loop] = ((5.0/3.0) * gridEye[loop] + 32.0).toFixed(2);

return msg;


Then set global variables:

global.set("RangeTemperature", []);

for(i=0; i<16; i++){
 global.set("RangeTemperature[" + i + "]", msg.payload[i]);   

return msg;



And here is the Node-Red flow.


Once I have the global array, the rest become trivial. I calculate an average temperature and find the MAX and write them out. I also calculate a temperature for each of the individual burners by doing an average of certain pixels based on their location under the camera as follows -


Note - my sensor orientation is such that the left side of the grid above is the front edge of the stove and the right edge is the rear of the stove.



Here is how I read the temperature in the left-front zone on the stove:

var avg = 0;

avg = avg += parseInt(global.get("RangeTemperature[" + 2 + "]"));
avg = avg += parseInt(global.get("RangeTemperature[" + 3 + "]"));
avg = avg += parseInt(global.get("RangeTemperature[" + 4 + "]"));
avg = avg += parseInt(global.get("RangeTemperature[" + 10 + "]"));
avg = avg += parseInt(global.get("RangeTemperature[" + 11 + "]"));
avg = avg += parseInt(global.get("RangeTemperature[" + 12 + "]"));
avg = avg += parseInt(global.get("RangeTemperature[" + 18 + "]"));
avg = avg += parseInt(global.get("RangeTemperature[" + 19 + "]"));
avg = avg += parseInt(global.get("RangeTemperature[" + 20 + "]"));

avg = (avg / 9).toFixed(2);

return {payload: avg, topic: "LF"};


- Just average all the pixels in the array which cover that section of the stove top.



Each zone is passed to a graph on the UI. The Max temperature is saved to a global variable and the main fan controller function block looks for high temperatures to know when to run the fan.


And finally, presenting the final product!!!



(Note: a new touchscreen will be installed on Friday when I get back from my trip; but the UI is the same as the mobile view)


Good luck to all competitors! This has been a great experience, and a huge thanks to Newark and to my wife!!