I'm a retired EE with 40+ years of experience primarily in semiconductor test equipment design both in hardware and software.  When I was a graduate student working at the University of Hawaii we used seismograph readings to alert us to earthquake activity to correlate to a tsunami early warning system that we were proposing.  So, I've been interested in vibration sensors for a while .


When I saw the Experimenting with Vibration Sensors Challenge it reminded me of an article in Scientific American about using footstep sensors to identify people by their gait…   Much of the work that I've seen in human presence detection using footsteps or gait has used other types of sensors (pressure or audio).  I thought that it would be an interesting experiment to see if it would work with the Kemet vibration sensor used in this challenge.


Problem Description

I thought that it would be a fun project to try to determine who was in present in a room using the vibration pattern from their footsteps.  Of course, because of the pandemic restrictions, this will be limited to identifying me, my wife and the dog.


Proposed Solution

I've recently been playing with TinyML using the EdgeImpulse API and I'm going to use that to train and deploy a model (impulse) to do the person classification.  I originally intended to use the Nucleo-H743ZI board and the Bluetooth shield that are included in the challenge kit, but I'm having second thoughts.  The Nucleo-H743ZI is an extremely capable board but I'm a little wary about the learning curve with a new board in the timeframe of this challenge (the board uses an Cortex-M7 core which I haven't used before).  I am probably going to use either an Arduino Nano 33 BLE Sense or a Wio Terminal as I've used both of these boards with Edge Impulse.  I expect that the data collection and formatting will be the long poles for me in this challenge and I'd like to focus on that rather than learning a new board set.  I'm sure the boards will be useful in the future.


Experiments Planned

I'm not sure how different the Kemet VS vibration sensor performance is relative to the MEMs accelerometers that you find everywhere.  I assume with the large cost differential that the Kemet should have a lot higher performance.  A lot of my early experiments will be to characterize the sensor in my use case.  The Kemet sensor has an analog signal output so that will require using an ADC to sample the data and that whole process could cause performance degradation.


The Kemet spec illustrates the application range for the sensor.


We'll be using the VS-BV203-B so the measurement range is +/-50 m/s^2 or about +/-5G.  I'm not quite sure what vibration range that I'll see but I'm guessing that I may see less than 20mV of signal which probably means that I'll need to add some amplification before the ADC.


I was surprised at some omissions in the spec.  It is clear that the sensor needs to be oriented so its Z axis is in the direction of the vibration but I don't see in the spec where the Z axis is indicated.  I'm going to guess from the picture of the sensor housing that the axis is perpendicular to the flat side and that positive is toward the labeled side.  This should be easy enough to check once I get the sensor, hopefully tomorrow.


The other omission is that they don't specify the connector type on the end of the sensor cable.  Worst case, I'll just cut it off and add my own connector.


Use case

I have a raised foundation with a crawl space underneath.  I plan to do the measurements on the first floor where I have a Oak hardwood floor over a wooden subfloor.  I'd like to not have to attach the sensor directly to the floor.  I think that attaching it to a weighted base will give me sufficient coupling but that's something to check.  If that works that would allow me to easily reposition the sensor to determine an optimal location.


List of experiments and tasks

  1. Determine Z axis orientation
  2. Determine connector type and order mating connector or replace
  3. Do simple sensitivity experiments (may lead to amplifier design)
  4. Build sensor mount for weighted base (3D print)
  5. Measure sensitivity on weighted base (try direct attach if this doesn't work)
  6. Set up WiFi or Bluetooth to remotely collect data samples
  7. Determine length of data window (proxy for number of steps)
  8. Collect labeled and noise data (need to trim samples to length of data window)
  9. Upload data to Edge Impulse and determine whether classification is possible (switch to a detection project if classification is not feasible)
  10. If possible build impulse and deploy to hardware (use Arduino library method)
  11. Test classification


Best case this will all work.  Worst case it will be a learning experience and I'll have fun which is hard to come by nowadays .