Vision Thing

Enter Your Project for a chance to win an Oscilloscope Grand Prize Package for the Most Creative Vision Thing Project!

Back to The Project14 homepage

Project14 Home
Monthly Themes
Monthly Theme Poll

 

TABLE OF CONTENTS

 

 

PREFACE

I was happy to be provided a BeagleBone AI to experiment with for this Project.  So, my son and I scratched our heads hard on what we could do that would be intriguing to a reader while testing nearly all functionality of the BBAI.  We wanted to hit all the marks:  GPIO Input/Output, PWM, I2C, Visual Classification with TIDL, Open CV for a GUI, Ad Hoc Streaming, and Audio.  We also thought of how an educator could use it as a classroom project (dig out those Matchbox cars!)  So, we dreamed up the Seeing Around Corners/Talking BBAI Backup camera.  It's not intended to be a commerical product, but stretch our imaginations and the limits of our current knowledge to grow further.  It led to greater than 80 hours of research, design, coding, and documentation as we went down a rabbit hole to understand the BBAI architecture.  We hope you enjoy and gain some nuggets of your own for this great platform.

-Sean and Connor Miller

 

PROJECT INTRODUCTION

 

Even with today's 170 degree rear cameras and backup ultrasonic sensors, parking lot fender benders still occur.  This is because the people behind you don't leave room enough for you to back in to a spot.  If you inch past it, they are likely to take the spot out from under you or at least ride your bumper such that you can't back up.  So, you are forced to pull straight in to the spot. 

 

The dilemma comes when you are ready to leave and need to back out into the steady flow of mall dwellers like yourself.  No matter how much you rubber neck, you can't see on either side due to the big vehicles around you.  Even with the wide angle lens of modern cars, the field of view can't see all the way down the aisle as to what is coming.  If you're lucky, you have someone in the back seat that can say "Clear" or "Woa!"  If not, some texting-oncomer could clip you in the blink of an eye.  Often, we nudge out, wait a second, then nudge again and finally proceed with hope.

 

Backing Up on a Wing and a Prayer

 

In this project, we are going to use the new hot-off-the-shelf Beaglebone AI's advanced vision and frame manipulation hardware to eliminate the problem altogether.  We'll design a back up camera that mounts under our trailer hitch.

 

It won't be your average, every day backup camera, though.  It is an Visual AI driven back seat partner!  Look at these features:

  • Turret controlled from the drivers seat allowing you to peek around corners and cars to see what is coming
  • Streamed Video to your phone
  • Display Backup Reference Lines
  • Realtime Turrent Rotation Angle Indication on the Display
  • Realtime Object Distance on the Display
  • Realtime Object Recognition (vehicles, people, pets) on the Display
  • Realtime Human Voiced Shout Outs - "Woa!", "Car behind us", "Car on the left", "Car coming on the right", "Wampa!"  (just kidding about the Wampa - that's for future releases)
  • Remote viewing to keep an eye on your car's surroundings.

Let's get to it!

 

Shabaz BBAI Pic

Beaglebone AI Board Combined with our Custom Turret Camera

 

BILL OF MATERIALS

Beaglebone AIBeaglebone AI

TFMini Infrared Time of Flight SensorTFMini Infrared Time of Flight Sensor

5KOhm Potentiometer5KOhm Potentiometer

Micro ServoMicro Servo

$9 Web Cam

USB to Audio Cable

3D Printed Parts:  Github

Mobile Smart Phone

 

DESIGN

This backup camera is unique in that it is a turrent camera controlled by a potentiometer and has smart Vision AI processing by the BeagleBone AI microcontroller.  Custom, 3D printed components were designed to fit the trailer hitch on our Jeep.

 

Here are the key components:

1. Weather Resistant Case

2. 3D Printed Servo Arm

3. TFMini Plus

4. Precision Bearing

5. Micro Servo for Precise Targeting

6. Camera Bracket

7. Camera

 

Exploded View of the Backup Camera

 

REVERSE ENGINEERING THE BEAGLEBONE AI

After designing this project with the brand new BeagleBone AI in mind, we hit a snag - although the card is available - its GPIO controlling libraries aren't!  It doesn't have its popular "bonescript" nodejs library working for GPIO, yet. The Adafruit Python library doesn't work, yet.  BeagleBone Black Device Tree Source files brick the bootup process.  No analog reads or writes are possible from the literature and the utilities to show pins give funky pin IDs.

 

Project Learning:  At first, discovering there were no high level libraries to get the GPIO pins working, I thought maybe I'd use another microcontroller instead.  But, I thought I'd first post to the community:  Accessing GPIO and PWM on BeagleBone AI   When I did, I quickly got a lot of leads and references to chase down on Github which ultimately led me to a chat room that had folks that used the BeagleBone at their work.  They bestowed their device source tree mojo enabling the GPIO on the BBAI.  After that, I had digital/analog read/write, PWM, and I2C communication!  I spent the next two weeks probing and writing c code to sort it all out.  Now, the BBAI is my favorite embedded solution in its form factor!

 

To not let all that research go to waste, I posted my discoveries as a project companion blog under the contest Vision Thing.  This allowed me to not clutter this one up and help the rest of e14 community and contestants get their ideas off the ground.  It has an actual fully functional device tree source file for the BBAI and a collection of tested working code examples in C, javascript, and Python:

BeagleBone AI Survival Guide V3.8: PWM, I2C, Analog/Digital Read/Write, Vision AI, Video Text Overlays, Audio, & Hardware

 

In order to use my code for this project, you'll need to first follow it BBAI Setup Checklist at the beginning of that blog, which I won't bother repeating here.  My goal was to eliminate the need for a supplemental micro controller in our design and the research paid off.

 

THE CODE

My software approach is to have three services running upon bootup of the BeagleBone AI (which will be embedded in my vehicle).  They are as follows:

  • one service handles control of the camera turret
  • one logs the current distance from an object behind the vehicle
  • the third handles visual AI including both object classification and text/graphic overlays on the video display.  It also is where the intelligence is for audio warning shoutouts from the backseat (woa!, car on your left, etc)

 

I took the multiple service approach so I can easily develop the code for each in modular fashion and so I can readily leverage the resulting services to other projects.

 

Another Project Learning:  I remember the days of the Amiga and its RAM disk.  Those were the good ole' days.  Doing this project, I found that Linux has a virtual disk located at /sys.  I know I'm late to the party on this, but this is really cool.  It allows device communication to the kernel so your code can communicate with it as you would to the file system.  You can even set GPIO pins high and low as well as turn on PWM.  This approach easily exposes the BBAI GPIO pins for assessing state by an infinite number of programs.  In turn, the /sys directory structure is basically the light speed central nervous system of your robot.  That led me to further learn about Linux FIFOs that I'll talk about in a minute.

 

Turret Code

The turret for the backup camera is controlled by a potentiometer from the drivers seat.  Using pin P9.33 as an analog in, this code translates the users input to a PWM out to the servo.

////////////////////////////////////////
//  servoPot.c
//      reads a pot and translates it to
//      a servos position.
//  Author:  Sean J. Miller, 11/3/2019
//  Wiring: Jumper P9.14 to a servo signal through a 220ohm resistor
//          Hook a potentiometers variable voltage to P9.33 (analog in)
//  See: https://www.element14.com/community/community/designcenter/single-board-computers/next-genbeaglebone/blog/2019/10/27/beagleboard-ai-brick-recovery-procedure
////////////////////////////////////////
#include <stdio.h>
#include <unistd.h>
#include <fcntl.h>
#include <sys/stat.h>
#include <errno.h>
int analogRead(){
    int i;
    FILE * my_in_pin = fopen("/sys/bus/iio/devices/iio:device0/in_voltage7_raw", "r");//P9.33 on the BBAI
    char the_voltage[5];//the characters in the file are 0 to 4095
    fgets(the_voltage,6,my_in_pin);
    fclose(my_in_pin);
    //printf("Voltage:  %s\n", the_voltage);
    sscanf(the_voltage, "%d", &i);
    return (i);
}
void pwm_duty(int the_duty_multiplier)
{
    FILE *duty; int duty_calc;
    duty = fopen("/sys/class/pwm/pwm-0:0/duty_cycle", "w");
    fseek(duty,0,SEEK_SET);
    duty_calc=(600000 + (1700000*(float)((float)the_duty_multiplier/4095))) ;
    //printf("Duty: %d\n", duty_calc);//1ms
    fprintf(duty,"%d",duty_calc);//1ms
    fflush(duty);
    fclose(duty);
}
void setupPWM() {
    FILE *period, *pwm;
    pwm_duty(2000);
    
    period = fopen("/sys/class/pwm/pwm-0:0/period", "w");
    usleep(20000);
    fseek(period,0,SEEK_SET);
    usleep(20000);
    fprintf(period,"%d",20000000);//20ms
    usleep(20000);
    fflush(period);
    fclose(period);
    pwm = fopen("/sys/class/pwm/pwm-0:0/enable", "w");
    usleep(20000);
    fseek(pwm,0,SEEK_SET);
    usleep(20000);
    fprintf(pwm,"%d",1);
    usleep(20000);
    fflush(pwm);
    
    fclose(pwm);
}
void recordRotation(int the_rotation){
 char buffer[64];
 //printf ("In rotation: %d\n",the_rotation);
 the_rotation=(int)((((float)the_rotation)/(float)4440)*180);
// printf ("Rotation: %d\n",the_rotation);
 int ret = snprintf(buffer, sizeof buffer, "%d", the_rotation);
 int fp = open("/home/debian/ramdisk/bbaibackupcam_rotation", O_WRONLY|O_CREAT,0777 );
    if (fp>-1){
     write(fp, buffer, sizeof(buffer));
     close(fp);
    }
}
int main() {
     int ii=0;
     //printf("Setting up\n");
     setupPWM();
     
     while(1) {
        ii=analogRead();
        if (ii>1310) ii=1310;
        if (ii<140) ii=140;
        //printf("ii:%d\n",ii);
        pwm_duty(ii*2);
        recordRotation(ii*2);
        usleep(20000);
     }
}

*See my GItHub for the latest, optimized code

Vision AI Code

Part of the Cloud9 examples installed on the BBAI per the Quick Start Guide, there is a file named classification.cpp in it.  It makes use of the hardware acceleration for vision AI by using the TIDL libraries.  I used it as my base code for object recognition.  I added additional code for the graphic overlays for the streaming backup camera video.  Of course, this took getting a working device tree source file first as discussed in "Reverse Engineering the BBAI" section above.

 

To start, I needed to narrow the hundreds of trained pictures that come with the TIDL example to just ones that I would expect and care about - vehicles, people, and pets.  So around line 106 of classification.cpp, I did the following substitution to the selected_items:

    selected_items[0] = 609; /* jeep */
    selected_items[1] = 627; /* limousine */
    selected_items[2] = 654; /* minibus */
    selected_items[3] = 656; /* minivan */
    selected_items[4] = 703; /* park_bench */
    selected_items[5] = 705; /* passenger_car */
    selected_items[6] = 779; /* school_bus */
    selected_items[7] = 829; /* streetcar */
    selected_items[8] = 176; /* Saluki */
    selected_items[9] = 734; /* police_van */

*See my GItHub for the latest, optimized code

 

The indexes, such as 609=jeep, are found in by typing the following:

     nano -l /usr/share/ti/examples/tidl/classification/imagenet.txt

 

The index value is equal to the line number minus one.

 

I replaced the DisplayFrame routine around line 310 with the one shown below.  This allowed for rendering text for what the BBAI sees as well as backup assistance reference lines and the distance to an object in the line of sight.  I was also able to render a target that indicated the extent of turret rotation.  Note, there is the functions custom distance_message and rotation called.  They are provided in the next section discussion Distance Tracking.

void DisplayFrame(const ExecutionObjectPipeline* eop, Mat& dst)
{   static time_t timer=time(NULL);
    static string my_message;
    static std::string * static_message=new string("");
    static float the_temp_distance=10;
    if(configuration.enableApiTrace)
        std::cout << "postprocess()" << std::endl;
    int is_object = tf_postprocess((uchar*) eop->GetOutputBufferPtr());
    if (the_distance>the_temp_distance+.5) the_temp_distance=the_distance;
    if (the_distance<8&&the_distance<(the_temp_distance-.5)&&time(NULL)>timer) {
        system("sudo -u debian aplay /var/lib/cloud9/BeagleBone/AI/backupcamera/woa.wav");
        the_temp_distance=the_distance;
        timer=time(NULL) + 1;
    }
    if(is_object >= 0)
    {   my_message=*labels_classes[is_object];
        if (time(NULL)>(timer)) {
            if (rot<30) {
                system("sudo -u debian aplay /var/lib/cloud9/BeagleBone/AI/backupcamera/car_on_right.wav");
            } else if (rot>58){
               system("sudo -u debian aplay /var/lib/cloud9/BeagleBone/AI/backupcamera/car_on_left.wav");
            } else {
                system("sudo -u debian aplay /var/lib/cloud9/BeagleBone/AI/backupcamera/car_behind.wav");
            }
            timer=time(NULL) + 4;
        }
        
    }else {
        my_message=*static_message;
        if (time(NULL)>timer&&the_distance<1){
            
            if (rot<30) {
                system("sudo -u debian aplay /var/lib/cloud9/BeagleBone/AI/backupcamera/something_on_right.wav");
                timer=time(NULL) + 4;
            } else if (rot>58){
                system("sudo -u debian aplay /var/lib/cloud9/BeagleBone/AI/backupcamera/something_on_left.wav");
                timer=time(NULL) + 4;
            } 
            
        }
    }
    
    cv::putText(
            dst,
            my_message.c_str(),
            cv::Point(220, 420),
            cv::FONT_HERSHEY_SIMPLEX,
            1.5,
            cv::Scalar(0,0,255),
            3,  /* thickness */
            8
        );
    //Header
         cv::rectangle(
             dst,
             cv::Point(0,0),
             cv::Point(640,130),
             cv::Scalar(255,255,255),
             CV_FILLED,8,0
         );
         
         cv::rectangle(
             dst,
             cv::Point(0,130),
             cv::Point(640,170),
             cv::Scalar(0,0,0),
             CV_FILLED,8,0
         );
        
         cv::putText(
            dst,
            "BACKUP ASSISTANCE",
            cv::Point(60, 165), //origin of bottom left horizontal, vertical
            cv::FONT_HERSHEY_TRIPLEX, //fontface
            1.5, //fontscale
            cv::Scalar(255,255,255), //color
            2,  /* thickness */
            8
        );
    //backup distance meter left side
        cv::line(
            dst,
            cv::Point(104, 420),
            cv::Point(50, 480),
            cv::Scalar(0,0,255), //color blue
            5, //thickness
            8, //connected line type
            0 //fractional bits
        );
        cv::line(//inward line
            dst,
            cv::Point(104, 420),
            cv::Point(134, 420),
            cv::Scalar(0,0,255), //color blue
            5, //thickness
            8, //connected line type
            0 //fractional bits
        );
        cv::line(
            dst,
            cv::Point(158, 360),
            cv::Point(114, 411),
            cv::Scalar(0,255,255), //color yellow
            4, //thickness
            8, //connected line type
            0 //fractional bits
        );
        cv::line(//inward line
            dst,
            cv::Point(158, 360),
            cv::Point(178, 360),
            cv::Scalar(0,255,255), //color yellow
            4, //thickness
            8, //connected line type
            0 //fractional bits
        );
        cv::line(
            dst,
            cv::Point(212, 300),
            cv::Point(168, 351),
            cv::Scalar(0,255,0), //color green
            2, //thickness
            8, //connected line type
            0 //fractional bits
        );
        cv::line(//inward line
            dst,
            cv::Point(212, 300),
            cv::Point(222, 300),
            cv::Scalar(0,255,0), //color green
            2, //thickness
            8, //connected line type
            0 //fractional bits
        );
        
    //backup distance meter right side
        cv::line(
            dst,
            cv::Point(536, 420),
            cv::Point(590, 480),
            cv::Scalar(0,0,255), //color blue
            5, //thickness
            8, //connected line type
            0 //fractional bits
        );
        cv::line(//inward line
            dst,
            cv::Point(536, 420),
            cv::Point(506, 420),
            cv::Scalar(0,0,255), //color blue
            5, //thickness
            8, //connected line type
            0 //fractional bits
        );
        cv::line(
            dst,
            cv::Point(482, 360),
            cv::Point(526, 411),
            cv::Scalar(0,255,255), //color yellow
            4, //thickness
            8, //connected line type
            0 //fractional bits
        );
        cv::line(//inward line
            dst,
            cv::Point(482, 360),
            cv::Point(462, 360),
            cv::Scalar(0,255,255), //color yellow
            4, //thickness
            8, //connected line type
            0 //fractional bits
        );
        cv::line(
            dst,
            cv::Point(428, 300),
            cv::Point(472, 350),
            cv::Scalar(0,255,0), //color green
            2, //thickness
            8, //connected line type
            0 //fractional bits
        );
        cv::line( //inward line
            dst,
            cv::Point(428, 300),
            cv::Point(418, 300),
            cv::Scalar(0,255,0), //color green
            2, //thickness
            8, //connected line type
            0 //fractional bits
        );
        //Footer
        cv::rectangle(
             dst,
             cv::Point(220,480),
             cv::Point(420,440),
             cv::Scalar(0,0,0),
             CV_FILLED,8,0
         );
        cv::putText(
            dst,
            (distance_message()),
            cv::Point(255, 475), //origin of bottom left horizontal, vertical
            cv::FONT_HERSHEY_TRIPLEX, //fontface
            1.5, //fontscale
            cv::Scalar(255,255,255), //color
            2,  /* thickness */
            8
        );
        int rot=rotation();
        if (rot>44){
            cv::drawMarker( 
                dst,
                cv::Point(320+((44-rot)*5), 250),
                cv::Scalar(0,0,0), //color green
                MARKER_CROSS, 
                20,
                5,
                8 
            );
            cv::drawMarker( 
                dst,
                cv::Point(320+((44-rot)*5), 250),
                cv::Scalar(255,255,255), //color green
                MARKER_CROSS, 
                20,
                1,
                8 
            );
        } else if (rot<44){
            cv::drawMarker( 
                dst,
                cv::Point(320-((rot-44)*5), 250),
                cv::Scalar(0,0,0), 
                MARKER_CROSS, 
                20,
                5,
                8
            );
            cv::drawMarker( 
                dst,
                cv::Point(320-((rot-44)*5), 250),
                cv::Scalar(255,255,255),
                MARKER_CROSS, 
                20,
                1,
                8
            );
        } else {
            cv::drawMarker( 
                dst,
                cv::Point(320, 250),
                cv::Scalar(0,255,0), //color green
                MARKER_DIAMOND ,
                20,
                1,
                8 
            );
        }
    
    if(last_rpt_id != is_object) {
        if(is_object >= 0)
        {
            std::cout << "(" << is_object << ")="
                      << (*(labels_classes[is_object])).c_str() << std::endl;
        }
        last_rpt_id = is_object;
    }
}

*See my GItHub for the latest, optimized code

 

Another Project Learning:  To date, I thought OpenCV was all about motion detection and object recognition.  I found it's well beyond that.  I learned it has some easy to call functions to draw overlay graphics on the screen at the speed of the frame rate.  Amazing!  Here is a great resource for learning how to draw with OpenCV as I have down above: https://docs.opencv.org/2.4/modules/core/doc/drawing_functions.html?highlight=line#

 

 

Distance Tracking Code

For the BBAI Backup Camera to add some further assistance to backing up, I also mounted a TFMini Plus for distance sensing.  In daylight, it can sense up to 6 meters and communicates with the BBAI using the I2C serial protocol. 

 

Presently, the BBAI out of the box software won't be able to pull this off.  You'll have to customize a Device Tree Source File as I mentioned in Reverse Engineering the BBAI section.  In that section I provide a link to my parallel blog that has a good working DTS file.  With it, I hooked the TFMini Plus to P9.19 and P9.20.  I used a Linux FIFO (virtual /tmp file) to store the current distance.  This allowed me to run the following code as a service to constantly update the distance sensed to the /tmp file for the classification.cpp code to read.

 

/* ----------------------------------------------------------------------- *
 * Title:         tfmini.c                                                 *
 * Description:   C-code for TFMini Plus                                   *
 *                Tested on BeagleBone AI                                  *
 *                11/6/2019 Sean J. Miller                                 *
 *References:                       *
 *https://stackoverflow.com/questions/8507810/why-does-my-program-hang-when-opening-a-mkfifo-ed-pipe*
 *https://stackoverflow.com/questions/2988791/converting-float-to-char     *
 * Prerequisites: apt-get libi2c-dev i2c-tools                             *
 *------------------------------------------------------------------------ */
#include <stdio.h>
#include <stdlib.h>
#include <sys/stat.h>
#include <unistd.h>
#include <linux/i2c-dev.h>
#include <fcntl.h>
#include <sys/ioctl.h>
#include <string.h>
#include <errno.h>
#define I2C_BUS        "/dev/i2c-3" // I2C bus device
#define I2C_ADDR       0x10         // I2C slave address for the TFMini module
int debug=0;
int i2cFile;
void i2c_start() {
   if((i2cFile = open(I2C_BUS, O_RDWR)) < 0) {
      printf("Error failed to open I2C bus [%s].\n", I2C_BUS);
      exit(-1);
   } else {
       if (debug)printf("Opened I2C Bus\n");
   }
   // set the I2C slave address for all subsequent I2C device transfers
   if (ioctl(i2cFile, I2C_SLAVE, I2C_ADDR) < 0) {
      printf("Error failed to set I2C address [%s].\n", I2C_ADDR);
      exit(-1);
   } else {
      if (debug) printf("Set Slave Address\n");
   }
}
float readDistance() {
 //Routine to output the distance to the console
 int distance = 0; //distance
 int strength = 0; // signal strength
 int rangeType = 0; //range scale
 unsigned char incoming[7]; //an array of bytes to hold the returned data from the TFMini.
 unsigned char cmdBuffer[] = { 0x01, 0x02, 7 }; //the bytes to send the request of distance
 
   write( i2cFile, cmdBuffer, 3 );
   usleep(100000);
   read(i2cFile, incoming, 7);
 for (int x = 0; x < 7; x++)
 {
  if (x == 0)
  {
   //Trigger done
   if (incoming[x] == 0x00)
   {
   
   }
   else if (incoming[x] == 0x01)
   {
   
   }
  }
  else if (x == 2)
   distance = incoming[x]; //LSB of the distance value "Dist_L"
  else if (x == 3)
   distance |= incoming[x] << 8; //MSB of the distance value "Dist_H"
  else if (x == 4)
   strength = incoming[x]; //LSB of signal strength value
  else if (x == 5)
   strength |= incoming[x] << 8; //MSB of signal strength value
  else if (x == 6)
   rangeType = incoming[x]; //range scale
 }
 
 float the_return = distance / (12 * 2.54); //convert to feet.
 return the_return;
}
void recordDistance(float the_distance){
 char buffer[20];
 int ret = snprintf(buffer, sizeof buffer, "%f", (the_distance));
 if (debug) printf("About to open for writing...\n");
 int fp = open("/home/debian/ramdisk/bbaibackupcam_distance", O_WRONLY|O_CREAT,0666);
 if (debug) printf("About to write...%d\n",fp);
 ret=write(fp, buffer, sizeof(buffer));
 close(fp);
 if (debug) printf("Written %d\n",ret);
}
 
void main() {
   float my_distance=0;
   debug=0; //change to 1 to see messages.
   
   if(debug) printf("Starting:\n");
 
 while (1) {
  i2c_start();
    my_distance = readDistance();
    if(debug) printf("the_distance: %f\n",my_distance);
    recordDistance(my_distance);
    close(i2cFile); 
    
    if(debug) printf("Looping.\n");
    
    usleep(1000000);
 }
}

Last, I needed to add the custom distance_message and rotation functions talked about at the Vision AI code to classification.cpp.  This reads the distance so it can add it to each frame. Rotation provides the camera rotation so it can show a target on the display to feedback how far it has rotated.

char *distance_message() {
    static char buf[20];
    static char suffix[4]=" Ft";
    static time_t timer=time(NULL);
    
    if (time(NULL)>timer) {
        int fd = open("/home/debian/ramdisk/bbaibackupcam_distance", O_RDONLY );
        if (fd>-1) {
            int result=read(fd,buf,sizeof(buf));
            if (result>-1){
                close(fd);
                memcpy(buf+3,suffix,4);
            }
            
            sscanf(buf, "%f", &the_distance);
        }
        timer=time(NULL)+.5;
    }
    return (char *)buf;
}
int rotation() {
    static char buf[20];
    
    static time_t timer=time(NULL)+.02;
    
    if (time(NULL)>timer) {
        int fd = open("/home/debian/ramdisk/bbaibackupcam_rotation", O_RDONLY );
        if (fd>-1){
            int result=read(fd,buf,sizeof(buf));
            if (result!=-1) {
                close(fd);
    
                int i;
                sscanf(buf, "%d", &i);
                rot=i;
            } 
        }
        timer=time(NULL)+.02;
    }
    return (rot);
}

Making it Talk

Since we can have the BBAI sort out what it sees, measure how far away it is, and keep track of which way the camera is looking, we might as well make it talk, right?  With a USB hub added to the mix of components, we can have both a web cam and a speaker attached.

 

To enable sound from the BBAI USB port, I simply typed the following at the terminal:

nano ~/.asoundrc

 

I pasted the following in the nano editor:

pcm.!default {
        type plug
        slave {
                pcm "hw:1,0"
        }
}
ctl.!default {
        type hw
        card 1
}

 

 

Then, from my code, I applied some If..then's to fire corresponding audio with this:

     system("sudo -u debian aplay /yourpath/yourfile.wav");

 

 

CAPE CIRCUIT DESIGN

To get reliable wire-to-board connectors to the backup camera components, I needed to design a cape.  A cape is the BeagleBoard name for a small board that plugs into its headers.  Based on the pins selected in the code, here is the mini-cape design:

 

BBAI Backup Camera Mini Cape

 

Another Project Learning:  You may notice a piezo circuit in the schematic.  In the last week of the project, I laid in bed and thought, it would be neat  to have the backup camera simulate someone helping back out into traffic with voice shoutouts.  I hit the Google the next morning and saw how the BBB pulled off audio.  So, I ordered me a USB to Audio cord and grabbed an old USB hub.  Low and behold, I could hook the camera and the audio up at once!  This then allowed me to code in voice responses to the realtime data to simulate a back seat partner calling out what he sees.  So, bye-bye piezo bleeps and bloops - we have speech!

 

I made the backup camera mini-cape with a proto-board.  It fits nicely into a 3D printed enclosure.

BBAI Mini Cape to Handle our Backup Camera Peripherals

Another Project Learning: As I was building my own wire-to-board cables for the peripherals, I was having mixed results with crimping pins.  I developed some workarounds I was proud of, but I still was getting less than production quality work.  This led me to make a post: My pin crimping tips -- what are yours?  After the healthy discussion with the e14 community, I found two major issues I had with crimping - I didn't really understand my crimping tool and my crimping tool wasn't for the small pins I was attempting to crimp.  Now I have the correct tools for the job and am crimping like a machine!

 

DESKTOP DEMO

One last test before we printed the components and headed out to the garage was our desktop demo.  Our program on the BBAI streams its video to a web page.  That's how we get the display on our iPhone.  In the video we test the control of the rotation of our backup camera and view the rotation indication on the screen.  Also, we test out the distance sensor and its display on the screen.

 

 

 



Another Project Learning: Earlier I mentioned I had learned about named pipes called FIFOs.  I first used them to pass data between the above scripts using memory instead of the SD file system.  However, with the desktop demo, I found that they have a drawback that causes a low frame rate for streaming video.  FIFOs expect that when one program is ready to write, the other one is ready to read.  In turn, I got a lot of latency in my streaming video as it waited for the distance to be measured and piped over.  So, I switched to a RAM disk instead.  This allowed the main program to read the current value in memory regardless of when it was written which eliminated the hesitation in frame rate.  Here's the post that taught me how to do it: https://www.linuxbabe.com/command-line/create-ramdisk-linux .

 

3D PRINTED COMPONENTS

You can get these 3D Models in their basic Autodesk Fusion 360 format at my Github repository:  Github  If you don't have Autodesk Fusion 360, it's free and it will absolutely change your world.

BBAI Case

The BBAI itself will be installed behind a panel in the back of our '99 Jeep Grand Cherokee.  To save material and allow the case to breathe, I designed it with plenty of slots.  Since it won't show to the vehicle passengers, only "function" was of concern versus form.  It will be installed with 3M Mounting Tape.

Autodesk Fusion 360 Design Rendering BBAI Car Backup Camera Brains

BBAI Car Backup Camera 3D Printed Case:  the real thing

BBAI Turret Cam Case

For the turret cam, I bought a $9 web cam.  It had a tapered cylinder allowing me to model my 3D print for a snug push fit.  Above it, the TFMini Plus is firmly pressed into the turret body.

Left: Actuator Case, Right: Turrent with Cam and TFMini, Back: BBAI Case

 

 

 

FINAL DEMO VIDEO

In the final demo video, we take you through assembly, tuning of the audio responses, vehicle installation, and vehicle backup test.

 

Pressing the Bearing to Assemble the Turret

 

Turret Cam Testing of Servo Limits

 

Tuning the Audio Timing With a Matchbox Test Rig

Video Coming 11/17

 

PROJECT SUMMARY

As stated in the Preface, we sought out to stretch our Maker limits as newcomers to the BeagleBone platform.  For the project, we ended up creating two blogs.  One blog to contribute to the community every little thing we learned about the BBAI.  The other, this blog, to provide an all inclusive literal road test covering GPIO Input/Output, PWM, I2C, Visual Classification with TIDL, Open CV for a GUI, Ad Hoc Streaming, and Audio.  Although our project taught us a lot and achieved our objectives, it did make us think - what would we do for a permanent solution that would be better than any solution on the market today?

 

This is what we would change for a permanent Backup Assistance BBAI Device:

  • Eliminate streaming video:  The streaming resulted in motion blur and delays.  Often, the Visual Classification would occur in <.2 seconds of the object coming in view, but the streaming video would show the video with over a second delay.  For this application, we need an immediate, clear response.
  • Go with CRT video:  Although we loved the ability to draw on the screen, I found that the biggest benefit of the BBAI was simulating the person in the back seat.  If we added a display, that would achieve the speed we are looking for, but my preference would be to buy an aftermarket $15 CRT backup camera and display solution for the video altogether.  Then...
  • Add multiple cams to have hands free surveillance:  we learned that with a USB hub, we could have additional backup cameras.  Since they are just $8, I prefer moving the cams inside the vehicle and pointing them out the back right, left, and hatch.  This then could verbalize blind spot detection as well as provide backup assistance and even help with parking straight between the lines when backing in a spot.  In addition, you could use streaming video to monitor your car within WIFI range on a multi view page.
  • Replace the TFMini Plus with aftermarket backup assistance.  You can get those buttons on the bumper for $15 and they have a wide detection angle and installs directly in the bumper.  No concern of theft.  The TFMini Plus is a great component and was fun to showcase I2C, but at $45 with a narrow beam, its best applied to our next adventure.
  • Add additional safety:  Add a linear actuator to the brake pedal...just kidding.

 

Well, we hope you enjoyed our blog and learned the BBAI along with us.  We'd love to hear your ideas for an epic backup assistance solution in the comments, too!

 

See you next time and happy holidays!

Sean and Connor Miller