Introduction

In this fifth episode of the Art-a-Tronic Picasso challenge, I have finished the torso motion with Auduino. At this point, before going in depth in the details, it is important to see the global design of this antenna. I apologize to not reveal the global idea but when I started there was a member of incognita I were not sure how to solve. Until I have not opened (surgically, obviously) the mannequin it was not clear of the internal space, structure robustness and thickness, and other variables.

Now the roles between the Raspberry Pi3 B+ and the Andrino board are almost clear. to be fixed in an approximate scheme. If these posts will surprise you, consider that many steps involve a level of uncertainness and some choices need improvisation.

As shown in the scheme above I have decided to split the control of the automa (or droid? Or Borg? or simply mannequin? No idea) in two parts because of some tasks are better if delivered to a microcontroller instead of the Raspberry Pi itself and because I need to manage also some analog signals.

In the Meantime, at Depot09...

 

Components Roles

 

Arduino Role

  • Control the lights of the NeoPixel ring
  • Control the torso rotation
  • Control the environment light intensity
  • Control the environment audio level and direction

The Arduino functions are called by the Raspberry Pi 3B+ through a USB-serial connection and a set of single-character commands. In the first experiments, I used an Arduino UNO R3 but I will definitely use an Arduino MKR 1010 so some behaviors of the mannequin can be controlled remotely through the Arduino IoT cloud.

 

Raspberry Pi 3B+ Role

The Pi act in this context as the brain of the system. It receives information from Arduino controlling the responses depending on the visitors and environment changes and interactions. For example, when a higher loud level is "heard" by one side, the torso will rotate to this direction until the audio level is balanced between left and right, as much as possible.

In the meantime, the Raspberry Pi will stream online the live camera replicated locally on a 5 inches HDMI display, part of the equipment of the Borg. To be honest, the mannequin is not a Borg but it will have a number of Borg-like technological implants.

The other way the Raspberry Pi 3B+ can interact is speaking sentences through the audio output and an amplified speaker. Moreover, when some condition occurs, the robot can express increasing her emotions.

The effect is created by a flame-simulating lamp inside of the torso controlled by one of the two relays of the Piface Digital 2 The system also detect when a user or an object is too near to the body thanks to a PIR sensor connected to one of the I/O pins of the Piface Digital 2.

 

Auduino Motor Control

The mannequin, when the transformation is complete should operate alone at the Art-a-Tronic exhibition. This means that I should take care that the system can run autonomously many hours. To be sure that rotation will start well after its daily power-on I have added another couple of 3D printed components to support an end-stop switch to reset the motor position. Accordingly, to the design, the torso will rotate 60 DEG with the mid-point at 30 DEG

Arduino Software

The part of the software described here refers to the stepper motor control. As the movements will be controlled by a command set I have developed a series of functions to execute all the movements, including an initialization function. The initialization function search for the 0 DEG position corresponding to the end-stop switch, then execute a complete left-to-right rotation test and complete the cycle positioning the torso to the mid position.

The Arduino sources are available on GitHub at the following link: https://github.com/alicemirror/mannequin

 typedef struct StepProfile {
  int torsoSpeed;
  int rotAngle;
  int lastAnglePos;
 };

 

The structure StepProfile contains the motion information updated by the Raspberry PI (via the serial commands) or remotely by the IoT cloud. By the way, this part will be discussed in detail in the Project14 post IoT in the cloud.

 

/**
 * Search the endstop switch moving to left then 
 * move to the middle and start accepting commands.
 * This function is used once on startup
 */
void setTorsoZero() {
  // Initialize the parameters for zero search
  torsoControl.rotAngle = 0;
  torsoControl.lastAnglePos = 0;
  torsoControl.torsoSpeed = SPEED_ZERO;

  // Search loop
  while(checkEndStop() == false) {
    torsoControl.rotAngle += SEARCH_ZERO_STEPS;
    moveTorso();
  } // Search loop
  torsoControl.lastAnglePos = MIN_ANGLE;
  torsoControl.torsoSpeed = SPEED_HIGH;
  torsoControl.rotAngle = MAX_ANGLE;
  moveTorso();
  torsoControl.torsoSpeed = SPEED_MED;
  torsoControl.rotAngle = MIN_ANGLE;
  moveTorso();
  torsoControl.torsoSpeed = SPEED_LOW;
  torsoControl.rotAngle = MAX_ANGLE / 2;
  moveTorso();
}

 

The initialization function SetTorsoZero() is emblematic as it includes the structure settings, as well as moving the motor for the testing cycle. You can see that the approach is to set or change one or more parameters then call the moverTorso() function.

 

/**
 * Move the motor to the new angle, if differs from the last.
 * The new position is the difference between the last angle and
 * the new rotation angle. The angle value is converted to motor
 * steps.
 */
void moveTorso() {
  if(torsoControl.rotAngle != torsoControl.lastAnglePos) {
    // Set the motion speed
    torsoStepper.setSpeed(torsoControl.torsoSpeed);
    //! Calculate the number of steps corresponding to the algebraic difference
    //! between the new angle and the current position
    int newAngle = torsoControl.rotAngle - torsoControl.lastAnglePos;
    
    //! Number of steps based on the new position    
    double moveSteps = 360 / STEPS_PER_REVOLUTION * ANGLE_DEMULTIPLIER * newAngle;
    
    // Move the motor testing the entire rotation range.
    torsoStepper.step(moveSteps);
    torsoControl.lastAnglePos = torsoControl.rotAngle;
  }
}

 

The moveTorso() function moves the torso accordingly with the parameters set, then update the structure based on the new parameters state.

 

Following this strategy, the move function can be called at any moment without risks. The next step in the Arduino software is creating the command parser and managing the other components.

 

Bonus: The Emotions Light Inside

The fire-simulating internal light, expressing emotions, has not been yet connected to the Piface Digital 2 relay but is fixed installed in place, inside the torso.

The video below shows the effect of testing the lamp inside the body.

 

 

Previous Episodes

Next Episode