Introduction

Maybe this episode is one of the less and intriguing than the previous ones in terms of spectacularity but this step was one of the most challenging.

After I have accomplished the task – that is, in a few words, <<Seven Of Nine extends to the social network>> I learned interesting new knowledge and acquired a new experience so I decided to share it dedicating an entire episode. Episode14 covers over one week of work on the Picasso Challenge to add new behavior to Seven Of Nine, extending the Borg ability to interact with humans beyond the borders of the exhibition location.

 

I say, in this episode, our animatronic has gone Where no man has gone before...

 

Credits (in behalf of Seven Of Nine)

One of the most important aspects of this episode is that the final part of the development has been possible, as well as finding a satisfactory efficient solution, thanks to the cooperation of other volunteers. Warmly thanks to shabaz and Workshopshed for the debugging support and for helping me find a solution to an issue over which I spent an incredible amount of time.

Also Thanks to beacon_dave, dixonselvan, Workshopshed (aka Andy Clark) and gecoz for their invaluable support in chatting for hours with the Borg for testing and producing a good series of examples.

 

The Failed Solutions and Why

After the hacked Borg Access Terminal discussed in Art-a-Tronic Episode 13 Seven Of Nine become able to hold a conversation by voice with a human counterpart: the next step was expanding its possibilities with network interactivity. Activating this feature will potentially enable any remote user to access the animatronic from remote, whenever it is not possible to reach the Art-a-Tronic exhibition location.

 

The most efficient solution to reach this goal were to set up a textual chat using some kind of social network. Exploring the possibilities available on the Internet supporting Python and running on a Linux terminal the first choice was trying to approach WhatsApp; it is worldwide known and diffused to billions of users, so what a better choice? I have tried a couple of Python libraries that worked just a couple of minutes before losing the messages or the connection at all. The reason I resigned to use WhatsApp is that all of these apparently working Python packages were unsupported and unauthorized developments, trying to reverse-engine the WhatsApp original application.

 

In fact, WhatsApp does not offer any kind of development kit, or support, or official technical documentation. I definitely decided to abandon this solution as unreliable for a long term supported software. I have excluded Facebook messenger too; messenger may be a useful solution but it is heavy and not so many people uses it if not in conjunction with their Facebook account activity.

 

I am already registered as Google developer and I use the Google APIs to support some features of this same project. Seems obvious that another possible candidate was Google Hangouts. Enabling the Raspberry PI to manage the development  Kit and Python libraries for Hangouts was exposing the Pl to some serious performances risk, as it is already running the Speech-to-text Google API.

 

I was interested in a simple textual chat, avoiding large data packets sent to the social platform like media, images, sounds, video, etc. and at the same time, almost popular that rarely users should install new applications.

 

Seven Of Nine Managed Tasks

Just as a reminder it is important to consider that at the actual state-of-the-art the animatronic Raspberry PI brain already manages many tasks:

 

  • Hear the environment
  • Speak contextualized sentences as wav sounds sampled by films, TV series and other sources
  • Act as MQTT server accepting connections from two different clients
  • Control an Arduino UNO for motion
  • Control the PiFace Digital 2 GPIO for emotion light, laser, and eye implant lights
  • Manage the Pi Camera for image acquisition
  • Send Tweets with attached photos
  • Run an AI text analysis engine
  • Process voice sentences from local users replying logically related sentences
  • Some background activities and secondary tasks

 

The above list does not need more words to explain the amount of processing power needed by the Raspberry PI. One of the reasons it was possible to create this scenario is due to strong software optimization, as well as taking care to optimize every resource. We should not forget that all these tasks will run concurrently in a small embedded Linux machine.

Above: Seven Of Nine seeing if some data is coming from the MagicMirror

 

When the Right Choice is Far From Working

Finally, I focused my choice on using Twitter for user chatting through Direct Messaging.

Maybe conditioned by the fact that I rarely use it, initially I was not so happy to use Twitter Direct Messaging. Considering the task in charge to the animatronic and the kind of chat I was searching for then this choice revealed probably the best solution: only text chatting, not important to share images and other media, easy to authorize the human user to start a conversation and simply to stop if needed.

 

DM (=Direct Messaging) is a subset of the Twitter social network features.  It is easy to limit the number of users that can or can't chat via DM and the possibility of chatting does no impact authentication, privacy, and other features. A  blocked user to send DM is not blocked following and sending tweets, as well as the other features of the social network like mentions (you can tweet mentioning also a user that you are not following and don’t follow you)

 

Seven Of Nine Twitter account profile

 

Libraries & Bugs

As discussed in Art-a-Tronic Episode 11 Tweepy is a complete Python library wrapping the Twitter APIs, already used with success. So, why not? In fact, Tweepy also includes the wrapper methods for sending and receiving Twitter Direct Messages. Unfortunately, the only calls that are not working are just those I need. Each tries to send a Direct Message I always get the Twitter returning error This page does not exist ( ) on the terminal.

 

Twitter Direct Messages from Seven Of Nine: Evolution

Until I started to use Tweepy in a different context than sending Tweets I have considered this feature almost experimental. Instead, the experiment worked well and continue by over one month. I decided it was time to move Seven Of Nine to the next step: I have created a dedicated Twitter account Seven Of Nine (Twitter user @WeAreBorg7of9)

Above: Seven Of Nine Twitter home page

 

To complete the Twitter new user registration procedure it is necessary to specify an email so I have created a dedicated email too: sevenofnine.artatronic.borg@gmail.com

BTW, I also own the domain we-are-borg.com that will be soon online. Until the Borg website is not yet active you can send emails to Seven Of Nine, as well as start following the Twitter account mentioned above and eventually chatting with her.

After the new Twitter user has been created and confirmed, I have also registered Seven Of Nine as a Twitter developer and created the application We-Are-Borg, with its own public and secret keys and OAuth access token.

 

A Bug That is Not a Bug

Digging in the Bug list of the Tweepy GitHub repository I discovered the reason the DM methods no longer work.

By the mid of July 2018, Twitter changed and strongly redesigned the Direct Message API endpoint disabling the REST approach, replaced by a different set of parameters to be sent through a POST call to the web.

Thanks to the community contributors and their suggestions I have found a couple of notes explaining a workaround to revive the sned_direct_message() call. The Direct Messages API features are available on the Twitter Developers Documentation.

 

How-To Patching a Python Package

I want to raise your attention to this paragraph. The procedure I followed to apply the suggested corrections in two source files of the Tweepy library can be applied almost without changes to any Python library you want to upgrade or customize by your own.

The Tweepy library is available in the Python repository and as I have done with many other libraries and components I have installed it via the pip repository with the command

 

> sudo pip3 tweepy install

 

Step 1 - Remove the Package

The first step is to remove the package from the system; it is easy as installing it if you have used the pip command:

 

> sudo pip3 tweepy uninstall

 

After the removal completes your PI is in the same conditions as before the installation; the dependency of the package, eventually installed automatically by the pip command instead are not removed.

 

Step 2 - Cloning and Building From Sources

The next step is cloning the package sources from the GitHub repository; I suggest to clone it in the home directory (e. g. /home/pi). Then to try that everything is fine, launch the following commands from the terminal:

 

/home/pi>cd tweepy

 

and try to reinstall the package from the sources

 

/home/pi>sudo python3 setup.py install

 

It should work. As a matter of fact, the result is the same than installing the package with pip but now the installation proceeds from the sources of your local repository. In general – just like this specific case – you should expect to find the file setup.py and other stuff in the root of the repository (here is /home/pi/tweepy) and a subfolder with the same name (/home/pi/tweepy/tweepy) where we can find the Python sources we are interested in. To be sure that after every change to the sources the installation is updated accordingly I have used the following commands in sequence (from the repository main folder /home/pi/tweepy)

 

$>sudo python3 setup.py clean
$>sudo python3 setup.py build
$>sudo python3 setup.py install

 

As mentioned before, this procedure can be applied to modify or customize the sources of most of the Python packages.

 

Something Worked But Something Don't

I have modified the source files api.py and binder.py accordingly to the suggestions mentioned in the bug list and creating an updated send_direct_message() method that finally worked!

At that point, the account Seven Of Nine was able to send DM to any follower, also followed by the account. Then?

 

The updated Tweepy package was able to send DM but unfortunately, the same Twitter API changes on the DM API affected also the other methods related to Direct Messaging. I have tried to understand the logic of the new API POST procedure – that replaced the previous REST call – patching also the other Tweepy methods. After hours of tests and changes, things were working better but I was again experiencing some issue, and messages reply were not yet possible. I launched a call for help to the small group of Element14 members I already contacted some days before for testing the new chat feature (as it were available).

 

Lucky me   Shabaz and Andy Clark answered to my request in less than an hour

 

We three, trying and chatting, and discussing the results spent almost the entire Friday afternoon (just when normal people stop working and relax while the weekend is coming); during the night Shabaz found another version of the Tweepy package. A fork by another community volunteer able to produce a list of the last messages in chronological order, a Python list of JSON objects, accordingly with the Twitter's API specifications.

 

Problem solved? Not yet. But Almost!

After further testing, we were sure that this new version was working fine, but not for the send_direct_message() method. I tried to proceed. pragmatically:

 

  • Forked the last, partially working repository on my GitHub account
  • Applied the fix (again) to the send_direct_message() method
  • Installed the package and tested it

 

Finally, I saw the magic! Sending DM and receiving responses was a reality, no matter what side initiate the conversation. The updated Tweepy package working fine with the new Twitter Direct Message API (send and receive) is available on GitHub under the same MIT license as the original version: https://github.com/alicemirror/tweepy

 

Software Notes

The image above shows the internal logic followed by the system to manage the Twitter messaging autonomously. As you can see the Tweepy package is the interface between the custom classes and the Twitter API.

The main logic implies checking if new messages have arrived in the list of JSON objects provided by the call to direct_messages() of the Tweepy package. If new messages appear on top of the list the AI class provide to process them generating the response, sent back to the generating user chat. This main event control is managed by the Python script TwitterBorgChat.py.

Note that while the Twitter chat is running the Raspberry PI is also able to maintain a voice chat with a local user.

The Main Python Script

The high-level process of check new messages -> reply to new messages (if any) should respect the Twitter directives: no more than 15 calls to this API can be sent every 15 minutes (else Twitter stops the user for a period). So this main process is launched periodically (every 8 minutes) by Cron and is limited to process up to 10 different user messages at a time with a delay between every reply of 30 seconds. Below the source code of the main script

 

#!/usr/bin/python3
# Manage chats via direct message with the followed users
from pyclasses.borgtwitter import BorgTwitter
from collections import namedtuple
from pyclasses.ai import AI
import time

def main():
    # Initialize the twitter class
    # (registration and connection to Twitter)
    tweet = BorgTwitter()
    # Establish a connection to the Twitter server APIs
    tweet.connectTwitter()
    # Initialize the AI class and load the trainer file for
    # the text processing engine
    chat = AI()
    chat.load('sevenofninetrainer.txt')

    # Execute a cycle processing all the queued message
    isNew, msgList = tweet.checkMessages()

    # print("New messages: ", isNew, msgList)

    # If there are messages awaiting answer, proceed answering
    # to them
    if isNew is True:
        for msg in msgList:
            ''' '''
            userId = msg['sender_id']
            userText = msg['text']
            # Send the message text to the language processor
            # engine
            response = chat.processDM(userText)
            tweet.sendMessage(userId, response)
            # Chat delay before processing the next message (due to the
            # anti-spam Twitter limitations)
            time.sleep(30)

if __name__ == '__main__':
    main()

 

The process is very easy to understand as the complex job is done by the BorgTwitter class. This class includes both the low-level private methods to interface the Twitter API calls via the Tweepy library and the High-level methods. While the SendMessage() mechanism is really easy, consisting only of the call to the send_direct_message() with the destination user and the response text as shown below.

 

    # Send DM to the desired recipient ID
    # The process to manage the identification (name, screen name)
    # of the recipient ID, if needed, should be managed outside of
    # the class. Twitter API don't care of the accessory information
    # and manage DMs only based on the respective recipient ID
    def sendMessage(self, userId, data):
        event = {
          "event": {
            "type": "message_create",
            "message_create": {
              "target": {
                "recipient_id": userId
              },
              "message_data": {
                "text": data
              }
            }
          }
        }


        # print(event)
        self.twitter.send_direct_message_new(event)

 

The complex mechanism is parsing the returning value from the checkMessages() call. In fact, this API method – I suppose easier than ever for the Twitter server – just return a list of JSON objects with all the last chat (for the last 10 users). Every JSON object includes the entire chat history (the last 30 messages) including those of the sender. The approach to parse this complex and under some aspects confused result is double; first the list should split every JSON object then check if any specific conversation has the remote user as its last message. In this case, this means that the remote user has replied to the last sent sentence and a new reply is expected.

 

The sources of the class are available on the GitHub repository https://alicemirror.github.io/mannequin/

 

Chatting With the Borg

Below some abstracts of the chat Human-Borg created during the testing by Dixon Selvan, Dave Ingels, Andy Clark, Fabio Origlia and myself.

Me and Andy

Dave

Dixon

Gecoz

As you can see, there is a flaw in some sentence-response, where the sentence (or group of sentences) is repeated twice. After investigating I discovered being a problem of the Twitter API that updates the list slowly than the real sentence sent by the server to the chat area; sometimes when the list is read also some messages already processed appears as without response again. This flaw will be corrected in a future version of my class, adding a more rigid control of the received list of sentences taking n account the timestamp, that should be later than the last time response has been sent to a certain user.

 

Seven Of Nine is happy to chat to those users that will follow her Twitter account

Previous Episodes

Next Episodes

Art-a-Tronic Episode 15