2 Replies Latest reply on Sep 7, 2019 7:59 PM by clem57

    Transfer Image from Raspberry Pi (Python) to Android app (Java)


      Hi everyone! I need help transferring an image via TCP from a python program on my raspberry pi to an android application.


      I have set up a client-server architecture such that my raspberry pi 3 records audio, performs some analysis on it, and then sends the data (via TCP) to the android app to display on the app screen. The recording and analysis is done and I am able to make the connection and transfer string data that displays on the app with no problem. However, I have been unsuccessful in transferring an image from rpi to android app. So basically, the image is stored on the rpi and I an attempting to transfer the image to the app to display it. I have been working on this for over a week with no luck so any help would be greatly appreciated!


      My current implementation (code snippets provided below):

      On rpi (python): Like I said, sending strings and displaying them on the android app is done without any problem. When I am sending the image portion of the audio analysis, I send a string first that says "?start" so that the android side knows that an image instead of a string is about to be sent (and will wait to update the GUI until it receives the entire image). Then, I open the image stored on rpi and read the entire image as a byte array (typically about 40-50k bytes). I get the length of the byte array and send that as a string to android app. Finally, I send the byte array to the android and it waits for an OK message from the app. All of this works without reporting any errors.


      On android app (java): When the app receives the "?start" string, it then uses a Buffered Reader (which is what I used to read the string data I had transferred to the app successfully earlier) to read the size of the image byte array. Then, I create 2 buffers, msg_buff and img_buff. msg_buff will read in 1024 bytes at a time while img_buff will hold the entire byte array of the image. In the infinite while loop, I have a DataInputStream, called in, read bytes into msg_buff and returns the number of bytes read. Then, I concatenate the copied contents of msg_buff into img_buff. Once the bytes read from in is -1 or the img_offset (which is just the total number of bytes read) is greater than or equal to the size of the image bytes array, the while loop is broken. Then, I would attempt to save the image to android internal storage and then load it later to an imageView to display it. This code does successfully read in the bytes until there are around 2000-3000 bytes left to be read and then it seems to freeze on the int bytes_read = in.read(msg_buff, 0, msg_buff.length) line. I have not been able to get past that point so I do not know if saving the image to internal storage and then loading it to imageview that way will work either.

      I have also tried using base64 encoding/decoding but that also kept producing errors. I have tried rpi only sending 1024 bytes of the image at a time but that also did not work. I have tried several implementations of this approach but nothing has worked so far. If anyone sees anything wrong or has another approach, I am all ears!


      Android Studio (app side):

      //receives the message which the server sends back 
      InputStream sin = socket.getInputStream();
      OutputStream sout = socket.getOutputStream();
      DataInputStream in = new DataInputStream(sin); 
      mBufferIn = new BufferedReader(new InputStreamReader(socket.getInputStream()));  

      //in this while the client listens for the messages sent by the server
      while (mRun) { 
           mServerMessage = mBufferIn.readLine(); 
           if (mServerMessage != null && mMessageListener != null) { 
                //Check if data is image
                if(mServerMessage.equals("?start")) { 
                     // Get length of image byte array
                     int size = Integer.parseInt(mBufferIn.readLine());  
                     // Create buffers
                     byte[] msg_buff = new byte[1024];
                     byte[] img_buff = new byte[size];
                     int img_offset = 0;

                          int bytes_read = in.read(msg_buff, 0, msg_buff.length); 
                          if(bytes_read == -1){ break; } 

                          //copy bytes into img_buff
                          System.arraycopy(msg_buff, 0, img_buff, img_offset, bytes_read);
                          img_offset += bytes_read;      
                          if( img_offset >= size) { break; }            
                     ContextWrapper cw = new ContextWrapper(ApplicationContextProvider.getContext());     
                     File directory = cw.getDir("imageDir", Context.MODE_PRIVATE);
                     File mypath = new File(directory, "signal.jpeg"); 
                     Bitmap bitmap = BitmapFactory.decodeByteArray(img_buff, 0, img_buff.length);
                     FileOutputStream fos = null; 
                          fos = new FileOutputStream(mypath);                
                          //Use compress method on Bitmap object to write image to OutputStream               
                          bitmap.compress(Bitmap.CompressFormat.JPEG, 100, fos);      
                          //Send OK byte[]          
                          OK = new byte[] {0x4F, 0x4B};     
                      }   catch (Exception e) {

      Raspberry Pi (python):

      def image_to_byte_array(image_file, conn): 
           with open(image_file, 'rb') as imageFile:
                content = imageFile.read()
                size = len(content) strSize = str(size) + "\n"

      Note that conn is the connection between the app and the rpi and the images are PNG.

      If anyone knows why this isn't working or has a better way for me to do this, I would greatly appreciate it!! Thank you in advance!!