10 Replies Latest reply on Aug 27, 2020 3:38 PM by ralphjy

    Need help debugging GStreamer RTSP pipeline on UltraZed-EV

    ralphjy

      I am using the UZ7EV_EVCC with RTSP streams from IP cameras and an NVR.  I have an issue in the GStreamer setup that I'd like to get help debugging/fixing.

       

      I have IP cameras from two vendors (Amcrest and FDT) and an Amcrest NVR.  The cameras are 1280x720@30FPS and the NVR is 1920x1080@30FPS.

       

      I'm having an issue with a simple pipeline that receives the RTSP stream, passes it through the VCU omxdecoder, and displays it through the DisplayPort interface.

      This is the pipeline for the Amcrest camera which works correctly:

      gst-launch-1.0 -v rtspsrc location="rtsp://admin:adminpw@10.0.0.212:554/cam/realmonitor?channel=1&subtype=0" ! rtph264depay ! h264parse ! omxh264dec ! kmssink bus-id="fd4a0000.zynqmp-display" fullscreen-overlay=true

       

      The Amcrest camera is displaying and updating properly but in the case of the FDT cameras and the NVR -> the pipeline rolls and the camera image displays but does not update.   The debug logs are the same between working and non-working cases.  The pipelines are the same in all cases except for the location string.

       

      I've used these camera RTSP streams with VLC (ffmpeg) and GStreamer on Ubuntu and also Omxplayer on the Raspberry Pi.  There seems to be some sensitivity in the UZ7EV pipeline that's causing the problem but I'm not how to approach debugging it since the pipeline is not giving me a failure.

       

      Here's an interesting data point.  I wanted to examine the data stream without having to resort to WireShark, so I dumped it to a file.  And if I play that file back -> it works although it seems the framerate is a bit fast.

      gst-launch-1.0 -v rtspsrc location="rtsp://admin:adminpw@10.0.0.210:554/11" ! rtph264depay ! h264parse ! video/x-h264,stream-format=byte-stream ! filesink location=/media/usb/videos/dump210.h264

      gst-launch-1.0 -v filesrc location=/media/usb/videos/dump210.h264 ! h264parse ! omxh264dec ! kmssink bus-id=fd4a0000.zynqmp-display fullscreen-overlay=true

       

      It seems like I have a timing problem somewhere.  I'd appreciate some advice as to how to debug it.  I did try a rtpjitterbuffer after the rtspsrc and tried inserting a queue before the decoder but I'm just shooting in the dark and those things did not help..

       

      One distinct difference that I noticed is that for the working camera I get a framerate of 30/1 and in the non-working cases I get a variable framerate of 0/1 because for some reason the framerate is not received from the camera in the non-working case.  But the file playback also works with a variable framerate so I don't think that is the issue?

       

      Thanks,

      Ralph

        • Re: Need help debugging GStreamer RTSP pipeline on UltraZed-EV
          drozwood90

          Hi Ralph,

           

          That's an interesting situation.  I will reach out to a few colleagues that might be able to help.

          Please give me some time to get an answer (time-zones what what not!).

           

          --Dan

            • Re: Need help debugging GStreamer RTSP pipeline on UltraZed-EV
              drozwood90

              Hi there,

               

              Here is the response I received:
              "Please post your gstreamer debug log – specifically want to see what GStreamer is negotiating as CAPS between elements of the pipeline.  If they’re not specified, GStreamer will attempt to determine these automatically from the input stream.  It’s possible that the pipeline that’s getting created is using a default value for something that’s not correct."

               

              --Dan

                • Re: Need help debugging GStreamer RTSP pipeline on UltraZed-EV
                  ralphjy

                  Dan,

                   

                  I've attached a zip of the debug log.  This is a run with the camera that does not update on the display.  I am running with log level = 5 (debug).  Let me know if you want a different log level or some other configuration.

                   

                  Thanks,

                  Ralph

                    • Re: Need help debugging GStreamer RTSP pipeline on UltraZed-EV
                      drozwood90

                      Ralph,

                       

                      Thanks!  I'll take a look and see what we see!

                       

                      --Dan

                        • Re: Need help debugging GStreamer RTSP pipeline on UltraZed-EV
                          drozwood90

                          Hi there,

                           

                          "… I don’t see anything unusual in the log. 

                           

                          A couple of things to try:

                          1. Compare caps negotiation between the working and non-working 720p camera streams.
                          2. Try specifying/forcing the stream framerate on the receive side (vs. allowing variable framerate=0/1) to see if this changes the behavior in processing the failing camera stream.

                           

                          For #1 - Can you capture a CAPS only debug log with both the working Amcrest camera and the FDT camera?  I’d also like to see the log from the RX stream command in the debug log if possible.

                           

                          If you use GST_DEBUG=”*CAPS*:5” before the pipeline that should simplify the debug log a bit (if it doesn’t look like this is providing much information, we can bump this up to :6 for TRACE instead of debug).

                           

                          I’d like to see the deltas between the 2 cameras decoded with the same pipeline.

                           

                          I think when you record RTSP to a file late frames will still get recorded (in order) and then played back from the file in the right order, so you will not have to deal with late or dropped frames with variable framerates.  Maybe there is an issue with variable framerate processing in some scenarios that it can’t recover properly (or is missing information to do so) with this version of the framework.

                           

                          "

                           

                          --Dan

                            • Re: Need help debugging GStreamer RTSP pipeline on UltraZed-EV
                              ralphjy

                              Hi Dan,

                               

                              Here's the first part of 1 - the two CAPS files are attached.  Interesting that the run times are about the same but the working Amcrest camera has a much larger file size.  Are you using a special viewer to analyze these files?  I've seen reference to the gst-debug-viewer but haven't looked into that.

                               

                              You'll need to help me with the second part of 1 - can you give me an example of using the RX stream command - sorry if that's obvious.

                               

                              For 2 - I tried setting the framerate at the rtpsrc and I recall seeing the framerate propagate in the pipeline (just looking at the verbose console output) and it was okay (30/1) until it went through the h246parse element and then it changed to variable (0/1).  I'll need to look back at my notes and reproduce that and get you a log.  That didn't fix the problem and I was unsure of whether I was doing it correctly.  Sorry, I can't seem to reproduce this so I may be remembering incorrectly.  Do you know the correct way to set this?  What I had tried is adding CAPS after the rtpsrc location. i.e. rtspsrc location="rtsp://admin:adminpw@10.0.0.210:554/11" caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, framerate=(fraction)30/1"

                               

                              Ralph

                                • Re: Need help debugging GStreamer RTSP pipeline on UltraZed-EV
                                  drozwood90

                                  Hi there,

                                   

                                  Here is a copy of the reply from my specialist:

                                   

                                  Without having a similar rtp stream to test with I’m sort of guessing here at next steps – some of these suggestions are band-aids and don’t identify the root cause, but might assist more in debugging the issue.

                                   

                                  1. The syntax looks correct for specifying caps for the rtpsrc stream, not sure if there is a specific error message or not when what the customer posted is attempted.
                                  2. You could try specifying the framerate for the output of the decoder and see if that propagates back through the pipeline negotiation (i.e. … ! omxh264dec ! video/x-raw,format=NV12,width=1280,height=720,framerate=30/1 ! … )
                                  3. Attempt to use the videoconvert/videorate elements (after the decoding) to see if this can compensate for the framerate just to get a stream actually running (i.e. … ! omxh264dec ! queue ! videoconvert ! videorate ! … )
                                  4. One thing I noticed in the stream negotiation is that the FDT Camera is using the  Constrained Baseline Profile … as far as I know, all of the VCU TRD testing is done using Baseline, High and Main profiles.  Although the VCU is supposed to support CBP (See Table 20 - https://www.xilinx.com/support/documentation/ip_documentation/vcu/v1_2/pg252-vcu.pdf#page=42) this might be something worth exploring – Is it possible to configure the FDT camera for a different H.264 profile, just to see if you’re able to decode a stream at a different profile level?

                                   

                                  The other thought is I’m close to having a v2020.1 VCU design up and running on the UltraZED-EV … once I have binaries for this it would be worth trying to see if there is something (software-wise) that may have been fixed that addresses this issue (if it’s related to an existing issue).

                                   

                                  Here are select excerpts from the debug logs just highlighting the profile differences and that the variable framerate propagates through the decode on the FDT camera pipeline.

                                   

                                  FDT Camera:

                                  0:00:00.897024892 3494   0x7fa803fe30 DEBUG GST_CAPS gstpad.c:3183:gst_pad_query_caps_default:<manager:recv_rtp_sink_0> query caps caps query: 0x7f900038a0, GstQueryCaps, filter=(GstCaps)"application/x-rtp\,\ media\=\(string\)video\,\ payload\=\(int\)96\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H264\,\ packetization-mode\=\(string\)1\,\ profile-level-id\=\(string\)4D001F\,\ sprop-parameter-sets\=\(string\)\"Z00AH5WoFAFuQA\\\=\\\=\\\,aO48gA\\\=\\\=\"\,\ a-framesize\=\(string\)1280-720\,\ ssrc\=\(uint\)1897102620\,\ npt-start\=\(guint64\)0\,\ play-speed\=\(double\)1\,\ play-scale\=\(double\)1", caps=(GstCaps)"NULL";

                                  ...

                                  0:00:03.030127826 3494   0x7f900040f0 DEBUG GST_CAPS gstutils.c:3141:gst_pad_query_accept_caps:<omxh264dec-omxh264dec0:sink> accept caps of video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)1280, height=(int)720, framerate=(fraction)0/1, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, colorimetry=(string)1:3:5:1, parsed=(boolean)true, profile=(string)main, level=(string)3.1

                                  ...

                                  0:00:03.070738383 3494   0x7fa803fd40 DEBUG GST_CAPS gstpad.c:2733:gst_pad_get_current_caps:<omxh264dec-omxh264dec0:src> get current pad caps video/x-raw(memory:GLMemory), format=(string)RGBA, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, colorimetry=(string)1:1:5:1, framerate=(fraction)0/1

                                  0:00:03.070873324 3494   0x7fa803fd40 DEBUG GST_CAPS gstutils.c:3141:gst_pad_query_accept_caps:<kmssink0:sink> accept caps of video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)1:3:5:1, framerate=(fraction)0/1

                                   

                                   

                                  Amcrest Camera:

                                  0:00:00.363496555 3568   0x7f80041630 DEBUG GST_CAPS gstpad.c:3183:gst_pad_query_caps_default:<manager:recv_rtp_sink_0> query caps caps query: 0x7f600030a0, GstQueryCaps, filter=(GstCaps)"application/x-rtp\,\ media\=\(string\)video\,\ payload\=\(int\)96\,\ clock-rate\=\(int\)90000\,\ encoding-name\=\(string\)H264\,\ packetization-mode\=\(string\)1\,\ profile-level-id\=\(string\)64001F\,\ sprop-parameter-sets\=\(string\)\"Z2QAH6w0yAUAW///Ad0B3G4CAgKAAAH0AAB1MHQwAMN4AAw3hd5caGABhvAAGG8LvLhQAA\\\=\\\=\\\,aO48MAA\\\=\"\,\ a-packetization-supported\=\(string\)DH\,\ a-rtppayload-supported\=\(string\)DH\,\ a-framerate\=\(string\)30.000000\,\ a-recvonly\=\(string\)\"\"\,\ ssrc\=\(uint\)1840137360\,\ clock-base\=\(uint\)1366808956\,\ seqnum-base\=\(uint\)6851\,\ npt-start\=\(guint64\)0\,\ play-speed\=\(double\)1\,\ play-scale\=\(double\)1", caps=(GstCaps)"NULL";

                                  ...

                                  0:00:02.492621860 3568   0x7f680045e0 DEBUG GST_CAPS gstutils.c:3141:gst_pad_query_accept_caps:<omxh264dec-omxh264dec0:sink> accept caps of video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, pixel-aspect-ratio=(fraction)477/476, width=(int)1280, height=(int)720, framerate=(fraction)30/1, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, colorimetry=(string)1:3:5:1, parsed=(boolean)true, profile=(string)high, level=(string)3.1

                                  ...

                                  0:00:02.535875822 3568   0x7f80041540 DEBUG GST_CAPS gstpad.c:2733:gst_pad_get_current_caps:<omxh264dec-omxh264dec0:src> get current pad caps video/x-raw(memory:GLMemory), format=(string)RGBA, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)477/476, colorimetry=(string)1:1:5:1, framerate=(fraction)30/1

                                  0:00:02.535983404 3568   0x7f80041540 DEBUG GST_CAPS gstutils.c:3141:gst_pad_query_accept_caps:<kmssink0:sink> accept caps of video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)477/476, chroma-site=(string)mpeg2, colorimetry=(string)1:3:5:1, framerate=(fraction)30/1

                                   

                                   

                                  RFC6184 shows the profile ID decoding – notice that the FDT Camera is using the Constrained Baseline H.264 Profile, where the Amcrest Camera uses the High Profile.

                                  https://tools.ietf.org/pdf/rfc6184.pdf#page=41

                                   

                                  He then went on to say:

                                   

                                  Just another note, you could also do #2 on the output side of the h264 parser:

                                   

                                  ! rtph264depay ! h264parse ! capsfilter caps=”video/x-h264,width=1280,height=720,framerate=(fraction)30/1” ! omxh264dec ! ...

                                  3 of 3 people found this helpful