I am using the UZ7EV_EVCC with RTSP streams from IP cameras and an NVR. I have an issue in the GStreamer setup that I'd like to get help debugging/fixing.
I have IP cameras from two vendors (Amcrest and FDT) and an Amcrest NVR. The cameras are 1280x720@30FPS and the NVR is 1920x1080@30FPS.
I'm having an issue with a simple pipeline that receives the RTSP stream, passes it through the VCU omxdecoder, and displays it through the DisplayPort interface.
This is the pipeline for the Amcrest camera which works correctly:
gst-launch-1.0 -v rtspsrc location="rtsp://admin:email@example.com:554/cam/realmonitor?channel=1&subtype=0" ! rtph264depay ! h264parse ! omxh264dec ! kmssink bus-id="fd4a0000.zynqmp-display" fullscreen-overlay=true
The Amcrest camera is displaying and updating properly but in the case of the FDT cameras and the NVR -> the pipeline rolls and the camera image displays but does not update. The debug logs are the same between working and non-working cases. The pipelines are the same in all cases except for the location string.
I've used these camera RTSP streams with VLC (ffmpeg) and GStreamer on Ubuntu and also Omxplayer on the Raspberry Pi. There seems to be some sensitivity in the UZ7EV pipeline that's causing the problem but I'm not how to approach debugging it since the pipeline is not giving me a failure.
Here's an interesting data point. I wanted to examine the data stream without having to resort to WireShark, so I dumped it to a file. And if I play that file back -> it works although it seems the framerate is a bit fast.
gst-launch-1.0 -v rtspsrc location="rtsp://admin:firstname.lastname@example.org:554/11" ! rtph264depay ! h264parse ! video/x-h264,stream-format=byte-stream ! filesink location=/media/usb/videos/dump210.h264
gst-launch-1.0 -v filesrc location=/media/usb/videos/dump210.h264 ! h264parse ! omxh264dec ! kmssink bus-id=fd4a0000.zynqmp-display fullscreen-overlay=true
It seems like I have a timing problem somewhere. I'd appreciate some advice as to how to debug it. I did try a rtpjitterbuffer after the rtspsrc and tried inserting a queue before the decoder but I'm just shooting in the dark and those things did not help..
One distinct difference that I noticed is that for the working camera I get a framerate of 30/1 and in the non-working cases I get a variable framerate of 0/1 because for some reason the framerate is not received from the camera in the non-working case. But the file playback also works with a variable framerate so I don't think that is the issue?