I have a pipeline coded in C++ that looks like this:
appsrc do-timestamp=TRUE is-live=TRUE caps=
“video/x-h264, stream-format=(string)byte-stream, alignment=(string)none, framerate=(fraction)0/1” min-latency=300000000 ! h264parse ! video/x-h264, stream-format=(string)avc, alignment=(string)au ! tee name=t \
t. ! queue ! valve drop=FALSE ! decodebin ! glupload ! glcolorconvert ! qtsink sync=FALSE \
t. ! queue ! valve drop=FALSE ! mp4mux reserved-max-duration=3600000000000 reserved-moov-update-period=10000000000 ! filesink sync=FALSE location=”....../out.mp4”
appsrc injects the video coming from a drone’s USB wireless video receiver into the pipeline.
Some more context:
- The USB receiver hardware gives us 512-byte chunks of non-timestamped raw Annex-B h.264 video
- The framerate should be 60 fps, but in practice it rarely keeps up with it and varies depending on the signal strength (therefore framerate=(fraction)0/1”, and that’s the reason neither qtsink nor filesink are sync’d to the pipeline (sync=FALSE))
- The hardware introduces a minimum of 300 ms of latency, as set in appsrc
- appsrc is automatically timestamping my buffers (do-timestamp=TRUE)
- I’m using mp4mux reserved-max-duration and reserved-moov-update-period to prevent app crashes from breaking the mp4 files
- I’m using GStreamer 1.18.4 for Android
Video recording works fine when the drone is not airborne. But when it takes off, after about 15 seconds of correct video recording the mp4mux element fails with the message “Buffer has no PTS”. Unfortunately, this has been consistently reported by some users but I can’t reproduce it (as it requires flying a drone that I don’t have), which doesn’t make a lot of sense. My guess so far is that at that particular moment there is probably some congestion in the wireless video link and some video packets might be held up for a few msecs, and that might be causing some trouble.
Here’s the (simplified) code that creates appsrc
_pAppSrc = gst_element_factory_make("appsrc", "artosyn_source");
gpointer pAppSrc = static_cast<gpointer>(_pAppSrc);
// Retain one more ref, so the source is destroyed
// in a controlled way
gst_object_ref(_pAppSrc);
pCaps = gst_caps_from_string("video/x-h264, stream-format=(string)byte-stream, alignment=none, framerate=(fraction)0/1"));
g_object_set(G_OBJECT(pAppSrc), "caps", pCaps,
"is-live", TRUE,
"min-latency", G_GINT64_CONSTANT(300000000),
"format", GST_FORMAT_TIME,
"do-timestamp", TRUE,
nullptr);
_pBufferPool = gst_buffer_pool_new();
pConfig = gst_buffer_pool_get_config (_pBufferPool);
static const guint kBufferSize = 512;
static const guint kPoolSize = 0x400000;
static const guint kPoolSizeMax = 0x600000;
qsizetype nBuffersMin = kPoolSize / kBufferSize;
qsizetype nBuffersMax = kPoolSizeMax / kBufferSize;
gst_buffer_pool_config_set_params(pConfig, pCaps, kBufferSize, nBuffersMin, nBuffersMax);
gst_buffer_pool_set_config(_pBufferPool, pConfig);
gst_buffer_pool_set_active(_pBufferPool, TRUE);
gst_caps_unref(GST_CAPS(pCaps));
When a new buffer is filled up by the USB driver, it’s pushed into the pipeline like this:
bool unref = false;
gst_buffer_unmap(b->pBuffer, &b->mapInfo);
gst_buffer_set_size(b->pBuffer, xfer.pXfer->actual_length);
if(result == LIBUSB_TRANSFER_COMPLETED)
{
//-- DROP DATA IF NOT IN PLAYING STATE --
GstState st, pend;
GstStateChangeReturn scr = gst_element_get_state(GST_ELEMENT(_pAppSrc), &st, &pend, GST_CLOCK_TIME_NONE);
Q_UNUSED(scr)
bool drop = (st != GST_STATE_PLAYING);
if(!drop)
{
GstFlowReturn ret = GST_FLOW_OK;
// Push into pipeline
ret = gst_app_src_push_buffer(GST_APP_SRC(_pAppSrc), b->pBuffer);
if(ret != GST_FLOW_OK)
qCDebug(MMCVideoLog()) << "Can't push buffer to the pipeline (" << ret << ")";
else
unref = false; // Don't unref since gst_app_src_push_buffer() steals one reference and takes ownership
}
} else if(result == LIBUSB_TRANSFER_CANCELLED)
{
qCDebug(MMCVideoLog()) << "! Buffer canceled";
} else {
qCDebug(MMCVideoLog()) << "? Buffer result = " << result;
}
if(unref)
gst_buffer_unref(b->pBuffer);
This is what I got from Android logcat from an affected machine:
[07-22 18:37:45.753 17414:18734 E/QGroundControl]
VideoReceiverLog: GStreamer error: [element ' "mp4mux0" '] Could not multiplex stream.
[07-22 18:37:45.753 17414:18734 E/QGroundControl]
VideoReceiverLog: Details: ../gst/isomp4/gstqtmux.c(5010): gst_qt_mux_add_buffer (): /GstPipeline:receiver/GstBin:sinkbin/GstMP4Mux:mp4mux0:
Buffer has no PTS.
What I’ve tried:
- Setting GstBaseParser pts_interpolation to TRUE, and infer_ts to TRUE
So my questions are:
- Can you see anything wrong with my code?, what am I missing?
- Can I rely on matroskamux to avoid the issue temporarily until I find the true cause?
Edit: I managed to reproduce it "in situ" while printing out the PTS and DTS of every buffer using a probe attached to the sink pad of my tee element and found out that the problem buffer has no DTS and no PTS. Perhaps my h264parse or my capsfilter are doing something nasty inbetween my appsrc and my tee?
07-28 17:54:49.025 1932 2047 D : PTS: 295659241497 DTS: 295659241497
07-28 17:54:49.026 1932 2047 D : PTS: 295682488791 DTS: 295682488791
07-28 17:54:49.053 1932 2047 D : PTS: 295710463127 DTS: 295710463127
07-28 17:54:49.054 1932 2047 D : PTS: 18446744073709551615 DTS: 18446744073709551615
07-28 17:54:49.054 1932 2047 E : ************** NO PTS
07-28 17:54:49.054 1932 2047 E : ************** NO DTS
07-28 17:54:49.110 1932 2047 D : PTS: 295738607214 DTS: 295738607214
07-28 17:54:49.111 1932 2199 E : GStreamer error: [element ' "mp4mux1" '] Could not multiplex stream.
07-28 17:54:49.111 1932 2199 E : Details: ../gst/isomp4/gstqtmux.c(5010): gst_qt_mux_add_buffer (): /GstPipeline:receiver/GstBin:sinkbin/GstMP4Mux:mp4mux1:
07-28 17:54:49.111 1932 2199 E : Buffer has no PTS.