Title: [286369] trunk
Revision
286369
Author
ph...@webkit.org
Date
2021-12-01 10:36:08 -0800 (Wed, 01 Dec 2021)

Log Message

[GStreamer] requestVideoFrameCallback support
https://bugs.webkit.org/show_bug.cgi?id=233541

Reviewed by Xabier Rodriguez-Calvar.

Source/WebCore:

Video frames metadata reported by the player is stored in GstBuffers as a new GstMeta.
Processing times is tracked in converters and decoders using pad probes. The GStreamer
mediastream video capturer is now inserting metadata that includes the capture timestamp in
the buffers. The WebRTC incoming and mock sources are wrapping the metadata in the
MediaSamples they create.

* platform/GStreamer.cmake:
* platform/VideoFrameMetadata.h:
* platform/graphics/gstreamer/GStreamerCommon.h:
(WebCore::fromGstClockTime):
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:
(WebCore::MediaPlayerPrivateGStreamer::createGSTPlayBin):
(WebCore::MediaPlayerPrivateGStreamer::configureVideoDecoder):
(WebCore::MediaPlayerPrivateGStreamer::pushTextureToCompositor):
(WebCore::MediaPlayerPrivateGStreamer::flushCurrentBuffer):
(WebCore::MediaPlayerPrivateGStreamer::videoFrameMetadata):
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h:
* platform/graphics/gstreamer/MediaSampleGStreamer.cpp:
(WebCore::MediaSampleGStreamer::MediaSampleGStreamer):
(WebCore::MediaSampleGStreamer::createImageSample):
* platform/graphics/gstreamer/MediaSampleGStreamer.h:
(WebCore::MediaSampleGStreamer::create):
(WebCore::MediaSampleGStreamer::createImageSample):
(WebCore::MediaSampleGStreamer::MediaSampleGStreamer):
* platform/graphics/gstreamer/VideoFrameMetadataGStreamer.cpp: Added.
(videoFrameMetadataAPIGetType):
(videoFrameMetadataGetInfo):
(webkitGstBufferSetVideoSampleMetadata):
(webkitGstTraceProcessingTimeForElement):
(webkitGstBufferGetVideoFrameMetadata):
* platform/graphics/gstreamer/VideoFrameMetadataGStreamer.h: Added.
* platform/mediastream/gstreamer/GStreamerCapturer.cpp:
(WebCore::GStreamerCapturer::createSource):
* platform/mediastream/gstreamer/GStreamerMediaStreamSource.cpp:
(mediaStreamTrackPrivateGetTags):
* platform/mediastream/gstreamer/MockRealtimeVideoSourceGStreamer.cpp:
(WebCore::MockRealtimeVideoSourceGStreamer::updateSampleBuffer):
* platform/mediastream/libwebrtc/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.cpp:
(WebCore::RealtimeIncomingVideoSourceLibWebRTC::OnFrame):
* platform/mediastream/libwebrtc/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.h:

Source/WTF:

* Scripts/Preferences/WebPreferencesExperimental.yaml: Enable rvfc support in GStreamer ports.

LayoutTests:

* platform/glib/TestExpectations: Update rvfc test expectations. WebRTC-related tests fail
mostly because WebRTC encoding/decoding is currently totally broken in WPE/GTK. XR tests
shall be handled in a separate patch.

Modified Paths

Added Paths

Diff

Modified: trunk/LayoutTests/ChangeLog (286368 => 286369)


--- trunk/LayoutTests/ChangeLog	2021-12-01 17:50:30 UTC (rev 286368)
+++ trunk/LayoutTests/ChangeLog	2021-12-01 18:36:08 UTC (rev 286369)
@@ -1,3 +1,14 @@
+2021-12-01  Philippe Normand  <pnorm...@igalia.com>
+
+        [GStreamer] requestVideoFrameCallback support
+        https://bugs.webkit.org/show_bug.cgi?id=233541
+
+        Reviewed by Xabier Rodriguez-Calvar.
+
+        * platform/glib/TestExpectations: Update rvfc test expectations. WebRTC-related tests fail
+        mostly because WebRTC encoding/decoding is currently totally broken in WPE/GTK. XR tests
+        shall be handled in a separate patch.
+
 2021-12-01  Patrick Griffis  <pgrif...@igalia.com>
 
         CSP: Update URL stripping in reports to match other implementations

Modified: trunk/LayoutTests/platform/glib/TestExpectations (286368 => 286369)


--- trunk/LayoutTests/platform/glib/TestExpectations	2021-12-01 17:50:30 UTC (rev 286368)
+++ trunk/LayoutTests/platform/glib/TestExpectations	2021-12-01 18:36:08 UTC (rev 286369)
@@ -775,8 +775,12 @@
 webkit.org/b/231811 webaudio/audiocontext-state-interrupted.html [ Timeout ]
 webkit.org/b/231811 webaudio/suspend-context-while-interrupted.html [ Timeout ]
 
-# Also Timing out on the bots
-webkit.org/b/232629 http/tests/media/media-blocked-by-willsendrequest.html [ Crash Timeout ]
+imported/w3c/web-platform-tests/video-rvfc [ Pass ]
+fast/mediastream/getUserMedia-rvfc.html [ Pass Failure ]
+webrtc/peerConnection-rvfc.html [ Failure ]
+imported/w3c/web-platform-tests/video-rvfc/request-video-frame-callback-webrtc.https.html [ Skip ]
+imported/w3c/web-platform-tests/video-rvfc/request-video-frame-callback-before-xr-session.https.html [ Skip ]
+imported/w3c/web-platform-tests/video-rvfc/request-video-frame-callback-during-xr-session.https.html [ Skip ]
 
 #////////////////////////////////////////////////////////////////////////////////////////
 # End of GStreamer-related bugs

Modified: trunk/Source/WTF/ChangeLog (286368 => 286369)


--- trunk/Source/WTF/ChangeLog	2021-12-01 17:50:30 UTC (rev 286368)
+++ trunk/Source/WTF/ChangeLog	2021-12-01 18:36:08 UTC (rev 286369)
@@ -1,3 +1,12 @@
+2021-12-01  Philippe Normand  <pnorm...@igalia.com>
+
+        [GStreamer] requestVideoFrameCallback support
+        https://bugs.webkit.org/show_bug.cgi?id=233541
+
+        Reviewed by Xabier Rodriguez-Calvar.
+
+        * Scripts/Preferences/WebPreferencesExperimental.yaml: Enable rvfc support in GStreamer ports.
+
 2021-12-01  Martin Robinson  <mrobin...@webkit.org>
 
         Add a runtime flag to enable CSS Transforms Level 2 spec compliant behavior

Modified: trunk/Source/WTF/Scripts/Preferences/WebPreferencesExperimental.yaml (286368 => 286369)


--- trunk/Source/WTF/Scripts/Preferences/WebPreferencesExperimental.yaml	2021-12-01 17:50:30 UTC (rev 286368)
+++ trunk/Source/WTF/Scripts/Preferences/WebPreferencesExperimental.yaml	2021-12-01 18:36:08 UTC (rev 286369)
@@ -1186,9 +1186,11 @@
       default: false
     WebKit:
       "PLATFORM(COCOA) && HAVE(AVSAMPLEBUFFERVIDEOOUTPUT)" : true
+      "USE(GSTREAMER)": true
       default: false
     WebCore:
       "PLATFORM(COCOA) && HAVE(AVSAMPLEBUFFERVIDEOOUTPUT)" : true
+      "USE(GSTREAMER)": true
       default: false
 
 # FIXME: This is on by default in WebKit2. Perhaps we should consider turning it on for WebKitLegacy as well.

Modified: trunk/Source/WebCore/ChangeLog (286368 => 286369)


--- trunk/Source/WebCore/ChangeLog	2021-12-01 17:50:30 UTC (rev 286368)
+++ trunk/Source/WebCore/ChangeLog	2021-12-01 18:36:08 UTC (rev 286369)
@@ -1,3 +1,51 @@
+2021-12-01  Philippe Normand  <pnorm...@igalia.com>
+
+        [GStreamer] requestVideoFrameCallback support
+        https://bugs.webkit.org/show_bug.cgi?id=233541
+
+        Reviewed by Xabier Rodriguez-Calvar.
+
+        Video frames metadata reported by the player is stored in GstBuffers as a new GstMeta.
+        Processing times is tracked in converters and decoders using pad probes. The GStreamer
+        mediastream video capturer is now inserting metadata that includes the capture timestamp in
+        the buffers. The WebRTC incoming and mock sources are wrapping the metadata in the
+        MediaSamples they create.
+
+        * platform/GStreamer.cmake:
+        * platform/VideoFrameMetadata.h:
+        * platform/graphics/gstreamer/GStreamerCommon.h:
+        (WebCore::fromGstClockTime):
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:
+        (WebCore::MediaPlayerPrivateGStreamer::createGSTPlayBin):
+        (WebCore::MediaPlayerPrivateGStreamer::configureVideoDecoder):
+        (WebCore::MediaPlayerPrivateGStreamer::pushTextureToCompositor):
+        (WebCore::MediaPlayerPrivateGStreamer::flushCurrentBuffer):
+        (WebCore::MediaPlayerPrivateGStreamer::videoFrameMetadata):
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h:
+        * platform/graphics/gstreamer/MediaSampleGStreamer.cpp:
+        (WebCore::MediaSampleGStreamer::MediaSampleGStreamer):
+        (WebCore::MediaSampleGStreamer::createImageSample):
+        * platform/graphics/gstreamer/MediaSampleGStreamer.h:
+        (WebCore::MediaSampleGStreamer::create):
+        (WebCore::MediaSampleGStreamer::createImageSample):
+        (WebCore::MediaSampleGStreamer::MediaSampleGStreamer):
+        * platform/graphics/gstreamer/VideoFrameMetadataGStreamer.cpp: Added.
+        (videoFrameMetadataAPIGetType):
+        (videoFrameMetadataGetInfo):
+        (webkitGstBufferSetVideoSampleMetadata):
+        (webkitGstTraceProcessingTimeForElement):
+        (webkitGstBufferGetVideoFrameMetadata):
+        * platform/graphics/gstreamer/VideoFrameMetadataGStreamer.h: Added.
+        * platform/mediastream/gstreamer/GStreamerCapturer.cpp:
+        (WebCore::GStreamerCapturer::createSource):
+        * platform/mediastream/gstreamer/GStreamerMediaStreamSource.cpp:
+        (mediaStreamTrackPrivateGetTags):
+        * platform/mediastream/gstreamer/MockRealtimeVideoSourceGStreamer.cpp:
+        (WebCore::MockRealtimeVideoSourceGStreamer::updateSampleBuffer):
+        * platform/mediastream/libwebrtc/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.cpp:
+        (WebCore::RealtimeIncomingVideoSourceLibWebRTC::OnFrame):
+        * platform/mediastream/libwebrtc/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.h:
+
 2021-12-01  Wenson Hsieh  <wenson_hs...@apple.com>
 
         Adjust a Live Text quirk so that it applies to YouTube image thumbnails

Modified: trunk/Source/WebCore/platform/GStreamer.cmake (286368 => 286369)


--- trunk/Source/WebCore/platform/GStreamer.cmake	2021-12-01 17:50:30 UTC (rev 286368)
+++ trunk/Source/WebCore/platform/GStreamer.cmake	2021-12-01 18:36:08 UTC (rev 286369)
@@ -24,6 +24,7 @@
         platform/graphics/gstreamer/TextCombinerPadGStreamer.cpp
         platform/graphics/gstreamer/TextSinkGStreamer.cpp
         platform/graphics/gstreamer/TrackPrivateBaseGStreamer.cpp
+        platform/graphics/gstreamer/VideoFrameMetadataGStreamer.cpp
         platform/graphics/gstreamer/VideoSinkGStreamer.cpp
         platform/graphics/gstreamer/VideoTrackPrivateGStreamer.cpp
         platform/graphics/gstreamer/WebKitAudioSinkGStreamer.cpp

Modified: trunk/Source/WebCore/platform/VideoFrameMetadata.h (286368 => 286369)


--- trunk/Source/WebCore/platform/VideoFrameMetadata.h	2021-12-01 17:50:30 UTC (rev 286368)
+++ trunk/Source/WebCore/platform/VideoFrameMetadata.h	2021-12-01 18:36:08 UTC (rev 286369)
@@ -27,6 +27,8 @@
 
 #if ENABLE(VIDEO)
 
+#include <optional>
+
 namespace WebCore {
 
 struct VideoFrameMetadata {

Modified: trunk/Source/WebCore/platform/graphics/gstreamer/GStreamerCommon.h (286368 => 286369)


--- trunk/Source/WebCore/platform/graphics/gstreamer/GStreamerCommon.h	2021-12-01 17:50:30 UTC (rev 286368)
+++ trunk/Source/WebCore/platform/graphics/gstreamer/GStreamerCommon.h	2021-12-01 18:36:08 UTC (rev 286369)
@@ -87,6 +87,14 @@
     return static_cast<GstClockTime>(toGstUnsigned64Time(mediaTime));
 }
 
+inline MediaTime fromGstClockTime(GstClockTime time)
+{
+    if (!GST_CLOCK_TIME_IS_VALID(time))
+        return MediaTime::invalidTime();
+
+    return MediaTime(GST_TIME_AS_USECONDS(time), G_USEC_PER_SEC);
+}
+
 class GstMappedBuffer {
     WTF_MAKE_NONCOPYABLE(GstMappedBuffer);
 public:

Modified: trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp (286368 => 286369)


--- trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp	2021-12-01 17:50:30 UTC (rev 286368)
+++ trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp	2021-12-01 18:36:08 UTC (rev 286369)
@@ -51,6 +51,7 @@
 #include "InbandTextTrackPrivateGStreamer.h"
 #include "TextCombinerGStreamer.h"
 #include "TextSinkGStreamer.h"
+#include "VideoFrameMetadataGStreamer.h"
 #include "VideoTrackPrivateGStreamer.h"
 
 #if ENABLE(MEDIA_STREAM)
@@ -2750,31 +2751,26 @@
     g_signal_connect(GST_BIN_CAST(m_pipeline.get()), "deep-element-added", G_CALLBACK(+[](GstBin*, GstBin* subBin, GstElement* element, MediaPlayerPrivateGStreamer* player) {
         GUniquePtr<char> binName(gst_element_get_name(GST_ELEMENT_CAST(subBin)));
         GUniquePtr<char> elementName(gst_element_get_name(element));
+        auto elementClass = makeString(gst_element_get_metadata(element, GST_ELEMENT_METADATA_KLASS));
+        auto classifiers = elementClass.split('/');
 
+        // Collect processing time metrics for video decoders and converters.
+        if ((classifiers.contains("Converter"_s) || classifiers.contains("Decoder"_s)) && classifiers.contains("Video"_s) && !classifiers.contains("Parser"))
+            webkitGstTraceProcessingTimeForElement(element);
+
+        if (classifiers.contains("Decoder"_s) && classifiers.contains("Video"_s)) {
+            player->configureVideoDecoder(element);
+            return;
+        }
+
         if (g_str_has_prefix(elementName.get(), "downloadbuffer")) {
             player->configureDownloadBuffer(element);
             return;
         }
 
-        if (g_str_has_prefix(elementName.get(), "uridecodebin")) {
-            // This will set the multiqueue size to the default value.
+        // This will set the multiqueue size to the default value.
+        if (g_str_has_prefix(elementName.get(), "uridecodebin"))
             g_object_set(element, "buffer-size", 2 * MB, nullptr);
-            return;
-        }
-
-        if (!g_str_has_prefix(binName.get(), "decodebin"))
-            return;
-
-        if (g_str_has_prefix(elementName.get(), "v4l2"))
-            player->m_videoDecoderPlatform = GstVideoDecoderPlatform::Video4Linux;
-        else if (g_str_has_prefix(elementName.get(), "imxvpudec"))
-            player->m_videoDecoderPlatform = GstVideoDecoderPlatform::ImxVPU;
-        else if (g_str_has_prefix(elementName.get(), "omx"))
-            player->m_videoDecoderPlatform = GstVideoDecoderPlatform::OpenMAX;
-
-#if USE(TEXTURE_MAPPER_GL)
-        player->updateTextureMapperFlags();
-#endif
     }), this);
 
     g_signal_connect_swapped(m_pipeline.get(), "source-setup", G_CALLBACK(sourceSetupCallback), this);
@@ -2819,6 +2815,27 @@
         }), this);
 }
 
+void MediaPlayerPrivateGStreamer::configureVideoDecoder(GstElement* decoder)
+{
+    GUniquePtr<char> name(gst_element_get_name(decoder));
+    if (g_str_has_prefix(name.get(), "v4l2"))
+        m_videoDecoderPlatform = GstVideoDecoderPlatform::Video4Linux;
+    else if (g_str_has_prefix(name.get(), "imxvpudec"))
+        m_videoDecoderPlatform = GstVideoDecoderPlatform::ImxVPU;
+    else if (g_str_has_prefix(name.get(), "omx"))
+        m_videoDecoderPlatform = GstVideoDecoderPlatform::OpenMAX;
+    else if (g_str_has_prefix(name.get(), "avdec")) {
+        // Set the decoder maximum number of threads to a low, fixed value, not depending on the
+        // platform. This also helps with processing metrics gathering. When using the default value
+        // the decoder introduces artificial processing latency reflecting the maximum number of threads.
+        g_object_set(decoder, "max-threads", 2, nullptr);
+    }
+
+#if USE(TEXTURE_MAPPER_GL)
+    updateTextureMapperFlags();
+#endif
+}
+
 bool MediaPlayerPrivateGStreamer::didPassCORSAccessCheck() const
 {
     if (WEBKIT_IS_WEB_SRC(m_source.get()))
@@ -2930,6 +2947,8 @@
     if (!GST_IS_SAMPLE(m_sample.get()))
         return;
 
+    ++m_sampleCount;
+
     auto internalCompositingOperation = [this](TextureMapperPlatformLayerProxy& proxy, std::unique_ptr<GstVideoFrameHolder>&& frameHolder) {
         std::unique_ptr<TextureMapperPlatformLayerBuffer> layerBuffer;
         if (frameHolder->hasMappedTextures()) {
@@ -3230,7 +3249,7 @@
 {
     Locker sampleLocker { m_sampleMutex };
 
-    if (m_sample) {
+    if (m_sample && gst_sample_get_buffer(m_sample.get())) {
         // Allocate a new copy of the sample which has to be released. The copy is necessary so that
         // the video dimensions can still be fetched and also for canvas rendering. The release is
         // necessary because the sample might have been allocated by a hardware decoder and memory
@@ -3786,6 +3805,31 @@
 }
 #endif
 
+std::optional<VideoFrameMetadata> MediaPlayerPrivateGStreamer::videoFrameMetadata()
+{
+    if (m_sampleCount == m_lastVideoFrameMetadataSampleCount)
+        return { };
+
+    m_lastVideoFrameMetadataSampleCount = m_sampleCount;
+
+    Locker sampleLocker { m_sampleMutex };
+    if (!GST_IS_SAMPLE(m_sample.get()))
+        return { };
+
+    auto* buffer = gst_sample_get_buffer(m_sample.get());
+    auto metadata = webkitGstBufferGetVideoFrameMetadata(buffer);
+    auto size = naturalSize();
+    metadata.width = size.width();
+    metadata.height = size.height();
+    metadata.presentedFrames = m_sampleCount;
+
+    // FIXME: presentationTime and expectedDisplayTime might not always have the same value, we should try getting more precise values.
+    metadata.presentationTime = MonotonicTime::now().secondsSinceEpoch().seconds();
+    metadata.expectedDisplayTime = metadata.presentationTime;
+
+    return metadata;
 }
 
+}
+
 #endif // USE(GSTREAMER)

Modified: trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h (286368 => 286369)


--- trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h	2021-12-01 17:50:30 UTC (rev 286368)
+++ trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h	2021-12-01 18:36:08 UTC (rev 286369)
@@ -469,6 +469,8 @@
     void configureDownloadBuffer(GstElement*);
     static void downloadBufferFileCreatedCallback(MediaPlayerPrivateGStreamer*);
 
+    void configureVideoDecoder(GstElement*);
+
     void setPlaybinURL(const URL& urlString);
 
     void updateTracks(const GRefPtr<GstStreamCollection>&);
@@ -564,6 +566,9 @@
     DataMutex<TaskAtMediaTimeScheduler> m_TaskAtMediaTimeSchedulerDataMutex;
 
 private:
+    std::optional<VideoFrameMetadata> videoFrameMetadata() final;
+    uint64_t m_sampleCount { 0 };
+    uint64_t m_lastVideoFrameMetadataSampleCount { 0 };
 #if USE(WPE_VIDEO_PLANE_DISPLAY_DMABUF)
     GUniquePtr<struct wpe_video_plane_display_dmabuf_source> m_wpeVideoPlaneDisplayDmaBuf;
 #endif

Modified: trunk/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.cpp (286368 => 286369)


--- trunk/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.cpp	2021-12-01 17:50:30 UTC (rev 286368)
+++ trunk/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.cpp	2021-12-01 18:36:08 UTC (rev 286369)
@@ -24,6 +24,7 @@
 
 #include "GStreamerCommon.h"
 #include "PixelBuffer.h"
+#include "VideoFrameMetadataGStreamer.h"
 #include <_javascript_Core/JSCInlines.h>
 #include <_javascript_Core/TypedArrayInlines.h>
 #include <algorithm>
@@ -32,7 +33,7 @@
 
 namespace WebCore {
 
-MediaSampleGStreamer::MediaSampleGStreamer(GRefPtr<GstSample>&& sample, const FloatSize& presentationSize, const AtomString& trackId, VideoRotation videoRotation, bool videoMirrored)
+MediaSampleGStreamer::MediaSampleGStreamer(GRefPtr<GstSample>&& sample, const FloatSize& presentationSize, const AtomString& trackId, VideoRotation videoRotation, bool videoMirrored, std::optional<VideoSampleMetadata>&& metadata)
     : m_pts(MediaTime::zeroTime())
     , m_dts(MediaTime::zeroTime())
     , m_duration(MediaTime::zeroTime())
@@ -46,15 +47,13 @@
     GstBuffer* buffer = gst_sample_get_buffer(sample.get());
     RELEASE_ASSERT(buffer);
 
-    auto createMediaTime =
-        [](GstClockTime time) -> MediaTime {
-            return MediaTime(GST_TIME_AS_USECONDS(time), G_USEC_PER_SEC);
-        };
+    if (metadata)
+        buffer = webkitGstBufferSetVideoSampleMetadata(buffer, WTFMove(metadata));
 
     if (GST_BUFFER_PTS_IS_VALID(buffer))
-        m_pts = createMediaTime(GST_BUFFER_PTS(buffer));
+        m_pts = fromGstClockTime(GST_BUFFER_PTS(buffer));
     if (GST_BUFFER_DTS_IS_VALID(buffer) || GST_BUFFER_PTS_IS_VALID(buffer))
-        m_dts = createMediaTime(GST_BUFFER_DTS_OR_PTS(buffer));
+        m_dts = fromGstClockTime(GST_BUFFER_DTS_OR_PTS(buffer));
     if (GST_BUFFER_DURATION_IS_VALID(buffer)) {
         // Sometimes (albeit rarely, so far seen only at the end of a track)
         // frames have very small durations, so small that may be under the
@@ -61,7 +60,7 @@
         // precision we are working with and be truncated to zero.
         // SourceBuffer algorithms are not expecting frames with zero-duration,
         // so let's use something very small instead in those fringe cases.
-        m_duration = createMediaTime(std::max(GST_BUFFER_DURATION(buffer), minimumDuration));
+        m_duration = fromGstClockTime(std::max(GST_BUFFER_DURATION(buffer), minimumDuration));
     } else {
         // Unfortunately, sometimes samples don't provide a duration. This can never happen in MP4 because of the way
         // the format is laid out, but it's pretty common in WebM.
@@ -68,7 +67,7 @@
         // The good part is that durations don't matter for playback, just for buffered ranges and coded frame deletion.
         // We want to pick something small enough to not cause unwanted frame deletion, but big enough to never be
         // mistaken for a rounding artifact.
-        m_duration = createMediaTime(16666667); // 1/60 seconds
+        m_duration = fromGstClockTime(16666667); // 1/60 seconds
     }
 
     m_size = gst_buffer_get_size(buffer);
@@ -101,7 +100,7 @@
     return adoptRef(*gstreamerMediaSample);
 }
 
-Ref<MediaSampleGStreamer> MediaSampleGStreamer::createImageSample(PixelBuffer&& pixelBuffer, const IntSize& destinationSize, double frameRate, VideoRotation videoRotation, bool videoMirrored)
+Ref<MediaSampleGStreamer> MediaSampleGStreamer::createImageSample(PixelBuffer&& pixelBuffer, const IntSize& destinationSize, double frameRate, VideoRotation videoRotation, bool videoMirrored, std::optional<VideoSampleMetadata>&& metadata)
 {
     ensureGStreamerInitialized();
 
@@ -120,6 +119,9 @@
     auto height = size.height();
     gst_buffer_add_video_meta(buffer.get(), GST_VIDEO_FRAME_FLAG_NONE, GST_VIDEO_FORMAT_BGRA, width, height);
 
+    if (metadata)
+        webkitGstBufferSetVideoSampleMetadata(buffer.get(), *metadata);
+
     int frameRateNumerator, frameRateDenominator;
     gst_util_double_to_fraction(frameRate, &frameRateNumerator, &frameRateDenominator);
 

Modified: trunk/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.h (286368 => 286369)


--- trunk/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.h	2021-12-01 17:50:30 UTC (rev 286368)
+++ trunk/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.h	2021-12-01 18:36:08 UTC (rev 286369)
@@ -26,6 +26,7 @@
 #include "FloatSize.h"
 #include "GStreamerCommon.h"
 #include "MediaSample.h"
+#include "VideoSampleMetadata.h"
 #include <wtf/text/AtomString.h>
 
 namespace WebCore {
@@ -34,13 +35,13 @@
 
 class MediaSampleGStreamer : public MediaSample {
 public:
-    static Ref<MediaSampleGStreamer> create(GRefPtr<GstSample>&& sample, const FloatSize& presentationSize, const AtomString& trackId, VideoRotation videoRotation = VideoRotation::None, bool videoMirrored = false)
+    static Ref<MediaSampleGStreamer> create(GRefPtr<GstSample>&& sample, const FloatSize& presentationSize, const AtomString& trackId, VideoRotation videoRotation = VideoRotation::None, bool videoMirrored = false, std::optional<VideoSampleMetadata>&& metadata = std::nullopt)
     {
-        return adoptRef(*new MediaSampleGStreamer(WTFMove(sample), presentationSize, trackId, videoRotation, videoMirrored));
+        return adoptRef(*new MediaSampleGStreamer(WTFMove(sample), presentationSize, trackId, videoRotation, videoMirrored, WTFMove(metadata)));
     }
 
     static Ref<MediaSampleGStreamer> createFakeSample(GstCaps*, MediaTime pts, MediaTime dts, MediaTime duration, const FloatSize& presentationSize, const AtomString& trackId);
-    static Ref<MediaSampleGStreamer> createImageSample(PixelBuffer&&, const IntSize& destinationSize = { }, double frameRate = 1, VideoRotation videoRotation = VideoRotation::None, bool videoMirrored = false);
+    static Ref<MediaSampleGStreamer> createImageSample(PixelBuffer&&, const IntSize& destinationSize = { }, double frameRate = 1, VideoRotation videoRotation = VideoRotation::None, bool videoMirrored = false, std::optional<VideoSampleMetadata>&& metadata = std::nullopt);
 
     void extendToTheBeginning();
     MediaTime presentationTime() const override { return m_pts; }
@@ -64,7 +65,7 @@
     bool videoMirrored() const override { return m_videoMirrored; }
 
 protected:
-    MediaSampleGStreamer(GRefPtr<GstSample>&&, const FloatSize& presentationSize, const AtomString& trackId, VideoRotation = VideoRotation::None, bool videoMirrored = false);
+    MediaSampleGStreamer(GRefPtr<GstSample>&&, const FloatSize& presentationSize, const AtomString& trackId, VideoRotation = VideoRotation::None, bool videoMirrored = false, std::optional<VideoSampleMetadata>&& = std::nullopt);
     virtual ~MediaSampleGStreamer() = default;
 
 private:

Added: trunk/Source/WebCore/platform/graphics/gstreamer/VideoFrameMetadataGStreamer.cpp (0 => 286369)


--- trunk/Source/WebCore/platform/graphics/gstreamer/VideoFrameMetadataGStreamer.cpp	                        (rev 0)
+++ trunk/Source/WebCore/platform/graphics/gstreamer/VideoFrameMetadataGStreamer.cpp	2021-12-01 18:36:08 UTC (rev 286369)
@@ -0,0 +1,179 @@
+/*
+ * Copyright (C) 2021 Igalia S.L
+ *
+ * This library is free software; you can redistribute it and/or
+ * modify it under the terms of the GNU Library General Public
+ * License as published by the Free Software Foundation; either
+ * version 2 of the License, or (at your option) any later version.
+ *
+ * This library is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ * Library General Public License for more details.
+ *
+ * You should have received a copy of the GNU Library General Public License
+ * aint with this library; see the file COPYING.LIB.  If not, write to
+ * the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor,
+ * Boston, MA 02110-1301, USA.
+ */
+
+#include "config.h"
+#include "VideoFrameMetadataGStreamer.h"
+
+#if ENABLE(VIDEO) && USE(GSTREAMER)
+
+#include "GStreamerCommon.h"
+#include <wtf/HashMap.h>
+#include <wtf/glib/WTFGType.h>
+
+GST_DEBUG_CATEGORY_STATIC(webkit_video_frame_meta_debug);
+#define GST_CAT_DEFAULT webkit_video_frame_meta_debug
+
+using namespace WebCore;
+
+struct VideoFrameMetadataPrivate {
+    std::optional<VideoSampleMetadata> videoSampleMetadata;
+    HashMap<String, std::pair<GstClockTime, GstClockTime>> processingTimes;
+};
+
+WEBKIT_DEFINE_ASYNC_DATA_STRUCT(VideoFrameMetadataPrivate);
+
+typedef struct _VideoFrameMetadataGStreamer {
+    GstMeta meta;
+    VideoFrameMetadataPrivate* priv;
+} VideoFrameMetadataGStreamer;
+
+GType videoFrameMetadataAPIGetType()
+{
+    static GType type;
+    static const gchar* tags[] = { nullptr };
+    static std::once_flag onceFlag;
+    std::call_once(onceFlag, [&] {
+        type = gst_meta_api_type_register("WebKitVideoFrameMetadataAPI", tags);
+    });
+    return type;
+}
+
+#define VIDEO_FRAME_METADATA_API_TYPE videoFrameMetadataAPIGetType()
+#define VIDEO_FRAME_METADATA_CAST(p) reinterpret_cast<VideoFrameMetadataGStreamer*>(p)
+#define getInternalVideoFrameMetadata(buffer) VIDEO_FRAME_METADATA_CAST(gst_buffer_get_meta(buffer, VIDEO_FRAME_METADATA_API_TYPE))
+
+const GstMetaInfo* videoFrameMetadataGetInfo();
+
+std::pair<GstBuffer*, VideoFrameMetadataGStreamer*> ensureVideoFrameMetadata(GstBuffer* buffer)
+{
+    auto* meta = getInternalVideoFrameMetadata(buffer);
+    if (meta)
+        return { buffer, meta };
+
+    buffer = gst_buffer_make_writable(buffer);
+    return { buffer, VIDEO_FRAME_METADATA_CAST(gst_buffer_add_meta(buffer, videoFrameMetadataGetInfo(), nullptr)) };
+}
+
+const GstMetaInfo* videoFrameMetadataGetInfo()
+{
+    static const GstMetaInfo* metaInfo = nullptr;
+    static std::once_flag onceFlag;
+    std::call_once(onceFlag, [&] {
+        metaInfo = gst_meta_register(VIDEO_FRAME_METADATA_API_TYPE, "WebKitVideoFrameMetadata", sizeof(VideoFrameMetadataGStreamer),
+            [](GstMeta* meta, gpointer, GstBuffer*) -> gboolean {
+                auto* frameMeta = VIDEO_FRAME_METADATA_CAST(meta);
+                frameMeta->priv = createVideoFrameMetadataPrivate();
+                return TRUE;
+            },
+            [](GstMeta* meta, GstBuffer*) {
+                auto* frameMeta = VIDEO_FRAME_METADATA_CAST(meta);
+                destroyVideoFrameMetadataPrivate(frameMeta->priv);
+            },
+            [](GstBuffer* buffer, GstMeta* meta, GstBuffer*, GQuark type, gpointer) -> gboolean {
+                if (!GST_META_TRANSFORM_IS_COPY(type))
+                    return FALSE;
+
+                auto* frameMeta = VIDEO_FRAME_METADATA_CAST(meta);
+                auto [buf, copyMeta] = ensureVideoFrameMetadata(buffer);
+                copyMeta->priv->videoSampleMetadata = frameMeta->priv->videoSampleMetadata;
+                copyMeta->priv->processingTimes = frameMeta->priv->processingTimes;
+                return TRUE;
+            });
+    });
+    return metaInfo;
+}
+
+GstBuffer* webkitGstBufferSetVideoSampleMetadata(GstBuffer* buffer, std::optional<VideoSampleMetadata>&& metadata)
+{
+    if (!GST_IS_BUFFER(buffer))
+        return nullptr;
+
+    auto [modifiedBuffer, meta] = ensureVideoFrameMetadata(buffer);
+    meta->priv->videoSampleMetadata = WTFMove(metadata);
+    return modifiedBuffer;
+}
+
+void webkitGstTraceProcessingTimeForElement(GstElement* element)
+{
+    static std::once_flag onceFlag;
+    std::call_once(onceFlag, [&] {
+        GST_DEBUG_CATEGORY_INIT(webkit_video_frame_meta_debug, "webkitvideoframemeta", 0, "Video frame processing metrics");
+    });
+
+    GST_DEBUG("Tracing processing time for %" GST_PTR_FORMAT, element);
+    auto probeType = static_cast<GstPadProbeType>(GST_PAD_PROBE_TYPE_PUSH | GST_PAD_PROBE_TYPE_BUFFER);
+
+    auto sinkPad = adoptGRef(gst_element_get_static_pad(element, "sink"));
+    gst_pad_add_probe(sinkPad.get(), probeType, [](GstPad*, GstPadProbeInfo* info, gpointer userData) -> GstPadProbeReturn {
+        auto [modifiedBuffer, meta] = ensureVideoFrameMetadata(GST_PAD_PROBE_INFO_BUFFER(info));
+        GST_PAD_PROBE_INFO_DATA(info) = modifiedBuffer;
+        meta->priv->processingTimes.set(reinterpret_cast<char*>(userData), std::make_pair(gst_util_get_timestamp(), GST_CLOCK_TIME_NONE));
+        return GST_PAD_PROBE_OK;
+    }, gst_element_get_name(element), g_free);
+
+    auto srcPad = adoptGRef(gst_element_get_static_pad(element, "src"));
+    gst_pad_add_probe(srcPad.get(), probeType, [](GstPad*, GstPadProbeInfo* info, gpointer userData) -> GstPadProbeReturn {
+        auto* meta = getInternalVideoFrameMetadata(GST_PAD_PROBE_INFO_BUFFER(info));
+        // Some decoders (such as theoradec) do not always copy the input meta to the output frame,
+        // so we need to check the meta is valid here before accessing it.
+        if (!meta)
+            return GST_PAD_PROBE_OK;
+
+        auto* elementName = reinterpret_cast<char*>(userData);
+        auto value = meta->priv->processingTimes.get(elementName);
+        meta->priv->processingTimes.set(elementName, std::make_pair(value.first, gst_util_get_timestamp()));
+        return GST_PAD_PROBE_OK;
+    }, gst_element_get_name(element), g_free);
+}
+
+VideoFrameMetadata webkitGstBufferGetVideoFrameMetadata(GstBuffer* buffer)
+{
+    if (!GST_IS_BUFFER(buffer))
+        return { };
+
+    VideoFrameMetadata videoFrameMetadata;
+    if (GST_BUFFER_PTS_IS_VALID(buffer))
+        videoFrameMetadata.mediaTime = fromGstClockTime(GST_BUFFER_PTS(buffer)).toDouble();
+
+    auto* meta = getInternalVideoFrameMetadata(buffer);
+    if (!meta)
+        return videoFrameMetadata;
+
+    auto processingDuration = MediaTime::zeroTime();
+    for (auto& [startTime, stopTime] : meta->priv->processingTimes.values())
+        processingDuration += fromGstClockTime(GST_CLOCK_DIFF(startTime, stopTime));
+
+    if (processingDuration != MediaTime::zeroTime())
+        videoFrameMetadata.processingDuration = processingDuration.toDouble();
+
+    auto videoSampleMetadata = meta->priv->videoSampleMetadata;
+    if (!videoSampleMetadata)
+        return videoFrameMetadata;
+
+    if (videoSampleMetadata->captureTime)
+        videoFrameMetadata.captureTime = videoSampleMetadata->captureTime->value();
+
+    if (videoSampleMetadata->receiveTime)
+        videoFrameMetadata.receiveTime = videoSampleMetadata->receiveTime->value();
+
+    videoFrameMetadata.rtpTimestamp = videoSampleMetadata->rtpTimestamp;
+    return videoFrameMetadata;
+}
+
+#endif

Added: trunk/Source/WebCore/platform/graphics/gstreamer/VideoFrameMetadataGStreamer.h (0 => 286369)


--- trunk/Source/WebCore/platform/graphics/gstreamer/VideoFrameMetadataGStreamer.h	                        (rev 0)
+++ trunk/Source/WebCore/platform/graphics/gstreamer/VideoFrameMetadataGStreamer.h	2021-12-01 18:36:08 UTC (rev 286369)
@@ -0,0 +1,33 @@
+/*
+ * Copyright (C) 2021 Igalia S.L
+ *
+ * This library is free software; you can redistribute it and/or
+ * modify it under the terms of the GNU Library General Public
+ * License as published by the Free Software Foundation; either
+ * version 2 of the License, or (at your option) any later version.
+ *
+ * This library is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ * Library General Public License for more details.
+ *
+ * You should have received a copy of the GNU Library General Public License
+ * aint with this library; see the file COPYING.LIB.  If not, write to
+ * the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor,
+ * Boston, MA 02110-1301, USA.
+ */
+
+#pragma once
+
+#if ENABLE(VIDEO) && USE(GSTREAMER)
+
+#include "VideoFrameMetadata.h"
+#include "VideoSampleMetadata.h"
+
+#include <gst/gst.h>
+
+GstBuffer* webkitGstBufferSetVideoSampleMetadata(GstBuffer*, std::optional<WebCore::VideoSampleMetadata>&&);
+void webkitGstTraceProcessingTimeForElement(GstElement*);
+WebCore::VideoFrameMetadata webkitGstBufferGetVideoFrameMetadata(GstBuffer*);
+
+#endif // ENABLE(VIDEO) && USE(GSTREAMER)

Modified: trunk/Source/WebCore/platform/mediastream/gstreamer/GStreamerCapturer.cpp (286368 => 286369)


--- trunk/Source/WebCore/platform/mediastream/gstreamer/GStreamerCapturer.cpp	2021-12-01 17:50:30 UTC (rev 286368)
+++ trunk/Source/WebCore/platform/mediastream/gstreamer/GStreamerCapturer.cpp	2021-12-01 18:36:08 UTC (rev 286369)
@@ -23,11 +23,14 @@
 #include "config.h"
 
 #if ENABLE(VIDEO) && ENABLE(MEDIA_STREAM) && USE(GSTREAMER)
+
 #include "GStreamerCapturer.h"
+#include "VideoFrameMetadataGStreamer.h"
 
 #include <gst/app/gstappsink.h>
 #include <gst/app/gstappsrc.h>
 #include <mutex>
+#include <wtf/MonotonicTime.h>
 
 GST_DEBUG_CATEGORY(webkit_capturer_debug);
 #define GST_CAT_DEFAULT webkit_capturer_debug
@@ -99,9 +102,20 @@
         if (GST_IS_APP_SRC(m_src.get()))
             g_object_set(m_src.get(), "is-live", true, "format", GST_FORMAT_TIME, nullptr);
 
+        auto srcPad = adoptGRef(gst_element_get_static_pad(m_src.get(), "src"));
+        if (m_deviceType == CaptureDevice::DeviceType::Camera) {
+            gst_pad_add_probe(srcPad.get(), static_cast<GstPadProbeType>(GST_PAD_PROBE_TYPE_PUSH | GST_PAD_PROBE_TYPE_BUFFER), [](GstPad*, GstPadProbeInfo* info, gpointer) -> GstPadProbeReturn {
+                VideoSampleMetadata metadata;
+                metadata.captureTime = MonotonicTime::now().secondsSinceEpoch();
+                auto* buffer = GST_PAD_PROBE_INFO_BUFFER(info);
+                auto* modifiedBuffer = webkitGstBufferSetVideoSampleMetadata(buffer, metadata);
+                gst_buffer_replace(&buffer, modifiedBuffer);
+                return GST_PAD_PROBE_OK;
+            }, nullptr, nullptr);
+        }
+
         if (m_deviceType == CaptureDevice::DeviceType::Screen) {
-            auto pad = adoptGRef(gst_element_get_static_pad(m_src.get(), "src"));
-            gst_pad_add_probe(pad.get(), GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM, [](GstPad*, GstPadProbeInfo* info, void* userData) -> GstPadProbeReturn {
+            gst_pad_add_probe(srcPad.get(), GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM, [](GstPad*, GstPadProbeInfo* info, void* userData) -> GstPadProbeReturn {
                 auto* event = gst_pad_probe_info_get_event(info);
                 if (GST_EVENT_TYPE(event) != GST_EVENT_CAPS)
                     return GST_PAD_PROBE_OK;

Modified: trunk/Source/WebCore/platform/mediastream/gstreamer/GStreamerMediaStreamSource.cpp (286368 => 286369)


--- trunk/Source/WebCore/platform/mediastream/gstreamer/GStreamerMediaStreamSource.cpp	2021-12-01 17:50:30 UTC (rev 286368)
+++ trunk/Source/WebCore/platform/mediastream/gstreamer/GStreamerMediaStreamSource.cpp	2021-12-01 18:36:08 UTC (rev 286369)
@@ -31,6 +31,7 @@
 #include "GStreamerCommon.h"
 #include "MediaSampleGStreamer.h"
 #include "MediaStreamPrivate.h"
+#include "VideoFrameMetadataGStreamer.h"
 #include "VideoTrackPrivateMediaStream.h"
 
 #include <gst/app/gstappsrc.h>
@@ -61,8 +62,10 @@
         gst_tag_list_add(tagList.get(), GST_TAG_MERGE_APPEND, WEBKIT_MEDIA_TRACK_TAG_KIND, static_cast<int>(VideoTrackPrivate::Kind::Main), nullptr);
 
         auto& settings = track->settings();
-        gst_tag_list_add(tagList.get(), GST_TAG_MERGE_APPEND, WEBKIT_MEDIA_TRACK_TAG_WIDTH, settings.width(),
-            WEBKIT_MEDIA_TRACK_TAG_HEIGHT, settings.height(), nullptr);
+        if (settings.width())
+            gst_tag_list_add(tagList.get(), GST_TAG_MERGE_APPEND, WEBKIT_MEDIA_TRACK_TAG_WIDTH, settings.width(), nullptr);
+        if (settings.height())
+            gst_tag_list_add(tagList.get(), GST_TAG_MERGE_APPEND, WEBKIT_MEDIA_TRACK_TAG_HEIGHT, settings.height(), nullptr);
     }
 
     GST_DEBUG("Track tags: %" GST_PTR_FORMAT, tagList.get());
@@ -267,19 +270,19 @@
         if (!m_parent)
             return;
 
-        auto* gstSample = static_cast<MediaSampleGStreamer*>(&sample)->platformSample().sample.gstSample;
-        auto* caps = gst_sample_get_caps(gstSample);
-        GstVideoInfo info;
-        gst_video_info_from_caps(&info, caps);
+        auto sampleSize = sample.presentationSize();
+        IntSize captureSize(sampleSize.width(), sampleSize.height());
 
-        int width = GST_VIDEO_INFO_WIDTH(&info);
-        int height = GST_VIDEO_INFO_HEIGHT(&info);
-        if (m_lastKnownSize != IntSize(width, height)) {
-            m_lastKnownSize.setWidth(width);
-            m_lastKnownSize.setHeight(height);
-            updateBlackFrame(caps);
-        }
+        auto settings = m_track.settings();
+        m_configuredSize.setWidth(settings.width());
+        m_configuredSize.setHeight(settings.height());
 
+        if (!m_configuredSize.width())
+            m_configuredSize.setWidth(captureSize.width());
+        if (!m_configuredSize.height())
+            m_configuredSize.setHeight(captureSize.height());
+
+        auto* mediaSample = static_cast<MediaSampleGStreamer*>(&sample);
         auto videoRotation = sample.videoRotation();
         bool videoMirrored = sample.videoMirrored();
         if (m_videoRotation != videoRotation || m_videoMirrored != videoMirrored) {
@@ -292,6 +295,12 @@
             gst_pad_push_event(pad.get(), gst_event_new_tag(gst_tag_list_new(GST_TAG_IMAGE_ORIENTATION, orientation.utf8().data(), nullptr)));
         }
 
+        auto* gstSample = mediaSample->platformSample().sample.gstSample;
+        if (!m_configuredSize.isEmpty() && m_lastKnownSize != m_configuredSize) {
+            m_lastKnownSize = m_configuredSize;
+            updateBlackFrame(gst_sample_get_caps(gstSample));
+        }
+
         if (m_track.enabled()) {
             GST_TRACE_OBJECT(m_src.get(), "Pushing video frame from enabled track");
             pushSample(gstSample);
@@ -336,6 +345,12 @@
     void pushBlackFrame()
     {
         GST_TRACE_OBJECT(m_src.get(), "Pushing black video frame");
+        VideoSampleMetadata metadata;
+        metadata.captureTime = MonotonicTime::now().secondsSinceEpoch();
+        auto* buffer = webkitGstBufferSetVideoSampleMetadata(gst_sample_get_buffer(m_blackFrame.get()), metadata);
+        // TODO: Use gst_sample_set_buffer() after bumping GStreamer dependency to 1.16.
+        auto* caps = gst_sample_get_caps(m_blackFrame.get());
+        m_blackFrame = adoptGRef(gst_sample_new(buffer, caps, nullptr, nullptr));
         pushSample(m_blackFrame.get());
     }
 
@@ -349,6 +364,7 @@
     bool m_isObserving { false };
     RefPtr<AudioTrackPrivateMediaStream> m_audioTrack;
     RefPtr<VideoTrackPrivateMediaStream> m_videoTrack;
+    IntSize m_configuredSize;
     IntSize m_lastKnownSize;
     GRefPtr<GstSample> m_blackFrame;
     MediaSample::VideoRotation m_videoRotation { MediaSample::VideoRotation::None };

Modified: trunk/Source/WebCore/platform/mediastream/gstreamer/MockRealtimeVideoSourceGStreamer.cpp (286368 => 286369)


--- trunk/Source/WebCore/platform/mediastream/gstreamer/MockRealtimeVideoSourceGStreamer.cpp	2021-12-01 17:50:30 UTC (rev 286368)
+++ trunk/Source/WebCore/platform/mediastream/gstreamer/MockRealtimeVideoSourceGStreamer.cpp	2021-12-01 18:36:08 UTC (rev 286369)
@@ -152,7 +152,9 @@
     if (!pixelBuffer)
         return;
 
-    auto sample = MediaSampleGStreamer::createImageSample(WTFMove(*pixelBuffer), size(), frameRate(), sampleRotation());
+    std::optional<VideoSampleMetadata> metadata;
+    metadata->captureTime = MonotonicTime::now().secondsSinceEpoch();
+    auto sample = MediaSampleGStreamer::createImageSample(WTFMove(*pixelBuffer), size(), frameRate(), sampleRotation(), false, WTFMove(metadata));
     sample->offsetTimestampsBy(MediaTime::createWithDouble((elapsedTime() + 100_ms).seconds()));
     dispatchMediaSampleToObservers(sample.get(), { });
 }

Modified: trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.cpp (286368 => 286369)


--- trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.cpp	2021-12-01 17:50:30 UTC (rev 286368)
+++ trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.cpp	2021-12-01 18:36:08 UTC (rev 286369)
@@ -60,7 +60,8 @@
 
     callOnMainThread([protectedThis = Ref { *this }, frame] {
         auto gstSample = GStreamerSampleFromLibWebRTCVideoFrame(frame);
-        auto sample = MediaSampleGStreamer::create(WTFMove(gstSample), { }, { });
+        auto metadata = std::make_optional(metadataFromVideoFrame(frame));
+        auto sample = MediaSampleGStreamer::create(WTFMove(gstSample), { }, { }, static_cast<MediaSample::VideoRotation>(frame.rotation()), false, WTFMove(metadata));
         protectedThis->videoSampleAvailable(sample.get(), { });
     });
 }

Modified: trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.h (286368 => 286369)


--- trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.h	2021-12-01 17:50:30 UTC (rev 286368)
+++ trunk/Source/WebCore/platform/mediastream/libwebrtc/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.h	2021-12-01 18:36:08 UTC (rev 286369)
@@ -45,8 +45,6 @@
 
     // rtc::VideoSinkInterface
     void OnFrame(const webrtc::VideoFrame&) final;
-    void setCapsFromSettings();
-    GRefPtr<GstCaps> m_caps;
 };
 
 } // namespace WebCore
_______________________________________________
webkit-changes mailing list
webkit-changes@lists.webkit.org
https://lists.webkit.org/mailman/listinfo/webkit-changes

Reply via email to