Title: [203739] trunk
Revision
203739
Author
commit-qu...@webkit.org
Date
2016-07-26 14:52:27 -0700 (Tue, 26 Jul 2016)

Log Message

HTMLVideoElement frames do not update on iOS when src is a MediaStream blob
https://bugs.webkit.org/show_bug.cgi?id=159833
<rdar://problem/27379487>

Patch by George Ruan <gr...@apple.com> on 2016-07-26
Reviewed by Eric Carlson.

Source/WebCore:

Test: fast/mediastream/MediaStream-video-element-displays-buffer.html

* WebCore.xcodeproj/project.pbxproj:
* platform/cf/CoreMediaSoftLink.cpp: Add CMSampleBufferCreateReadyWithImageBuffer and CMVideoFormatDescriptionCreateForImageBuffer
softlink.
* platform/cf/CoreMediaSoftLink.h: Ditto.
* platform/cocoa/CoreVideoSoftLink.cpp: Add CVPixelBufferCreate, kCVPixelBufferCGBitmapContextCompatibilityKey, and
kCVPixelBufferCGImageCompatibilityKey.
* platform/cocoa/CoreVideoSoftLink.h: Ditto.
* platform/graphics/avfoundation/MediaSampleAVFObjC.h: Change create to return a Ref<T> instead
of RefPtr<T>.
* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h: Make observer of
MediaStreamTrackPrivate and make MediaPlayer use an AVSampleBufferDisplayLayer instead of CALayer.
* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm: Ditto.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC): Clean up
observers and AVSampleBufferDisplayLayer.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::isAvailable): Ensures AVSampleBufferDisplayLayer
is available.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSampleBufferFromTrack): Placeholder.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSampleBufferFromTrack): Responsible
for enqueuing sample buffers to the active video track.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer): Ensures that an AVSampleBufferDisplayLayer
exists.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer): Destroys the AVSampleBufferDisplayLayer.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::platformLayer): Replace CALayer with AVSampleBufferDisplayLayer.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode): Ditto.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play): Call updateReadyState as a deferred task.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentReadyState): readyState is bumped to HAVE_ENOUGH_DATA
only when the MediaPlayerPrivateMediaStreamAVFObjC has received a media sample.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferUpdated): Called from MediaStreamTrackPrivate when a
new SampleBuffer is available.
(WebCore::updateTracksOfType): Manage adding and removing self as observer from tracks.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateTracks): Replace CALayer with AVSampleBufferDisplayLayer
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::acceleratedRenderingStateChanged): Copied from
MediaPlayerPrivateMediaSourceAVFObjC.mm
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::load): Deleted CALayer.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode): Deleted process of updating CALayer.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateIntrinsicSize): Deleted CALayer.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::createPreviewLayers): Deleted.
* platform/mediastream/MediaStreamPrivate.cpp:
(WebCore::MediaStreamPrivate::updateActiveVideoTrack): Remove redundant check.
* platform/mediastream/MediaStreamTrackPrivate.cpp:
(WebCore::MediaStreamTrackPrivate::sourceHasMoreMediaData): Called from RealtimeMediaSource when a new SampleBuffer
is available.
* platform/mediastream/MediaStreamTrackPrivate.h:
(WebCore::MediaStreamTrackPrivate::Observer::sampleBufferUpdated): Relays to MediaPlayerPrivateMediaStreamAVFObjC that
a new SampleBuffer is available to enqueue to the AVSampleBufferDisplayLayer.
* platform/mediastream/RealtimeMediaSource.cpp:
(WebCore::RealtimeMediaSource::settingsDidChange): Fix grammatical mistake in function name settingsDidChanged().
(WebCore::RealtimeMediaSource::mediaDataUpdated): Relays to all observers that a new SampleBuffer is available.
(WebCore::RealtimeMediaSource::settingsDidChanged): Deleted.
* platform/mediastream/RealtimeMediaSource.h:
* platform/mediastream/mac/AVVideoCaptureSource.mm:
(WebCore::AVVideoCaptureSource::processNewFrame): Calls mediaDataUpdated when a new SampleBuffer is captured.
* platform/mediastream/mac/MockRealtimeVideoSourceMac.h:
* platform/mediastream/mac/MockRealtimeVideoSourceMac.mm:
(WebCore::MockRealtimeVideoSourceMac::CMSampleBufferFromPixelBuffer): Convert CVPixelBuffer to CMSampleBuffer.
(WebCore::MockRealtimeVideoSourceMac::pixelBufferFromCGImage): Convert CGImage to CVPixelBuffer.
(WebCore::MockRealtimeVideoSourceMac::updateSampleBuffer): Creates a CMSampleBuffer from current imageBuffer and
sends the CMSampleBuffer to MediaPlayerPrivateMediaStreamAVFObjC
* platform/mock/MockRealtimeVideoSource.cpp:
(WebCore::MockRealtimeVideoSource::setFrameRate): Fix grammar of settingsDidChanged() to settingsDidChange().
(WebCore::MockRealtimeVideoSource::setSize): Ditto.
(WebCore::MockRealtimeVideoSource::generateFrame): Call updateSampleBuffer().
* platform/mock/MockRealtimeVideoSource.h: Change elapsedTime() from private to protected.
(WebCore::MockRealtimeVideoSource::updateSampleBuffer): Overriden by MockRealtimeVideoSourceMac.

LayoutTests:

* fast/mediastream/MediaStream-video-element-displays-buffer-expected.txt: Added.
* fast/mediastream/MediaStream-video-element-displays-buffer.html: Added. Checks that
a video element with a mediastream source displays frames that are neither black or transparent.
* fast/mediastream/resources/getUserMedia-helper.js:
(setupVideoElementWithStream): Sets up video element with global variable mediastream.

Modified Paths

Added Paths

Diff

Modified: trunk/LayoutTests/ChangeLog (203738 => 203739)


--- trunk/LayoutTests/ChangeLog	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/LayoutTests/ChangeLog	2016-07-26 21:52:27 UTC (rev 203739)
@@ -1,3 +1,17 @@
+2016-07-26  George Ruan  <gr...@apple.com>
+
+        HTMLVideoElement frames do not update on iOS when src is a MediaStream blob
+        https://bugs.webkit.org/show_bug.cgi?id=159833
+        <rdar://problem/27379487>
+
+        Reviewed by Eric Carlson.
+
+        * fast/mediastream/MediaStream-video-element-displays-buffer-expected.txt: Added.
+        * fast/mediastream/MediaStream-video-element-displays-buffer.html: Added. Checks that
+        a video element with a mediastream source displays frames that are neither black or transparent.
+        * fast/mediastream/resources/getUserMedia-helper.js:
+        (setupVideoElementWithStream): Sets up video element with global variable mediastream.
+
 2016-07-26  Ryosuke Niwa  <rn...@webkit.org>
 
         Remove the tests for legacy custom elements API

Added: trunk/LayoutTests/fast/mediastream/MediaStream-video-element-displays-buffer-expected.txt (0 => 203739)


--- trunk/LayoutTests/fast/mediastream/MediaStream-video-element-displays-buffer-expected.txt	                        (rev 0)
+++ trunk/LayoutTests/fast/mediastream/MediaStream-video-element-displays-buffer-expected.txt	2016-07-26 21:52:27 UTC (rev 203739)
@@ -0,0 +1,17 @@
+Tests that the stream displays captured buffers to the video element.
+
+On success, you will see a series of "PASS" messages, followed by "TEST COMPLETE".
+
+
+PASS mediaDevices.getUserMedia generated a stream successfully.
+video.src = ""
+video.play()
+
+ === checking pixels ===
+PASS isPixelTransparent(buffer) is true
+PASS isPixelTransparent(buffer) is false
+PASS isPixelBlack(buffer) is false
+PASS successfullyParsed is true
+
+TEST COMPLETE
+ 

Added: trunk/LayoutTests/fast/mediastream/MediaStream-video-element-displays-buffer.html (0 => 203739)


--- trunk/LayoutTests/fast/mediastream/MediaStream-video-element-displays-buffer.html	                        (rev 0)
+++ trunk/LayoutTests/fast/mediastream/MediaStream-video-element-displays-buffer.html	2016-07-26 21:52:27 UTC (rev 203739)
@@ -0,0 +1,72 @@
+<!DOCTYPE html>
+<html>
+<head>
+    <script src=""
+    <script src=""
+</head>
+<body _onload_="start()">
+<p id="description"></p>
+<div id="console"></div>
+<video controls width="680" height="360"></video>
+<canvas width="680" height="360"></canvas>
+<script>
+    let mediaStream;
+    let video;
+    
+    let buffer;
+    
+    function isPixelTransparent(pixel)
+    {
+        return pixel[0] === 0 && pixel[1] === 0 && pixel[2] === 0 && pixel[3] === 0;
+    }
+
+    function isPixelBlack(pixel)
+    {
+        return pixel[0] === 0 && pixel[1] === 0 && pixel[2] === 0 && pixel[3] === 255;
+    }
+
+    function verifyFramesBeingDisplayed()
+    {
+        let canvas = document.querySelector('canvas');
+        let context = canvas.getContext('2d');
+
+        debug('<br> === checking pixels ===');
+
+        context.clearRect(0, 0, canvas.width, canvas.height);
+
+        let x = canvas.width * .035;
+        let y = canvas.height * 0.6 + 2 + x;
+        
+        buffer = context.getImageData(x, y, 1, 1).data;
+        shouldBeTrue('isPixelTransparent(buffer)');
+        
+        context.drawImage(video, 0, 0, canvas.width, canvas.height);
+        
+        buffer = context.getImageData(x, y, 1, 1).data;
+        shouldBeFalse('isPixelTransparent(buffer)');
+        shouldBeFalse('isPixelBlack(buffer)');
+
+        finishJSTest();
+    }
+
+    function canplay()
+    {
+        evalAndLog('video.play()');
+    }
+
+    function start()
+    {
+        description("Tests that the stream displays captured buffers to the video element.");
+
+        video = document.querySelector('video');
+        video.addEventListener('canplay', canplay, false);
+        video.addEventListener('playing', verifyFramesBeingDisplayed, false);
+
+        getUserMedia("allow", {video:true}, setupVideoElementWithStream);
+    }
+
+    window.jsTestIsAsync = true;
+</script>
+<script src=""
+</body>
+</html>
\ No newline at end of file

Modified: trunk/LayoutTests/fast/mediastream/resources/getUserMedia-helper.js (203738 => 203739)


--- trunk/LayoutTests/fast/mediastream/resources/getUserMedia-helper.js	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/LayoutTests/fast/mediastream/resources/getUserMedia-helper.js	2016-07-26 21:52:27 UTC (rev 203739)
@@ -24,3 +24,10 @@
     testFailed('getUserMedia failed:' + e);
     finishJSTest();
 }
+
+function setupVideoElementWithStream(stream)
+{
+    mediaStream = stream;
+    testPassed('mediaDevices.getUserMedia generated a stream successfully.');
+    evalAndLog('video.src = ""
+}
\ No newline at end of file

Modified: trunk/Source/WebCore/ChangeLog (203738 => 203739)


--- trunk/Source/WebCore/ChangeLog	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/ChangeLog	2016-07-26 21:52:27 UTC (rev 203739)
@@ -1,3 +1,78 @@
+2016-07-26  George Ruan  <gr...@apple.com>
+
+        HTMLVideoElement frames do not update on iOS when src is a MediaStream blob
+        https://bugs.webkit.org/show_bug.cgi?id=159833
+        <rdar://problem/27379487>
+
+        Reviewed by Eric Carlson.
+
+        Test: fast/mediastream/MediaStream-video-element-displays-buffer.html
+
+        * WebCore.xcodeproj/project.pbxproj:
+        * platform/cf/CoreMediaSoftLink.cpp: Add CMSampleBufferCreateReadyWithImageBuffer and CMVideoFormatDescriptionCreateForImageBuffer
+        softlink.
+        * platform/cf/CoreMediaSoftLink.h: Ditto.
+        * platform/cocoa/CoreVideoSoftLink.cpp: Add CVPixelBufferCreate, kCVPixelBufferCGBitmapContextCompatibilityKey, and
+        kCVPixelBufferCGImageCompatibilityKey.
+        * platform/cocoa/CoreVideoSoftLink.h: Ditto.
+        * platform/graphics/avfoundation/MediaSampleAVFObjC.h: Change create to return a Ref<T> instead
+        of RefPtr<T>.
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h: Make observer of
+        MediaStreamTrackPrivate and make MediaPlayer use an AVSampleBufferDisplayLayer instead of CALayer.
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm: Ditto.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC): Clean up
+        observers and AVSampleBufferDisplayLayer.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::isAvailable): Ensures AVSampleBufferDisplayLayer
+        is available.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSampleBufferFromTrack): Placeholder.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSampleBufferFromTrack): Responsible
+        for enqueuing sample buffers to the active video track.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer): Ensures that an AVSampleBufferDisplayLayer
+        exists.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer): Destroys the AVSampleBufferDisplayLayer.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::platformLayer): Replace CALayer with AVSampleBufferDisplayLayer.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode): Ditto.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play): Call updateReadyState as a deferred task.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentReadyState): readyState is bumped to HAVE_ENOUGH_DATA
+        only when the MediaPlayerPrivateMediaStreamAVFObjC has received a media sample.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferUpdated): Called from MediaStreamTrackPrivate when a
+        new SampleBuffer is available.
+        (WebCore::updateTracksOfType): Manage adding and removing self as observer from tracks.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateTracks): Replace CALayer with AVSampleBufferDisplayLayer
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::acceleratedRenderingStateChanged): Copied from
+        MediaPlayerPrivateMediaSourceAVFObjC.mm
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::load): Deleted CALayer.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode): Deleted process of updating CALayer.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateIntrinsicSize): Deleted CALayer.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::createPreviewLayers): Deleted.
+        * platform/mediastream/MediaStreamPrivate.cpp:
+        (WebCore::MediaStreamPrivate::updateActiveVideoTrack): Remove redundant check.
+        * platform/mediastream/MediaStreamTrackPrivate.cpp:
+        (WebCore::MediaStreamTrackPrivate::sourceHasMoreMediaData): Called from RealtimeMediaSource when a new SampleBuffer
+        is available.
+        * platform/mediastream/MediaStreamTrackPrivate.h:
+        (WebCore::MediaStreamTrackPrivate::Observer::sampleBufferUpdated): Relays to MediaPlayerPrivateMediaStreamAVFObjC that
+        a new SampleBuffer is available to enqueue to the AVSampleBufferDisplayLayer.
+        * platform/mediastream/RealtimeMediaSource.cpp:
+        (WebCore::RealtimeMediaSource::settingsDidChange): Fix grammatical mistake in function name settingsDidChanged().
+        (WebCore::RealtimeMediaSource::mediaDataUpdated): Relays to all observers that a new SampleBuffer is available.
+        (WebCore::RealtimeMediaSource::settingsDidChanged): Deleted.
+        * platform/mediastream/RealtimeMediaSource.h:
+        * platform/mediastream/mac/AVVideoCaptureSource.mm:
+        (WebCore::AVVideoCaptureSource::processNewFrame): Calls mediaDataUpdated when a new SampleBuffer is captured.
+        * platform/mediastream/mac/MockRealtimeVideoSourceMac.h:
+        * platform/mediastream/mac/MockRealtimeVideoSourceMac.mm:
+        (WebCore::MockRealtimeVideoSourceMac::CMSampleBufferFromPixelBuffer): Convert CVPixelBuffer to CMSampleBuffer.
+        (WebCore::MockRealtimeVideoSourceMac::pixelBufferFromCGImage): Convert CGImage to CVPixelBuffer.
+        (WebCore::MockRealtimeVideoSourceMac::updateSampleBuffer): Creates a CMSampleBuffer from current imageBuffer and
+        sends the CMSampleBuffer to MediaPlayerPrivateMediaStreamAVFObjC
+        * platform/mock/MockRealtimeVideoSource.cpp:
+        (WebCore::MockRealtimeVideoSource::setFrameRate): Fix grammar of settingsDidChanged() to settingsDidChange().
+        (WebCore::MockRealtimeVideoSource::setSize): Ditto.
+        (WebCore::MockRealtimeVideoSource::generateFrame): Call updateSampleBuffer().
+        * platform/mock/MockRealtimeVideoSource.h: Change elapsedTime() from private to protected.
+        (WebCore::MockRealtimeVideoSource::updateSampleBuffer): Overriden by MockRealtimeVideoSourceMac.
+
 2016-07-26  Zalan Bujtas  <za...@apple.com>
 
         Move ControlStates HashMap to RenderBox.

Modified: trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj (203738 => 203739)


--- trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj	2016-07-26 21:52:27 UTC (rev 203739)
@@ -985,6 +985,7 @@
 		1AFE119A0CBFFCC4003017FA /* JSSQLResultSetRowList.h in Headers */ = {isa = PBXBuildFile; fileRef = 1AFE11980CBFFCC4003017FA /* JSSQLResultSetRowList.h */; };
 		1B124D8D1D380B7000ECDFB0 /* MediaSampleAVFObjC.h in Headers */ = {isa = PBXBuildFile; fileRef = 1B124D8C1D380B7000ECDFB0 /* MediaSampleAVFObjC.h */; };
 		1B124D8F1D380BB600ECDFB0 /* MediaSampleAVFObjC.mm in Sources */ = {isa = PBXBuildFile; fileRef = 1B124D8E1D380BB600ECDFB0 /* MediaSampleAVFObjC.mm */; };
+		1BF9DB3C1D3973AD0026AEB7 /* MediaSample.h in Headers */ = {isa = PBXBuildFile; fileRef = CD641EC7181ED60100EE4C41 /* MediaSample.h */; settings = {ATTRIBUTES = (Private, ); }; };
 		1C010700192594DF008A4201 /* InlineTextBoxStyle.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 1C0106FE192594DF008A4201 /* InlineTextBoxStyle.cpp */; };
 		1C010701192594DF008A4201 /* InlineTextBoxStyle.h in Headers */ = {isa = PBXBuildFile; fileRef = 1C0106FF192594DF008A4201 /* InlineTextBoxStyle.h */; };
 		1C0939EA1A13E12900B788E5 /* CachedSVGFont.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 1C0939E81A13E12900B788E5 /* CachedSVGFont.cpp */; };
@@ -27053,6 +27054,7 @@
 				75793EC90D0CE72D007FC0AC /* JSMessageEvent.h in Headers */,
 				E1ADEDDA0E76BD93004A1A5E /* JSMessagePort.h in Headers */,
 				41F584C7104652CB009CAA64 /* JSMessagePortCustom.h in Headers */,
+				1BF9DB3C1D3973AD0026AEB7 /* MediaSample.h in Headers */,
 				2D6F3E951C1F85550061DBD4 /* JSMockPageOverlay.h in Headers */,
 				E38838991BAD145F00D62EE3 /* JSModuleLoader.h in Headers */,
 				A86629D109DA2B48009633A5 /* JSMouseEvent.h in Headers */,

Modified: trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.cpp (203738 => 203739)


--- trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.cpp	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.cpp	2016-07-26 21:52:27 UTC (rev 203739)
@@ -80,6 +80,7 @@
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferCreate, OSStatus, (CFAllocatorRef allocator, CMBlockBufferRef dataBuffer, Boolean dataReady, CMSampleBufferMakeDataReadyCallback makeDataReadyCallback, void *makeDataReadyRefcon, CMFormatDescriptionRef formatDescription, CMItemCount numSamples, CMItemCount numSampleTimingEntries, const CMSampleTimingInfo *sampleTimingArray, CMItemCount numSampleSizeEntries, const size_t *sampleSizeArray, CMSampleBufferRef *sBufOut), (allocator, dataBuffer, dataReady, makeDataReadyCallback, makeDataReadyRefcon, formatDescription, numSamples, numSampleTimingEntries, sampleTimingArray, numSampleSizeEntries, sampleSizeArray, sBufOut))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferCreateCopy, OSStatus, (CFAllocatorRef allocator, CMSampleBufferRef sbuf, CMSampleBufferRef *sbufCopyOut), (allocator, sbuf, sbufCopyOut))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferCreateCopyWithNewTiming, OSStatus, (CFAllocatorRef allocator, CMSampleBufferRef originalSBuf, CMItemCount numSampleTimingEntries, const CMSampleTimingInfo *sampleTimingArray, CMSampleBufferRef *sBufCopyOut), (allocator, originalSBuf, numSampleTimingEntries, sampleTimingArray, sBufCopyOut))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferCreateReadyWithImageBuffer, OSStatus, (CFAllocatorRef allocator, CVImageBufferRef imageBuffer, CMVideoFormatDescriptionRef formatDescription, const CMSampleTimingInfo* sampleTiming, CMSampleBufferRef* sBufOut), (allocator, imageBuffer, formatDescription, sampleTiming, sBufOut))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetDecodeTimeStamp, CMTime, (CMSampleBufferRef sbuf), (sbuf))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetDuration, CMTime, (CMSampleBufferRef sbuf), (sbuf))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetImageBuffer, CVImageBufferRef, (CMSampleBufferRef sbuf), (sbuf))
@@ -93,6 +94,7 @@
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetRate, OSStatus, (CMTimebaseRef timebase, Float64 rate), (timebase, rate))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetTime, OSStatus, (CMTimebaseRef timebase, CMTime time), (timebase, time))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimeCopyAsDictionary, CFDictionaryRef, (CMTime time, CFAllocatorRef allocator), (time, allocator))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMVideoFormatDescriptionCreateForImageBuffer, OSStatus, (CFAllocatorRef allocator, CVImageBufferRef imageBuffer, CMVideoFormatDescriptionRef* outDesc), (allocator, imageBuffer, outDesc))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMVideoFormatDescriptionGetDimensions, CMVideoDimensions, (CMVideoFormatDescriptionRef videoDesc), (videoDesc))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMVideoFormatDescriptionGetPresentationDimensions, CGSize, (CMVideoFormatDescriptionRef videoDesc, Boolean usePixelAspectRatio, Boolean useCleanAperture), (videoDesc, usePixelAspectRatio, useCleanAperture))
 

Modified: trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.h (203738 => 203739)


--- trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.h	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.h	2016-07-26 21:52:27 UTC (rev 203739)
@@ -127,6 +127,8 @@
 #define CMSampleBufferCreateCopy softLink_CoreMedia_CMSampleBufferCreateCopy
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferCreateCopyWithNewTiming, OSStatus, (CFAllocatorRef allocator, CMSampleBufferRef originalSBuf, CMItemCount numSampleTimingEntries, const CMSampleTimingInfo *sampleTimingArray, CMSampleBufferRef *sBufCopyOut), (allocator, originalSBuf, numSampleTimingEntries, sampleTimingArray, sBufCopyOut))
 #define CMSampleBufferCreateCopyWithNewTiming softLink_CoreMedia_CMSampleBufferCreateCopyWithNewTiming
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferCreateReadyWithImageBuffer, OSStatus, (CFAllocatorRef allocator, CVImageBufferRef imageBuffer, CMVideoFormatDescriptionRef formatDescription, const CMSampleTimingInfo *sampleTiming, CMSampleBufferRef *sBufOut), (allocator, imageBuffer, formatDescription, sampleTiming, sBufOut))
+#define CMSampleBufferCreateReadyWithImageBuffer softLink_CoreMedia_CMSampleBufferCreateReadyWithImageBuffer
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetDecodeTimeStamp, CMTime, (CMSampleBufferRef sbuf), (sbuf))
 #define CMSampleBufferGetDecodeTimeStamp softLink_CoreMedia_CMSampleBufferGetDecodeTimeStamp
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetDuration, CMTime, (CMSampleBufferRef sbuf), (sbuf))
@@ -153,6 +155,8 @@
 #define CMTimebaseSetTime softLink_CoreMedia_CMTimebaseSetTime
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimeCopyAsDictionary, CFDictionaryRef, (CMTime time, CFAllocatorRef allocator), (time, allocator))
 #define CMTimeCopyAsDictionary softLink_CoreMedia_CMTimeCopyAsDictionary
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMVideoFormatDescriptionCreateForImageBuffer, OSStatus, (CFAllocatorRef allocator, CVImageBufferRef imageBuffer, CMVideoFormatDescriptionRef *outDesc), (allocator, imageBuffer, outDesc))
+#define CMVideoFormatDescriptionCreateForImageBuffer softLink_CoreMedia_CMVideoFormatDescriptionCreateForImageBuffer
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMVideoFormatDescriptionGetDimensions, CMVideoDimensions, (CMVideoFormatDescriptionRef videoDesc), (videoDesc))
 #define CMVideoFormatDescriptionGetDimensions softLink_CoreMedia_CMVideoFormatDescriptionGetDimensions
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMVideoFormatDescriptionGetPresentationDimensions, CGSize, (CMVideoFormatDescriptionRef videoDesc, Boolean usePixelAspectRatio, Boolean useCleanAperture), (videoDesc, usePixelAspectRatio, useCleanAperture))

Modified: trunk/Source/WebCore/platform/cocoa/CoreVideoSoftLink.cpp (203738 => 203739)


--- trunk/Source/WebCore/platform/cocoa/CoreVideoSoftLink.cpp	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/platform/cocoa/CoreVideoSoftLink.cpp	2016-07-26 21:52:27 UTC (rev 203739)
@@ -46,6 +46,9 @@
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreVideo, CVOpenGLESTextureCacheFlush, void, (CVOpenGLESTextureCacheRef textureCache, CVOptionFlags options), (textureCache, options))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreVideo, CVOpenGLESTextureGetTarget, GLenum, (CVOpenGLESTextureRef image), (image))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreVideo, CVOpenGLESTextureGetName, GLuint, (CVOpenGLESTextureRef image), (image))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreVideo, CVPixelBufferCreate, CVReturn, (CFAllocatorRef allocator, size_t width, size_t height, OSType pixelFormatType, CFDictionaryRef pixelBufferAttributes, CVPixelBufferRef *pixelBufferOut), (allocator, width, height, pixelFormatType, pixelBufferAttributes, pixelBufferOut))
+SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreVideo, kCVPixelBufferCGBitmapContextCompatibilityKey, CFStringRef)
+SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreVideo, kCVPixelBufferCGImageCompatibilityKey, CFStringRef)
 SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreVideo, kCVPixelBufferIOSurfaceOpenGLESFBOCompatibilityKey, CFStringRef)
 #else
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreVideo, CVOpenGLTextureCacheCreate, CVReturn, (CFAllocatorRef allocator, CFDictionaryRef cacheAttributes, CGLContextObj cglContext, CGLPixelFormatObj cglPixelFormat, CFDictionaryRef textureAttributes, CVOpenGLTextureCacheRef* cacheOut), (allocator, cacheAttributes, cglContext, cglPixelFormat, textureAttributes, cacheOut))

Modified: trunk/Source/WebCore/platform/cocoa/CoreVideoSoftLink.h (203738 => 203739)


--- trunk/Source/WebCore/platform/cocoa/CoreVideoSoftLink.h	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/platform/cocoa/CoreVideoSoftLink.h	2016-07-26 21:52:27 UTC (rev 203739)
@@ -62,7 +62,13 @@
 #define CVOpenGLESTextureGetTarget softLink_CoreVideo_CVOpenGLESTextureGetTarget
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreVideo, CVOpenGLESTextureGetName, GLuint, (CVOpenGLESTextureRef image), (image))
 #define CVOpenGLESTextureGetName softLink_CoreVideo_CVOpenGLESTextureGetName
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreVideo, CVPixelBufferCreate, CVReturn, (CFAllocatorRef allocator, size_t width, size_t height, OSType pixelFormatType, CFDictionaryRef pixelBufferAttributes, CVPixelBufferRef *pixelBufferOut), (allocator, width, height, pixelFormatType, pixelBufferAttributes, pixelBufferOut))
+#define CVPixelBufferCreate softLink_CoreVideo_CVPixelBufferCreate
 
+SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, CoreVideo, kCVPixelBufferCGBitmapContextCompatibilityKey, CFStringRef)
+#define kCVPixelBufferCGBitmapContextCompatibilityKey get_CoreVideo_kCVPixelBufferCGBitmapContextCompatibilityKey()
+SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, CoreVideo, kCVPixelBufferCGImageCompatibilityKey, CFStringRef)
+#define kCVPixelBufferCGImageCompatibilityKey get_CoreVideo_kCVPixelBufferCGImageCompatibilityKey()
 SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, CoreVideo, kCVPixelBufferIOSurfaceOpenGLESFBOCompatibilityKey, CFStringRef)
 #define kCVPixelBufferIOSurfaceOpenGLESFBOCompatibilityKey get_CoreVideo_kCVPixelBufferIOSurfaceOpenGLESFBOCompatibilityKey()
 #else

Modified: trunk/Source/WebCore/platform/graphics/avfoundation/MediaSampleAVFObjC.h (203738 => 203739)


--- trunk/Source/WebCore/platform/graphics/avfoundation/MediaSampleAVFObjC.h	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/MediaSampleAVFObjC.h	2016-07-26 21:52:27 UTC (rev 203739)
@@ -32,8 +32,8 @@
 
 class MediaSampleAVFObjC final : public MediaSample {
 public:
-    static RefPtr<MediaSampleAVFObjC> create(CMSampleBufferRef sample, int trackID) { return adoptRef(new MediaSampleAVFObjC(sample, trackID)); }
-    static RefPtr<MediaSampleAVFObjC> create(CMSampleBufferRef sample) { return adoptRef(new MediaSampleAVFObjC(sample)); }
+    static Ref<MediaSampleAVFObjC> create(CMSampleBufferRef sample, int trackID) { return adoptRef(*new MediaSampleAVFObjC(sample, trackID)); }
+    static Ref<MediaSampleAVFObjC> create(CMSampleBufferRef sample) { return adoptRef(*new MediaSampleAVFObjC(sample)); }
 
 private:
     MediaSampleAVFObjC(CMSampleBufferRef sample)

Modified: trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h (203738 => 203739)


--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h	2016-07-26 21:52:27 UTC (rev 203739)
@@ -29,6 +29,7 @@
 #if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION)
 
 #include "MediaPlayerPrivate.h"
+#include "MediaSample.h"
 #include "MediaStreamPrivate.h"
 #include <wtf/Function.h>
 #include <wtf/MediaTime.h>
@@ -36,6 +37,7 @@
 #include <wtf/WeakPtr.h>
 
 OBJC_CLASS AVSampleBufferAudioRenderer;
+OBJC_CLASS AVSampleBufferDisplayLayer;
 OBJC_CLASS AVStreamSession;
 typedef struct opaqueCMSampleBuffer *CMSampleBufferRef;
 
@@ -51,7 +53,7 @@
 class VideoFullscreenLayerManager;
 #endif
 
-class MediaPlayerPrivateMediaStreamAVFObjC : public MediaPlayerPrivateInterface, public MediaStreamPrivate::Observer {
+class MediaPlayerPrivateMediaStreamAVFObjC final : public MediaPlayerPrivateInterface, private MediaStreamPrivate::Observer, private MediaStreamTrackPrivate::Observer {
 public:
     explicit MediaPlayerPrivateMediaStreamAVFObjC(MediaPlayer*);
     virtual ~MediaPlayerPrivateMediaStreamAVFObjC();
@@ -70,6 +72,9 @@
 
     WeakPtr<MediaPlayerPrivateMediaStreamAVFObjC> createWeakPtr() { return m_weakPtrFactory.createWeakPtr(); }
 
+    void ensureLayer();
+    void destroyLayer();
+
 private:
     // MediaPlayerPrivateInterface
 
@@ -117,10 +122,14 @@
 
     void setSize(const IntSize&) override { /* No-op */ }
 
+    void enqueueAudioSampleBufferFromTrack(MediaStreamTrackPrivate&, PlatformSample);
+    void enqueueVideoSampleBufferFromTrack(MediaStreamTrackPrivate&, PlatformSample);
+
     void paint(GraphicsContext&, const FloatRect&) override;
     void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) override;
     bool metaDataAvailable() const { return m_mediaStreamPrivate && m_readyState >= MediaPlayer::HaveMetadata; }
 
+    void acceleratedRenderingStateChanged() override;
     bool supportsAcceleratedRendering() const override { return true; }
 
     bool hasSingleSecurityOrigin() const override { return true; }
@@ -139,7 +148,6 @@
     void updateReadyState();
 
     void updateIntrinsicSize(const FloatSize&);
-    void createPreviewLayers();
     void updateTracks();
     void renderingModeChanged();
 
@@ -160,6 +168,13 @@
     void didAddTrack(MediaStreamTrackPrivate&) override;
     void didRemoveTrack(MediaStreamTrackPrivate&) override;
 
+    // MediaStreamPrivateTrack::Observer
+    void trackEnded(MediaStreamTrackPrivate&) override { };
+    void trackMutedChanged(MediaStreamTrackPrivate&) override { };
+    void trackSettingsChanged(MediaStreamTrackPrivate&) override { };
+    void trackEnabledChanged(MediaStreamTrackPrivate&) override { };
+    void sampleBufferUpdated(MediaStreamTrackPrivate&, MediaSample&) override;
+
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     void setVideoFullscreenLayer(PlatformLayer*, std::function<void()> completionHandler) override;
     void setVideoFullscreenFrame(FloatRect) override;
@@ -168,8 +183,7 @@
     MediaPlayer* m_player { nullptr };
     WeakPtrFactory<MediaPlayerPrivateMediaStreamAVFObjC> m_weakPtrFactory;
     RefPtr<MediaStreamPrivate> m_mediaStreamPrivate;
-    mutable RetainPtr<CALayer> m_previewLayer;
-    mutable RetainPtr<PlatformLayer> m_videoBackgroundLayer;
+    RetainPtr<AVSampleBufferDisplayLayer> m_sampleBufferDisplayLayer;
     RetainPtr<CGImageRef> m_pausedImage;
     std::unique_ptr<Clock> m_clock;
 
@@ -185,6 +199,8 @@
     bool m_muted { false };
     bool m_haveEverPlayed { false };
     bool m_ended { false };
+    bool m_hasEverEnqueuedVideoFrame { false };
+    bool m_hasReceivedMedia { false };
 
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     std::unique_ptr<VideoFullscreenLayerManager> m_videoFullscreenLayerManager;

Modified: trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm (203738 => 203739)


--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm	2016-07-26 21:52:27 UTC (rev 203739)
@@ -36,6 +36,7 @@
 #import "Logging.h"
 #import "MediaStreamPrivate.h"
 #import "VideoTrackPrivateMediaStream.h"
+#import <AVFoundation/AVSampleBufferDisplayLayer.h>
 #import <QuartzCore/CALayer.h>
 #import <QuartzCore/CATransaction.h>
 #import <objc_runtime.h>
@@ -52,6 +53,8 @@
 
 SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation)
 
+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferDisplayLayer)
+
 namespace WebCore {
 
 #pragma mark -
@@ -71,8 +74,17 @@
 MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC()
 {
     LOG(Media, "MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC(%p)", this);
-    if (m_mediaStreamPrivate)
+    if (m_mediaStreamPrivate) {
         m_mediaStreamPrivate->removeObserver(*this);
+
+        for (auto& track : m_mediaStreamPrivate->tracks())
+            track->removeObserver(*this);
+    }
+
+    m_audioTrackMap.clear();
+    m_videoTrackMap.clear();
+
+    destroyLayer();
 }
 
 #pragma mark -
@@ -87,7 +99,7 @@
 
 bool MediaPlayerPrivateMediaStreamAVFObjC::isAvailable()
 {
-    return AVFoundationLibrary() && isCoreMediaFrameworkAvailable();
+    return AVFoundationLibrary() && isCoreMediaFrameworkAvailable() && getAVSampleBufferDisplayLayerClass();
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::getSupportedTypes(HashSet<String, ASCIICaseInsensitiveHash>& types)
@@ -105,6 +117,60 @@
 }
 
 #pragma mark -
+#pragma mark AVSampleBuffer Methods
+
+void MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSampleBufferFromTrack(MediaStreamTrackPrivate&, PlatformSample)
+{
+    // FIXME: https://bugs.webkit.org/show_bug.cgi?id=159836
+}
+
+void MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSampleBufferFromTrack(MediaStreamTrackPrivate& track, PlatformSample platformSample)
+{
+    if (&track != m_mediaStreamPrivate->activeVideoTrack())
+        return;
+
+    if (m_displayMode == LivePreview && [m_sampleBufferDisplayLayer isReadyForMoreMediaData]) {
+        [m_sampleBufferDisplayLayer enqueueSampleBuffer:platformSample.sample.cmSampleBuffer];
+        
+        if (!m_hasEverEnqueuedVideoFrame) {
+            m_hasEverEnqueuedVideoFrame = true;
+            m_player->firstVideoFrameAvailable();
+        }
+    }
+}
+
+void MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer()
+{
+    if (m_sampleBufferDisplayLayer)
+        return;
+    
+    m_sampleBufferDisplayLayer = adoptNS([allocAVSampleBufferDisplayLayerInstance() init]);
+#ifndef NDEBUG
+    [m_sampleBufferDisplayLayer setName:@"MediaPlayerPrivateMediaStreamAVFObjC AVSampleBufferDisplayLayer"];
+#endif
+    
+    renderingModeChanged();
+    
+#if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
+    m_videoFullscreenLayerManager->setVideoLayer(m_sampleBufferDisplayLayer.get(), snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size());
+#endif
+}
+
+void MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer()
+{
+    if (!m_sampleBufferDisplayLayer)
+        return;
+    
+    [m_sampleBufferDisplayLayer flush];
+    m_sampleBufferDisplayLayer = nullptr;
+    renderingModeChanged();
+    
+#if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
+    m_videoFullscreenLayerManager->didDestroyVideoLayer();
+#endif
+}
+
+#pragma mark -
 #pragma mark MediaPlayerPrivateInterface Overrides
 
 void MediaPlayerPrivateMediaStreamAVFObjC::load(const String&)
@@ -129,7 +195,6 @@
 {
     LOG(Media, "MediaPlayerPrivateMediaStreamAVFObjC::load(%p)", this);
 
-    m_previewLayer = nullptr;
     m_intrinsicSize = FloatSize();
 
     m_mediaStreamPrivate = &stream;
@@ -157,19 +222,19 @@
 
 PlatformLayer* MediaPlayerPrivateMediaStreamAVFObjC::platformLayer() const
 {
-    if (!m_videoBackgroundLayer || m_displayMode == None)
+    if (!m_sampleBufferDisplayLayer || m_displayMode == None)
         return nullptr;
 
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     return m_videoFullscreenLayerManager->videoInlineLayer();
 #else
-    return m_videoBackgroundLayer.get();
+    return m_sampleBufferDisplayLayer.get();
 #endif
 }
 
 MediaPlayerPrivateMediaStreamAVFObjC::DisplayMode MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode() const
 {
-    if (m_ended || m_intrinsicSize.isEmpty() || !metaDataAvailable() || !m_videoBackgroundLayer)
+    if (m_ended || m_intrinsicSize.isEmpty() || !metaDataAvailable() || !m_sampleBufferDisplayLayer)
         return None;
 
     if (m_mediaStreamPrivate->activeVideoTrack() && !m_mediaStreamPrivate->activeVideoTrack()->enabled())
@@ -194,51 +259,6 @@
 
     if (m_displayMode == None)
         return;
-
-    [CATransaction begin];
-    [CATransaction setAnimationDuration:0];
-    [CATransaction setDisableActions:YES];
-
-    do {
-        if (m_displayMode < LivePreview) {
-
-            if (m_displayMode == PausedImage) {
-                if (m_videoBackgroundLayer.get().contents)
-                    break;
-
-                RefPtr<Image> image = m_mediaStreamPrivate->currentFrameImage();
-                if (!image) {
-                    m_displayMode = PaintItBlack;
-                    continue;
-                }
-
-                m_pausedImage = image->getCGImageRef();
-                if (!m_pausedImage) {
-                    m_displayMode = PaintItBlack;
-                    continue;
-                }
-
-                m_videoBackgroundLayer.get().contents = (id)m_pausedImage.get();
-                m_videoBackgroundLayer.get().backgroundColor = nil;
-            } else {
-                m_videoBackgroundLayer.get().contents = nil;
-                m_videoBackgroundLayer.get().backgroundColor = cachedCGColor(Color::black);
-                m_pausedImage = nullptr;
-            }
-
-            m_previewLayer.get().hidden = true;
-
-        } else {
-
-            m_previewLayer.get().hidden = false;
-            m_videoBackgroundLayer.get().contents = nil;
-            m_pausedImage = nullptr;
-        }
-
-        break;
-    } while (1);
-
-    [CATransaction commit];
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::play()
@@ -251,8 +271,9 @@
     m_clock->start();
     m_playing = true;
     m_haveEverPlayed = true;
-    updateDisplayMode();
-    updateReadyState();
+    scheduleDeferredTask([this] {
+        updateReadyState();
+    });
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::pause()
@@ -341,7 +362,9 @@
     if (!m_mediaStreamPrivate)
         return MediaPlayer::ReadyState::HaveNothing;
 
-    if (m_mediaStreamPrivate->active()) {
+    // https://w3c.github.io/mediacapture-main/ Change 8. from July 4, 2013.
+    // FIXME: Only update readyState to HAVE_ENOUGH_DATA when all active tracks have sent a sample buffer.
+    if (m_mediaStreamPrivate->active() && m_hasReceivedMedia) {
         if (!m_haveEverPlayed)
             return MediaPlayer::ReadyState::HaveFutureData;
         return MediaPlayer::ReadyState::HaveEnoughData;
@@ -387,46 +410,8 @@
         return;
 
     m_intrinsicSize = size;
-
-    if (m_videoBackgroundLayer || !m_player || !m_player->client().mediaPlayerRenderingCanBeAccelerated(m_player))
-        return;
-
-    if (!m_mediaStreamPrivate || !m_mediaStreamPrivate->platformLayer())
-        return;
-
-    createPreviewLayers();
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::createPreviewLayers()
-{
-    if (!m_videoBackgroundLayer) {
-        m_videoBackgroundLayer = adoptNS([[CALayer alloc] init]);
-        m_videoBackgroundLayer.get().name = @"MediaPlayerPrivateMediaStreamAVFObjC preview background layer";
-
-#if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
-        m_videoFullscreenLayerManager->setVideoLayer(m_videoBackgroundLayer.get(), snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size());
-#endif
-    }
-
-    if (!m_previewLayer) {
-        m_previewLayer = m_mediaStreamPrivate->platformLayer();
-        if (m_previewLayer) {
-            m_previewLayer.get().contentsGravity = kCAGravityResizeAspect;
-            m_previewLayer.get().anchorPoint = CGPointZero;
-            if (!m_playing)
-                m_previewLayer.get().hidden = true;
-
-            [m_videoBackgroundLayer addSublayer:m_previewLayer.get()];
-#if PLATFORM(MAC)
-            [m_previewLayer setFrame:[m_videoBackgroundLayer bounds]];
-            [m_previewLayer setAutoresizingMask:(kCALayerWidthSizable | kCALayerHeightSizable)];
-#endif
-        }
-    }
-
-    renderingModeChanged();
-}
-
 void MediaPlayerPrivateMediaStreamAVFObjC::renderingModeChanged()
 {
     updateDisplayMode();
@@ -473,6 +458,29 @@
     updateTracks();
 }
 
+void MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferUpdated(MediaStreamTrackPrivate& track, MediaSample& mediaSample)
+{
+    ASSERT(track.id() == mediaSample.trackID());
+    ASSERT(mediaSample.platformSample().type == PlatformSample::CMSampleBufferType);
+    ASSERT(m_mediaStreamPrivate);
+
+    switch (track.type()) {
+    case RealtimeMediaSource::None:
+        // Do nothing.
+        break;
+    case RealtimeMediaSource::Audio:
+        // FIXME: https://bugs.webkit.org/show_bug.cgi?id=159836
+        break;
+    case RealtimeMediaSource::Video:
+        enqueueVideoSampleBufferFromTrack(track, mediaSample.platformSample());
+        m_hasReceivedMedia = true;
+        scheduleDeferredTask([this] {
+            updateReadyState();
+        });
+        break;
+    }
+}
+
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
 void MediaPlayerPrivateMediaStreamAVFObjC::setVideoFullscreenLayer(PlatformLayer *videoFullscreenLayer, std::function<void()> completionHandler)
 {
@@ -486,7 +494,7 @@
 #endif
 
 template <typename RefT, typename PassRefT>
-void updateTracksOfType(HashMap<String, RefT>& trackMap, RealtimeMediaSource::Type trackType, MediaStreamTrackPrivateVector& currentTracks, RefT (*itemFactory)(MediaStreamTrackPrivate&), MediaPlayer* player, void (MediaPlayer::*removedFunction)(PassRefT), void (MediaPlayer::*addedFunction)(PassRefT), std::function<void(RefT, int)> configureCallback)
+void updateTracksOfType(HashMap<String, RefT>& trackMap, RealtimeMediaSource::Type trackType, MediaStreamTrackPrivateVector& currentTracks, RefT (*itemFactory)(MediaStreamTrackPrivate&), MediaPlayer* player, void (MediaPlayer::*removedFunction)(PassRefT), void (MediaPlayer::*addedFunction)(PassRefT), std::function<void(RefT, int)> configureCallback, MediaStreamTrackPrivate::Observer* trackObserver)
 {
     Vector<RefT> removedTracks;
     Vector<RefT> addedTracks;
@@ -519,17 +527,20 @@
     for (const auto& track : trackMap.values())
         configureCallback(track, index++);
 
-    for (auto& track : removedTracks)
+    for (auto& track : removedTracks) {
         (player->*removedFunction)(track);
+        track->streamTrack()->removeObserver(*trackObserver);
+    }
 
-    for (auto& track : addedTracks)
+    for (auto& track : addedTracks) {
         (player->*addedFunction)(track);
+        track->streamTrack()->addObserver(*trackObserver);
+    }
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::updateTracks()
 {
     MediaStreamTrackPrivateVector currentTracks = m_mediaStreamPrivate->tracks();
-    bool selectedVideoTrackChanged = false;
 
     std::function<void(RefPtr<AudioTrackPrivateMediaStream>, int)> enableAudioTrack = [this](auto track, int index)
     {
@@ -536,24 +547,17 @@
         track->setTrackIndex(index);
         track->setEnabled(track->streamTrack()->enabled() && !track->streamTrack()->muted());
     };
-    updateTracksOfType(m_audioTrackMap, RealtimeMediaSource::Audio, currentTracks, &AudioTrackPrivateMediaStream::create, m_player, &MediaPlayer::removeAudioTrack, &MediaPlayer::addAudioTrack, enableAudioTrack);
+    updateTracksOfType(m_audioTrackMap, RealtimeMediaSource::Audio, currentTracks, &AudioTrackPrivateMediaStream::create, m_player, &MediaPlayer::removeAudioTrack, &MediaPlayer::addAudioTrack, enableAudioTrack, (MediaStreamTrackPrivate::Observer*) this);
 
-    std::function<void(RefPtr<VideoTrackPrivateMediaStream>, int)> enableVideoTrack = [this, &selectedVideoTrackChanged](auto track, int index)
+    std::function<void(RefPtr<VideoTrackPrivateMediaStream>, int)> enableVideoTrack = [this](auto track, int index)
     {
-        bool wasSelected = track->selected();
         track->setTrackIndex(index);
         track->setSelected(track->streamTrack() == m_mediaStreamPrivate->activeVideoTrack());
-        if (wasSelected != track->selected())
-            selectedVideoTrackChanged = true;
+
+        if (track->selected())
+            ensureLayer();
     };
-    updateTracksOfType(m_videoTrackMap, RealtimeMediaSource::Video, currentTracks, &VideoTrackPrivateMediaStream::create, m_player, &MediaPlayer::removeVideoTrack, &MediaPlayer::addVideoTrack, enableVideoTrack);
-
-    if (selectedVideoTrackChanged) {
-        if (m_previewLayer)
-            m_previewLayer = nullptr;
-
-        createPreviewLayers();
-    }
+    updateTracksOfType(m_videoTrackMap, RealtimeMediaSource::Video, currentTracks, &VideoTrackPrivateMediaStream::create, m_player, &MediaPlayer::removeVideoTrack, &MediaPlayer::addVideoTrack, enableVideoTrack, (MediaStreamTrackPrivate::Observer*) this);
 }
 
 std::unique_ptr<PlatformTimeRanges> MediaPlayerPrivateMediaStreamAVFObjC::seekable() const
@@ -593,6 +597,14 @@
     }
 }
 
+void MediaPlayerPrivateMediaStreamAVFObjC::acceleratedRenderingStateChanged()
+{
+    if (m_player->client().mediaPlayerRenderingCanBeAccelerated(m_player))
+        ensureLayer();
+    else
+        destroyLayer();
+}
+
 String MediaPlayerPrivateMediaStreamAVFObjC::engineDescription() const
 {
     static NeverDestroyed<String> description(ASCIILiteral("AVFoundation MediaStream Engine"));

Modified: trunk/Source/WebCore/platform/mediastream/MediaStreamPrivate.cpp (203738 => 203739)


--- trunk/Source/WebCore/platform/mediastream/MediaStreamPrivate.cpp	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/platform/mediastream/MediaStreamPrivate.cpp	2016-07-26 21:52:27 UTC (rev 203739)
@@ -257,7 +257,7 @@
 {
     m_activeVideoTrack = nullptr;
     for (auto& track : m_trackSet.values()) {
-        if (!track->ended() && track->type() == RealtimeMediaSource::Type::Video && !track->ended()) {
+        if (!track->ended() && track->type() == RealtimeMediaSource::Type::Video) {
             m_activeVideoTrack = track.get();
             break;
         }

Modified: trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp (203738 => 203739)


--- trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp	2016-07-26 21:52:27 UTC (rev 203739)
@@ -213,6 +213,13 @@
     return !m_isEnded;
 }
 
+void MediaStreamTrackPrivate::sourceHasMoreMediaData(MediaSample& mediaSample)
+{
+    mediaSample.setTrackID(id());
+    for (auto& observer : m_observers)
+        observer->sampleBufferUpdated(*this, mediaSample);
+}
+
 } // namespace WebCore
 
 #endif // ENABLE(MEDIA_STREAM)

Modified: trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h (203738 => 203739)


--- trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h	2016-07-26 21:52:27 UTC (rev 203739)
@@ -37,6 +37,7 @@
 
 class AudioSourceProvider;
 class GraphicsContext;
+class MediaSample;
 class MediaSourceSettings;
 class RealtimeMediaSourceCapabilities;
 
@@ -50,6 +51,7 @@
         virtual void trackMutedChanged(MediaStreamTrackPrivate&) = 0;
         virtual void trackSettingsChanged(MediaStreamTrackPrivate&) = 0;
         virtual void trackEnabledChanged(MediaStreamTrackPrivate&) = 0;
+        virtual void sampleBufferUpdated(MediaStreamTrackPrivate&, MediaSample&) { };
     };
     
     static RefPtr<MediaStreamTrackPrivate> create(RefPtr<RealtimeMediaSource>&&);
@@ -104,6 +106,7 @@
     void sourceMutedChanged() final;
     void sourceSettingsChanged() final;
     bool preventSourceFromStopping() final;
+    void sourceHasMoreMediaData(MediaSample&) final;
 
     Vector<Observer*> m_observers;
     RefPtr<RealtimeMediaSource> m_source;

Modified: trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.cpp (203738 => 203739)


--- trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.cpp	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.cpp	2016-07-26 21:52:27 UTC (rev 203739)
@@ -90,12 +90,18 @@
         observer->sourceMutedChanged();
 }
 
-void RealtimeMediaSource::settingsDidChanged()
+void RealtimeMediaSource::settingsDidChange()
 {
     for (auto& observer : m_observers)
         observer->sourceSettingsChanged();
 }
 
+void RealtimeMediaSource::mediaDataUpdated(MediaSample& mediaSample)
+{
+    for (auto& observer : m_observers)
+        observer->sourceHasMoreMediaData(mediaSample);
+}
+
 bool RealtimeMediaSource::readonly() const
 {
     return m_readonly;

Modified: trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h (203738 => 203739)


--- trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h	2016-07-26 21:52:27 UTC (rev 203739)
@@ -39,6 +39,7 @@
 #include "AudioSourceProvider.h"
 #include "Image.h"
 #include "MediaConstraints.h"
+#include "MediaSample.h"
 #include "PlatformLayer.h"
 #include "RealtimeMediaSourceCapabilities.h"
 #include <wtf/RefCounted.h>
@@ -66,6 +67,9 @@
 
         // Observer state queries.
         virtual bool preventSourceFromStopping() = 0;
+        
+        // Media data changes.
+        virtual void sourceHasMoreMediaData(MediaSample&) = 0;
     };
 
     virtual ~RealtimeMediaSource() { }
@@ -86,7 +90,8 @@
 
     virtual RefPtr<RealtimeMediaSourceCapabilities> capabilities() = 0;
     virtual const RealtimeMediaSourceSettings& settings() = 0;
-    void settingsDidChanged();
+    void settingsDidChange();
+    void mediaDataUpdated(MediaSample&);
     
     bool stopped() const { return m_stopped; }
 

Modified: trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm (203738 => 203739)


--- trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm	2016-07-26 21:52:27 UTC (rev 203739)
@@ -34,6 +34,7 @@
 #import "IntRect.h"
 #import "Logging.h"
 #import "MediaConstraints.h"
+#import "MediaSampleAVFObjC.h"
 #import "NotImplemented.h"
 #import "PlatformLayer.h"
 #import "RealtimeMediaSourceCenter.h"
@@ -298,7 +299,9 @@
     }
 
     if (settingsChanged)
-        this->settingsDidChanged();
+        settingsDidChange();
+
+    mediaDataUpdated(MediaSampleAVFObjC::create(sampleBuffer.get()));
 }
 
 void AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutputType*, CMSampleBufferRef sampleBuffer, AVCaptureConnectionType*)

Modified: trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.h (203738 => 203739)


--- trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.h	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.h	2016-07-26 21:52:27 UTC (rev 203739)
@@ -37,6 +37,10 @@
 #include "MockRealtimeVideoSource.h"
 #include <wtf/RunLoop.h>
 
+typedef struct __CVBuffer *CVBufferRef;
+typedef CVBufferRef CVImageBufferRef;
+typedef CVImageBufferRef CVPixelBufferRef;
+
 namespace WebCore {
 
 class MockRealtimeVideoSourceMac final : public MockRealtimeVideoSource {
@@ -48,8 +52,12 @@
     friend class MockRealtimeVideoSource;
     MockRealtimeVideoSourceMac();
 
+    RetainPtr<CMSampleBufferRef> CMSampleBufferFromPixelBuffer(CVPixelBufferRef);
+    RetainPtr<CVPixelBufferRef> pixelBufferFromCGImage(CGImageRef) const;
+
     PlatformLayer* platformLayer() const override;
     void updatePlatformLayer() const override;
+    void updateSampleBuffer() override;
 
     mutable RetainPtr<CGImageRef> m_previewImage;
     mutable RetainPtr<PlatformLayer> m_previewLayer;

Modified: trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm (203738 => 203739)


--- trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm	2016-07-26 21:52:27 UTC (rev 203739)
@@ -35,6 +35,7 @@
 #import "GraphicsContextCG.h"
 #import "ImageBuffer.h"
 #import "MediaConstraints.h"
+#import "MediaSampleAVFObjC.h"
 #import "NotImplemented.h"
 #import "PlatformLayer.h"
 #import "RealtimeMediaSourceSettings.h"
@@ -42,6 +43,9 @@
 #import <QuartzCore/CATransaction.h>
 #import <objc/runtime.h>
 
+#import "CoreMediaSoftLink.h"
+#import "CoreVideoSoftLink.h"
+
 namespace WebCore {
 
 Ref<MockRealtimeVideoSource> MockRealtimeVideoSource::create()
@@ -54,6 +58,57 @@
 {
 }
 
+RetainPtr<CMSampleBufferRef> MockRealtimeVideoSourceMac::CMSampleBufferFromPixelBuffer(CVPixelBufferRef pixelBuffer)
+{
+    if (!pixelBuffer)
+        return nullptr;
+
+    CMSampleTimingInfo timingInfo;
+
+    timingInfo.presentationTimeStamp = CMTimeMake(elapsedTime() * 1000, 1000);
+    timingInfo.decodeTimeStamp = kCMTimeInvalid;
+    timingInfo.duration = kCMTimeInvalid;
+
+    CMVideoFormatDescriptionRef formatDescription = nullptr;
+    OSStatus status = CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, (CVImageBufferRef)pixelBuffer, &formatDescription);
+    if (status != noErr) {
+        LOG_ERROR("Failed to initialize CMVideoFormatDescription with error code: %d", status);
+        return nullptr;
+    }
+
+    CMSampleBufferRef sampleBuffer;
+    status = CMSampleBufferCreateReadyWithImageBuffer(kCFAllocatorDefault, (CVImageBufferRef)pixelBuffer, formatDescription, &timingInfo, &sampleBuffer);
+    CFRelease(formatDescription);
+    if (status != noErr) {
+        LOG_ERROR("Failed to initialize CMSampleBuffer with error code: %d", status);
+        return nullptr;
+    }
+
+    return adoptCF(sampleBuffer);
+}
+
+RetainPtr<CVPixelBufferRef> MockRealtimeVideoSourceMac::pixelBufferFromCGImage(CGImageRef image) const
+{
+    CGSize frameSize = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image));
+    CFDictionaryRef options = (__bridge CFDictionaryRef) @{
+        (__bridge NSString *)kCVPixelBufferCGImageCompatibilityKey: @(NO),
+        (__bridge NSString *)kCVPixelBufferCGBitmapContextCompatibilityKey: @(NO)
+    };
+    CVPixelBufferRef pixelBuffer;
+    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width, frameSize.height, kCVPixelFormatType_32ARGB, options, &pixelBuffer);
+    if (status != kCVReturnSuccess)
+        return nullptr;
+
+    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
+    void* data = ""
+    auto rgbColorSpace = adoptCF(CGColorSpaceCreateDeviceRGB());
+    auto context = adoptCF(CGBitmapContextCreate(data, frameSize.width, frameSize.height, 8, CVPixelBufferGetBytesPerRow(pixelBuffer), rgbColorSpace.get(), (CGBitmapInfo) kCGImageAlphaNoneSkipFirst));
+    CGContextDrawImage(context.get(), CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);
+    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
+
+    return adoptCF(pixelBuffer);
+}
+
 PlatformLayer* MockRealtimeVideoSourceMac::platformLayer() const
 {
     if (m_previewLayer)
@@ -97,6 +152,14 @@
     [CATransaction commit];
 }
 
+void MockRealtimeVideoSourceMac::updateSampleBuffer()
+{
+    auto pixelBuffer = pixelBufferFromCGImage(imageBuffer()->copyImage()->getCGImageRef());
+    auto sampleBuffer = CMSampleBufferFromPixelBuffer(pixelBuffer.get());
+    
+    mediaDataUpdated(MediaSampleAVFObjC::create(sampleBuffer.get()));
+}
+
 } // namespace WebCore
 
 #endif // ENABLE(MEDIA_STREAM)

Modified: trunk/Source/WebCore/platform/mock/MockRealtimeVideoSource.cpp (203738 => 203739)


--- trunk/Source/WebCore/platform/mock/MockRealtimeVideoSource.cpp	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/platform/mock/MockRealtimeVideoSource.cpp	2016-07-26 21:52:27 UTC (rev 203739)
@@ -134,7 +134,7 @@
     if (m_timer.isActive())
         m_timer.startRepeating(std::chrono::milliseconds(lround(1000 / m_frameRate)));
 
-    settingsDidChanged();
+    settingsDidChange();
 }
 
 void MockRealtimeVideoSource::setSize(const IntSize& size)
@@ -169,7 +169,7 @@
     m_imageBuffer = nullptr;
     updatePlatformLayer();
 
-    settingsDidChanged();
+    settingsDidChange();
 }
 
 void MockRealtimeVideoSource::drawAnimation(GraphicsContext& context)
@@ -331,6 +331,7 @@
     drawBoxes(context);
 
     updatePlatformLayer();
+    updateSampleBuffer();
 }
 
 ImageBuffer* MockRealtimeVideoSource::imageBuffer() const

Modified: trunk/Source/WebCore/platform/mock/MockRealtimeVideoSource.h (203738 => 203739)


--- trunk/Source/WebCore/platform/mock/MockRealtimeVideoSource.h	2016-07-26 21:49:26 UTC (rev 203738)
+++ trunk/Source/WebCore/platform/mock/MockRealtimeVideoSource.h	2016-07-26 21:52:27 UTC (rev 203739)
@@ -59,9 +59,12 @@
 protected:
     MockRealtimeVideoSource(const String& name = ASCIILiteral("Mock video device"));
     virtual void updatePlatformLayer() const { }
+    virtual void updateSampleBuffer() { }
 
     ImageBuffer* imageBuffer() const;
 
+    double elapsedTime();
+
 private:
     void updateSettings(RealtimeMediaSourceSettings&) override;
     void initializeCapabilities(RealtimeMediaSourceCapabilities&) override;
@@ -80,8 +83,6 @@
 
     void generateFrame();
 
-    double elapsedTime();
-
     float m_baseFontSize { 0 };
     FontCascade m_timeFont;
 
_______________________________________________
webkit-changes mailing list
webkit-changes@lists.webkit.org
https://lists.webkit.org/mailman/listinfo/webkit-changes

Reply via email to