Title: [214044] trunk
Revision
214044
Author
commit-qu...@webkit.org
Date
2017-03-16 09:09:50 -0700 (Thu, 16 Mar 2017)

Log Message

Improve WebRTC track enabled support
https://bugs.webkit.org/show_bug.cgi?id=169727

Patch by Youenn Fablet <you...@apple.com> on 2017-03-16
Reviewed by Alex Christensen.

Source/WebCore:

Tests: webrtc/peer-connection-audio-mute2.html
       webrtc/peer-connection-remote-audio-mute.html
       webrtc/video-remote-mute.html

Making sure muted/disabled sources produce silence/black frames.
For outgoing audio/video sources, this should be done by the actual a/v providers.
We keep this filtering here until we are sure they implement that.

* platform/audio/mac/AudioSampleDataSource.mm:
(WebCore::AudioSampleDataSource::pullAvalaibleSamplesAsChunks): Ensuring disabled audio tracks send silence.
Used for outgoing webrtc tracks.
* platform/mediastream/mac/MockRealtimeAudioSourceMac.mm:
(WebCore::MockRealtimeAudioSourceMac::render): Ditto.
* platform/mediastream/mac/RealtimeIncomingAudioSource.cpp:
(WebCore::RealtimeIncomingAudioSource::OnData): Ditto.
* platform/mediastream/mac/RealtimeIncomingVideoSource.cpp:
(WebCore::RealtimeIncomingVideoSource::pixelBufferFromVideoFrame): Generating black frames if muted.
(WebCore::RealtimeIncomingVideoSource::OnFrame):
* platform/mediastream/mac/RealtimeIncomingVideoSource.h:
* platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp:
(WebCore::RealtimeOutgoingVideoSource::videoSampleAvailable): Ensuring we quit after sending black frame.

LayoutTests:

* TestExpectations:
* webrtc/audio-peer-connection-webaudio.html:
* webrtc/peer-connection-audio-mute-expected.txt:
* webrtc/peer-connection-audio-mute.html:
* webrtc/peer-connection-audio-mute2-expected.txt: Added.
* webrtc/peer-connection-audio-mute2.html: Added.
* webrtc/peer-connection-remote-audio-mute-expected.txt: Added.
* webrtc/peer-connection-remote-audio-mute.html: Added.
* webrtc/video-mute-expected.txt:
* webrtc/video-mute.html:
* webrtc/video-remote-mute-expected.txt: Added.
* webrtc/video-remote-mute.html: Added.

Modified Paths

Added Paths

Diff

Modified: trunk/LayoutTests/ChangeLog (214043 => 214044)


--- trunk/LayoutTests/ChangeLog	2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/ChangeLog	2017-03-16 16:09:50 UTC (rev 214044)
@@ -1,3 +1,23 @@
+2017-03-16  Youenn Fablet  <you...@apple.com>
+
+        Improve WebRTC track enabled support
+        https://bugs.webkit.org/show_bug.cgi?id=169727
+
+        Reviewed by Alex Christensen.
+
+        * TestExpectations:
+        * webrtc/audio-peer-connection-webaudio.html:
+        * webrtc/peer-connection-audio-mute-expected.txt:
+        * webrtc/peer-connection-audio-mute.html:
+        * webrtc/peer-connection-audio-mute2-expected.txt: Added.
+        * webrtc/peer-connection-audio-mute2.html: Added.
+        * webrtc/peer-connection-remote-audio-mute-expected.txt: Added.
+        * webrtc/peer-connection-remote-audio-mute.html: Added.
+        * webrtc/video-mute-expected.txt:
+        * webrtc/video-mute.html:
+        * webrtc/video-remote-mute-expected.txt: Added.
+        * webrtc/video-remote-mute.html: Added.
+
 2017-03-16  Manuel Rego Casasnovas  <r...@igalia.com>
 
         [css-grid] Crash on debug removing a positioned child

Modified: trunk/LayoutTests/TestExpectations (214043 => 214044)


--- trunk/LayoutTests/TestExpectations	2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/TestExpectations	2017-03-16 16:09:50 UTC (rev 214044)
@@ -711,7 +711,10 @@
 # GTK enables some of these tests on their TestExpectations file.
 [ Release ] webrtc [ Skip ]
 
-[ Debug ] webrtc/audio-peer-connection-webaudio.html [ Failure ]
+[ Debug ] webrtc/peer-connection-audio-mute.html [ Pass Failure ]
+[ Debug ] webrtc/peer-connection-audio-mute2.html [ Pass Failure ]
+[ Debug ] webrtc/peer-connection-remote-audio-mute.html [ Pass Failure ]
+[ Debug ] webrtc/peer-connection-remote-audio-mute2.html [ Pass Failure ]
 fast/mediastream/getUserMedia-webaudio.html [ Skip ]
 fast/mediastream/RTCPeerConnection-AddRemoveStream.html [ Skip ]
 fast/mediastream/RTCPeerConnection-closed-state.html [ Skip ]

Modified: trunk/LayoutTests/webrtc/audio-peer-connection-webaudio.html (214043 => 214044)


--- trunk/LayoutTests/webrtc/audio-peer-connection-webaudio.html	2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/webrtc/audio-peer-connection-webaudio.html	2017-03-16 16:09:50 UTC (rev 214044)
@@ -11,7 +11,7 @@
         if (window.testRunner)
             testRunner.setUserMediaPermission(true);
 
-       return navigator.mediaDevices.getUserMedia({audio: true}).then((stream) => {
+        return navigator.mediaDevices.getUserMedia({audio: true}).then((stream) => {
             if (window.internals)
                 internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
             return new Promise((resolve, reject) => {
@@ -21,14 +21,14 @@
                     secondConnection._onaddstream_ = (streamEvent) => { resolve(streamEvent.stream); };
                 });
                 setTimeout(() => reject("Test timed out"), 5000);
-            }).then((stream) => {
-                return analyseAudio(stream, 1000);
-            }).then((results) => {
-                assert_true(results.heardHum, "heard hum");
-                assert_true(results.heardBip, "heard bip");
-                assert_true(results.heardBop, "heard bop");
             });
-         });
+        }).then((remoteStream) => {
+            return analyseAudio(remoteStream, 1000);
+        }).then((results) => {
+            assert_true(results.heardHum, "heard hum");
+            assert_true(results.heardBip, "heard bip");
+            assert_true(results.heardBop, "heard bop");
+        });
     }, "Basic audio playback through a peer connection");
     </script>
 </head>

Modified: trunk/LayoutTests/webrtc/peer-connection-audio-mute-expected.txt (214043 => 214044)


--- trunk/LayoutTests/webrtc/peer-connection-audio-mute-expected.txt	2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/webrtc/peer-connection-audio-mute-expected.txt	2017-03-16 16:09:50 UTC (rev 214044)
@@ -1,3 +1,3 @@
 
-FAIL Muting and unmuting an audio track assert_true: heard hum expected true got false
+PASS Muting a local audio track and making sure the remote track is silent 
 

Modified: trunk/LayoutTests/webrtc/peer-connection-audio-mute.html (214043 => 214044)


--- trunk/LayoutTests/webrtc/peer-connection-audio-mute.html	2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/webrtc/peer-connection-audio-mute.html	2017-03-16 16:09:50 UTC (rev 214044)
@@ -13,17 +13,19 @@
         if (window.testRunner)
             testRunner.setUserMediaPermission(true);
 
-        return navigator.mediaDevices.getUserMedia({audio: true}).then((stream) => {
+        var localTrack;
+        return navigator.mediaDevices.getUserMedia({audio: true}).then((localStream) => {
             if (window.internals)
                 internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
 
-            var stream;
+            localTrack = localStream.getAudioTracks()[0];
+            var remoteStream;
             return new Promise((resolve, reject) => {
                 createConnections((firstConnection) => {
-                    firstConnection.addStream(stream);
+                    firstConnection.addStream(localStream);
                 }, (secondConnection) => {
                     secondConnection._onaddstream_ = (streamEvent) => {
-                        stream = streamEvent.stream;
+                        remoteStream = streamEvent.stream;
                         resolve();
                     };
                 });
@@ -30,36 +32,19 @@
             }).then(() => {
                 return waitFor(500);
             }).then(() => {
-                return analyseAudio(stream, 500).then((results) => {
-                    assert_true(results.heardHum, "heard hum");
-                    assert_true(results.heardBip, "heard bip");
-                    assert_true(results.heardBop, "heard bop");
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_true(results.heardHum, "heard hum from remote enabled track");
                 });
             }).then(() => {
-                stream.getAudioTracks().forEach((track) => {
-                    track.enabled = false;
-                });
+                localTrack.enabled = false;
                 return waitFor(500);
             }).then(() => {
-                return analyseAudio(stream, 500).then((results) => {
-                    assert_false(results.heardHum, "heard hum");
-                    assert_false(results.heardBip, "heard bip");
-                    assert_false(results.heardBop, "heard bop");
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_false(results.heardHum, "not heard hum from remote disabled track");
                 });
-            }).then(() => {
-                stream.getAudioTracks().forEach((track) => {
-                    track.enabled = true;
-                });
-                return waitFor(500);
-            }).then(() => {
-                return analyseAudio(stream, 500).then((results) => {
-                    assert_true(results.heardHum, "heard hum");
-                    assert_true(results.heardBip, "heard bip");
-                    assert_true(results.heardBop, "heard bop");
-                });
             });
         });
-    }, "Muting and unmuting an audio track");
+    }, "Muting a local audio track and making sure the remote track is silent");
     </script>
 </body>
 </html>

Added: trunk/LayoutTests/webrtc/peer-connection-audio-mute2-expected.txt (0 => 214044)


--- trunk/LayoutTests/webrtc/peer-connection-audio-mute2-expected.txt	                        (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-audio-mute2-expected.txt	2017-03-16 16:09:50 UTC (rev 214044)
@@ -0,0 +1,3 @@
+
+PASS Muting and unmuting a local audio track 
+

Copied: trunk/LayoutTests/webrtc/peer-connection-audio-mute2.html (from rev 214043, trunk/LayoutTests/webrtc/peer-connection-audio-mute.html) (0 => 214044)


--- trunk/LayoutTests/webrtc/peer-connection-audio-mute2.html	                        (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-audio-mute2.html	2017-03-16 16:09:50 UTC (rev 214044)
@@ -0,0 +1,57 @@
+<!DOCTYPE html>
+<html>
+<head>
+    <meta charset="utf-8">
+    <title>Testing local audio capture playback causes "playing" event to fire</title>
+    <script src=""
+    <script src=""
+</head>
+<body>
+    <script src =""
+    <script>
+    promise_test((test) => {
+        if (window.testRunner)
+            testRunner.setUserMediaPermission(true);
+
+        var localTrack;
+        return navigator.mediaDevices.getUserMedia({audio: true}).then((localStream) => {
+            if (window.internals)
+                internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
+
+            localTrack = localStream.getAudioTracks()[0];
+            var remoteStream;
+            return new Promise((resolve, reject) => {
+                createConnections((firstConnection) => {
+                    firstConnection.addStream(localStream);
+                }, (secondConnection) => {
+                    secondConnection._onaddstream_ = (streamEvent) => {
+                        remoteStream = streamEvent.stream;
+                        resolve();
+                    };
+                });
+            }).then(() => {
+                return waitFor(500);
+            }).then(() => {
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_true(results.heardHum, "heard hum from remote enabled track");
+                });
+            }).then(() => {
+                localTrack.enabled = false;
+                return waitFor(500);
+            }).then(() => {
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_false(results.heardHum, "not heard hum from remote disabled track");
+                });
+            }).then(() => {
+                localTrack.enabled = true;
+                return waitFor(500);
+            }).then(() => {
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_true(results.heardHum, "heard hum from remote reenabled track");
+                });
+            });
+        });
+    }, "Muting and unmuting a local audio track");
+    </script>
+</body>
+</html>

Added: trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute-expected.txt (0 => 214044)


--- trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute-expected.txt	                        (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute-expected.txt	2017-03-16 16:09:50 UTC (rev 214044)
@@ -0,0 +1,3 @@
+
+PASS Muting an incoming audio track 
+

Added: trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute.html (0 => 214044)


--- trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute.html	                        (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute.html	2017-03-16 16:09:50 UTC (rev 214044)
@@ -0,0 +1,47 @@
+<!DOCTYPE html>
+<html>
+<head>
+    <meta charset="utf-8">
+    <title>Testing local audio capture playback causes "playing" event to fire</title>
+    <script src=""
+    <script src=""
+</head>
+<body>
+    <script src =""
+    <script>
+    promise_test((test) => {
+        if (window.testRunner)
+            testRunner.setUserMediaPermission(true);
+
+        return navigator.mediaDevices.getUserMedia({audio: true}).then((localStream) => {
+            if (window.internals)
+                internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
+
+            var remoteTrack;
+            var remoteStream;
+            return new Promise((resolve, reject) => {
+                createConnections((firstConnection) => {
+                    firstConnection.addStream(localStream);
+                }, (secondConnection) => {
+                    secondConnection._onaddstream_ = (streamEvent) => {
+                        remoteStream = streamEvent.stream;
+                        remoteTrack = remoteStream.getAudioTracks()[0];
+                        resolve();
+                    };
+                });
+            }).then(() => {
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_true(results.heardHum, "heard hum from remote enabled track");
+                });
+            }).then(() => {
+                remoteTrack.enabled = false;
+            }).then(() => {
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_false(results.heardHum, "not heard hum from remote disabled track");
+                });
+            });
+        });
+    }, "Muting an incoming audio track");
+    </script>
+</body>
+</html>

Added: trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2-expected.txt (0 => 214044)


--- trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2-expected.txt	                        (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2-expected.txt	2017-03-16 16:09:50 UTC (rev 214044)
@@ -0,0 +1,3 @@
+
+PASS Muting and unmuting an incoming audio track 
+

Added: trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2.html (0 => 214044)


--- trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2.html	                        (rev 0)
+++ trunk/LayoutTests/webrtc/peer-connection-remote-audio-mute2.html	2017-03-16 16:09:50 UTC (rev 214044)
@@ -0,0 +1,53 @@
+<!DOCTYPE html>
+<html>
+<head>
+    <meta charset="utf-8">
+    <title>Testing local audio capture playback causes "playing" event to fire</title>
+    <script src=""
+    <script src=""
+</head>
+<body>
+    <script src =""
+    <script>
+    promise_test((test) => {
+        if (window.testRunner)
+            testRunner.setUserMediaPermission(true);
+
+        return navigator.mediaDevices.getUserMedia({audio: true}).then((localStream) => {
+            if (window.internals)
+                internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
+
+            var remoteTrack;
+            var remoteStream;
+            return new Promise((resolve, reject) => {
+                createConnections((firstConnection) => {
+                    firstConnection.addStream(localStream);
+                }, (secondConnection) => {
+                    secondConnection._onaddstream_ = (streamEvent) => {
+                        remoteStream = streamEvent.stream;
+                        remoteTrack = remoteStream.getAudioTracks()[0];
+                        resolve();
+                    };
+                });
+            }).then(() => {
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_true(results.heardHum, "heard hum from remote enabled track");
+                });
+            }).then(() => {
+                remoteTrack.enabled = false;
+            }).then(() => {
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_false(results.heardHum, "not heard hum from remote disabled track");
+                });
+            }).then(() => {
+                remoteTrack.enabled = true;
+            }).then(() => {
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_true(results.heardHum, "heard hum from remote reenabled track");
+                });
+            });
+        });
+    }, "Muting and unmuting an incoming audio track");
+    </script>
+</body>
+</html>

Modified: trunk/LayoutTests/webrtc/video-mute-expected.txt (214043 => 214044)


--- trunk/LayoutTests/webrtc/video-mute-expected.txt	2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/webrtc/video-mute-expected.txt	2017-03-16 16:09:50 UTC (rev 214044)
@@ -1,4 +1,4 @@
 
 
-PASS Video muted/unmuted track 
+PASS Outgoing muted/unmuted video track 
 

Modified: trunk/LayoutTests/webrtc/video-mute.html (214043 => 214044)


--- trunk/LayoutTests/webrtc/video-mute.html	2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/LayoutTests/webrtc/video-mute.html	2017-03-16 16:09:50 UTC (rev 214044)
@@ -21,10 +21,11 @@
     canvas.height = video.videoHeight;
     canvas.getContext('2d').drawImage(video, 0, 0, canvas.width, canvas.height);
 
-    imageData = canvas.getContext('2d').getImageData(10, 325, 250, 1);
+    imageData = canvas.getContext('2d').getImageData(0, 0, canvas.width, canvas.height);
     data = ""
     for (var cptr = 0; cptr < canvas.width * canvas.height; ++cptr) {
-        if (data[4 * cptr] || data[4 * cptr + 1] || data[4 * cptr + 2])
+        // Approximatively black pixels.
+        if (data[4 * cptr] > 10 || data[4 * cptr + 1] > 10 || data[4 * cptr + 2] > 10)
             return false;
     }
     return true;
@@ -35,35 +36,36 @@
     if (window.testRunner)
         testRunner.setUserMediaPermission(true);
 
-    return navigator.mediaDevices.getUserMedia({ video: true}).then((stream) => {
+    return navigator.mediaDevices.getUserMedia({ video: true}).then((localStream) => {
         return new Promise((resolve, reject) => {
             if (window.internals)
                 internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
 
+            track = localStream.getVideoTracks()[0];
+
             createConnections((firstConnection) => {
-                firstConnection.addStream(stream);
+                firstConnection.addStream(localStream);
             }, (secondConnection) => {
                 secondConnection._onaddstream_ = (streamEvent) => { resolve(streamEvent.stream); };
             });
             setTimeout(() => reject("Test timed out"), 5000);
         });
-    }).then((stream) => {
-        video.srcObject = stream;
-        track = stream.getVideoTracks()[0];
+    }).then((remoteStream) => {
+        video.srcObject = remoteStream;
         return video.play();
     }).then(() => {
-         assert_false(isVideoBlack());
+         assert_false(isVideoBlack(), "track is enabled, video is not black");
     }).then(() => {
         track.enabled = false;
         return waitFor(500);
     }).then(() => {
-        assert_true(isVideoBlack());
+        assert_true(isVideoBlack(), "track is disabled, video is black");
         track.enabled = true;
         return waitFor(500);
     }).then(() => {
-        assert_false(isVideoBlack());
+        assert_false(isVideoBlack(), "track is reenabled, video is not black");
     });
-}, "Video muted/unmuted track");
+}, "Outgoing muted/unmuted video track");
         </script>
     </body>
 </html>

Added: trunk/LayoutTests/webrtc/video-remote-mute-expected.txt (0 => 214044)


--- trunk/LayoutTests/webrtc/video-remote-mute-expected.txt	                        (rev 0)
+++ trunk/LayoutTests/webrtc/video-remote-mute-expected.txt	2017-03-16 16:09:50 UTC (rev 214044)
@@ -0,0 +1,4 @@
+
+
+PASS Incoming muted/unmuted video track 
+

Copied: trunk/LayoutTests/webrtc/video-remote-mute.html (from rev 214043, trunk/LayoutTests/webrtc/video-mute.html) (0 => 214044)


--- trunk/LayoutTests/webrtc/video-remote-mute.html	                        (rev 0)
+++ trunk/LayoutTests/webrtc/video-remote-mute.html	2017-03-16 16:09:50 UTC (rev 214044)
@@ -0,0 +1,69 @@
+<!doctype html>
+<html>
+    <head>
+        <meta charset="utf-8">
+        <title>Testing basic video exchange from offerer to receiver</title>
+        <script src=""
+        <script src=""
+    </head>
+    <body>
+        <video id="video" autoplay=""></video>
+        <canvas id="canvas" width="640" height="480"></canvas>
+        <script src =""
+        <script>
+video = document.getElementById("video");
+canvas = document.getElementById("canvas");
+// FIXME: We should use tracks
+
+function isVideoBlack()
+{
+    canvas.width = video.videoWidth;
+    canvas.height = video.videoHeight;
+    canvas.getContext('2d').drawImage(video, 0, 0, canvas.width, canvas.height);
+
+    imageData = canvas.getContext('2d').getImageData(10, 325, 250, 1);
+    data = ""
+    for (var cptr = 0; cptr < canvas.width * canvas.height; ++cptr) {
+        if (data[4 * cptr] || data[4 * cptr + 1] || data[4 * cptr + 2])
+            return false;
+    }
+    return true;
+}
+
+var track;
+promise_test((test) => {
+    if (window.testRunner)
+        testRunner.setUserMediaPermission(true);
+
+    return navigator.mediaDevices.getUserMedia({ video: true}).then((localStream) => {
+        return new Promise((resolve, reject) => {
+            if (window.internals)
+                internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
+
+            createConnections((firstConnection) => {
+                firstConnection.addStream(localStream);
+            }, (secondConnection) => {
+                secondConnection._onaddstream_ = (streamEvent) => { resolve(streamEvent.stream); };
+            });
+            setTimeout(() => reject("Test timed out"), 5000);
+        });
+    }).then((remoteStream) => {
+        video.srcObject = remoteStream;
+        track = remoteStream.getVideoTracks()[0];
+        return video.play();
+    }).then(() => {
+         assert_false(isVideoBlack());
+    }).then(() => {
+        track.enabled = false;
+        return waitFor(500);
+    }).then(() => {
+        assert_true(isVideoBlack());
+        track.enabled = true;
+        return waitFor(500);
+    }).then(() => {
+        assert_false(isVideoBlack());
+    });
+}, "Incoming muted/unmuted video track");
+        </script>
+    </body>
+</html>

Modified: trunk/Source/WebCore/ChangeLog (214043 => 214044)


--- trunk/Source/WebCore/ChangeLog	2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/ChangeLog	2017-03-16 16:09:50 UTC (rev 214044)
@@ -1,5 +1,34 @@
 2017-03-16  Youenn Fablet  <you...@apple.com>
 
+        Improve WebRTC track enabled support
+        https://bugs.webkit.org/show_bug.cgi?id=169727
+
+        Reviewed by Alex Christensen.
+
+        Tests: webrtc/peer-connection-audio-mute2.html
+               webrtc/peer-connection-remote-audio-mute.html
+               webrtc/video-remote-mute.html
+
+        Making sure muted/disabled sources produce silence/black frames.
+        For outgoing audio/video sources, this should be done by the actual a/v providers.
+        We keep this filtering here until we are sure they implement that.
+
+        * platform/audio/mac/AudioSampleDataSource.mm:
+        (WebCore::AudioSampleDataSource::pullAvalaibleSamplesAsChunks): Ensuring disabled audio tracks send silence.
+        Used for outgoing webrtc tracks.
+        * platform/mediastream/mac/MockRealtimeAudioSourceMac.mm:
+        (WebCore::MockRealtimeAudioSourceMac::render): Ditto.
+        * platform/mediastream/mac/RealtimeIncomingAudioSource.cpp:
+        (WebCore::RealtimeIncomingAudioSource::OnData): Ditto.
+        * platform/mediastream/mac/RealtimeIncomingVideoSource.cpp:
+        (WebCore::RealtimeIncomingVideoSource::pixelBufferFromVideoFrame): Generating black frames if muted.
+        (WebCore::RealtimeIncomingVideoSource::OnFrame):
+        * platform/mediastream/mac/RealtimeIncomingVideoSource.h:
+        * platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp:
+        (WebCore::RealtimeOutgoingVideoSource::videoSampleAvailable): Ensuring we quit after sending black frame.
+
+2017-03-16  Youenn Fablet  <you...@apple.com>
+
         LibWebRTC outgoing source should be thread safe refcounted
         https://bugs.webkit.org/show_bug.cgi?id=169726
 

Modified: trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.mm (214043 => 214044)


--- trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.mm	2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/audio/mac/AudioSampleDataSource.mm	2017-03-16 16:09:50 UTC (rev 214044)
@@ -311,6 +311,16 @@
         timeStamp = startFrame;
 
     startFrame = timeStamp;
+
+    if (m_muted) {
+        AudioSampleBufferList::zeroABL(buffer, sampleCountPerChunk * m_outputDescription->bytesPerFrame());
+        while (endFrame - startFrame >= sampleCountPerChunk) {
+            consumeFilledBuffer();
+            startFrame += sampleCountPerChunk;
+        }
+        return true;
+    }
+
     while (endFrame - startFrame >= sampleCountPerChunk) {
         if (m_ringBuffer->fetch(&buffer, sampleCountPerChunk, startFrame, CARingBuffer::Copy))
             return false;

Modified: trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm (214043 => 214044)


--- trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm	2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm	2017-03-16 16:09:50 UTC (rev 214044)
@@ -147,9 +147,6 @@
 
 void MockRealtimeAudioSourceMac::render(double delta)
 {
-    if (m_muted || !m_enabled)
-        return;
-
     if (!m_audioBufferList)
         reconfigure();
 
@@ -162,8 +159,11 @@
         uint32_t bipBopCount = std::min(frameCount, bipBopRemain);
         for (auto& audioBuffer : m_audioBufferList->buffers()) {
             audioBuffer.mDataByteSize = frameCount * m_streamFormat.mBytesPerFrame;
-            memcpy(audioBuffer.mData, &m_bipBopBuffer[bipBopStart], sizeof(Float32) * bipBopCount);
-            addHum(HumVolume, HumFrequency, m_sampleRate, m_samplesRendered, static_cast<float*>(audioBuffer.mData), bipBopCount);
+            if (!m_muted && m_enabled) {
+                memcpy(audioBuffer.mData, &m_bipBopBuffer[bipBopStart], sizeof(Float32) * bipBopCount);
+                addHum(HumVolume, HumFrequency, m_sampleRate, m_samplesRendered, static_cast<float*>(audioBuffer.mData), bipBopCount);
+            } else
+                memset(audioBuffer.mData, 0, sizeof(Float32) * bipBopCount);
         }
         emitSampleBuffers(bipBopCount);
         m_samplesRendered += bipBopCount;

Modified: trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingAudioSource.cpp (214043 => 214044)


--- trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingAudioSource.cpp	2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingAudioSource.cpp	2017-03-16 16:09:50 UTC (rev 214044)
@@ -95,13 +95,17 @@
             m_audioSourceProvider->prepare(&m_streamFormat);
     }
 
+    // FIXME: We should not need to do the extra memory allocation and copy.
+    // Instead, we should be able to directly pass audioData pointer.
     WebAudioBufferList audioBufferList { CAAudioStreamDescription(m_streamFormat), WTF::safeCast<uint32_t>(numberOfFrames) };
     audioBufferList.buffer(0)->mDataByteSize = numberOfChannels * numberOfFrames * bitsPerSample / 8;
     audioBufferList.buffer(0)->mNumberChannels = numberOfChannels;
-    // FIXME: We should not need to do the extra memory allocation and copy.
-    // Instead, we should be able to directly pass audioData pointer.
-    memcpy(audioBufferList.buffer(0)->mData, audioData, audioBufferList.buffer(0)->mDataByteSize);
 
+    if (muted() || !enabled())
+        memset(audioBufferList.buffer(0)->mData, 0, audioBufferList.buffer(0)->mDataByteSize);
+    else
+        memcpy(audioBufferList.buffer(0)->mData, audioData, audioBufferList.buffer(0)->mDataByteSize);
+
     audioSamplesAvailable(mediaTime, audioBufferList, CAAudioStreamDescription(m_streamFormat), numberOfFrames);
 }
 

Modified: trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.cpp (214043 => 214044)


--- trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.cpp	2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.cpp	2017-03-16 16:09:50 UTC (rev 214044)
@@ -92,13 +92,40 @@
         m_videoTrack->RemoveSink(this);
 }
 
+CVPixelBufferRef RealtimeIncomingVideoSource::pixelBufferFromVideoFrame(const webrtc::VideoFrame& frame)
+{
+    if (muted() || !enabled()) {
+        if (!m_blackFrame || m_blackFrameWidth != frame.width() || m_blackFrameHeight != frame.height()) {
+            CVPixelBufferRef pixelBuffer = nullptr;
+            auto status = CVPixelBufferCreate(kCFAllocatorDefault, frame.width(), frame.height(), kCVPixelFormatType_420YpCbCr8Planar, nullptr, &pixelBuffer);
+            ASSERT_UNUSED(status, status == noErr);
+
+            m_blackFrame = pixelBuffer;
+            m_blackFrameWidth = frame.width();
+            m_blackFrameHeight = frame.height();
+
+            status = CVPixelBufferLockBaseAddress(pixelBuffer, 0);
+            ASSERT(status == noErr);
+            void* data = ""
+            size_t yLength = frame.width() * frame.height();
+            memset(data, 0, yLength);
+            memset(static_cast<uint8_t*>(data) + yLength, 128, yLength / 2);
+
+            status = CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
+            ASSERT(!status);
+        }
+        return m_blackFrame.get();
+    }
+    auto buffer = frame.video_frame_buffer();
+    return static_cast<CVPixelBufferRef>(buffer->native_handle());
+}
+
 void RealtimeIncomingVideoSource::OnFrame(const webrtc::VideoFrame& frame)
 {
     if (!m_isProducingData)
         return;
 
-    auto buffer = frame.video_frame_buffer();
-    CVPixelBufferRef pixelBuffer = static_cast<CVPixelBufferRef>(buffer->native_handle());
+    auto pixelBuffer = pixelBufferFromVideoFrame(frame);
 
     // FIXME: Convert timing information from VideoFrame to CMSampleTimingInfo.
     // For the moment, we will pretend that frames should be rendered asap.

Modified: trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.h (214043 => 214044)


--- trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.h	2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.h	2017-03-16 16:09:50 UTC (rev 214044)
@@ -70,6 +70,8 @@
     // rtc::VideoSinkInterface
     void OnFrame(const webrtc::VideoFrame&) final;
 
+    CVPixelBufferRef pixelBufferFromVideoFrame(const webrtc::VideoFrame&);
+
     RefPtr<Image> m_currentImage;
     RealtimeMediaSourceSettings m_currentSettings;
     RealtimeMediaSourceSupportedConstraints m_supportedConstraints;
@@ -79,6 +81,9 @@
     rtc::scoped_refptr<webrtc::VideoTrackInterface> m_videoTrack;
     RetainPtr<CMSampleBufferRef> m_buffer;
     PixelBufferConformerCV m_conformer;
+    RetainPtr<CVPixelBufferRef> m_blackFrame;
+    int m_blackFrameWidth { 0 };
+    int m_blackFrameHeight { 0 };
 };
 
 } // namespace WebCore

Modified: trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp (214043 => 214044)


--- trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp	2017-03-16 15:23:00 UTC (rev 214043)
+++ trunk/Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp	2017-03-16 16:09:50 UTC (rev 214044)
@@ -92,6 +92,7 @@
         auto blackBuffer = m_bufferPool.CreateBuffer(settings.width(), settings.height());
         blackBuffer->SetToBlack();
         sendFrame(WTFMove(blackBuffer));
+        return;
     }
 
     ASSERT(sample.platformSample().type == PlatformSample::CMSampleBufferType);
_______________________________________________
webkit-changes mailing list
webkit-changes@lists.webkit.org
https://lists.webkit.org/mailman/listinfo/webkit-changes

Reply via email to