Title: [177058] trunk/Source/WebCore
Revision
177058
Author
ph...@webkit.org
Date
2014-12-10 08:00:45 -0800 (Wed, 10 Dec 2014)

Log Message

[GStreamer] AudioSourceProvider support in the MediaPlayer
https://bugs.webkit.org/show_bug.cgi?id=78883

Reviewed by Gustavo Noronha Silva.

GStreamer-based audio source provider for the GTK and EFL
ports. This new component gathers decoded raw audio data from the
MediaPlayer and pipes it to an AudioBus when required by the
User Agent.

* PlatformEfl.cmake: New files in the build.
* PlatformGTK.cmake: Ditto.
* platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp: Added.
(WebCore::onAppsinkNewBufferCallback): Function called when a new
buffer can be pulled from appsink.
(WebCore::onGStreamerDeinterleavePadAddedCallback): Function
called when a new source pad has been added to deinterleave.
(WebCore::onGStreamerDeinterleaveReadyCallback): Function called
when the deinterleave element completed the configuration of all
its source pads.
(WebCore::copyGstreamerBuffersToAudioChannel): Called for each
channel of the AudioBus that needs data as input.
(WebCore::AudioSourceProviderGStreamer::AudioSourceProviderGStreamer):
Create an audio bin that by default routes buffers only to
autoaudiosink. A new route is added if the provider has a client.
(WebCore::AudioSourceProviderGStreamer::~AudioSourceProviderGStreamer):
Clean buffer adapters and audio bin.
(WebCore::AudioSourceProviderGStreamer::configureAudioBin):
(WebCore::AudioSourceProviderGStreamer::provideInput): Transfer
data from the buffer adapters to the bus channels.
(WebCore::AudioSourceProviderGStreamer::handleAudioBuffer): Pull a
buffer from appsink and queue it to the buffer adapter.
(WebCore::AudioSourceProviderGStreamer::setClient): Complete the
construction of the audio bin by adding a new chain to the tee
element. This new chain will deinterleave the buffer stream to
planar audio channels and route them to an appsink per channel for
data extraction.
(WebCore::AudioSourceProviderGStreamer::handleNewDeinterleavePad):
A new appsink after a new source pad has been added to deinterleave.
(WebCore::AudioSourceProviderGStreamer::deinterleavePadsConfigured):
Configure the client Node format (number of channels and sample
rate) once the provider knows how many audio channels are managed
by the pipeline.
(WebCore::cleanUpElementsAfterDeinterleaveSourcePadCallback):
(WebCore::AudioSourceProviderGStreamer::cleanUpElementsAfterDeinterleaveSourcePad):
Remove the elements after the given deinterleave source pad.
(WebCore::AudioSourceProviderGStreamer::reset): Cleanup the
deinterleave source pads. This is especially needed before the
whole pipeline goes to NULL and later on prerolls again.
* platform/audio/gstreamer/AudioSourceProviderGStreamer.h: Added.
(WebCore::AudioSourceProviderGStreamer::create): Use this to
create the provider and get an OwnPtr of it.
(WebCore::AudioSourceProviderGStreamer::client): Provider client getter.
(WebCore::AudioSourceProviderGStreamer::getAudioBin): Audio bin
getter, used by the media player to configure its
playbin::audio-sink property.
* platform/graphics/gstreamer/GRefPtrGStreamer.cpp:
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:
(WebCore::MediaPlayerPrivateGStreamer::MediaPlayerPrivateGStreamer):
Provider life cycle management and reset the audio provider before
going to NULL.
(WebCore::MediaPlayerPrivateGStreamer::~MediaPlayerPrivateGStreamer): Ditto.
(WebCore::MediaPlayerPrivateGStreamer::handlePluginInstallerResult): Ditto.
(WebCore::MediaPlayerPrivateGStreamer::cancelLoad): Ditto.
(WebCore::MediaPlayerPrivateGStreamer::didEnd): Ditto.
(WebCore::MediaPlayerPrivateGStreamer::createAudioSink): Configure
the audio source provider if needed.
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h:
(WebCore::MediaPlayerPrivateGStreamer::audioSourceProvider):
Provider getter, used by MediaPlayer and MediaElement.

Modified Paths

Added Paths

Diff

Modified: trunk/Source/WebCore/ChangeLog (177057 => 177058)


--- trunk/Source/WebCore/ChangeLog	2014-12-10 13:00:05 UTC (rev 177057)
+++ trunk/Source/WebCore/ChangeLog	2014-12-10 16:00:45 UTC (rev 177058)
@@ -1,3 +1,76 @@
+2014-12-08  Philippe Normand  <pnorm...@igalia.com>
+
+        [GStreamer] AudioSourceProvider support in the MediaPlayer
+        https://bugs.webkit.org/show_bug.cgi?id=78883
+
+        Reviewed by Gustavo Noronha Silva.
+
+        GStreamer-based audio source provider for the GTK and EFL
+        ports. This new component gathers decoded raw audio data from the
+        MediaPlayer and pipes it to an AudioBus when required by the
+        User Agent.
+
+        * PlatformEfl.cmake: New files in the build.
+        * PlatformGTK.cmake: Ditto.
+        * platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp: Added.
+        (WebCore::onAppsinkNewBufferCallback): Function called when a new
+        buffer can be pulled from appsink.
+        (WebCore::onGStreamerDeinterleavePadAddedCallback): Function
+        called when a new source pad has been added to deinterleave.
+        (WebCore::onGStreamerDeinterleaveReadyCallback): Function called
+        when the deinterleave element completed the configuration of all
+        its source pads.
+        (WebCore::copyGstreamerBuffersToAudioChannel): Called for each
+        channel of the AudioBus that needs data as input.
+        (WebCore::AudioSourceProviderGStreamer::AudioSourceProviderGStreamer):
+        Create an audio bin that by default routes buffers only to
+        autoaudiosink. A new route is added if the provider has a client.
+        (WebCore::AudioSourceProviderGStreamer::~AudioSourceProviderGStreamer):
+        Clean buffer adapters and audio bin.
+        (WebCore::AudioSourceProviderGStreamer::configureAudioBin):
+        (WebCore::AudioSourceProviderGStreamer::provideInput): Transfer
+        data from the buffer adapters to the bus channels.
+        (WebCore::AudioSourceProviderGStreamer::handleAudioBuffer): Pull a
+        buffer from appsink and queue it to the buffer adapter.
+        (WebCore::AudioSourceProviderGStreamer::setClient): Complete the
+        construction of the audio bin by adding a new chain to the tee
+        element. This new chain will deinterleave the buffer stream to
+        planar audio channels and route them to an appsink per channel for
+        data extraction.
+        (WebCore::AudioSourceProviderGStreamer::handleNewDeinterleavePad):
+        A new appsink after a new source pad has been added to deinterleave.
+        (WebCore::AudioSourceProviderGStreamer::deinterleavePadsConfigured):
+        Configure the client Node format (number of channels and sample
+        rate) once the provider knows how many audio channels are managed
+        by the pipeline.
+        (WebCore::cleanUpElementsAfterDeinterleaveSourcePadCallback):
+        (WebCore::AudioSourceProviderGStreamer::cleanUpElementsAfterDeinterleaveSourcePad):
+        Remove the elements after the given deinterleave source pad.
+        (WebCore::AudioSourceProviderGStreamer::reset): Cleanup the
+        deinterleave source pads. This is especially needed before the
+        whole pipeline goes to NULL and later on prerolls again.
+        * platform/audio/gstreamer/AudioSourceProviderGStreamer.h: Added.
+        (WebCore::AudioSourceProviderGStreamer::create): Use this to
+        create the provider and get an OwnPtr of it.
+        (WebCore::AudioSourceProviderGStreamer::client): Provider client getter.
+        (WebCore::AudioSourceProviderGStreamer::getAudioBin): Audio bin
+        getter, used by the media player to configure its
+        playbin::audio-sink property.
+        * platform/graphics/gstreamer/GRefPtrGStreamer.cpp:
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:
+        (WebCore::MediaPlayerPrivateGStreamer::MediaPlayerPrivateGStreamer):
+        Provider life cycle management and reset the audio provider before
+        going to NULL.
+        (WebCore::MediaPlayerPrivateGStreamer::~MediaPlayerPrivateGStreamer): Ditto.
+        (WebCore::MediaPlayerPrivateGStreamer::handlePluginInstallerResult): Ditto.
+        (WebCore::MediaPlayerPrivateGStreamer::cancelLoad): Ditto.
+        (WebCore::MediaPlayerPrivateGStreamer::didEnd): Ditto.
+        (WebCore::MediaPlayerPrivateGStreamer::createAudioSink): Configure
+        the audio source provider if needed.
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h:
+        (WebCore::MediaPlayerPrivateGStreamer::audioSourceProvider):
+        Provider getter, used by MediaPlayer and MediaElement.
+
 2014-12-09  Myles C. Maxfield  <mmaxfi...@apple.com>
 
         Scrolling to anchor tags does nothing in vertical-rl writing mode

Modified: trunk/Source/WebCore/PlatformEfl.cmake (177057 => 177058)


--- trunk/Source/WebCore/PlatformEfl.cmake	2014-12-10 13:00:05 UTC (rev 177057)
+++ trunk/Source/WebCore/PlatformEfl.cmake	2014-12-10 16:00:45 UTC (rev 177058)
@@ -72,6 +72,7 @@
 
     platform/audio/gstreamer/AudioDestinationGStreamer.cpp
     platform/audio/gstreamer/AudioFileReaderGStreamer.cpp
+    platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp
     platform/audio/gstreamer/FFTFrameGStreamer.cpp
     platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.cpp
 

Modified: trunk/Source/WebCore/PlatformGTK.cmake (177057 => 177058)


--- trunk/Source/WebCore/PlatformGTK.cmake	2014-12-10 13:00:05 UTC (rev 177057)
+++ trunk/Source/WebCore/PlatformGTK.cmake	2014-12-10 16:00:45 UTC (rev 177058)
@@ -56,6 +56,7 @@
 
     platform/audio/gstreamer/AudioDestinationGStreamer.cpp
     platform/audio/gstreamer/AudioFileReaderGStreamer.cpp
+    platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp
     platform/audio/gstreamer/FFTFrameGStreamer.cpp
     platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.cpp
 

Added: trunk/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp (0 => 177058)


--- trunk/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp	                        (rev 0)
+++ trunk/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.cpp	2014-12-10 16:00:45 UTC (rev 177058)
@@ -0,0 +1,349 @@
+/*
+ *  Copyright (C) 2014 Igalia S.L
+ *
+ *  This library is free software; you can redistribute it and/or
+ *  modify it under the terms of the GNU Lesser General Public
+ *  License as published by the Free Software Foundation; either
+ *  version 2 of the License, or (at your option) any later version.
+ *
+ *  This library is distributed in the hope that it will be useful,
+ *  but WITHOUT ANY WARRANTY; without even the implied warranty of
+ *  MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ *  Lesser General Public License for more details.
+ *
+ *  You should have received a copy of the GNU Lesser General Public
+ *  License along with this library; if not, write to the Free Software
+ *  Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA  02110-1301  USA
+ */
+
+#include "config.h"
+#include "AudioSourceProviderGStreamer.h"
+
+#if ENABLE(WEB_AUDIO) && ENABLE(VIDEO) && USE(GSTREAMER)
+
+#include "AudioBus.h"
+#include "AudioSourceProviderClient.h"
+#include <gst/app/gstappsink.h>
+#include <gst/audio/audio.h>
+#include <gst/base/gstadapter.h>
+#include <wtf/gobject/GMutexLocker.h>
+
+
+namespace WebCore {
+
+// For now the provider supports only stereo files at a fixed sample
+// bitrate.
+static const int gNumberOfChannels = 2;
+static const float gSampleBitRate = 44100;
+
+static GstFlowReturn onAppsinkNewBufferCallback(GstAppSink* sink, gpointer userData)
+{
+    return static_cast<AudioSourceProviderGStreamer*>(userData)->handleAudioBuffer(sink);
+}
+
+static void onGStreamerDeinterleavePadAddedCallback(GstElement*, GstPad* pad, AudioSourceProviderGStreamer* provider)
+{
+    provider->handleNewDeinterleavePad(pad);
+}
+
+static void onGStreamerDeinterleaveReadyCallback(GstElement*, AudioSourceProviderGStreamer* provider)
+{
+    provider->deinterleavePadsConfigured();
+}
+
+static void onGStreamerDeinterleavePadRemovedCallback(GstElement*, GstPad* pad, AudioSourceProviderGStreamer* provider)
+{
+    provider->handleRemovedDeinterleavePad(pad);
+}
+
+static GstPadProbeReturn onAppsinkFlushCallback(GstPad*, GstPadProbeInfo* info, gpointer userData)
+{
+    if (GST_PAD_PROBE_INFO_TYPE(info) & (GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM | GST_PAD_PROBE_TYPE_EVENT_FLUSH)) {
+        GstEvent* event = GST_PAD_PROBE_INFO_EVENT(info);
+        if (GST_EVENT_TYPE(event) == GST_EVENT_FLUSH_STOP) {
+            AudioSourceProviderGStreamer* provider = reinterpret_cast<AudioSourceProviderGStreamer*>(userData);
+            provider->clearAdapters();
+        }
+    }
+    return GST_PAD_PROBE_OK;
+}
+
+static void copyGStreamerBuffersToAudioChannel(GstAdapter* adapter, AudioBus* bus , int channelNumber, size_t framesToProcess)
+{
+    if (!gst_adapter_available(adapter)) {
+        bus->zero();
+        return;
+    }
+
+    size_t bytes = framesToProcess * sizeof(float);
+    if (gst_adapter_available(adapter) >= bytes) {
+        gst_adapter_copy(adapter, bus->channel(channelNumber)->mutableData(), 0, bytes);
+        gst_adapter_flush(adapter, bytes);
+    }
+}
+
+AudioSourceProviderGStreamer::AudioSourceProviderGStreamer()
+    : m_client(0)
+    , m_deinterleaveSourcePads(0)
+    , m_deinterleavePadAddedHandlerId(0)
+    , m_deinterleaveNoMorePadsHandlerId(0)
+    , m_deinterleavePadRemovedHandlerId(0)
+{
+    g_mutex_init(&m_adapterMutex);
+    m_frontLeftAdapter = gst_adapter_new();
+    m_frontRightAdapter = gst_adapter_new();
+}
+
+AudioSourceProviderGStreamer::~AudioSourceProviderGStreamer()
+{
+    GRefPtr<GstElement> deinterleave = adoptGRef(gst_bin_get_by_name(GST_BIN(m_audioSinkBin.get()), "deinterleave"));
+    if (deinterleave) {
+        g_signal_handler_disconnect(deinterleave.get(), m_deinterleavePadAddedHandlerId);
+        g_signal_handler_disconnect(deinterleave.get(), m_deinterleaveNoMorePadsHandlerId);
+        g_signal_handler_disconnect(deinterleave.get(), m_deinterleavePadRemovedHandlerId);
+    }
+
+    g_object_unref(m_frontLeftAdapter);
+    g_object_unref(m_frontRightAdapter);
+    g_mutex_clear(&m_adapterMutex);
+}
+
+void AudioSourceProviderGStreamer::configureAudioBin(GstElement* audioBin, GstElement* teePredecessor)
+{
+    m_audioSinkBin = audioBin;
+
+    GstElement* audioTee = gst_element_factory_make("tee", "audioTee");
+    GstElement* audioQueue = gst_element_factory_make("queue", 0);
+    GstElement* audioConvert = gst_element_factory_make("audioconvert", 0);
+    GstElement* audioConvert2 = gst_element_factory_make("audioconvert", 0);
+    GstElement* audioResample = gst_element_factory_make("audioresample", 0);
+    GstElement* audioResample2 = gst_element_factory_make("audioresample", 0);
+    GstElement* volumeElement = gst_element_factory_make("volume", "volume");
+    GstElement* audioSink = gst_element_factory_make("autoaudiosink", 0);
+
+    gst_bin_add_many(GST_BIN(m_audioSinkBin.get()), audioTee, audioQueue, audioConvert, audioResample, volumeElement, audioConvert2, audioResample2, audioSink, nullptr);
+
+    // In cases where the audio-sink needs elements before tee (such
+    // as scaletempo) they need to be linked to tee which in this case
+    // doesn't need a ghost pad. It is assumed that the teePredecessor
+    // chain already configured a ghost pad.
+    if (teePredecessor)
+        gst_element_link_pads_full(teePredecessor, "src", audioTee, "sink", GST_PAD_LINK_CHECK_NOTHING);
+    else {
+        // Add a ghostpad to the bin so it can proxy to tee.
+        GRefPtr<GstPad> audioTeeSinkPad = adoptGRef(gst_element_get_static_pad(audioTee, "sink"));
+        gst_element_add_pad(m_audioSinkBin.get(), gst_ghost_pad_new("sink", audioTeeSinkPad.get()));
+    }
+
+    // Link a new src pad from tee to queue ! audioconvert !
+    // audioresample ! volume ! audioconvert ! audioresample !
+    // autoaudiosink. The audioresample and audioconvert are needed to
+    // ensure the audio sink receives buffers in the correct format.
+    gst_element_link_pads_full(audioTee, "src_%u", audioQueue, "sink", GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(audioQueue, "src", audioConvert, "sink", GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(audioConvert, "src", audioResample, "sink", GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(audioResample, "src", volumeElement, "sink", GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(volumeElement, "src", audioConvert2, "sink", GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(audioConvert2, "src", audioResample2, "sink", GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(audioResample2, "src", audioSink, "sink", GST_PAD_LINK_CHECK_NOTHING);
+}
+
+void AudioSourceProviderGStreamer::provideInput(AudioBus* bus, size_t framesToProcess)
+{
+    GMutexLocker<GMutex> lock(m_adapterMutex);
+    copyGStreamerBuffersToAudioChannel(m_frontLeftAdapter, bus, 0, framesToProcess);
+    copyGStreamerBuffersToAudioChannel(m_frontRightAdapter, bus, 1, framesToProcess);
+}
+
+GstFlowReturn AudioSourceProviderGStreamer::handleAudioBuffer(GstAppSink* sink)
+{
+    if (!m_client)
+        return GST_FLOW_OK;
+
+    // Pull a buffer from appsink and store it the appropriate buffer
+    // list for the audio channel it represents.
+    GRefPtr<GstSample> sample = adoptGRef(gst_app_sink_pull_sample(sink));
+    if (!sample)
+        return gst_app_sink_is_eos(sink) ? GST_FLOW_EOS : GST_FLOW_ERROR;
+
+    GstBuffer* buffer = gst_sample_get_buffer(sample.get());
+    if (!buffer)
+        return GST_FLOW_ERROR;
+
+    GstCaps* caps = gst_sample_get_caps(sample.get());
+    if (!caps)
+        return GST_FLOW_ERROR;
+
+    GstAudioInfo info;
+    gst_audio_info_from_caps(&info, caps);
+
+    GMutexLocker<GMutex> lock(m_adapterMutex);
+
+    // Check the first audio channel. The buffer is supposed to store
+    // data of a single channel anyway.
+    switch (GST_AUDIO_INFO_POSITION(&info, 0)) {
+    case GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT:
+    case GST_AUDIO_CHANNEL_POSITION_MONO:
+        gst_adapter_push(m_frontLeftAdapter, gst_buffer_ref(buffer));
+        break;
+    case GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT:
+        gst_adapter_push(m_frontRightAdapter, gst_buffer_ref(buffer));
+        break;
+    default:
+        break;
+    }
+
+    return GST_FLOW_OK;
+}
+
+void AudioSourceProviderGStreamer::setClient(AudioSourceProviderClient* client)
+{
+    ASSERT(client);
+    m_client = client;
+
+    // The volume element is used to mute audio playback towards the
+    // autoaudiosink. This is needed to avoid double playback of audio
+    // from our audio sink and from the WebAudio AudioDestination node
+    // supposedly configured already by application side.
+    GRefPtr<GstElement> volumeElement = adoptGRef(gst_bin_get_by_name(GST_BIN(m_audioSinkBin.get()), "volume"));
+    g_object_set(volumeElement.get(), "mute", TRUE, nullptr);
+
+    // The audioconvert and audioresample elements are needed to
+    // ensure deinterleave and the sinks downstream receive buffers in
+    // the format specified by the capsfilter.
+    GstElement* audioQueue = gst_element_factory_make("queue", 0);
+    GstElement* audioConvert  = gst_element_factory_make("audioconvert", 0);
+    GstElement* audioResample = gst_element_factory_make("audioresample", 0);
+    GstElement* capsFilter = gst_element_factory_make("capsfilter", 0);
+    GstElement* deInterleave = gst_element_factory_make("deinterleave", "deinterleave");
+
+    g_object_set(deInterleave, "keep-positions", TRUE, nullptr);
+    m_deinterleavePadAddedHandlerId = g_signal_connect(deInterleave, "pad-added", G_CALLBACK(onGStreamerDeinterleavePadAddedCallback), this);
+    m_deinterleaveNoMorePadsHandlerId = g_signal_connect(deInterleave, "no-more-pads", G_CALLBACK(onGStreamerDeinterleaveReadyCallback), this);
+    m_deinterleavePadRemovedHandlerId = g_signal_connect(deInterleave, "pad-removed", G_CALLBACK(onGStreamerDeinterleavePadRemovedCallback), this);
+
+    GstCaps* caps = gst_caps_new_simple("audio/x-raw", "rate", G_TYPE_INT, static_cast<int>(gSampleBitRate),
+        "channels", G_TYPE_INT, gNumberOfChannels,
+        "format", G_TYPE_STRING, GST_AUDIO_NE(F32),
+        "layout", G_TYPE_STRING, "interleaved", nullptr);
+
+    g_object_set(capsFilter, "caps", caps, nullptr);
+    gst_caps_unref(caps);
+
+    gst_bin_add_many(GST_BIN(m_audioSinkBin.get()), audioQueue, audioConvert, audioResample, capsFilter, deInterleave, nullptr);
+
+    GRefPtr<GstElement> audioTee = adoptGRef(gst_bin_get_by_name(GST_BIN(m_audioSinkBin.get()), "audioTee"));
+
+    // Link a new src pad from tee to queue ! audioconvert !
+    // audioresample ! capsfilter ! deinterleave. Later
+    // on each deinterleaved planar audio channel will be routed to an
+    // appsink for data extraction and processing.
+    gst_element_link_pads_full(audioTee.get(), "src_%u", audioQueue, "sink", GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(audioQueue, "src", audioConvert, "sink", GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(audioConvert, "src", audioResample, "sink", GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(audioResample, "src", capsFilter, "sink", GST_PAD_LINK_CHECK_NOTHING);
+    gst_element_link_pads_full(capsFilter, "src", deInterleave, "sink", GST_PAD_LINK_CHECK_NOTHING);
+
+    gst_element_sync_state_with_parent(audioQueue);
+    gst_element_sync_state_with_parent(audioConvert);
+    gst_element_sync_state_with_parent(audioResample);
+    gst_element_sync_state_with_parent(capsFilter);
+    gst_element_sync_state_with_parent(deInterleave);
+}
+
+void AudioSourceProviderGStreamer::handleNewDeinterleavePad(GstPad* pad)
+{
+    m_deinterleaveSourcePads++;
+
+    if (m_deinterleaveSourcePads > 2) {
+        g_warning("The AudioSourceProvider supports only mono and stereo audio. Silencing out this new channel.");
+        GstElement* queue = gst_element_factory_make("queue", 0);
+        GstElement* sink = gst_element_factory_make("fakesink", 0);
+        g_object_set(sink, "async", FALSE, nullptr);
+        gst_bin_add_many(GST_BIN(m_audioSinkBin.get()), queue, sink, nullptr);
+
+        GRefPtr<GstPad> sinkPad = adoptGRef(gst_element_get_static_pad(queue, "sink"));
+        gst_pad_link_full(pad, sinkPad.get(), GST_PAD_LINK_CHECK_NOTHING);
+
+        GQuark quark = g_quark_from_static_string("peer");
+        g_object_set_qdata(G_OBJECT(pad), quark, sinkPad.get());
+        gst_element_link_pads_full(queue, "src", sink, "sink", GST_PAD_LINK_CHECK_NOTHING);
+        gst_element_sync_state_with_parent(queue);
+        gst_element_sync_state_with_parent(sink);
+        return;
+    }
+
+    // A new pad for a planar channel was added in deinterleave. Plug
+    // in an appsink so we can pull the data from each
+    // channel. Pipeline looks like:
+    // ... deinterleave ! queue ! appsink.
+    GstElement* queue = gst_element_factory_make("queue", 0);
+    GstElement* sink = gst_element_factory_make("appsink", 0);
+
+    GstAppSinkCallbacks callbacks;
+    callbacks.eos = 0;
+    callbacks.new_preroll = 0;
+    callbacks.new_sample = onAppsinkNewBufferCallback;
+    gst_app_sink_set_callbacks(GST_APP_SINK(sink), &callbacks, this, 0);
+
+    g_object_set(sink, "async", FALSE, nullptr);
+
+    GRefPtr<GstCaps> caps = adoptGRef(gst_caps_new_simple("audio/x-raw", "rate", G_TYPE_INT, static_cast<int>(gSampleBitRate),
+        "channels", G_TYPE_INT, 1,
+        "format", G_TYPE_STRING, GST_AUDIO_NE(F32),
+        "layout", G_TYPE_STRING, "interleaved", nullptr));
+
+    gst_app_sink_set_caps(GST_APP_SINK(sink), caps.get());
+
+    gst_bin_add_many(GST_BIN(m_audioSinkBin.get()), queue, sink, nullptr);
+
+    GRefPtr<GstPad> sinkPad = adoptGRef(gst_element_get_static_pad(queue, "sink"));
+    gst_pad_link_full(pad, sinkPad.get(), GST_PAD_LINK_CHECK_NOTHING);
+
+    GQuark quark = g_quark_from_static_string("peer");
+    g_object_set_qdata(G_OBJECT(pad), quark, sinkPad.get());
+
+    gst_element_link_pads_full(queue, "src", sink, "sink", GST_PAD_LINK_CHECK_NOTHING);
+
+    sinkPad = adoptGRef(gst_element_get_static_pad(sink, "sink"));
+    gst_pad_add_probe(sinkPad.get(), GST_PAD_PROBE_TYPE_EVENT_FLUSH, onAppsinkFlushCallback, this, nullptr);
+
+    gst_element_sync_state_with_parent(queue);
+    gst_element_sync_state_with_parent(sink);
+}
+
+void AudioSourceProviderGStreamer::handleRemovedDeinterleavePad(GstPad* pad)
+{
+    m_deinterleaveSourcePads--;
+
+    // Remove the queue ! appsink chain downstream of deinterleave.
+    GQuark quark = g_quark_from_static_string("peer");
+    GstPad* sinkPad = reinterpret_cast<GstPad*>(g_object_get_qdata(G_OBJECT(pad), quark));
+    GRefPtr<GstElement> queue = adoptGRef(gst_pad_get_parent_element(sinkPad));
+    GRefPtr<GstPad> queueSrcPad = adoptGRef(gst_element_get_static_pad(queue.get(), "src"));
+    GRefPtr<GstPad> appsinkSinkPad = adoptGRef(gst_pad_get_peer(queueSrcPad.get()));
+    GRefPtr<GstElement> sink = adoptGRef(gst_pad_get_parent_element(appsinkSinkPad.get()));
+    gst_element_set_state(sink.get(), GST_STATE_NULL);
+    gst_element_set_state(queue.get(), GST_STATE_NULL);
+    gst_element_unlink(queue.get(), sink.get());
+    gst_bin_remove_many(GST_BIN(m_audioSinkBin.get()), queue.get(), sink.get(), nullptr);
+}
+
+void AudioSourceProviderGStreamer::deinterleavePadsConfigured()
+{
+    ASSERT(m_client);
+    ASSERT(m_deinterleaveSourcePads == gNumberOfChannels);
+
+    m_client->setFormat(m_deinterleaveSourcePads, gSampleBitRate);
+}
+
+void AudioSourceProviderGStreamer::clearAdapters()
+{
+    GMutexLocker<GMutex> lock(m_adapterMutex);
+    gst_adapter_clear(m_frontLeftAdapter);
+    gst_adapter_clear(m_frontRightAdapter);
+}
+
+} // WebCore
+
+#endif // ENABLE(WEB_AUDIO) && ENABLE(VIDEO) && USE(GSTREAMER)

Added: trunk/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.h (0 => 177058)


--- trunk/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.h	                        (rev 0)
+++ trunk/Source/WebCore/platform/audio/gstreamer/AudioSourceProviderGStreamer.h	2014-12-10 16:00:45 UTC (rev 177058)
@@ -0,0 +1,72 @@
+/*
+ *  Copyright (C) 2014 Igalia S.L
+ *
+ *  This library is free software; you can redistribute it and/or
+ *  modify it under the terms of the GNU Lesser General Public
+ *  License as published by the Free Software Foundation; either
+ *  version 2 of the License, or (at your option) any later version.
+ *
+ *  This library is distributed in the hope that it will be useful,
+ *  but WITHOUT ANY WARRANTY; without even the implied warranty of
+ *  MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ *  Lesser General Public License for more details.
+ *
+ *  You should have received a copy of the GNU Lesser General Public
+ *  License along with this library; if not, write to the Free Software
+ *  Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA  02110-1301  USA
+ */
+
+#ifndef AudioSourceProviderGStreamer_h
+#define AudioSourceProviderGStreamer_h
+
+#if ENABLE(WEB_AUDIO) && ENABLE(VIDEO) && USE(GSTREAMER)
+
+#include "AudioSourceProvider.h"
+#include "GRefPtrGStreamer.h"
+#include <gst/gst.h>
+#include <wtf/Forward.h>
+#include <wtf/Noncopyable.h>
+#include <wtf/PassOwnPtr.h>
+
+typedef struct _GstAdapter GstAdapter;
+typedef struct _GstAppSink GstAppSink;
+
+namespace WebCore {
+
+class AudioSourceProviderGStreamer : public AudioSourceProvider {
+    WTF_MAKE_NONCOPYABLE(AudioSourceProviderGStreamer);
+public:
+    static PassOwnPtr<AudioSourceProviderGStreamer> create() { return adoptPtr(new AudioSourceProviderGStreamer()); }
+    AudioSourceProviderGStreamer();
+    ~AudioSourceProviderGStreamer();
+
+    void configureAudioBin(GstElement* audioBin, GstElement* teePredecessor);
+
+    void provideInput(AudioBus*, size_t framesToProcess);
+    void setClient(AudioSourceProviderClient*);
+    const AudioSourceProviderClient* client() const { return m_client; }
+
+    void handleNewDeinterleavePad(GstPad*);
+    void deinterleavePadsConfigured();
+    void handleRemovedDeinterleavePad(GstPad*);
+
+    GstFlowReturn handleAudioBuffer(GstAppSink*);
+    GstElement* getAudioBin() const { return m_audioSinkBin.get(); }
+    void clearAdapters();
+
+private:
+    GRefPtr<GstElement> m_audioSinkBin;
+    AudioSourceProviderClient* m_client;
+    int m_deinterleaveSourcePads;
+    GstAdapter* m_frontLeftAdapter;
+    GstAdapter* m_frontRightAdapter;
+    unsigned long m_deinterleavePadAddedHandlerId;
+    unsigned long m_deinterleaveNoMorePadsHandlerId;
+    unsigned long m_deinterleavePadRemovedHandlerId;
+    GMutex m_adapterMutex;
+};
+
+}
+#endif // ENABLE(WEB_AUDIO) && ENABLE(VIDEO) && USE(GSTREAMER)
+
+#endif // AudioSourceProviderGStreamer_h

Modified: trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp (177057 => 177058)


--- trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp	2014-12-10 13:00:05 UTC (rev 177057)
+++ trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp	2014-12-10 16:00:45 UTC (rev 177058)
@@ -64,6 +64,10 @@
 #include "WebKitMediaSourceGStreamer.h"
 #endif
 
+#if ENABLE(WEB_AUDIO)
+#include "AudioSourceProviderGStreamer.h"
+#endif
+
 // Max interval in seconds to stay in the READY state on manual
 // state change requests.
 static const unsigned gReadyStateTimerInterval = 60;
@@ -212,6 +216,9 @@
     , m_hasAudio(false)
     , m_totalBytes(0)
     , m_preservesPitch(false)
+#if ENABLE(WEB_AUDIO)
+    , m_audioSourceProvider(AudioSourceProviderGStreamer::create())
+#endif
     , m_requestedState(GST_STATE_VOID_PENDING)
     , m_missingPlugins(false)
 {
@@ -265,6 +272,10 @@
         GRefPtr<GstPad> videoSinkPad = adoptGRef(gst_element_get_static_pad(m_webkitVideoSink.get(), "sink"));
         g_signal_handlers_disconnect_by_func(videoSinkPad.get(), reinterpret_cast<gpointer>(mediaPlayerPrivateVideoSinkCapsChangedCallback), this);
     }
+
+#if ENABLE(WEB_AUDIO)
+    m_audioSourceProvider.release();
+#endif
 }
 
 void MediaPlayerPrivateGStreamer::load(const String& urlString)
@@ -1845,35 +1856,56 @@
     m_autoAudioSink = gst_element_factory_make("autoaudiosink", 0);
     g_signal_connect(m_autoAudioSink.get(), "child-added", G_CALLBACK(setAudioStreamPropertiesCallback), this);
 
-    // Construct audio sink only if pitch preserving is enabled.
-    if (!m_preservesPitch)
-        return m_autoAudioSink.get();
+    GstElement* audioSinkBin;
 
-    // On 1.4.2 and newer we use the audio-filter property instead.
-    if (webkitGstCheckVersion(1, 4, 2))
+    if (webkitGstCheckVersion(1, 4, 2)) {
+#if ENABLE(WEB_AUDIO)
+        audioSinkBin = gst_bin_new("audio-sink");
+        m_audioSourceProvider->configureAudioBin(audioSinkBin, nullptr);
+        return audioSinkBin;
+#else
         return m_autoAudioSink.get();
-
-    GstElement* scale = gst_element_factory_make("scaletempo", 0);
-    if (!scale) {
-        GST_WARNING("Failed to create scaletempo");
-        return m_autoAudioSink.get();
+#endif
     }
 
-    GstElement* audioSinkBin = gst_bin_new("audio-sink");
-    GstElement* convert = gst_element_factory_make("audioconvert", 0);
-    GstElement* resample = gst_element_factory_make("audioresample", 0);
+    // Construct audio sink only if pitch preserving is enabled.
+    // If GStreamer 1.4.2 is used the audio-filter playbin property is used instead.
+    if (m_preservesPitch) {
+        GstElement* scale = gst_element_factory_make("scaletempo", nullptr);
+        if (!scale) {
+            GST_WARNING("Failed to create scaletempo");
+            return m_autoAudioSink.get();
+        }
 
-    gst_bin_add_many(GST_BIN(audioSinkBin), scale, convert, resample, m_autoAudioSink.get(), NULL);
+        audioSinkBin = gst_bin_new("audio-sink");
+        gst_bin_add(GST_BIN(audioSinkBin), scale);
+        GRefPtr<GstPad> pad = adoptGRef(gst_element_get_static_pad(scale, "sink"));
+        gst_element_add_pad(audioSinkBin, gst_ghost_pad_new("sink", pad.get()));
 
-    if (!gst_element_link_many(scale, convert, resample, m_autoAudioSink.get(), NULL)) {
-        GST_WARNING("Failed to link audio sink elements");
-        gst_object_unref(audioSinkBin);
-        return m_autoAudioSink.get();
+#if ENABLE(WEB_AUDIO)
+        m_audioSourceProvider->configureAudioBin(audioSinkBin, scale);
+#else
+        GstElement* convert = gst_element_factory_make("audioconvert", nullptr);
+        GstElement* resample = gst_element_factory_make("audioresample", nullptr);
+
+        gst_bin_add_many(GST_BIN(audioSinkBin), convert, resample, m_autoAudioSink.get(), nullptr);
+
+        if (!gst_element_link_many(scale, convert, resample, m_autoAudioSink.get(), nullptr)) {
+            GST_WARNING("Failed to link audio sink elements");
+            gst_object_unref(audioSinkBin);
+            return m_autoAudioSink.get();
+        }
+#endif
+        return audioSinkBin;
     }
 
-    GRefPtr<GstPad> pad = adoptGRef(gst_element_get_static_pad(scale, "sink"));
-    gst_element_add_pad(audioSinkBin, gst_ghost_pad_new("sink", pad.get()));
+#if ENABLE(WEB_AUDIO)
+    audioSinkBin = gst_bin_new("audio-sink");
+    m_audioSourceProvider->configureAudioBin(audioSinkBin, nullptr);
     return audioSinkBin;
+#endif
+    ASSERT_NOT_REACHED();
+    return 0;
 }
 
 GstElement* MediaPlayerPrivateGStreamer::audioSink() const

Modified: trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h (177057 => 177058)


--- trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h	2014-12-10 13:00:05 UTC (rev 177057)
+++ trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h	2014-12-10 16:00:45 UTC (rev 177058)
@@ -50,6 +50,11 @@
 
 namespace WebCore {
 
+#if ENABLE(WEB_AUDIO)
+class AudioSourceProvider;
+class AudioSourceProviderGStreamer;
+#endif
+
 class AudioTrackPrivateGStreamer;
 class InbandMetadataTextTrackPrivateGStreamer;
 class InbandTextTrackPrivateGStreamer;
@@ -125,6 +130,10 @@
 
     bool changePipelineState(GstState);
 
+#if ENABLE(WEB_AUDIO)
+    AudioSourceProvider* audioSourceProvider() { return reinterpret_cast<AudioSourceProvider*>(m_audioSourceProvider.get()); }
+#endif
+
 private:
     MediaPlayerPrivateGStreamer(MediaPlayer*);
 
@@ -211,6 +220,9 @@
     mutable unsigned long long m_totalBytes;
     URL m_url;
     bool m_preservesPitch;
+#if ENABLE(WEB_AUDIO)
+    OwnPtr<AudioSourceProviderGStreamer> m_audioSourceProvider;
+#endif
     GstState m_requestedState;
     GRefPtr<GstElement> m_autoAudioSink;
     bool m_missingPlugins;
_______________________________________________
webkit-changes mailing list
webkit-changes@lists.webkit.org
https://lists.webkit.org/mailman/listinfo/webkit-changes

Reply via email to