Here's where I am at on this:
After the WebRTC caller connects to the webcaster radio host (which
works fine, the show host and caller can talk to each other) then I do
this (using the webcaster's current AudioContext - so I think anyway -
where event.stream is the caller's audio stream object (taking WebRTC
audio into Web Audio for processing, which is definitely possible) :
source = Webcaster.node.context.createMediaStreamSource(
event.stream );
gain_node = Webcaster.node.context.createGain();
source.connect( gain_node );
// So with the above I have a gain node with the caller's audio stream
connected to it, or so I think.
// Now I connect the gainNode to the Webcaster's node.context.destination
gain_node.connect( Webcaster.node.context.destination );
No errors are returned in the JS console, but the code doesn't cause the
caller's audio stream to be sent out via webcaster to liquidsoap and on
to Icecast (I'm monitoring the audio stream in a different computer via
a MP3 stream player connected to the Icecast stream). The webcaster
host's audio is sent through liquidsoap to Icecast, I can hear that.
What am I doing wrong?
Mark
On 02/17/2017 08:08 AM, Mark E wrote:
> Hi,
>
> First let me say that webcaster is a great tool - thank you very much
> for creating it along with webcast.js and liquidsoap.
>
> I would like to integrate a simple WebRTC audio-only chat into the
> webcaster client code so that an inbound WebRTC caller is connected to
> the webcaster host's (the person operating webcaster) voice audio and
> mixed into the outbound audio stream being sent to liquidsoap.
>
> I have the chat built and it works fine. Now I need to integrate that
> into the webcaster app. I need some advice please. This is what I think
> should be done:
>
> - When the "Start streaming" button is clicked and the app does its
> navigator.mediaDevices.getUserMedia() call to get permission for the mic
> and the result stream, add code to connect to my WebRTC signaling server..
>
> - On a different web page where people listen to the stream, add a
> button to initiate a call to the studio. When the button is clicked the
> listener is asked for permission to use the mic and then on success is
> connected to the signaling server.
>
> - Signaling server sends an offer to connect to the webcaster host,
> which then (after the usual WebRTC negotiation) results in a RTC peer
> connection between the show and the caller.
>
> - At that point I need to somehow connect the caller's audio stream
> (object) into the overall webcaster audio mix so that the conversation
> goes out to liquidsoap as usual.
>
> The question I have is where and how within the webcaster client.js do I
> connect the callers stream to the outbound audio stream? I'm not an
> expert with backbone and underscore (used with webcaster) but I can wing
> it as I learn if someone can please point me to the right things to do.
>
> Thanks for you time in considering this!
>
> Mark
>
> ------------------------------------------------------------------------------
> Check out the vibrant tech community on one of the world's most
> engaging tech sites, SlashDot.org! http://sdm.link/slashdot
> _______________________________________________
> Savonet-users mailing list
> [email protected]
> https://lists.sourceforge.net/lists/listinfo/savonet-users
>
>
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, SlashDot.org! http://sdm.link/slashdot
_______________________________________________
Savonet-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/savonet-users