Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for broadcasting to icecast2 #28

Open
jmcclelland opened this issue May 23, 2013 · 18 comments
Open

Support for broadcasting to icecast2 #28

jmcclelland opened this issue May 23, 2013 · 18 comments

Comments

@jmcclelland
Copy link

It would be great if we could use this software to broadcast webrtc to an icecast2 server.

There's a thread on the icecast2-dev list on this topic here:

http://lists.xiph.org/pipermail/icecast-dev/2013-May/thread.html#2168

@muaz-khan
Copy link
Owner

You should open peer-to-icecast2 connection. It can be done if icecast2 is capable to generate SDP accordingly.

If peer connection is opened between a browser and the icecast2 server; RTP packets will flow toward icecast2.

You’ll be able to read those DTLS/SRTP packets as ByteStream. Then it will be easy to either broadcast or store i.e. record those packets.

Remember, getting packets as ArrayBuffer or DataURL/etc. in client side; and posting them over icecast2 server using HTTP PUT or POST is not a better idea; also will NEVER be realtime.

You can use hacks on icecast2 to transcode RTP streams if codecs are not supported in v3 beta or earlier.

@dm8tbr
Copy link

dm8tbr commented Jun 17, 2013

Icecast only supports HTTP.

Also the idea of @jmcclelland is to forward a webRTC conversation to an Icecast server for broadcasting. At this point it is no longer required to be real time. Think about hosting a radio discussion spread over several studios. The participants must communicate in real time, the listeners don't care about possible delay on their side. (Also well established TV/Radio broadcasting technologies have delays that can reach several seconds or even more)

As Icecast only supports HTTP PUT and ogg or WebM containers, that puts the requirement to the source client. This can be either directly a webRTC participant with additional logic to repackage opus/vp8, or e.g. KRADradio taking screen content and audio.

@jmcclelland
Copy link
Author

Thanks for the follow up - I agree completely that we don't need realtime. In fact, using most icecast clients we get a delay of up to 45 - 60 seconds. This is acceptable. Can you say more about KRADradio? I'm not sure I follow that part.

@dm8tbr
Copy link

dm8tbr commented Jun 17, 2013

@jmcclelland check out @krad-radio
It would be a bit of a kludge, as you'd need something running that connects to webRTC and then KRAD would capture the screen and audio. This of course also means reencoding, which is computationally expensive.

Hence I do belive, that remuxing vp8+vorbis into WebM, or just vorbis in Ogg is the way to go if you want to bridge webRTC to broadcast streaming.

@muaz-khan
Copy link
Owner

HTTP PUT/POST and Firefox Nightly
<script src="https://webrtc-experiment.appspot.com/AudioVideoRecorder.js"></script>
AudioVideoRecorder({
    // MediaStream object
    stream: MediaStream,

    // mime-type of the output blob
    mimeType: 'audio/ogg',

    // set time-interval to get the blob i.e. 1 second
    interval: 1000,

    // get access to the recorded blob
    onRecordedMedia: function (blob) {
        // POST/PUT blob using FormData/XMLHttpRequest
    }
});

Reference

AFAIK, ogg opus streams are supported on icecast2.

@danielhjames
Copy link

Opus works with the Icecast 2.4 beta series, you can see the packages here:

https://build.opensuse.org/package/show?package=icecast&project=home%3Adm8tbr

Direct downloads are here:

http://download.opensuse.org/repositories/home:/dm8tbr/

@jmcclelland
Copy link
Author

Thank you! I've spent some time experimenting and still do not have solid results. It seems as though with this code I can make a PUT request once every second (or some other duration). However, I'd like to make a single put requests that sends the video stream as a single stream.

The action I'm trying to replicate via html5/js is:

cat foo.ogg | curl -u source:hackme -H "Content-type: application/ogg" -T - http://hobo:8000/webrtc.ogg

Am I doing something wrong? Here's the code I have at this point:

<audio id="audio" autoplay="autoplay" controls></audio>
<script type="text/javascript" src="/js/jquery-1.10.2.min.js"></script>
<script src="/js/AudioVideoRecorder.js"></script>
<script type="text/javascript">
  console.log("starting...");
  navigator.getMedia = ( navigator.getUserMedia ||
                       navigator.webkitGetUserMedia ||
                       navigator.mozGetUserMedia ||
                       navigator.msGetUserMedia);
  navigator.getMedia (
   // constraints
   {
      video: false,
      audio: true
   },

   // successCallback
   function(localMediaStream) {
      console.log("Success call back started.");
      var audio = document.querySelector('audio');
      audio.src = window.URL.createObjectURL(localMediaStream);
      audio.onloadedmetadata = function(e) {
        console.log("onloadmetadata...");
        AudioVideoRecorder({
            // MediaStream object
            stream: localMediaStream,

            // mime-type of the output blob
            mimeType: 'audio/ogg',

            // set time-interval to get the blob i.e. 1 second
            interval: 1000,

            // get access to the recorded blob
            onRecordedMedia: function (blob) {
              console.log("Sending ajax put request");
              $.ajax({
                type: "PUT",
                url: "/webrtc.ogg",
                headers: { "content-type": "audio/ogg" },
                data: blob,
                processData: false,
               data: blob,
                processData: false,
                contentType: "audio/ogg",
                user: "source",
                password: "hackme",
                success: function( data, txtSuccess, jqXHR ) {
                  alert( "Ajax put successful: " + txtSuccess );
                }
              });
            }
        })
      };
   },

   // errorCallback
   function(err) {
    console.log("The following error occured: " + err);
   }

);

@yannickmodahgouez
Copy link

@jmcclelland is your piece of code working ? I didnt test it but I assume that you can implement a proxy that will handle each 1sec chunk then stream back to the icecast server. Isnt it ?

@jmcclelland
Copy link
Author

No - not working, but I didn't think of using a proxy to handle each 1 sec chunk. Please follow up if you are able to get it working.

@peili
Copy link

peili commented May 24, 2014

Any progress publishing stream to icecast?

@tim-phillips
Copy link

Obviously this is way later, but Webcast.js might achieve what you're looking for.

It uses websockets, so you'll need a receiving server running liquidsoap or the like to receive the socket stream and send to icecast using libshout.

webcast.js -> liquidsoap -> icecast

@jmcclelland
Copy link
Author

Never too late! But... it only seems to support audio. I'm looking for streaming video

@Keyne
Copy link

Keyne commented Feb 21, 2018

Late in the conversation but, those are my attempts: https://stackoverflow.com/questions/48913474/how-to-getusermedia-video-and-audio-to-icecast-or-progressive-http-streaming

If anyone got any progress, please let me know.

@jmcclelland
Copy link
Author

Never too late :) - that link seems broken though - can you repaste?

@Keyne
Copy link

Keyne commented Feb 22, 2018

@jmcclelland Ops! I've deleted the post since I've achieved at a first glance what I was wanting. I've not used any of the @muaz-khan code, I wrote using just the browser's APIs. It's actually a combination of the Media Capture and Streams API with nodejs+ffmpeg and icecast. But it's simple, there's another post that describes better the architecture on stackoverflow: https://stackoverflow.com/q/48891897/260610

Just a side note, you've made a reference to WebRTC, but actually this is not WebRTC. It's as stated in the MDN: "The Media Capture and Streams API, often called the Media Stream API or the Stream API, is an API related to WebRTC which supports streams of audio or video data, the methods for working with them, the constraints associated with the type of data, the success and error callbacks when using the data asynchronously, and the events that are fired during the process."

@jmcclelland
Copy link
Author

@Keyne Wow - thanks for pointing me in the right direction. Thanks to you (and the comments on your post) I have finally succeeded! I've published here:
https://gitlab.com/jamie/icecast-web-source-client

@Keyne
Copy link

Keyne commented Feb 22, 2018

@jmcclelland I'm glad that you achieved the same results. Now I need way to play it cross browser. If you figure it out let me know!

@Keyne
Copy link

Keyne commented Feb 23, 2018

@jmcclelland The following seems to play in chrome:

child_process.spawn('ffmpeg', [
        '-i', 'pipe:0',
        '-f', 'webm',
        '-cluster_size_limit', '2M',
        '-cluster_time_limit', '5100',
        '-content_type', 'video/webm',
        '-r', '30',
        '-ac', '2',
        '-acodec', 'libopus',
        '-b:a', '96K',
        '-vcodec', 'libvpx',
        '-b:v', '1M',
        '-crf', '60',
        '-bufsize', '2M',
        '-g', '10',
        '-deadline', 'realtime',
        '-threads', '8',
        '-keyint_min', '10',
        'icecast://source:hackme@localhost:8004/live'
    ]);

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants