In-browser broadcasting
Learn how to broadcast using WebRTC
We demonstrate below how to broadcast from the web using Livepeer’s low latency WebRTC broadcast. Developers can either use the Livepeer Broadcast React component, or build their own WebRTC solution.
Using UI Kit Broadcast
The example below shows how to use the Livepeer UI Kit
Broadcast
component to broadcast from the web.
Broadcast
This guide assumes you have configured a Livepeer JS SDK client with an API key.
We can use the Broadcast
primitives with a
stream key, from a stream we created.
We show some simple styling below with Tailwind CSS, but this can use any styling library, since the primitives ship as unstyled, composable components.
Embeddable broadcast
This is one of the easiest ways to broadcast video from your website/applications. You can embed the iframe on your website/applications by using the below code snippet.
You can replace the STREAM_KEY
with your stream key for the stream.
This will automatically stream from the browser with a fully composed UI, using STUN/TURN servers to avoid network firewall issues.
Adding custom broadcasting
If you want to add custom broadcasting to your app and handle the WebRTC SDP negotiation without using the Livepeer React primitives, you can follow the steps below.
Get the SDP Host
First, you will need to make a request to get the proper ingest URL for the region which your end user is in. We have a global presence, and we handle redirects based on GeoDNS to allow users to get the lowest latency server.
To do this make a HEAD
request to the WebRTC redirect endpoint:
We are only interested in getting the redirect URL from the response, so that we can set up the correct ICE servers.
From the above response headers, the best WebRTC ingest URL for the user is
https://lax-prod-catalyst-2.lp-playback.studio/webrtc/{STREAM_KEY}
. We will
use this in the next step.
The process will change in the future to remove the need for this extraneous
HEAD
request - please check back later.
Broadcast
Now that we have the endpoint for the ICE servers, we can start SDP negotiation following the WHIP spec and kick off a livestream.
The outline of the steps are:
- Create a new
RTCPeerConnection
with the ICE servers from the redirect URL. - Construct an SDP offer using the library of your choice.
- Wait for ICE gathering.
- Send the SDP offer to the server and get the response.
- Use the response to set the remote description on the
RTCPeerConnection
. - Get a local media stream and add the track to the peer connection, and set
the video element
src
to thesrcObject
.
We just negotiated following the WHIP spec (which outlines the structure for the POST requests seen above) and we did SDP negotiation to create a new livestream. We then retrieved a local camera source and started a broadcast!
To make the above process clearer, here is the flow (credit to the authors of the WHIP spec):
The final HTTP DELETE is not needed for our media server, since we detect the end of broadcast by the lack of incoming packets from the gateway.
Was this page helpful?