Start here in 5 minutes
What do you want to build?
- I want to add video to my app
- I want to run AI on video
- I want to run a gateway
- I want to contribute GPU compute
- I want to extend the protocol
You are: An application developer adding live streaming or video-on-demand to a product.Your path: Use a hosted gateway service. You do not need to run infrastructure.What you’ll use: Secondary CTA: Use Studio product docs for production API details.
livepeer npm package, @livepeer/react Player and Broadcast components, RTMP ingest, HLS playback, Livepeer Studio dashboard.Primary CTA: Start implementation with the quickstart.Video Streaming Quickstart
Create a livestream, get a stream key, and play back with the Livepeer Player in minutes.
Livepeer Studio
Hosted video gateway - REST API, SDKs, dashboard. Best for production video applications.
Three ways to go deeper
The tabs above get you to a first win quickly. Once you know how you want to engage with the network, most developer journeys settle into one of three longer-term paths.Workload Provider
Build workloads that orchestrators run, whether that means BYOC, custom
routing, or direct smart contract control.
Workload Consumer
Consume existing AI or video workloads through hosted gateways, Daydream, or
other higher-level products.
Core Contributor
Work directly on go-livepeer, the protocol, or the supporting OSS stack that
powers the network.
Workload Provider
Workload Providers define what runs on Livepeer compute. That can mean packaging a BYOC container, choosing your own routing layer, or interacting with the protocol more directly when you need full operational control.Start with the standard BYOC route
BYOC is the clearest path for most builders. Package the workload, understand the pipeline model, and validate the routing flow end to end.
BYOC
Learn how custom containers attach to Livepeer’s inference and routing model.
Add a gateway when you need routing control
If you need your own routing, auth, or SLA layer, pair BYOC with a gateway path instead of relying only on hosted products.
Run a Gateway
Understand when gateway control is worth the extra operational complexity.
livepeer-ops
DeFine’s direct smart contract and operator-management toolkit for advanced
workload providers.
Embody pipeline reference
Reference implementation for a real-time avatar workflow built on direct
orchestration patterns.
Workload Consumer
Workload Consumers do not need to run infrastructure. They use hosted APIs, higher-level products, or existing workloads already available on the network.AI Quickstart
Get your first response from Livepeer AI without setting up infrastructure.
Daydream
Explore the productized real-time generative workflow built on Livepeer
infrastructure.
AI on Livepeer
Compare hosted APIs, ComfyStream, and custom workload options before you go
deeper.
Core Contributor
Core Contributors work on the repos that power the network itself:go-livepeer, the protocol contracts, the AI runtime, ComfyStream, and the docs and tooling around them.
OSS Stack
See how the main Livepeer repos fit together and where each kind of
contribution starts.
Contribution Guide
Review contribution standards, repo expectations, and submission
conventions.
go-livepeer
Start with the main node implementation if you want to work on gateways,
orchestrators, or the protocol runtime.
Zero-to-Hero Progression
Each builder path has a clear progression from first action to ecosystem contribution:| Stage | Application Dev | Gateway Operator | GPU Operator | AI Developer |
|---|---|---|---|---|
| Start | API key + first stream | Read requirements | Check GPU compat. | API key + first inference |
| First Win | Stream playing in app | Gateway running locally | Orchestrator registered | First AI result returned |
| Production | Live app with users | On-chain gateway routing jobs | Earning ETH + LPT | AI pipeline in product |
| Hero | Build tools for other devs | Run multi-region gateway product | Top-tier orchestrator | Ship novel AI pipeline |