Skip to main content
Livepeer serves multiple builder profiles. The fastest path to your first success depends on what you’re trying to build. Use this guide to find your lane.

Start here in 5 minutes

What do you want to build?

You are: An application developer adding live streaming or video-on-demand to a product.Your path: Use a hosted gateway service. You do not need to run infrastructure.What you’ll use: livepeer npm package, @livepeer/react Player and Broadcast components, RTMP ingest, HLS playback, Livepeer Studio dashboard.Primary CTA: Start implementation with the quickstart.

Video Streaming Quickstart

Create a livestream, get a stream key, and play back with the Livepeer Player in minutes.
Secondary CTA: Use Studio product docs for production API details.

Livepeer Studio

Hosted video gateway - REST API, SDKs, dashboard. Best for production video applications.

Three ways to go deeper

The tabs above get you to a first win quickly. Once you know how you want to engage with the network, most developer journeys settle into one of three longer-term paths. Here is the deeper operating model behind those roles:

Workload Provider

Workload Providers define what runs on Livepeer compute. That can mean packaging a BYOC container, choosing your own routing layer, or interacting with the protocol more directly when you need full operational control.

Start with the standard BYOC route

BYOC is the clearest path for most builders. Package the workload, understand the pipeline model, and validate the routing flow end to end.

BYOC

Learn how custom containers attach to Livepeer’s inference and routing model.

Add a gateway when you need routing control

If you need your own routing, auth, or SLA layer, pair BYOC with a gateway path instead of relying only on hosted products.

Run a Gateway

Understand when gateway control is worth the extra operational complexity.

Use direct contract tooling for advanced control

The DeFine-maintained livepeer-ops workflow and Embody reference implementation show the more protocol-native route: direct orchestrator management, remote coordination, and custom control planes.

Workload Consumer

Workload Consumers do not need to run infrastructure. They use hosted APIs, higher-level products, or existing workloads already available on the network.

Core Contributor

Core Contributors work on the repos that power the network itself: go-livepeer, the protocol contracts, the AI runtime, ComfyStream, and the docs and tooling around them.

Zero-to-Hero Progression

Each builder path has a clear progression from first action to ecosystem contribution:
StageApplication DevGateway OperatorGPU OperatorAI Developer
StartAPI key + first streamRead requirementsCheck GPU compat.API key + first inference
First WinStream playing in appGateway running locallyOrchestrator registeredFirst AI result returned
ProductionLive app with usersOn-chain gateway routing jobsEarning ETH + LPTAI pipeline in product
HeroBuild tools for other devsRun multi-region gateway productTop-tier orchestratorShip novel AI pipeline

Not sure yet? Browse by use case

Last modified on March 16, 2026