Technical Role
An Orchestrator is the compute supply layer of the Livepeer network. It accepts jobs from Gateways, routes them to GPU workers, executes the work, and returns results. Orchestrators perform the actual processing - transcoding video frames, running AI inference pipelines, executing BYOC containers. Core responsibilities:- Job execution - receive segments or inference requests from Gateways and route them to GPU workers
- Capability advertisement - broadcast supported pipelines, models, GPU types, and price per unit
- Payment receipt - collect probabilistic micropayment tickets per segment or pixel from Gateways
- Reward calling - trigger the protocol reward mechanism each round to claim LPT inflation rewards
- Worker management - coordinate transcoder workers (video) and AI runners (inference)
Network Role
Orchestrators are the supply side of the Livepeer marketplace. Where Gateways aggregate application demand, Orchestrators provide the GPU compute that fulfils it.- Active set participation - only the top 100 Orchestrators by total bonded stake are eligible to receive work in any given round
- Staking and security - LPT staked to an Orchestrator signals economic commitment; Delegators extend this stake in exchange for a share of earnings
- Governance - Orchestrators participate in protocol governance via LPT voting weight
- Capability discovery - Orchestrators register capabilities and prices on-chain so Gateways can find them via the ServiceRegistry contract on Arbitrum
Orchestrators interact with the BondingManager, RoundsManager, TicketBroker, and
ServiceRegistry contracts on Arbitrum. Gateways interact only with TicketBroker and ServiceRegistry.
This protocol depth is what distinguishes the Orchestrator role from the Gateway role.
Deployment Types
Orchestrators run in five common configurations. Choose the setup that matches your hardware scale, workload mix, and operating model. See the to find the right setup path for your goals and hardware.Who Should Operate One
Orchestrators are infrastructure operators, not application builders. The role requires sustained uptime, GPU hardware investment, and protocol participation (LPT staking or pool membership).The Miner - Can I earn from my GPU?
The Miner - Can I earn from my GPU?
Existing GPU operators can direct spare capacity at video transcoding or AI inference and earn ETH for the work.Start with the to understand what earnings look like for your hardware tier.
The Easy Earner - Simplest path?
The Easy Earner - Simplest path?
Joining an existing pool is the fastest path when you want to participate without managing LPT staking or
on-chain activation. Bring GPU hardware; the pool handles the rest.See for options.
The Pro Operator - Adding AI to an existing setup?
The Pro Operator - Adding AI to an existing setup?
Operators who already run video transcoding can add AI inference workloads from the same node.
New capabilities are advertised automatically once configured.See for the workload overview.
The Business - Building at scale?
The Business - Building at scale?
Commercial Orchestrators serving application workloads (Daydream, Livepeer Studio, other products)
operate differently from solo GPU miners. The incentives, pricing strategy, and operational
requirements differ significantly.See for the revenue model breakdown.
Related Pages
Orchestrator Capabilities
Workload types, execution boundaries, and Gateway selection signals.
Orchestrator Architecture
How Orchestrators connect to Gateways, the protocol layer, and GPU workers.
Incentive Model
Revenue streams, cost structure, and why operating an Orchestrator earns.
Navigator
Find the right setup path for your hardware, goals, and experience level.