What a Gateway Operator Does
Gateway operators handle:
- Job intake and API requests
- Routing workloads to the best Orchestrator (GPU Node)
- Managing pricing, capabilities, and service metadata
- Publishing offerings (AI inference, video transcoding and more) to the Marketplace
- Monitoring job performance, latency, and reliability
Gateways do not compute or perform the AI inference or transcoding themselves.
That work is performed by orchestrators.
Key Marketplace Features
1. Capability Discovery
Gateways and orchestrators list:
- AI model support
- Versioning and model weights
- Pipeline compatibility
- GPU type and compute class
Applications can programmatically choose the best provider.
2. Dynamic Pricing
Pricing can vary by:
- GPU class
- Model complexity
- Latency SLA
- Throughput requirements
- Region
Gateways expose pricing APIs for transparent selection.
Orchestrators compete on:
- Speed
- Reliability
- GPU quality
- Cost efficiency
Gateways compete on:
- Routing quality
- Supported features
- Latency
- Developer ecosystem fit
This creates a healthy decentralized market.
4. BYOC Integration
Any container-based pipeline can be brought into the marketplace:
- Run custom AI models
- Run ML workflows
- Execute arbitrary compute
- Support enterprise workloads
Gateways advertise BYOC offerings; orchestrators execute containers.
Marketplace Benefits
- Developer choice — choose the best model, price, and performance
- Economic incentives — better nodes earn more work
- Scalability — network supply grows independently of demand
- Innovation unlock — new models and pipelines can be added instantly
- Decentralization — no single operator controls the workload flow
Summary
The Marketplace turns Livepeer into a competitive, discoverable, real-time AI compute layer.
- Gateways expose services
- Orchestrators execute them
- Applications choose the best fit
Last modified on March 1, 2026