When to Use BYOC
| Use BYOC when… | Use ComfyStream or the AI gateway API instead when… |
|---|---|
| Your model does not fit into a ComfyUI node graph | Your model is already a ComfyUI workflow |
| You need full control over the inference runtime | You want a hosted or managed inference path |
| You are using a non-standard model architecture | You are running standard batch pipelines (text-to-image, etc.) |
| You want to earn network fees as an AI worker | You are building a client application, not a worker |
| Your pipeline requires Python packages not available in ComfyStream | — |
Prerequisites
- Docker installed on a Linux machine with NVIDIA GPU
- Your AI model or processing function implemented and tested locally
- go-livepeer — to register your container as a worker on the network
- Familiarity with the trickle streaming protocol (you do not need to implement it directly — PyTrickle handles this)
How BYOC Works
Your BYOC container does two things:- Exposes a REST API that the Livepeer gateway calls to start, stop, and update your processing session
- Connects to the trickle streaming layer — subscribes to an input stream URL and publishes to an output stream URL
FrameProcessor), and PyTrickle handles the streaming, encoding, decoding, and API surface.
Step 1 — Implement Your Processor
Install PyTrickle:Step 2 — Define the REST API Contract
PyTrickle automatically exposes these endpoints on your container. The Livepeer gateway calls them to manage your processing session.| Endpoint | Method | Request body | Purpose |
|---|---|---|---|
/api/stream/start | POST | {subscribe_url, publish_url, gateway_request_id, params} | Start a new stream processing session |
/api/stream/params | POST | {key: value, ...} | Update parameters mid-stream |
/api/stream/status | GET | — | Returns current session status |
/api/stream/stop | POST | — | Stop the current session |
StreamServer provides them.
/api/stream/start body:
Step 3 — Build Your Docker Container
Step 4 — Test Locally
Before deploying to the Livepeer network, verify your container processes a stream end-to-end. Prerequisites for local testing:- Install
http-trickle(the trickle protocol server):
Check
GET /api/stream/status to confirm the session is active:
Step 5 — Push to a Container Registry
Step 6 — Deploy to the Livepeer Network
Your BYOC container runs on an orchestrator. The orchestrator pulls your image, starts it, and routes live-video-to-video jobs to it. To register your container with an orchestrator, you (or the orchestrator you are working with) configure go-livepeer to use BYOC mode and point to your container image:BYOC orchestrator onboarding is actively scaling as of Phase 4 (January 2026). If you cannot find a willing orchestrator, reach out in the Livepeer Discord
#developers channel.Building a Client Application on Top of BYOC
Once your BYOC container is live on the network, applications connect to it through a Livepeer gateway using the@muxionlabs/byoc-sdk:
Variants
ComfyStream as a BYOC container
ComfyStream is already integrated with PyTrickle (Phase 4). To run ComfyStream as a BYOC worker, use themuxionlabs/comfystream image instead of building from scratch: