Cloud SPE operates free gateways as a public good for the Livepeer ecosystem. These gateways are ideal for experimentation, development, and open-source projects. For production applications with SLA requirements, consider Livepeer Studio. As of 02-March-2026, Cloud SPE publishes AI gateway access through tools.livepeer.cloud; check provider docs for the current direct API base URL and auth requirements.
Start here in 5 minutes
What Cloud SPE Provides
Cloud SPE operates two types of gateways: RTMP Gateway - Free-to-use RTMP ingest for live video streaming powered by the Livepeer transcoding network. Ideal for live streaming applications, Owncast integrations, and developers building on decentralized video infrastructure. AI Gateway - Free access point to the Livepeer AI network for running inference pipelines. Includes support for text-to-image, image-to-image, LLM inference, and more. Access and current endpoint details are published at tools.livepeer.cloud.Getting Started
Access the AI Gateway
The Cloud SPE AI gateway is available at tools.livepeer.cloud. As of 02-March-2026, this portal is the canonical source for current AI endpoint and auth details. The tools page also shows available AI pipelines, network capabilities, and which orchestrators are warm (ready to serve requests without model loading delay).Cloud SPE Tools
Browse AI pipelines, check network capabilities, and access the AI gateway.
First Request Example (Portal-Guided API)
Use the tools portal as the source of truth for the active endpoint and required auth mode:https://dream-gateway.livepeer.cloud, and unauthenticated POST /text-to-image requests succeed. If Cloud SPE changes auth policy, use the current requirement shown in tools.livepeer.cloud.
Expected success signal: HTTP 200 response with an output object (for example, an image URL, base64 payload, or job/result handle depending on the active route).
Access the RTMP Gateway
Cloud SPE provides RTMP ingest for live streaming. Visit livepeer.cloud to get started and access documentation on configuring your encoder.Livepeer.Cloud
Get started with Cloud SPE’s free RTMP streaming infrastructure.
What Cloud SPE Has Built
Beyond running gateways, Cloud SPE contributes tooling to the Livepeer ecosystem: Owncast Integration - Integration between Livepeer’s decentralized transcoding network and Owncast, the open-source self-hosted live streaming platform. Stream through decentralized infrastructure without centralized dependencies. Ollama-Based LLM Runner - A custom AI runner optimized for LLM inference on GPUs with as little as 8GB VRAM (GTX 1080, 1070 Ti, RTX 2060, and others). This lowers the GPU barrier for AI orchestrators significantly from the official 16GB VRAM requirement. Network Capabilities Dashboard - Live visibility into which AI pipelines are available on the network and which orchestrators are serving them warm. Available at tools.livepeer.cloud/ai/network-capabilities. Decentralized Metrics Initiative - Cloud SPE proposed and is delivering decentralized SLA metrics and reliability infrastructure for the Livepeer network, improving transparency and usability for the whole ecosystem.When to Use Cloud SPE
Community and Resources
- livepeer.cloud - Cloud SPE website and RTMP gateway
- tools.livepeer.cloud - AI tools, network capabilities, and gateway access
- Livepeer Forum - Cloud SPE - Treasury proposals, financial reports, and community updates
- Discord - #orchestrating - Get help from Cloud SPE team (ping @mike_zoop)