NDI in the Cloud: Convert SRT to NDI Devices in Minutes
Aug 24, 2025
Moving live production to the cloud no longer means giving up NDI. With a modern gateway, you can ingest SRT from anywhere, convert it to NDI in the cloud, and expose those NDI sources to your mixers—locally or remotely.
Why SRT → NDI in the cloud
- Global capture: Field encoders push SRT through the public internet with ARQ and encryption.
- Production-ready I/O: Convert streams to NDI to feed switchers, graphics, replay, and multiviews.
- Elastic compute: Spin up instances per event, then shut them down to save costs.
Reference architecture
- Ingress: Contributors send SRT (caller) to a public SRT server (listener) with main/backup ports.
- Gateway: A cloud app receives SRT, normalizes audio/video, and exposes each feed as an NDI source.
- Discovery: Deploy an NDI Discovery Server in the same VPC/subnet. Point adapters and mixers to it.
- Production tools: vMix/OBS/wirecast-style switchers connect to the cloud NDI sources or pull return feeds.
- Distribution: Encode ladders to HLS/DASH, or return SRT/RTMP to platforms and affiliates.
Step-by-step setup
- Create two SRT listeners (main/backup). Share hostname/port/passphrase with contributors.
- Map each incoming SRT to an NDI output. Name clearly (e.g., Cam-1-Paris).
- Start the NDI Discovery Server. Set environment variables or config pointing all nodes to it.
- Add NDI sources in your switcher. Confirm resolution, fps, and audio mapping.
- Optionally publish a web player for QA so non-NDI users can monitor in a browser.
Latency & sync tips
- Use SRT latency that matches path quality (e.g., 80–200 ms). Don’t overtune.
- Align frame rates and color spaces across cameras to reduce conversions.
- For multi-cam sports, use timestamp offsets or genlock-like features to line up shots.
Security & reliability
- Protect SRT with passphrases and firewall allow-lists.
- Run dual gateways across AZs; route traffic automatically on failure.
- Use IAM roles for storage and API access; rotate credentials.
Right-sizing instances
Match CPU/GPU/FPGA to your workloads: CPU for light I/O, GPU/FPGA for dense ladders and HEVC. Keep NDI and discovery in the same zone to minimize jitter.
Result: You get the reliability of SRT for contribution and the flexibility of NDI for production—fully cloud-based, fast to set up, and easy to scale.