media server logo
SRT contribution boundary

Use an SRT server as the clean ingest edge for live production.

Bring remote guests, field feeds, cameras, and vMix outputs into one stable SRT contribution workflow. Callaba helps broadcast teams accept the signal, monitor RTT and bitrate, and turn the same input into browser playback, multi-destination delivery, and recording.

Built for producers and technical directors first, with API control available when engineering teams need protocol-level automation.

Remote contributionAccept remote guests and field feeds safely

Keep noisy public-network inputs at a clean ingest edge before they hit production.

Runtime visibilityWatch RTT, bitrate, and live state

See the path while the stream is still recoverable instead of guessing after failure.

Operational outcomeSend the same signal to vMix, viewers, and archives

Use one controlled SRT boundary to feed playback, distribution, and recording together.

Good service for SRT broadcasts organizing.

Andy SmithVerified AWS customer
What an SRT server solves in practiceBroadcast teams first

Callaba turns one SRT ingest point into a real live production surface.

The SRT server is not the final product the audience sees. It is the controlled ingest and monitoring layer that lets the same contribution feed vMix, browser playback, distribution, and recording without fragile glue.

SourceEncoder, field kit, remote guest, or vMix output

Accept live SRT from the place where the feed actually enters the workflow.

Control layerSRT ingest + monitoring

See RTT, bitrate, and connection health before the contribution becomes a fire drill.

OutputsvMix, browser playback, distribution, and archive
YouTube
Twitch
Facebook
Instagram Live
TikTok
X Live
LinkedIn Live
Kick
Vimeo
Dailymotion
Bilibili
Douyin
Kuaishou
WeChat Channels
XiaoHongShu
Niconico
LINE VOOM
Trovo

Route the same live feed into operator workflows and viewer-facing delivery without rebuilding the source path.

Remote guestsField contributionMulti-location live production

An SRT server is a live video ingest endpoint that receives, sends, or relays SRT streams. It is usually used to move live video from a camera, encoder, remote venue, studio, or mobile device into a controlled media workflow.

SRT means Secure Reliable Transport. It runs over UDP and adds recovery, encryption, latency control, and runtime statistics. This makes it useful for live contribution over public internet, venue networks, long-distance paths, and other imperfect connections.

An SRT server is not the same thing as a web video player, CDN, or viewer playback server. In most live workflows, SRT is used for contribution and ingest. After the SRT server receives the live feed, the stream can be routed, recorded, transcoded, restreamed, or converted to viewer formats such as HLS, WebRTC, or RTMP outputs.

Live SRT proof

Watch the two delivery signals while you read

Use RTT for the first sanity check, then switch to bitrate to confirm media is still arriving into the same live path.

ConnectionConnecting
Last updateWaiting for the first packet
Active streams
Network RTT
The current round-trip time reported by the SRT session.
ms

What is an SRT server?

An SRT server is the endpoint that accepts or manages SRT connections. It can receive a live SRT stream from an encoder, relay that stream to another system, or act as a controlled handoff point between a remote source and a production platform.

In a typical live workflow, the SRT server does four practical jobs:

  • Receives the live feed: a camera encoder, OBS, vMix, FFmpeg, Larix, or another source sends video to the server.
  • Protects the contribution path: SRT can recover lost packets and encrypt the transport session.
  • Exposes live statistics: operators can monitor bitrate, RTT, packet loss, retransmissions, and connection state.
  • Passes the stream downstream: the server can route the feed into recording, transcoding, restreaming, switching, or playback workflows.

This makes the SRT server the boundary between the source side and the platform side. When a remote crew says “we are sending,” the SRT server is where you check whether the signal is actually arriving, whether it is stable, and whether the media can be used downstream.

SRT server vs SRT protocol

The SRT protocol is the transport method. The SRT server is the system or software endpoint that uses that protocol to receive or send live streams.

  • SRT protocol: the transport layer used to move live media over UDP with recovery, encryption, and latency control.
  • SRT server: the ingest or relay endpoint that accepts SRT sessions and connects them to the rest of the workflow.

For example, a field encoder may send an SRT stream to Callaba. In that case, the encoder is the sender, Callaba is the SRT server, and the SRT protocol is the transport method used between them.

What is an SRT live server?

An SRT live server is an SRT server used for real-time or near-real-time live video contribution. It receives a live stream while the event is happening and forwards it into a live production, recording, or distribution workflow.

Teams use SRT live servers for:

  • remote event contribution
  • cloud ingest from cameras and encoders
  • studio-to-cloud transport
  • partner feed handoff
  • backup contribution paths
  • remote production workflows
  • multi-destination restreaming

The word “live” matters because SRT tuning is different from simple file transfer. The server has to balance delay and recovery. If latency is too low for the real network path, the stream may connect but still break up during packet loss or jitter.

How an SRT server works

An SRT server receives encoded audio and video over an SRT connection. The media is already encoded before SRT carries it. For example, the video may be H.264 or H.265/HEVC, and the audio may be AAC.

The basic flow looks like this:

  1. An encoder creates the live audio and video stream.
  2. The encoder sends the stream to an SRT server.
  3. The SRT server receives the stream and tracks connection health.
  4. If packets are lost, SRT can request retransmission while the packets are still useful.
  5. The server passes the stream to the next workflow step: recorder, transcoder, restream, switcher, API workflow, or playback system.

This is why SRT is useful on real networks. It does not require the internet path to be perfect. It gives the stream a controlled recovery window, which helps when packet loss, jitter, or routing instability appears during a live event.

Caller, listener, and rendezvous modes

SRT uses three connection modes. The mode decides which side starts the connection and how the session works through firewalls and NAT.

  • Listener: waits for an incoming SRT connection on a known UDP port. This is the common mode for a cloud ingest server or data center endpoint.
  • Caller: starts the connection to a listener. This is common for field encoders, OBS, vMix, FFmpeg, mobile apps, and remote sources.
  • Rendezvous: both sides initiate the connection. This can help in some NAT cases, but it should be tested carefully before production.

The most common production pattern is simple: the SRT server is listener, and the remote encoder is caller. This works well when the server has a public IP, a known UDP port, and clear firewall rules.

SRT server ports and firewall rules

SRT uses UDP. That means the server must have the right UDP port open in the cloud security group, host firewall, router, or network policy.

Before testing an SRT server, check these points:

  • the server has a public IP or reachable network address
  • the correct UDP port is open
  • the encoder is using the correct caller/listener mode
  • the stream ID matches the server routing rule, if stream ID is used
  • the passphrase matches on both sides, if encryption is enabled
  • the receiving workflow is mapped to the correct downstream output

A common mistake is to check only that the encoder says “connected.” A connection is not enough. You also need to confirm that media is arriving, bitrate is stable, and the downstream system can use the stream.

Where SRT servers fit in live streaming workflows

SRT is usually strongest on the contribution side of the workflow. It is the part where the live feed travels from the source to the platform.

A typical SRT workflow looks like this:

camera or encoder → SRT server → transcoder / recorder / restream / player workflow

After the SRT server receives the stream, the platform can prepare it for other tasks:

  • Restreaming: send the same source to Twitch, YouTube, Facebook, or another RTMP destination.
  • Recording: save the live contribution feed for replay, archive, or editing.
  • Transcoding: convert the incoming stream to another bitrate, resolution, or codec.
  • Playback: package the stream into HLS, WebRTC, or another viewer format.
  • Routing: forward the stream to another server, region, switcher, or production tool.

This is the practical role of an SRT server: it receives the live contribution feed and gives the rest of the system something stable to work with.

SRT server vs RTMP server

SRT and RTMP are both used in live streaming, but they are usually better for different jobs.

  • SRT server: better for contribution over unstable networks, long-distance paths, public internet, and professional ingest workflows.
  • RTMP server: still useful for simple publishing, legacy tools, and platforms that expect RTMP ingest.

Many production workflows use both. For example, a remote encoder sends SRT to Callaba, then Callaba restreams the signal to a social platform over RTMP or RTMPS.

If you are comparing protocols, read SRT vs RTMP.

SRT server vs HLS, WebRTC, and NDI

SRT is not a replacement for every video technology. It solves a specific part of the workflow.

  • SRT: best for live contribution, ingest, and transport between controlled endpoints.
  • HLS: best for large-scale viewer playback across browsers, TVs, and mobile devices.
  • WebRTC: best for interactive real-time video, calls, return feeds, and sub-second participation.
  • NDI: best for low-latency production networking inside controlled LAN or studio environments.

A clean live workflow often uses more than one technology. SRT brings the stream into the system. HLS, WebRTC, RTMP, or NDI may be used later depending on the output, audience, or production tool.

How to deploy an SRT server

The exact setup depends on your software, cloud provider, and workflow, but the deployment logic is usually the same.

  1. Create a server or cloud instance with enough CPU, network capacity, and storage for your workflow.
  2. Open the required UDP port in the cloud security group and host firewall.
  3. Create an SRT listener that will receive the incoming stream.
  4. Set stream ID and passphrase rules if you need routing and encryption.
  5. Connect the encoder as SRT caller and send the stream to the listener endpoint.
  6. Check live statistics such as bitrate, RTT, packet loss, retransmissions, and connection state.
  7. Route the stream downstream to recording, restreaming, transcoding, or playback.

For Callaba, you can create an SRT server from the dashboard or through the API. Then you can use that SRT server as an input for restreaming, recording, routing, or other live workflows.

How to use an SRT server in Callaba

Inside Callaba, an SRT server is usually used as a controlled ingest point. A remote source sends a stream to Callaba, and Callaba makes that stream available for the next workflow step.

Common Callaba workflows include:

  • SRT encoder to Callaba, then restream to Twitch or YouTube
  • OBS to Callaba over SRT, then record the stream
  • vMix to Callaba over SRT, then route the feed to another destination
  • mobile app to Callaba over SRT, then restream to social platforms
  • remote venue to Callaba, then package for browser playback

Useful setup guides:

What to monitor on an SRT server

A connected SRT session does not always mean the stream is healthy. Monitor both transport health and media health.

Transport signals

  • Connection state: connected, disconnected, reconnecting, or failed.
  • Incoming bitrate: whether media is still flowing at the expected rate.
  • RTT: round-trip time between sender and receiver.
  • Packet loss: how much data is being lost on the path.
  • Retransmissions: how often SRT has to recover missing packets.
  • Jitter: how much packet timing varies.
  • Receive buffer pressure: whether the connection is running too close to its recovery limit.

Media signals

  • black video
  • frozen video
  • missing audio
  • silent audio
  • wrong codec
  • wrong frame rate or resolution
  • bad timestamps
  • missing keyframes

This distinction matters. SRT can transport packets correctly while the media inside the stream is still wrong. For production workflows, always check both the SRT session and the actual audio/video payload.

Common SRT server problems

The SRT connection does not start

Check the connection mode, UDP port, public IP, firewall rules, stream ID, and encryption passphrase. Most failed SRT handshakes come from a wrong mode, blocked UDP traffic, wrong port, or mismatched security settings.

The stream connects but video is unstable

Look at RTT, jitter, packet loss, retransmissions, and latency settings. If the latency value is too aggressive for the network path, SRT may not have enough time to recover lost packets before they become useless.

The stream connects but there is no audio

Check the encoder first. Make sure the audio source is enabled, the right audio device is selected, the audio codec is compatible with the next workflow step, and the receiving application can read the audio track.

The SRT stats look good but viewers still have problems

If the SRT link is healthy but viewers see stalls or artifacts, the problem may be downstream. Check transcoding, packaging, origin, CDN, player behavior, and output format. Do not blame the SRT server until you have checked the rest of the chain.

Self-hosted vs managed SRT server

You can run an SRT server yourself or use a managed platform. The better choice depends on how much operational control your team wants to own.

Use a self-hosted SRT server when:

  • you need full control over network placement
  • you need custom routing logic
  • you already operate media infrastructure
  • you need strict compliance or internal deployment rules
  • you have a team that can monitor and maintain the system

Use a managed SRT server when:

  • you need to launch quickly
  • you do not want to maintain the full media stack yourself
  • you need monitoring, routing, recording, or restreaming in one place
  • you want to reduce event-day operational risk
  • your team is small and does not want to carry 24/7 infrastructure ownership

Callaba can be used as a cloud or self-hosted SRT workflow platform. You can launch it on AWS, install it on your own server, create SRT ingest points, and connect those streams to restreaming, recording, routing, and API workflows.

Event-day checklist for an SRT server

  • Confirm the server IP or hostname.
  • Confirm the UDP port is open.
  • Confirm caller/listener/rendezvous mode on both sides.
  • Confirm stream ID, if used.
  • Confirm encryption passphrase, if used.
  • Confirm expected bitrate, codec, frame rate, resolution, and audio format.
  • Start the stream and verify incoming bitrate.
  • Check RTT, packet loss, retransmissions, and jitter.
  • Check actual video and audio, not only connection state.
  • Confirm the downstream route: recording, restreaming, transcoding, or playback.
  • Test the backup path before the event starts.

FAQ

What is an SRT server?

An SRT server is a live video ingest or relay endpoint that receives, sends, or routes SRT streams. It is usually used for contribution workflows where a live feed must travel from an encoder, camera, venue, studio, or cloud source into a controlled media system.

What is an SRT live server?

An SRT live server is an SRT server used for real-time live video contribution. It receives live video while an event is happening and passes the stream into recording, restreaming, transcoding, switching, or playback workflows.

Is an SRT server the same as a streaming server?

Not always. An SRT server usually handles live ingest or transport between controlled endpoints. A full streaming server or platform may also handle transcoding, recording, viewer playback, analytics, access control, and CDN delivery.

Does an SRT server use UDP?

Yes. SRT runs over UDP. That is why the correct UDP port must be open on the server firewall, cloud security group, router, or network policy.

What port does an SRT server use?

SRT does not require one universal fixed port. The port is defined by the server configuration. In production, teams usually reserve a clear UDP port or UDP port range for SRT ingest and document which feeds, tenants, or events use each port.

Do I need an SRT server for live streaming?

You need an SRT server when you want to receive SRT streams from encoders, cameras, remote sites, software tools, or partner systems. If you only stream directly to a platform that accepts RTMP, you may not need SRT. If the contribution path is unstable or important, SRT is often a better choice.

Can OBS send to an SRT server?

Yes. OBS can send video to an SRT server when configured with an SRT output URL. The SRT server receives the stream and can then route it into recording, restreaming, transcoding, or playback workflows.

Can vMix send to an SRT server?

Yes. vMix supports SRT workflows and can send or receive SRT streams. In a common setup, vMix sends an SRT feed to a server such as Callaba, and the server handles routing, recording, or restreaming.

Is SRT better than RTMP?

SRT is usually better than RTMP for live contribution over unstable, lossy, or long-distance networks. RTMP is still common for simple publishing and social platform ingest. Many workflows use SRT for contribution and RTMP or RTMPS for the final push to a platform.

Can browsers play SRT directly?

In normal web workflows, browsers do not play SRT directly. An SRT server usually receives the stream first, and then the platform converts or packages it into a viewer format such as HLS or WebRTC.

Why does my SRT stream connect but show no video?

The SRT connection may be working while the media payload is wrong. Check codec, container, timestamps, keyframes, audio tracks, stream mapping, and downstream compatibility. Also confirm that bitrate is actually arriving at the server.

How do I make an SRT server more reliable?

Use a stable server, open the correct UDP ports, choose realistic latency, monitor RTT and retransmissions, keep enough bandwidth headroom, validate the media payload, and test a backup endpoint before the event.

Where to go next