Hello Community!
I have this diagram here,
My issue is low upstream bandwidth, and I am trying to find solutions to fix this maybe with some VM or cloud hosting solutions.
I know that HA Proxy is a load balancer, but can it balance the bandwidth from the front end?
My issue is that due to low upstream, multiple users would create multiple connections to my particular ISP. I am trying to avoid this by creating 1 single upstream connection, then having that same connection repeated with out increasing.
If I have a single stream that connects at 6,000Kbs or 6Mbs, can HA Proxy repeat that same stream on the front end without creating additional connections.
I think what you’re looking for is some sort of caching or broadcasting solution, and which solution heavily depends on what is meant by “streaming”. If you are live streaming as said in the diagram, then you need some sort of repeater that is not behind your ISP, perhaps a remote OBS server. To my knowledge, these types of streams cannot be shared or repeated except by the live stream server that clients are establishing a connection to.
If you mean streaming like watching a YouTube-like video that is not live, that is typically done by chunking up the video into pieces that HAProxy can cache so that the live stream server only has to serve the content once every so often to HAProxy, and HAProxy will serve it repeatedly as requested. This also assumes that HAProxy is not behind your ISP.
Hi,
Thanks for taking the time to reach back out to me.
Not using OBS as the actual server, I am using a linux based server for broadcasting in .m3u8 format, the server sends hls stream in chunks but at a faster speed.
Hoping this gives you more insight on the following diagram.
Yes i would be hosting an HAProxy on a vps provider for this task.
I have local access to livestream server on site, but im looking make this as 1 single upstream to HAProxy, then expand connections to each user on a VPS if that makes sense.
Yes live streaming as Broadcasting… sorry about that.
This is quite a bit outside my scope of knowledge, so please allow others to confirm or maybe test if you can.
Just searching around, it looks like HLS can be proxied, but I can’t tell if HLS can be cached. I found a forum about Caddy caching and loads about Nginx just being used as a reverse proxy, but those are not HAProxy.
If you wanted to test HAProxy caching the content, you’d have to set it up in http mode. The configs might look something like this:
cache cache
total-max-size 256 # RAM cache size in megabytes
max-object-size 10485760 # max cacheable object size in bytes
max-age 300 # max cache duration in seconds
process-vary on # handle the Vary header (otherwise don't cache)
frontend entrypoint
# For HTTP (no SSL/TLS)
bind *:80
# For HTTPS
bind *:443 ssl crt /path/to/cert.pem
http-request cache-use cache # Respond to requests from cache
http-response cache-store cache # Store response in cache
default_backend stream_server
backend stream_server
# Choose one:
server streamserver 1.2.3.4:80 check # If Stream Server is listening on 80 for HTTP
server streamserver 1.2.3.4:443 check ssl # If Stream Server is listening on 443 for HTTPS
With this in place, you would set a couple of streaming clients to point at whatever the IP address is of HAProxy and watch the logs for <CACHE>. If it does, then it’s a fair bet that you can reduce the load to your source server this way.
Haproxy is the wrong tool for the job, this is not what a reverse proxy server and load balancer does, at all.
You can unicast receive a stream with VLC or dvblast (part of the VLC project) and then distribute it locally with multicast (or even unicast), so you don’t waste WAN bandwidth. I’m sure there are plenty of other tools that can do the same, but you have to look for video specific solutions not HTTP proxies.