Haproxy fails to display correctly website that locally displays fine

website: https://juimonos.com/
shows missing images, internally in admin doesnt list users .
haproxy ip
cert shows ok

but internally in my local area network , without haproxy , or https, the website loads everything fine as expected:

webserver ip:
type: nginx virtual host called juimonos.com
port 443

Haproxy configuration:
log /dev/log local0
log /dev/log local1 notice
chroot /var/lib/haproxy
stats socket /run/haproxy/admin.sock mode 660 level admin expose-fd listeners
stats timeout 30s
user haproxy
group haproxy

    # Default SSL material locations
    ca-base /etc/ssl/certs
    crt-base /etc/ssl/private

    # See: https://ssl-config.mozilla.org/#server=haproxy&server-version=2.0.3&config=intermediate
    ssl-default-bind-ciphersuites TLS_AES_128_GCM_SHA256:TLS_AES_256_GCM_SHA384:TLS_CHACHA20_POLY1305_SHA256
    ssl-default-bind-options ssl-min-ver TLSv1.2 no-tls-tickets

log global
mode http
option httplog
option dontlognull
timeout connect 3s
timeout client 5s
timeout server 5s
timeout http-request 5s
timeout http-request 10s # Preventing Slowloris like attacks
option http-buffer-request
errorfile 400 /etc/haproxy/errors/400.http
errorfile 403 /etc/haproxy/errors/403.http
errorfile 408 /etc/haproxy/errors/408.http
errorfile 500 /etc/haproxy/errors/500.http
errorfile 502 /etc/haproxy/errors/502.http
errorfile 503 /etc/haproxy/errors/503.http
errorfile 504 /etc/haproxy/errors/504.http

frontend haprx01
bind ssl crt /etc/ssl/sw.prem crt /etc/ssl/juimonos.prem crt /etc/ssl/sym.prem crt /etc/ssl/larifanet.prem crt /etc/ssl/ticofds.prem
mode http
http-request deny if HTTP_1.0
http-request deny if { req.hdr(user-agent) -i -m sub curl phantomjs slimerjs }
http-request deny unless { req.hdr(user-agent) -m found }
http-request deny if { src -f /etc/haproxy/blacklist.acl }
acl whitelist src -f /etc/haproxy/whitelist.lst

ACL for “serviciosymas.com

acl ACL_sym hdr(host) -i serviciosymas.com www.serviciosymas.com
use_backend sym if ACL_sym

ACL for “juimonos.com

acl ACL_juimonos hdr(host) -i juimonos.com www.juimonos.com
use_backend juimonos if ACL_juimonos

nginx virtual host config: (ssl requested and installed as well here)
server {
root /wbs/carpool/www/public/;
index index.php index.html index.htm index.nginx-debian.html;
server_name juimonos.com www.juimonos.com;

    location / {

            try_files $uri $uri/ /index.php?q=$uri&$args;
    location ~ \.php$ {
       include snippets/fastcgi-php.conf;
       fastcgi_pass unix:/run/php/php7.4-fpm.sock;

listen [::]:443 ssl ipv6only=on; # managed by Certbot
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/juimonos.com/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/juimonos.com/privkey.pem; # managed by Certbot
include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot


server {
if ($host = www.juimonos.com) {
return 301 https://$host$request_uri;
} # managed by Certbot

if ($host = juimonos.com) {
    return 301 https://$host$request_uri;
} # managed by Certbot

    listen 80;
    listen [::]:80;
    server_name juimonos.com www.juimonos.com;
return 404; # managed by Certbot


cloudflare setting:

Please help, I want to get this website fully loading https just like http works normally internally.

What I am doing wrong?

haproxy version 2.4
i checked haproxy logs located at /var/log/
hundreds of errors were shown telling

basically the connection was dropped before even was sent to backend

i went by discard method and found this rule that i got from some haproxy block was the culprit

http-request deny if { sc_http_req_rate(0) gt 10 } this one is problematic

now what is that rule for? its supposed to deny DDOS attacks but is malformed?

from the docs

let’s say that you wanted to block any client making more than 10 requests per second. The http_req_rate(10s) counter that you added will report the number of requests over 10 seconds. So, to cap requests at 10 per second, set the limit to 100.

maybe the 10 requests per seconds are to low. if you have a site with lot of images, css, javascript then the 10 requests could easily be reached…