HAProxy re-executing Master process

Hi everyone,

I’m observing a strange behavior which has occurred 2 times till now. For some reason HAProxy re-executes its master process every 21 days, which forks a new worker process and kills the old one. We loose the stats data too! Below are the HAProxy version, configuration files and logs.

HAProxy Version:

HA-Proxy version 2.0.13-2ubuntu0.3 2021/08/27 - https://haproxy.org/
Build options :
  TARGET  = linux-glibc
  CPU     = generic
  CC      = gcc
  CFLAGS  = -O2 -g -O2 -fdebug-prefix-map=/build/haproxy-jeVpgs/haproxy-2.0.13=. -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fno-strict-aliasing -Wdeclaration-after-statement -fwrapv -Wno-address-of-packed-member -Wno-unused-label -Wno-sign-compare -Wno-unused-parameter -Wno-old-style-declaration -Wno-ignored-qualifiers -Wno-clobbered -Wno-missing-field-initializers -Wno-implicit-fallthrough -Wno-stringop-overflow -Wno-cast-function-type -Wtype-limits -Wshift-negative-value -Wshift-overflow=2 -Wduplicated-cond -Wnull-dereference
  OPTIONS = USE_PCRE2=1 USE_PCRE2_JIT=1 USE_REGPARM=1 USE_OPENSSL=1 USE_LUA=1 USE_ZLIB=1 USE_SYSTEMD=1

Feature list : +EPOLL -KQUEUE -MY_EPOLL -MY_SPLICE +NETFILTER -PCRE -PCRE_JIT +PCRE2 +PCRE2_JIT +POLL -PRIVATE_CACHE +THREAD -PTHREAD_PSHARED +REGPARM -STATIC_PCRE -STATIC_PCRE2 +TPROXY +LINUX_TPROXY +LINUX_SPLICE +LIBCRYPT +CRYPT_H -VSYSCALL +GETADDRINFO +OPENSSL +LUA +FUTEX +ACCEPT4 -MY_ACCEPT4 +ZLIB -SLZ +CPU_AFFINITY +TFO +NS +DL +RT -DEVICEATLAS -51DEGREES -WURFL +SYSTEMD -OBSOLETE_LINKER +PRCTL +THREAD_DUMP -EVPORTS

Default settings :
  bufsize = 16384, maxrewrite = 1024, maxpollevents = 200

Built with multi-threading support (MAX_THREADS=64, default=48).
Built with OpenSSL version : OpenSSL 1.1.1f  31 Mar 2020
Running on OpenSSL version : OpenSSL 1.1.1f  31 Mar 2020
OpenSSL library supports TLS extensions : yes
OpenSSL library supports SNI : yes
OpenSSL library supports : TLSv1.0 TLSv1.1 TLSv1.2 TLSv1.3
Built with Lua version : Lua 5.3.3
Built with network namespace support.
Built with transparent proxy support using: IP_TRANSPARENT IPV6_TRANSPARENT IP_FREEBIND
Built with zlib version : 1.2.11
Running on zlib version : 1.2.11
Compression algorithms supported : identity("identity"), deflate("deflate"), raw-deflate("deflate"), gzip("gzip")
Built with PCRE2 version : 10.34 2019-11-21
PCRE2 library supports JIT : yes
Encrypted password support via crypt(3): yes
Built with the Prometheus exporter as a service

Available polling systems :
      epoll : pref=300,  test result OK
       poll : pref=200,  test result OK
     select : pref=150,  test result OK
Total: 3 (3 usable), will use epoll.

Available multiplexer protocols :
(protocols marked as <default> cannot be specified using 'proto' keyword)
              h2 : mode=HTX        side=FE|BE     mux=H2
              h2 : mode=HTTP       side=FE        mux=H2
       <default> : mode=HTX        side=FE|BE     mux=H1
       <default> : mode=TCP|HTTP   side=FE|BE     mux=PASS

Available services :
	prometheus-exporter

Available filters :
	[SPOE] spoe
	[COMP] compression
	[CACHE] cache
	[TRACE] trace

Operating System: Ubuntu 20.04.2 LTS (Focal Fossa)

Systemd:

[Unit]
Description=HAProxy Load Balancer
Documentation=man:haproxy(1)
Documentation=file:/usr/share/doc/haproxy/configuration.txt.gz
After=network.target rsyslog.service

[Service]
EnvironmentFile=-/etc/default/haproxy
EnvironmentFile=-/etc/sysconfig/haproxy
Environment="CONFIG=/etc/haproxy/haproxy.cfg" "PIDFILE=/run/haproxy.pid" "EXTRAOPTS=-S /run/haproxy-master.sock"
ExecStartPre=/usr/sbin/haproxy -f $CONFIG -c -q $EXTRAOPTS
ExecStart=/usr/sbin/haproxy -Ws -f $CONFIG -p $PIDFILE $EXTRAOPTS
ExecReload=/usr/sbin/haproxy -f $CONFIG -c -q $EXTRAOPTS
ExecReload=/bin/kill -USR2 $MAINPID
KillMode=mixed
Restart=always
SuccessExitStatus=143
Type=notify

# The following lines leverage SystemD's sandboxing options to provide
# defense in depth protection at the expense of restricting some flexibility
# in your setup (e.g. placement of your configuration files) or possibly
# reduced performance. See systemd.service(5) and systemd.exec(5) for further
# information.

# NoNewPrivileges=true
# ProtectHome=true
# If you want to use 'ProtectSystem=strict' you should whitelist the PIDFILE,
# any state files and any other files written using 'ReadWritePaths' or
# 'RuntimeDirectory'.
# ProtectSystem=true
# ProtectKernelTunables=true

Configuration:

global
	log /dev/log	local0
	log /dev/log	local1 notice

	chroot /var/lib/haproxy

	stats socket /run/haproxy/admin.sock mode 660 level admin expose-fd listeners
	stats timeout 30s

	user haproxy
	group haproxy

	daemon
        maxconn 4000

	tune.ssl.default-dh-param 2048
	ssl-dh-param-file /etc/ssl/dhparam.pem

        ssl-default-bind-ciphers ALL:!RSA:!CAMELLIA:!aNULL:!eNULL:!LOW:!3DES:!MD5:!EXP:!PSK:!SRP:!DSS:!RC4:!SHA1:!SHA256:!SHA384
        ssl-default-bind-options ssl-min-ver TLSv1.2 no-tls-tickets

defaults
	log	global
	option	httplog
	option	dontlognull

	mode    http

	option httpchk HEAD /
	option redispatch
	
	timeout http-request    10s
	timeout queue           1m
	timeout connect         10s
	timeout client          1m
	timeout server          1m
	timeout http-keep-alive 10s
	timeout check           10s

	errorfile 400 /etc/haproxy/errors/400.http
	errorfile 403 /etc/haproxy/errors/403.http
	errorfile 408 /etc/haproxy/errors/408.http
	errorfile 500 /etc/haproxy/errors/500.http
	errorfile 502 /etc/haproxy/errors/502.http
	errorfile 503 /etc/haproxy/errors/503.http
	errorfile 504 /etc/haproxy/errors/504.http

frontend fe_http
	bind *:80
        use_backend %[req.hdr(Host),lower]

frontend fe_https
        bind *:443 ssl crt /etc/ssl/private/cert_key.pem alpn h2,http/1.1
        use_backend %[req.hdr(Host),lower]

backend pe.gw.local
        server s1 10.234.7.44:8000 weight 1 check rise 2 fall 5
        server s2 10.234.7.45:8000 weight 1 check rise 2 fall 5
        server s3 10.234.7.46:8000 weight 1 check rise 2 fall 5

frontend hastats_ro_prometheus_exporter
        bind *:9000
        option http-use-htx
        http-request use-service prometheus-exporter if { path /metrics }
        stats enable
        stats uri /stats
	http-response set-header Cache-Control "no-cache"
        stats refresh 10s

listen admin_hap_stats
        bind *:9001
        stats enable
        stats uri /stats
	http-response set-header Cache-Control "no-cache"
        stats refresh 10s
	stats admin if TRUE
        stats auth username:passw0rd

Logs:

On 19th of August:

Aug 19 06:16:10 node-05 haproxy[3044578]: [WARNING] 230/061610 (3044578) : Reexecuting Master process
Aug 19 06:16:10 node-05 haproxy[3044578]: Proxy fe_http started.
Aug 19 06:16:10 node-05 haproxy[3044579]: Stopping frontend fe_http in 0 ms.
Aug 19 06:16:10 node-05 haproxy[3044579]: Stopping frontend fe_http in 0 ms.
Aug 19 06:16:10 node-05 haproxy[3044579]: Stopping frontend fe_https in 0 ms.
Aug 19 06:16:10 node-05 haproxy[3044579]: Stopping frontend fe_https in 0 ms.
Aug 19 06:16:10 node-05 haproxy[3044579]: Stopping backend pe.gw.local in 0 ms.
Aug 19 06:16:10 node-05 haproxy[3044579]: Stopping backend pe.gw.local in 0 ms.
Aug 19 06:16:10 node-05 haproxy[3044579]: Stopping frontend hastats_ro_prometheus_exporter in 0 ms.
Aug 19 06:16:10 node-05 haproxy[3044579]: Stopping frontend hastats_ro_prometheus_exporter in 0 ms.
Aug 19 06:16:10 node-05 haproxy[3044579]: Stopping proxy admin_hap_stats in 0 ms.
Aug 19 06:16:10 node-05 haproxy[3044578]: Proxy fe_http started.
Aug 19 06:16:10 node-05 haproxy[3044579]: [WARNING] 230/061610 (3044579) : Stopping frontend GLOBAL in 0 ms.
Aug 19 06:16:10 node-05 haproxy[3044579]: [WARNING] 230/061610 (3044579) : Stopping frontend fe_http in 0 ms.
Aug 19 06:16:10 node-05 haproxy[3044579]: [WARNING] 230/061610 (3044579) : Stopping frontend fe_https in 0 ms.
Aug 19 06:16:10 node-05 haproxy[3044579]: [WARNING] 230/061610 (3044579) : Stopping backend pe.gw.local in 0 ms.
Aug 19 06:16:10 node-05 haproxy[3044579]: [WARNING] 230/061610 (3044579) : Stopping frontend hastats_ro_prometheus_exporter in 0 ms.
Aug 19 06:16:10 node-05 haproxy[3044579]: [WARNING] 230/061610 (3044579) : Stopping proxy admin_hap_stats in 0 ms.
Aug 19 06:16:10 node-05 haproxy[3044578]: Proxy fe_https started.
Aug 19 06:16:10 node-05 haproxy[3044578]: [NOTICE] 230/061610 (3044578) : New worker #1 (4108017) forked
Aug 19 06:16:10 node-05 haproxy[3044578]: Proxy fe_https started.
Aug 19 06:16:10 node-05 haproxy[3044578]: Proxy pe.gw.local started.
Aug 19 06:16:10 node-05 haproxy[3044578]: Proxy pe.gw.local started.
Aug 19 06:16:10 node-05 haproxy[3044578]: Proxy hastats_ro_prometheus_exporter started.
Aug 19 06:16:10 node-05 haproxy[3044578]: Proxy hastats_ro_prometheus_exporter started.
Aug 19 06:16:10 node-05 haproxy[3044578]: Proxy admin_hap_stats started.
Aug 19 06:16:10 node-05 haproxy[3044578]: Proxy admin_hap_stats started.
Aug 19 06:16:10 node-05 haproxy[3044579]: [WARNING] 230/061610 (3044579) : Proxy GLOBAL stopped (FE: 1 conns, BE: 1 conns).
Aug 19 06:16:10 node-05 haproxy[3044579]: [WARNING] 230/061610 (3044579) : Proxy fe_http stopped (FE: 667909 conns, BE: 2211 conns).
Aug 19 06:16:10 node-05 haproxy[3044579]: Stopping proxy admin_hap_stats in 0 ms.
Aug 19 06:16:10 node-05 haproxy[3044579]: Proxy fe_http stopped (FE: 667909 conns, BE: 2211 conns).
Aug 19 06:16:10 node-05 haproxy[3044579]: Proxy fe_http stopped (FE: 667909 conns, BE: 2211 conns).
Aug 19 06:16:10 node-05 haproxy[3044579]: Proxy fe_https stopped (FE: 55702 conns, BE: 32222 conns).
Aug 19 06:16:10 node-05 haproxy[3044579]: [WARNING] 230/061610 (3044579) : Proxy fe_https stopped (FE: 55702 conns, BE: 32222 conns).
Aug 19 06:16:10 node-05 haproxy[3044579]: Proxy fe_https stopped (FE: 55702 conns, BE: 32222 conns).
Aug 19 06:16:10 node-05 haproxy[3044579]: Proxy pe.gw.local stopped (FE: 0 conns, BE: 1142360 conns).
Aug 19 06:16:10 node-05 haproxy[3044579]: [WARNING] 230/061610 (3044579) : Proxy pe.gw.local stopped (FE: 0 conns, BE: 1142360 conns).
Aug 19 06:16:10 node-05 haproxy[3044579]: Proxy pe.gw.local stopped (FE: 0 conns, BE: 1142360 conns).
Aug 19 06:16:10 node-05 haproxy[3044579]: Proxy hastats_ro_prometheus_exporter stopped (FE: 236257 conns, BE: 51041 conns).
Aug 19 06:16:10 node-05 haproxy[3044579]: [WARNING] 230/061610 (3044579) : Proxy hastats_ro_prometheus_exporter stopped (FE: 236257 conns, BE: 51041 conns).
Aug 19 06:16:10 node-05 haproxy[3044579]: Proxy hastats_ro_prometheus_exporter stopped (FE: 236257 conns, BE: 51041 conns).
Aug 19 06:16:10 node-05 haproxy[3044579]: Proxy admin_hap_stats stopped (FE: 21 conns, BE: 19 conns).
Aug 19 06:16:10 node-05 haproxy[3044579]: [WARNING] 230/061610 (3044579) : Proxy admin_hap_stats stopped (FE: 21 conns, BE: 19 conns).
Aug 19 06:16:11 node-05 haproxy[3044578]: [WARNING] 230/061611 (3044578) : Former worker #1 (3044579) exited with code 0 (Exit)
Aug 19 06:16:10 node-05 haproxy[3044579]: Proxy admin_hap_stats stopped (FE: 21 conns, BE: 19 conns).

On 9th of September:

Sep  9 06:03:31 node-05 haproxy[3044578]: [WARNING] 251/060331 (3044578) : Reexecuting Master process
Sep  9 06:03:31 node-05 haproxy[3044578]: Proxy fe_http started.
Sep  9 06:03:31 node-05 haproxy[4108017]: Stopping frontend fe_http in 0 ms.
Sep  9 06:03:31 node-05 haproxy[4108017]: Stopping frontend fe_http in 0 ms.
Sep  9 06:03:31 node-05 haproxy[4108017]: Stopping frontend fe_https in 0 ms.
Sep  9 06:03:31 node-05 haproxy[4108017]: Stopping frontend fe_https in 0 ms.
Sep  9 06:03:31 node-05 haproxy[4108017]: Stopping backend pe.gw.local in 0 ms.
Sep  9 06:03:31 node-05 haproxy[4108017]: Stopping backend pe.gw.local in 0 ms.
Sep  9 06:03:31 node-05 haproxy[4108017]: Stopping frontend hastats_ro_prometheus_exporter in 0 ms.
Sep  9 06:03:31 node-05 haproxy[3044578]: Proxy fe_http started.
Sep  9 06:03:31 node-05 haproxy[3044578]: Proxy fe_https started.
Sep  9 06:03:31 node-05 haproxy[4108017]: Stopping frontend hastats_ro_prometheus_exporter in 0 ms.
Sep  9 06:03:31 node-05 haproxy[4108017]: Stopping proxy admin_hap_stats in 0 ms.
Sep  9 06:03:31 node-05 haproxy[4108017]: [WARNING] 251/060331 (4108017) : Stopping frontend GLOBAL in 0 ms.
Sep  9 06:03:31 node-05 haproxy[4108017]: Stopping proxy admin_hap_stats in 0 ms.
Sep  9 06:03:31 node-05 haproxy[4108017]: Proxy fe_http stopped (FE: 78633 conns, BE: 2869 conns).
Sep  9 06:03:31 node-05 haproxy[4108017]: [WARNING] 251/060331 (4108017) : Stopping frontend fe_http in 0 ms.
Sep  9 06:03:31 node-05 haproxy[4108017]: Proxy fe_http stopped (FE: 78633 conns, BE: 2869 conns).
Sep  9 06:03:31 node-05 haproxy[4108017]: Proxy fe_https stopped (FE: 106918 conns, BE: 1189 conns).
Sep  9 06:03:31 node-05 haproxy[4108017]: [WARNING] 251/060331 (4108017) : Stopping frontend fe_https in 0 ms.
Sep  9 06:03:31 node-05 haproxy[4108017]: Proxy fe_https stopped (FE: 106918 conns, BE: 1189 conns).
Sep  9 06:03:31 node-05 haproxy[4108017]: Proxy pe.gw.local stopped (FE: 0 conns, BE: 168650 conns).
Sep  9 06:03:31 node-05 haproxy[4108017]: [WARNING] 251/060331 (4108017) : Stopping backend pe.gw.local in 0 ms.
Sep  9 06:03:31 node-05 haproxy[4108017]: Proxy pe.gw.local stopped (FE: 0 conns, BE: 168650 conns).
Sep  9 06:03:31 node-05 haproxy[4108017]: Proxy hastats_ro_prometheus_exporter stopped (FE: 215256 conns, BE: 22848 conns).
Sep  9 06:03:31 node-05 haproxy[4108017]: [WARNING] 251/060331 (4108017) : Stopping frontend hastats_ro_prometheus_exporter in 0 ms.
Sep  9 06:03:31 node-05 haproxy[4108017]: Proxy hastats_ro_prometheus_exporter stopped (FE: 215256 conns, BE: 22848 conns).
Sep  9 06:03:31 node-05 haproxy[4108017]: Proxy admin_hap_stats stopped (FE: 3 conns, BE: 1 conns).
Sep  9 06:03:31 node-05 haproxy[4108017]: [WARNING] 251/060331 (4108017) : Stopping proxy admin_hap_stats in 0 ms.
Sep  9 06:03:31 node-05 haproxy[3044578]: Proxy fe_https started.
Sep  9 06:03:31 node-05 haproxy[3044578]: [NOTICE] 251/060331 (3044578) : New worker #1 (3380821) forked
Sep  9 06:03:31 node-05 haproxy[3044578]: Proxy pe.gw.local started.
Sep  9 06:03:31 node-05 haproxy[4108017]: [WARNING] 251/060331 (4108017) : Proxy GLOBAL stopped (FE: 1 conns, BE: 1 conns).
Sep  9 06:03:31 node-05 haproxy[4108017]: [WARNING] 251/060331 (4108017) : Proxy fe_http stopped (FE: 78633 conns, BE: 2869 conns).
Sep  9 06:03:31 node-05 haproxy[4108017]: [WARNING] 251/060331 (4108017) : Proxy fe_https stopped (FE: 106918 conns, BE: 1189 conns).
Sep  9 06:03:31 node-05 haproxy[4108017]: [WARNING] 251/060331 (4108017) : Proxy pe.gw.local stopped (FE: 0 conns, BE: 168650 conns).
Sep  9 06:03:31 node-05 haproxy[4108017]: [WARNING] 251/060331 (4108017) : Proxy hastats_ro_prometheus_exporter stopped (FE: 215256 conns, BE: 22848 conns).
Sep  9 06:03:31 node-05 haproxy[4108017]: [WARNING] 251/060331 (4108017) : Proxy admin_hap_stats stopped (FE: 3 conns, BE: 1 conns).
Sep  9 06:03:31 node-05 haproxy[3044578]: Proxy pe.gw.local started.
Sep  9 06:03:31 node-05 haproxy[3044578]: Proxy hastats_ro_prometheus_exporter started.
Sep  9 06:03:31 node-05 haproxy[3044578]: Proxy hastats_ro_prometheus_exporter started.
Sep  9 06:03:31 node-05 haproxy[3044578]: Proxy admin_hap_stats started.
Sep  9 06:03:31 node-05 haproxy[3044578]: Proxy admin_hap_stats started.
Sep  9 06:03:32 node-05 haproxy[3044578]: [WARNING] 251/060332 (3044578) : Former worker #1 (4108017) exited with code 0 (Exit)
Sep  9 06:03:31 node-05 haproxy[4108017]: Proxy admin_hap_stats stopped (FE: 3 conns, BE: 1 conns).

Same issue is happening on the other HAProxy too! Is this a known issue (I don’t see it anywhere!) or am I doing something wrong?

It either gets a SIGUSR2 signal from somewhere (maybe systemd), or it get’s the reload command on the admin CLI.

To make sure, intercept signals with something like auditd or systemtap and close down the admin CLI.

Thanks for the reply! Will check and report back.

Also, I have configured keepalived.

The config. is as follows:

vrrp_script track_haproxy {
  script "killall -0 haproxy"
  interval 1
}

vrrp_instance V_1 {
  interface bond0
  state MASTER 
  virtual_router_id 97
  priority 100
  advert_int 1
  virtual_ipaddress {
    10.234.7.60/21
 }
  track_script {
    track_haproxy
  }
}