Haproxy 2.2.3 stops accepting connections during lua execution - what diagnostic info should I collect

greetings from down under :slight_smile:

I have configured haproxy 2.2.3 running on an RHEL7 VM to accept https browser traffic and forward to http back end, but intercept two specific URLs - a redirect resulting from Okta Authorization Code Flow (the user was successfully authenticated by Okta), and

  • a request for token introspection by the back end application (pertinent details are supplied as headers, and some lua code extracts those values (user name, JWT access token, & Okta refresh token). haproxy is responsible for invoking lua code to initiate a separate https conversation with Okta to conduct JWT token introspection - rather than using any ‘roll your own’ JWT token validation libraries we just ask Okta to do the token introspection), then return a good 200 or bad result 401 depending on response from Okta.

at very low (unit test) volumes (1 or 2 concurrent sessions) everything appears to work fine, but it appears sometimes during the lua JWT token introspection routine haproxy just halts, and haproxy stops accepting new connections (and the logging in the lua routines indicates that two or more instances of the introspection routine were running in parallel). Nothing in the logs indicates a crash.

I took what I hope might be useful diagnostic info, but I was hoping to get some guidance about where to actually start looking and what things I need to collect to be able to get better help.

Not being a full-time linux admin, I understand the concept of running out of memory, but don’t really know how to tell if that is happening.

I also wondered whether before I ask for help here, I have to upgrade haproxy to the most recent patch release before seeking help?

haproxy config
global
log /dev/log len 65535 local0 debug # I want all the log entries, not just debug
chroot /var/lib/haproxy
stats socket /run/haproxy/admin.sock mode 660 level admin expose-fd listeners
stats timeout 30s
daemon
lua-load /home/someuser/base64.lua
lua-load /home/someuser/executeOACF.lua
lua-load /home/someuser/introspect.lua
lua-load /home/someuser/dumpHeaders.lua

    # Trying to resolve badreq errors
    # https://serverfault.com/questions/291467/haproxy-badreq-errors
    tune.maxrewrite 16384
    tune.bufsize    32768

    # Default SSL material locations
    ca-base /etc/ssl/certs
    crt-base /etc/ssl/private    # this doesn't exist as of 25/09/2020

defaults
    log     global
    option  httplog
    mode    http
    timeout connect 60s
    timeout client  60s
    timeout server  60s

frontend theapp
    bind :443 ssl crt /etc/haproxy/pem/wildcard.pem
    bind :9000  # this is so people incorrectly connecting to port 9000 will get redirected to the ssl port

    option logasap

    acl http  ssl_fc,not
    http-request redirect scheme https code 301 unless { ssl_fc }

    # https://www.haproxy.com/blog/haproxy-and-http-strict-transport-security-hsts-header-in-http-redirects/
    http-response set-header Strict-Transport-Security "max-age=16000000; includeSubDomains; preload;"

    acl p_tokenvalidation path_beg /doTokenIntrospection

    # token validation looks for the header values first, and if missing, goes for query parameters
    # query format is accessToken=<access token without username>&refreshToken=<refresh token>
    http-request set-var(txn.token_active)       str("dummyTokenState")   if p_tokenvalidation ! http

    # Update the value of token_active to 'true' or 'false' from the [active] json value
    # If the token was expired, this routine will refresh it, providing new values for access_token and refresh_token
    # The introspection routine will hunt for headers, and if they don't exist, look for query parameters
    http-request lua.executeIntrospect if p_tokenvalidation  ! http

    acl p_active var(txn.token_active) -m str true

    # Responses just for Token verification/renewal (always return body of true or false and only if successful return HAP headers)
    http-request return status 200 content-type text/plain lf-string "%[var(txn.token_active)]" hdr hap-access-token %[var(txn.access_token)] hdr hap-refresh-token %[var(txn.refresh_token)] hdr hap-user-name %[var(txn.b64name)]  if p_tokenvalidation p_active !http
    http-request return status 401 content-type text/plain lf-string "%[var(txn.token_active)]" if p_tokenvalidation !p_active !http

##############  We've completed all the special cases, just hand everything else off to the back end
    default_backend theapp_servers

frontend stats
    bind :80 ssl crt /etc/haproxy/pem/wildcard.pem
    stats enable
    stats show-node
    stats uri /stats
    stats refresh 300s
    stats admin if LOCALHOST

frontend stdws
    bind :442 ssl crt /etc/haproxy/pem/wildcard.pem
    default_backend stdws_servers

backend theapp_servers
    # https://discourse.haproxy.org/t/mixed-content-warning-when-using-https/981/9
    http-response set-header Content-Security-Policy upgrade-insecure-requests

    acl p_back_from_okta path_beg /backfromokta

    acl p_generate_headers always_true
    acl p_generate_query   always_false
    acl p_dumpHeaders      always_false

    http-request set-var(txn.access_token) str("dummaccesstoken")    if p_back_from_okta
    http-request set-var(txn.id_token) str("dummyidtoken")           if p_back_from_okta
    http-request set-var(txn.refresh_token) str("dummyrefreshtoken") if p_back_from_okta
    http-request set-var(txn.b64name) str("dummyb64name")            if p_back_from_okta
    http-request set-var(txn.username) str("dummyusername")          if p_back_from_okta
    http-request set-var(txn.okta_state_UID) str("dummystate")       if p_back_from_okta

    # Conduct OACF flow to exchange code for tokens
    http-request lua.executeOACF                                     if p_back_from_okta

    # Turn each query parameter extracted from the okta response into a separate header
    http-request set-header HAP-Access-Token  %[var(txn.access_token)]   if p_back_from_okta p_generate_headers
    http-request set-header HAP-User-Name     %[var(txn.b64name)]        if p_back_from_okta p_generate_headers
    http-request set-header HAP-Refresh-Token %[var(txn.refresh_token)]  if p_back_from_okta p_generate_headers

    http-request lua.dumpHeaders 'be-req' if p_back_from_okta p_generate_headers p_dumpHeaders

    # the state parameter is supposed to be generated by theapp, but currently hard-coded in theapp core.properties file, but we aren't shipping it on to theapp because they don't handle it yet

    # Use a rewrite to avoid exposing the tokens to the end user
    # two commands based on whether the query needs to be supplied, if the headers were generated, they are attached to whichever uri is passed on to the back end
    http-request set-uri https://haproxy-vm.pdqn/web/theapp/home?HAP-Access-Token=%[var(txn.access_token)]&HAP-User-Name=%[var(txn.b64name)]&HAP-Refresh-Token=%[var(txn.refresh_token)]&state=%[var(txn.okta_state_UID)] if p_back_from_okta p_generate_query
    http-request set-uri https://haproxy-vm.pdqn/web/theapp/home if p_back_from_okta p_generate_headers

    balance roundrobin
    server theapp_9000 theapp.pdqn:9000 check

backend stdws_servers
    balance roundrobin
    server stdws_9001 theapp.pdqn:9001 check

resolvers dnsresolver
    nameserver dns1 192.168.10.10:53
    resolve_retries 3
    timeout retry 1s
    hold nx 10s
    hold valid 10s

haproxy version info
/opt/haproxy/sbin/haproxy -vv
HA-Proxy version 2.2.3-0e58a34 2020/09/08 - https://haproxy.org/
Status: long-term supported branch - will stop receiving fixes around Q2 2025.
Known bugs: http://www.haproxy.org/bugs/bugs-2.2.3.html
Running on: Linux 3.10.0-1160.2.1.el7.x86_64 #1 SMP Mon Sep 21 21:00:09 EDT 2020 x86_64
Build options :
TARGET = linux-glibc
CPU = generic
CC = gcc
CFLAGS = -O2 -g -Wall -Wextra -Wdeclaration-after-statement -fwrapv -Wno-unused-label -Wno-sign-compare -Wno-unused-parameter -Wno-clobbered -Wno-missing-field-initializers -Wtype-limits
OPTIONS = USE_PCRE=1 USE_OPENSSL=1 USE_LUA=1 USE_ZLIB=1 USE_SYSTEMD=1

Feature list : +EPOLL -KQUEUE +NETFILTER +PCRE -PCRE_JIT -PCRE2 -PCRE2_JIT +POLL -PRIVATE_CACHE +THREAD -PTHREAD_PSHARED +BACKTRACE -STATIC_PCRE -STATIC_PCRE2 +TPROXY +LINUX_TPROXY +LINUX_SPLICE +LIBCRYPT +CRYPT_H +GETADDRINFO +OPENSSL +LUA +FUTEX +ACCEPT4 +ZLIB -SLZ +CPU_AFFINITY +TFO +NS +DL +RT -DEVICEATLAS -51DEGREES -WURFL +SYSTEMD -OBSOLETE_LINKER +PRCTL +THREAD_DUMP -EVPORTS

Default settings :
  bufsize = 16384, maxrewrite = 1024, maxpollevents = 200

Built with multi-threading support (MAX_THREADS=64, default=4).
Built with OpenSSL version : OpenSSL 1.0.2k-fips  26 Jan 2017
Running on OpenSSL version : OpenSSL 1.0.2k-fips  26 Jan 2017
OpenSSL library supports TLS extensions : yes
OpenSSL library supports SNI : yes
OpenSSL library supports : SSLv3 TLSv1.0 TLSv1.1 TLSv1.2
Built with Lua version : Lua 5.3.5
Built with network namespace support.
Built with zlib version : 1.2.7
Running on zlib version : 1.2.7
Compression algorithms supported : identity("identity"), deflate("deflate"), raw-deflate("deflate"), gzip("gzip")
Built with transparent proxy support using: IP_TRANSPARENT IPV6_TRANSPARENT IP_FREEBIND
Built with PCRE version : 8.32 2012-11-30
Running on PCRE version : 8.32 2012-11-30
PCRE library supports JIT : no (USE_PCRE_JIT not set)
Encrypted password support via crypt(3): yes
Built with gcc compiler version 4.8.5 20150623 (Red Hat 4.8.5-44)

Available polling systems :
      epoll : pref=300,  test result OK
       poll : pref=200,  test result OK
     select : pref=150,  test result OK
Total: 3 (3 usable), will use epoll.

Available multiplexer protocols :
(protocols marked as <default> cannot be specified using 'proto' keyword)
            fcgi : mode=HTTP       side=BE        mux=FCGI
       <default> : mode=HTTP       side=FE|BE     mux=H1
              h2 : mode=HTTP       side=FE|BE     mux=H2
       <default> : mode=TCP        side=FE|BE     mux=PASS

Available services : none

Available filters :
        [SPOE] spoe
        [COMP] compression
        [TRACE] trace
        [CACHE] cache
        [FCGI] fcgi-app

lua version info
lua -v
Lua 5.3.5 Copyright © 1994-2018 Lua.org, PUC-Rio

what am I running haproxy on
uname -a
Linux haproxy-vm 3.10.0-1160.2.1.el7.x86_64 #1 SMP Mon Sep 21 21:00:09 EDT 2020 x86_64 x86_64 x86_64 GNU/Linux

what was running
[someuser@haproxy-vm ~]$ ps -ef | grep hap
root 24764 1 0 18:06 ? 00:00:00 /opt/haproxy/sbin/haproxy -Ws -f /etc/haproxy/haproxy.cfg -p /run/haproxy.pid -S /run/haproxy-master.sock
root 24766 24764 99 18:06 ? 00:37:35 /opt/haproxy/sbin/haproxy -Ws -f /etc/haproxy/haproxy.cfg -p /run/haproxy.pid -S /run/haproxy-master.sock

memory in the machine at the time haproxy had stopped (just prior to restarting haproxy)
[someuser@haproxy-vm ~]$ free
total used free shared buff/cache available
Mem: 6642448 3694548 286512 201428 2661388 2678904
Swap: 20967420 2048 20965372

what was in the mix (the machine only runs haproxy, nothing else)
top
top - 18:22:21 up 40 days, 19:46, 1 user, load average: 2.86, 2.73, 1.97
Tasks: 244 total, 1 running, 243 sleeping, 0 stopped, 0 zombie
%Cpu(s): 20.8 us, 58.3 sy, 0.0 ni, 20.8 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st
KiB Mem : 6642448 total, 280876 free, 3699924 used, 2661648 buff/cache
KiB Swap: 20967420 total, 20965372 free, 2048 used. 2673440 avail Mem
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
24766 root 20 0 330072 23320 3272 S 276.5 0.4 43:02.20 haproxy
26091 root 20 0 162248 2668 1820 R 5.9 0.0 0:00.05 top
1 root 20 0 194220 7348 4220 S 0.0 0.1 13:29.90 systemd
2 root 20 0 0 0 0 S 0.0 0.0 0:03.16 kthreadd
4 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 kworker/0:0H
6 root 20 0 0 0 0 S 0.0 0.0 0:12.02 ksoftirqd/0
7 root rt 0 0 0 0 S 0.0 0.0 0:05.23 migration/0
8 root 20 0 0 0 0 S 0.0 0.0 0:00.00 rcu_bh
9 root 20 0 0 0 0 S 0.0 0.0 12:34.53 rcu_sched
10 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 lru-add-drain
11 root rt 0 0 0 0 S 0.0 0.0 0:16.53 watchdog/0
12 root rt 0 0 0 0 S 0.0 0.0 0:14.16 watchdog/1
13 root rt 0 0 0 0 S 0.0 0.0 0:05.23 migration/1
14 root 20 0 0 0 0 S 0.0 0.0 0:10.69 ksoftirqd/1
16 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 kworker/1:0H
17 root rt 0 0 0 0 S 0.0 0.0 0:13.56 watchdog/2
18 root rt 0 0 0 0 S 0.0 0.0 0:05.69 migration/2
19 root 20 0 0 0 0 S 0.0 0.0 0:10.35 ksoftirqd/2
21 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 kworker/2:0H
22 root rt 0 0 0 0 S 0.0 0.0 0:13.00 watchdog/3
23 root rt 0 0 0 0 S 0.0 0.0 0:05.26 migration/3
24 root 20 0 0 0 0 S 0.0 0.0 0:09.87 ksoftirqd/3
26 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 kworker/3:0H
28 root 20 0 0 0 0 S 0.0 0.0 0:00.00 kdevtmpfs
29 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 netns
30 root 20 0 0 0 0 S 0.0 0.0 0:01.58 khungtaskd
31 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 writeback
32 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 kintegrityd
33 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 bioset
34 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 bioset
35 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 bioset
36 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 kblockd
37 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 md
38 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 edac-poller
39 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 watchdogd
45 root 20 0 0 0 0 S 0.0 0.0 0:00.41 kswapd0
46 root 25 5 0 0 0 S 0.0 0.0 0:00.00 ksmd
47 root 39 19 0 0 0 S 0.0 0.0 0:08.05 khugepaged
48 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 crypto
56 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 kthrotld
58 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 kmpath_rdacd
59 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 kaluad
61 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 kpsmoused
62 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 ipv6_addrconf
76 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 deferwq
112 root 20 0 0 0 0 S 0.0 0.0 0:01.34 kauditd
666 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 hv_vmbus_con
667 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 hv_pri_chan
669 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 hv_sub_chan
699 root 20 0 0 0 0 S 0.0 0.0 0:00.00 scsi_eh_0
701 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 scsi_tmf_0
702 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 storvsc_error_w
771 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 kdmflush
772 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 bioset
781 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 kdmflush
782 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 bioset
806 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 bioset
807 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 xfsalloc
808 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 xfs_mru_cache
817 root 0 -20 0 0 0 S 0.0 0.0 0:11.38 kworker/0:1H
827 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 xfs-buf/dm-0
828 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 xfs-data/dm-0
829 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 xfs-conv/dm-0
830 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 xfs-cil/dm-0

more detailed memory info
cat /proc/meminfo
MemTotal: 6642448 kB
MemFree: 281660 kB
MemAvailable: 2674076 kB
Buffers: 52 kB
Cached: 2471304 kB
SwapCached: 60 kB
Active: 1710496 kB
Inactive: 1198720 kB
Active(anon): 317676 kB
Inactive(anon): 321612 kB
Active(file): 1392820 kB
Inactive(file): 877108 kB
Unevictable: 0 kB
Mlocked: 0 kB
SwapTotal: 20967420 kB
SwapFree: 20965372 kB
Dirty: 16 kB
Writeback: 0 kB
AnonPages: 437848 kB
Mapped: 119548 kB
Shmem: 201428 kB
Slab: 276648 kB
SReclaimable: 190056 kB
SUnreclaim: 86592 kB
KernelStack: 10400 kB
PageTables: 22796 kB
NFS_Unstable: 0 kB
Bounce: 0 kB
WritebackTmp: 0 kB
CommitLimit: 24288644 kB
Committed_AS: 2685748 kB
VmallocTotal: 34359738367 kB
VmallocUsed: 253604 kB
VmallocChunk: 34359476360 kB
Percpu: 108480 kB
HardwareCorrupted: 0 kB
AnonHugePages: 139264 kB
CmaTotal: 0 kB
CmaFree: 0 kB
HugePages_Total: 0
HugePages_Free: 0
HugePages_Rsvd: 0
HugePages_Surp: 0
Hugepagesize: 2048 kB
DirectMap4k: 169372 kB
DirectMap2M: 6776832 kB
DirectMap1G: 0 kB

the tail of my (sanitised) log
[someuser@haproxy-vm ~]$ tail -n1000 /var/log/haproxy.log
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - active=true
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - access_token=######
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - refresh_token=norefreshtokensupplied
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - headers 22 and stream 0 should have arrived
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - okta introspection result = 200
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - okta introspection call content ######
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - Exiting Okta token introspection routine
Dec 10 18:09:23 haproxy-vm haproxy[24766]: 172.18.131.111:12283 [10/Dec/2020:18:09:20.665] theapp~ theapp/ 3181/-1/-1/-1/+3181 200 +948 - - LR-- 3/3/0/0/0 0/0 “GET /doTokenIntrospection HTTP/1.1”
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - active=true
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - access_token=######
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - refresh_token=norefreshtokensupplied
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - Exiting Okta token introspection routine
Dec 10 18:09:23 haproxy-vm haproxy[24766]: 172.18.131.111:12308 [10/Dec/2020:18:09:20.983] theapp~ theapp/ 2924/-1/-1/-1/+2924 200 +1293 - - LR-- 3/3/0/0/0 0/0 “GET /doTokenIntrospection HTTP/1.1”
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - headers 22 and stream 0 should have arrived
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - okta introspection result = 200
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - okta introspection call content ######
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - active=true
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - access_token=######
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - refresh_token=norefreshtokensupplied
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - Exiting Okta token introspection routine
Dec 10 18:09:23 haproxy-vm haproxy[24766]: 172.18.131.111:12291 [10/Dec/2020:18:09:20.921] theapp~ theapp/ 2927/-1/-1/-1/+2927 200 +948 - - LR-- 3/3/0/0/0 0/0 “GET /doTokenIntrospection HTTP/1.1”
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - Okta token introspection start
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - Okta token introspection fault - no refresh token
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - Received access token #######
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - Received refresh token “norefreshtokensupplied”
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - Received user name “0oa1jxm3q4eQamlnm0h8”
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - HAP-User-Name & sub claim (without @domain) from HAP-Access-Token match
Dec 10 18:09:23 haproxy-vm haproxy[24766]: introspect.lua: - About to call https://customokta.okta.com/oauth2/default/v1/introspect?token=###########&token_type_hint=access_token
Dec 10 18:09:24 haproxy-vm haproxy[24766]: introspect.lua: - Okta token introspection start
Dec 10 18:09:24 haproxy-vm haproxy[24766]: introspect.lua: - Received access token #######
Dec 10 18:09:24 haproxy-vm haproxy[24766]: introspect.lua: - Received refresh token “norefreshtokensupplied”
Dec 10 18:09:24 haproxy-vm haproxy[24766]: introspect.lua: - Received user name “0oa1jxm3q4eQamlnm0h8”
Dec 10 18:09:24 haproxy-vm haproxy[24766]: introspect.lua: - HAP-User-Name & sub claim (without @domain) from HAP-Access-Token match
Dec 10 18:09:24 haproxy-vm haproxy[24766]: introspect.lua: - About to call https://customokta.okta.com/oauth2/default/v1/introspect?token=###########&token_type_hint=access_token
Dec 10 18:09:25 haproxy-vm haproxy[24766]: introspect.lua: - Okta token introspection start
Dec 10 18:09:25 haproxy-vm haproxy[24766]: introspect.lua: - Received access token #######
Dec 10 18:09:25 haproxy-vm haproxy[24766]: introspect.lua: - Received refresh token “norefreshtokensupplied”
Dec 10 18:09:25 haproxy-vm haproxy[24766]: introspect.lua: - Received user name “TestSSTAgent”
Dec 10 18:09:25 haproxy-vm haproxy[24766]: introspect.lua: - HAP-User-Name & sub claim (without @domain) from HAP-Access-Token match
Dec 10 18:09:25 haproxy-vm haproxy[24766]: introspect.lua: - About to call https://customokta.okta.com/oauth2/default/v1/introspect?token=###########&token_type_hint=access_token
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - headers 22 and stream 0 should have arrived
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - okta introspection result = 200
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - okta introspection call content ######
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - active=true
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - access_token=######
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - refresh_token=norefreshtokensupplied
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - Exiting Okta token introspection routine
Dec 10 18:09:26 haproxy-vm haproxy[24766]: 172.18.131.111:12308 [10/Dec/2020:18:09:24.223] theapp~ theapp/ 2714/-1/-1/-1/+2714 200 +948 - - LR-- 3/3/0/0/0 0/0 “GET /doTokenIntrospection HTTP/1.1”
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - headers 22 and stream 0 should have arrived
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - okta introspection result = 200
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - okta introspection call content ######
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - active=true
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - access_token=######
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - refresh_token=norefreshtokensupplied
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - Exiting Okta token introspection routine
Dec 10 18:09:26 haproxy-vm haproxy[24766]: 172.18.131.111:12283 [10/Dec/2020:18:09:23.935] theapp~ theapp/ 3003/-1/-1/-1/+3003 200 +948 - - LR-- 3/3/0/0/0 0/0 “GET /doTokenIntrospection HTTP/1.1”
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - headers 22 and stream 0 should have arrived
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - okta introspection result = 200
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - okta introspection call content ######
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - active=true
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - access_token=######
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - refresh_token=norefreshtokensupplied
Dec 10 18:09:26 haproxy-vm haproxy[24766]: introspect.lua: - Exiting Okta token introspection routine
Dec 10 18:09:26 haproxy-vm haproxy[24766]: 172.18.131.111:12291 [10/Dec/2020:18:09:24.048] theapp~ theapp/ 2892/-1/-1/-1/+2892 200 +1293 - - LR-- 3/3/0/0/0 0/0 “GET /doTokenIntrospection HTTP/1.1”
Dec 10 18:09:27 haproxy-vm haproxy[24766]: introspect.lua: - Okta token introspection start
Dec 10 18:09:27 haproxy-vm haproxy[24766]: introspect.lua: - Received access token #######
Dec 10 18:09:27 haproxy-vm haproxy[24766]: introspect.lua: - Received refresh token “norefreshtokensupplied”
Dec 10 18:09:27 haproxy-vm haproxy[24766]: introspect.lua: - Received user name “0oa1jxm3q4eQamlnm0h8”
Dec 10 18:09:27 haproxy-vm haproxy[24766]: introspect.lua: - HAP-User-Name & sub claim (without @domain) from HAP-Access-Token match
Dec 10 18:09:27 haproxy-vm haproxy[24766]: introspect.lua: - About to call https://customokta.okta.com/oauth2/default/v1/introspect?token=###########&token_type_hint=access_token
Dec 10 18:09:28 haproxy-vm haproxy[24766]: introspect.lua: - Okta token introspection start
Dec 10 18:09:28 haproxy-vm haproxy[24766]: introspect.lua: - Received access token #######
Dec 10 18:09:28 haproxy-vm haproxy[24766]: introspect.lua: - Received refresh token “norefreshtokensupplied”
Dec 10 18:09:28 haproxy-vm haproxy[24766]: introspect.lua: - Received user name “TestSSTAgent”
Dec 10 18:09:28 haproxy-vm haproxy[24766]: introspect.lua: - Okta token introspection start
Dec 10 18:09:28 haproxy-vm haproxy[24766]: introspect.lua: - Received access token #######
Dec 10 18:09:28 haproxy-vm haproxy[24766]: introspect.lua: - Received refresh token “norefreshtokensupplied”
Dec 10 18:09:28 haproxy-vm haproxy[24766]: introspect.lua: - Received user name “0oa1jxm3q4eQamlnm0h8”
Dec 10 18:09:28 haproxy-vm haproxy[24766]: introspect.lua: - HAP-User-Name & sub claim (without @domain) from HAP-Access-Token match
Dec 10 18:09:28 haproxy-vm haproxy[24766]: introspect.lua: - HAP-User-Name & sub claim (without @domain) from HAP-Access-Token match

from the log I can tell that at least three separate instances of the introspect.lua routine were active when the log stopped getting any new entries and I suppose haproxy stopped accepting new connections, but that same condition (3 active instances of introspect.lua) occurs throughout the day without haproxy freezing.

I have seen another (older) topic about haproxy stopping after 2000 connections but I think I have enough data to say that it is not the same problem (I have had at least 14K connections today just for the token introspection routine, and the system appears to have locked up only twice, so I figure its not the same).

I would be grateful for any advice, even if that is… yep, install 2.2.6 and then see if it keeps happening