How do I serve a single static file from HAProxy?


#1

We have a custom backend we use to serve a single static file:

acl is_robots_txt path /robots.txt
use_backend robotstxt if is_robots_txt is_cdn


backend robotstxt
  mode http
  errorfile 503 /etc/haproxy/errors/200robots.http

The file contains:

HTTP/1.0 200 Found
Cache-Control: no-cache
Connection: close
Content-Type: text/plain

# Discourse CDN, all crawling is allowed (cdn contains images and text files)
User-Agent: *
Allow: /

This means that when people hit: https://cdn.discourse.org/robots.txt well they get the robots file.

This hack works OK, except that now we are logging a bunch of 503 statuses in the logs which in reality are 200s, this in turn makes log analysis a bit annoying.

Is there another hack we can use that allows us to serve a single static file from a backend and retain status 200 in the logs?


Robots.txt file on HAProxy
#2

You may disable logging in this backend, which is probably the best thing to do anyway since you don’t care about who retrieves this file :

http-request set-log-level silent

I wanted to have the “return-raw” and “return-html” actions merged into 1.6 but we didn’t have time to do them.


#3

I see, ideally I want to keep the logging, can I achieve my desired result with a LUA script running in the backend serving the request?


#4

Hi Sam,

Yes, you may be able to create a lua service which delivers the content of you robots.txt.
Then you may have a 200 in th logs.

Baptiste


#5

could it be done for 1.6.4 maybe or does it require bigger changes?

also yay my favorite forum software for my favorite proxy!


#6

time to find out how stable 1.7 is :stuck_out_tongue:


#7

Anyone have a good example of how to serve a single static file using lua?
Looks a bit less hacky than above, would like to create a landing page for an ssl verify required scenario, vs. just present a request for cert.

There is a hello world example here, but it doesn’t have much:


#8

Also I saw someone just do:

listen http-webservices
    bind :8080 
    monitor-uri /c 
    errorfile 200 /etc/haproxy/errorfiles/200.http

#9

I got a LUA response working based on the rabbit trail @mattpark started me down. I am using this with HAproxy running on pfSense.

I have several responses in a single lua file and just call them by function name (service at the bottom)

Here is an example from my lua file, just replace the response data with the contents of your robots.txt, indicating new lines by a \n . If you want this to be HTML (or any other file type) just change the Content-Type header as needed.

http_resp = function(applet) 
local response = "Your Content goes here.\n use backslash+n to generate a new line"
applet:add_header("Content-Length", string.len(response)) 
applet:add_header("Content-Type", "text/plain") 
applet:set_status(200) 
applet:start_response() 
applet:send(response) 
end 

core.register_service("http_resp", "http", http_resp)

#10

@sam, I’m sure you found an acceptable solution a long time ago, but I found this thread today and am not using this solution to disallow robots indexing on all of my backends at once with this frontend LUA config.

My specific robots.txt response uses this as the ACL to match to call the function.
Name Expression Value Actions
robots Path ends with: robots.txt

frontend config:

	acl			robots	path_end -i robots.txt
	http-request use-service lua.robots  if  robots aclcrt_SharedFront

LUA file:

robots = function(applet) 
local response = "User-agent: *\nDisallow: /"
applet:add_header("Content-Length", string.len(response)) 
applet:add_header("Content-Type", "text/plain") 
applet:set_status(200) 
applet:start_response() 
applet:send(response) 
end 

core.register_service("robots", "http", robots)