@sam, I’m sure you found an acceptable solution a long time ago, but I found this thread today and am not using this solution to disallow robots indexing on all of my backends at once with this frontend LUA config.
My specific robots.txt response uses this as the ACL to match to call the function.
Name Expression Value Actions
robots Path ends with: robots.txt
frontend config:
acl robots path_end -i robots.txt
http-request use-service lua.robots if robots aclcrt_SharedFront
LUA file:
robots = function(applet)
local response = "User-agent: *\nDisallow: /"
applet:add_header("Content-Length", string.len(response))
applet:add_header("Content-Type", "text/plain")
applet:set_status(200)
applet:start_response()
applet:send(response)
end
core.register_service("robots", "http", robots)