Robots.txt file on HAProxy


Currently we host multiple sites behind our HAProxy load balancer, and I would like to prevent Robots from indexing and crawling our sites. I do not want to put a robots.txt on every site we host behind our load balancer, but I would rather have the file on the load balancer itself. Does anyone know if this is possible through use of an ACL or other kinds of config?