1

We have a dedicated development server which runs only test PHP applications on a public network.
We have setup session-based authentication for the site.

The issue we have is there are lots of 404s logged in access log for robots.txt.
So, We want to block/ignore these requests on the lighttpd to save some bandwidth.

How can we achieve this in lighttpd/1.4.31 ?

5
  • 2
    wouldn't it be better to put a dummy robots.txt file? Nov 19, 2012 at 14:20
  • @WaleedHamra, But isnt putting a dummy robots.txt file also a part of bandwidth. The applications on the server takes around 150K requests per day. So, i suppose serving a dummy file will use 150K*1K of bandwidth. Am I Right? Nov 19, 2012 at 14:25
  • 4
    a 1 byte file will generate 150K bytes of data... much smaller than a complete 404 page, if your server serves one. Nov 19, 2012 at 14:40
  • Why are search engines crawling your development server in the first place?!? Nov 19, 2012 at 16:40
  • WaleedHamra,oh ok, didnt think of it that way. Thanks for pointing. MichaelHampton, no idea, i have a domain via which all the applications are accessed, could that be the reason? Nov 20, 2012 at 11:29

0

You must log in to answer this question.

Browse other questions tagged .