Clickability has created a really nice Robots.txt Builder that helps you to configure your Robots.txt file. You can easily build a Robots.txt file to disallow robots into your file structure. There are options for easily adding web search robots, image search, contextual ads, web archivers, and even “bad robots”. The bad robots puts in a default list of a ton of robots that you can keep out.
With the announcement of the new sitemap autodiscovery code for Robots.txt, I hope they add something for this, even though its pretty easy to implement yourself.