If your server supports User-agent logging you can check for retrievals with unusual User-agent header values.
Finally, if you notice a site repeatedly checking for the file '/robots.txt' chances are that is a robot too.
I've been visited by a robot! Now what?
Well, nothing :-) The whole idea is they are automatic; you don't need to do anything.
If you think you have discovered a new robot (ie one that is not listed on the list of active robots, and it does more than sporadic visits, drop me a line so I can make a note of it for future reference. But please don't tell me about every robot that happens to drop by!