- Certain robot implementations can (and have in the past) overloaded networks and servers. This happens especially with people who are just starting to write a robot; these days there is sufficient information on robots to prevent some of these mistakes.
- Robots are operated by humans, who make mistakes in configuration, or simply don't consider the implications of their actions. This means people need to be careful, and robot authors need to make it difficult for people to make mistakes with bad effects.
- Web-wide indexing robots build a central database of documents, which doesn't scale too well to millions of documents on millions of sites.
So no, robots aren't inherently bad, nor inherently brilliant, and need careful attention.