Use a robots.txt to point out links that bots shouldn't follow. For example, put the following in http://example.com/robots.txt
You can read more about robots.txt here: http://www.robotstxt.org/
Google and every other well-behaved bot will read and follow directions in robots.txt.
If you also have problems with bots not following those directions you will have to code some logic to block bots, or at least to decrease their impact. You can for example log how many votes you've got from an IP address in a certain time frame and block votes above that level. Another solution can be to only allow posts, and also have some JS logic (or similar) to block out spam bots, but that's much more work than robots.txt so only put time into it if it becomes a problem.
You can also block bad-behaving bots entirely by blocking their IPs in your web server. There are a few lists of bad-behaving bots out there you can try if you prefer the block solution.