Do we really need CAPTCHA's to prevent robots posting to our web forms? Not if you run ASP.NET 2.0. Whether you use a form for member logins, blog comments or a web shop you want to keep it as secure and tamper proof as possible. Brute force attacks on a login form performed by robots or spammers on the blog comments are scenarios you can avoid by leveraging existing ASP.NET 2.0 features.
Event validation makes sure that a postback comes from a control on the page and not a direct POST request sent by an application (read robot).
ASP.NET has event validation turned on by default, but many people turn it off for various reasons. One reason is that their web application registers exceptions caused by the event validation. So does mine, but that is when an unauthorized post request is performed. Don’t turn it off.
When ViewStateMAC is enabled it encrypts the ViewState so it cannot be tampered with by evil doers. ViewStateMAC is not enabled by default, so you have to do it in web.config manually like so:
<pages enableViewStateMac="true" />
When enabling ViewStateMAC you must also add a machine key to the web.config so that all the servers in a webfarm use the same encryption and decryption key. Otherwise you can end up with invalid ViewState. Here is an example on such a machine key.
You can take it a step further and add a user key to the ViewState. That locks the ViewState to a single user and makes it even harder to tamper with. Read more on the user key here.
All the rest
These are two build-in technologies that can be used, but you still have to do your custom form field validation etc. No CAPTCHA is needed when using these two techniques. All it requires is that you use a <form runat="server"> and the standard postback feature of ASP.NET to post the form. If you don't believe me, try it out. It does eliminate the use of CAPTCHA's.