10 things to check before opting for a CAPTCHA

Tags: captcha

Predicting the destiny of CAPTCHA systems won’t be a tough job now, as most of them are easily prone to hacking these days . But this has not prevented the web developers in using CAPTCHAs in web pages, e-mail forms etc...

Making a wiser use of CAPTCHA system can stop bots, without making sign-ups hard for genuine users.

From a user’s point of view it can be undoubtedly said that no one likes seeing CAPTCHAs. A CAPTCHA cannot be used by everyone, especially those with an impaired vision. It can also create troubles if the graphics are not enabled. Improperly given CAPTCHAs may slowdown the registration process of a site there by resulting in lesser registrations than expected.

The worst part of using CAPTCHAs is that the whole burden is laid on the shoulders of the user. Hence CAPTCHA should be the last choice, according to experts, and not the first.

Recent discoveries say that much of the hacking attempts and bots can be prevented even without the aid of CAPTCHAs.

Here are some tips to stop spoofing attempts.

1.Validate everything server-side
Try to validate every field with the help of the server side code, even if you are confident about the secure client side validation of yours. Be particular about the fields that are placed in the email headers. Email addresses are possibly the most significant values to check: Try to use a good regular expression and look out for ‘HTML tags, SQL injections, or return characters (\n and \r in PHP)’.

2.Is there a spam-like content?
It can be noticed that most of the spammers provide links to websites. If you don’t get the site that’s expected, it can be a spam bot. A third part tool like Akismet (http://akismet.com/ ) can help you here.

3.Beware of the rogue POST and GET values
Is your form expecting 3 POSTed fields? Do you see a 4th one too? Be careful, as it can indicate a hacking attempt.  Also make sure that ‘no additional GET values have been passed’.

4.Check HTTP header
Simple spam bots hardly ever set a ‘user agent (HTTP_USER_AGENT) or a referring page (HTTP_REFERER)’. Make sure that ‘the referrer is the page where the form is located.’

5.Use a honeypot field
Spambots usually make sure that every form filed is completed, so that they do a successful basic validation.
Create a honeypot form field which should be left blank. Use CSS to conceal it from human users and not bots.  As soon as form is submitted you can check it and make sure whether the value of the form field is left blank or not.

6.Detect the presence of JavaScript
Can the page run JavaScript?  If so, you are on the safer side because only a human user can load it on a browser. A majority of robots do not have the ability to interpret JavaScript present on the website. Developing a JavaScript interpreter is more difficult than developing a crawler that passes through HTML links.  A standard browser used by a human being loads the HTML of the webpage as well as the other resources, which includes the ones served by Java Scripts. But robots just load the HTML of the page, maybe ‘graphics on the page’ too, nothing more.

7.Show a verification page
It’s very difficult for bots to react to a server response.  Do you have any doubts about the validity of a post? Try to show an intermediate page to ask the user to confirm the data and press submit bottom later. 

8.Time the user response
Human users normally take a little time to complete the forms were as bots do it instantaneously.  This point can be taken as a sign to detect bots.

9.Log everything
Keep a record of everything that happens during a form submission.  Not a 100% practical one always, yet this can help you detect the hacking attempts later on.

10. Also note that regex or other validations used should be a white list i.e.  Only those inputs which are safer should be allowed.

Comments and Feedback

Post your Comment

Type your comment here*


Verification Code Image

Back to Main Top of Page