Article Preview
TopIntroduction
At the present time, the human society is enormously dependent on the Internet, the most important source of communication. For this reason, the accessibility of the Internet is very significant for the growth of the civilized society. For example, the expansion and the success of Internet have changed the way of working of conventional services such as marketing, banking, and electoral system. All of these conventional services are now rapidly replacing by efficient web based applications. On the other hand, the intrinsic vulnerabilities of web application give possibility for a range of attacks on the web based applications.
For instance, web bot is a class of web security attack that creates an enormous threat to the security of every web application and web service (Heartfield et al., 2013). Principally, the web bot is a script or program which is developed to execute completely automated and repetitive task on web applications. The name bot is derived from the word robot, so it is also known as web robot. The intention of the development of web bots can either be bad or good. The functionality and the actions classify the web bot to be good web bot or bad web bot (Gilani et al., 2016). The most extensively used good web bot is web crawler. Its purpose is to index web sites and web applications in search engines (Thelwall, 2001). On the other hand, bad web bots are developed to carry out a range of destructive tasks (Rahman & Tomar, 2018).
The bad web bots are the reason of majority of security attacks on web sites. These web sites attacks are vary from small cyber crime like click fraud, Backlinks creations, and mass registration to big crimes like stealing of credit card information and credential stuffing (Thelwall et al., 2009). According to global web bot traffic report, web bots traffic comprises about fifty percent of the total web traffic (Zelfman, 2017). The distribution of web traffic is shown in Figure 1. From the figure it is clear that the thirty percent comes from good web bots and remaining twenty percent comes from bad web bots (Wang et al., 2014).
When the report critically examined, it illustrates the particular web sites such as small, medium and large are further exposed to web bot attacks in the year 2014. Thus, bad bots attacked categorically to all of these web sites. As depicted in the subsequent figure (Figure 2), the proportion of web traffic coming from bad web bots is steadily thirty percent, irrespective of its size.
In reality, bad bots for instance ScrapeBox (Shin et al., 2011) and XRumer (Hayati et al., 2012) are created for generating the Backlinks, web scraping, content scraping, form spamming, and automated registration of web services such as mailing. Web Bot defense mechanism can be categorize into two main approaches namely preventive approach and detective approach. Preventive approach requires direct human participation such as Turing test in form of CAPTCHA (Rahman & Tomar, 2012). However, advanced web bot such as XRumer is capable of bypassing this preventive approach generally used by web applications through solving CAPTCHA using optical character recognition (OCR). In fact, it has been reported in the year 2008, the web bot XRumer effectively evaded Google and Hotmail CAPTCHA to make enormous number of accounts with these web services. Similarly, Decaptcha application defeated CAPTCHA of Wikipedia just about twenty five percent of the time (Bursztein & Bethard, 2009).