How to block bad bots in WordPress?

0
123
How-to-block-bad-bots
How-to-block-bad-bots

What is bot in Internet?

A bot , also known as Internet bot, which is a program that runs automated tasks over the Internet. Typically intended to perform simple and repetitive tasks(executes the scripts & files again & again), Internet bots are programs and scripts which enables their user to do things quickly and on a scale. For example, search engines like Google, Bing, Yandex or Baidu use crawler bots to periodically collect the information’s from hundreds of millions of domains and index them into their result pages.

Good bot vs bad bot, what’s the difference?

Now a days, Internet bots are ruling the internet world. According to May, 2017 survey, 66 percent of all current bot traffic is used only for malicious purposes which is dangerous for the domain owners. That means, its really very dangerous to company/person run a online business.

Good Bots are really useful to the business to make strong the Internet; they can perform very valuable functions such us operating search engines, powering APIs,  monitoring websites and vulnerability scanning; but they can also be used to cause serious problems. Bad bots are responsible for almost all malicious activity that we encounter on our sites. The main issue would be distributed denial-of-service (DDoS) attacks.

Good bots vs bad bots
Good bots vs bad bots

How to block bad bots?

There are 3 easy methods to block the bad bots.

  1. Using .htaccess file
  2. Using Robots.txt
  3. Using plugin

1Method: Using .htaccess file

Using .htaccess file we can easily block the bad bots from hackers. Login to your File Manager using your host provider or FTP clients like Filezilla(recommended), WinSCPCyberduck, etc., Once you log into your account, take a backup for your .htaccess file (Don’t forget since its really important). Copy the following code and add at the end of the file. To edit .htaccess file you can use any text editor like Notepad.

RewriteEngine On 
RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [OR] 
RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] 
RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] 
RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] 
RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] 
RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] 
RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] 
RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] 
RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] 
RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] 
RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] 
RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] 
RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] 
RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] 
RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] 
RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] 
RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] 
RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] 
RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] 
RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] 
RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] 
RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] 
RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] 
RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] 
RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] 
RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] 
RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] 
RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] 
RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] 
RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] 
RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] 
RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] 
RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] 
RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] 
RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] 
RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] 
RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] 
RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] 
RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] 
RewriteCond %{HTTP_USER_AGENT} ^Zeus 
RewriteRule ^.* - [F,L]

Once you finish this save this file and Check your website.

Important: Do not Change anything if you really don't know what you're doing!
This will leads to crash your website.

2Method: Using Robots.txt

This method is very simple method which block the bad bots using robots.txt file. Login to your File Manager using your host provider or FTP clients like Filezilla(recommended), WinSCPCyberduck, etc., Once you log into your account, take a backup for your robots.txt (Don’t forget since its really important). Copy the following code and Replace with the following. To edit robots.txt file you can use any text editor like Notepad.

User-agent: *
Disallow: / 

User-agent: Googlebot 
Allow: / 

User-agent: Yahoo-slurp 
Allow: / 

User-agent: Msnbot 
Allow: /

Once you finish this save this file and Check your website.

Important: Do not Change anything if you really don't know what you're doing!
This will leads to crash your website.

3Method: Using plugins

If you’re not clear in programming, We recommend to use this method to avoid issues. This method is very easy and you can finish by mouse clicks. This method doesn’t need any programming knowledge.

Here are the popular plugins which helps you to block the bad bots from hackers:

Top 5 plugins to block bad bots in WordPress

  1. Cerber Security & Limit Login Attempts (Recommended Free version)
  2. Wordfence Security
  3. Sucuri Security – Auditing, Malware Scanner and Security Hardening
  4. Blackhole for Bad Bots
  5. StopBadBots

Install any of the above plugin & activate the plugin; and follow the guidelines according to the plugin author.

Any questions, Please ask us in comments!

LEAVE A REPLY

Please enter your comment!
Please enter your name here