Publishers and advertisers have been left wrestling with the challenge of overcoming the rise in bot traffic, which is also sometimes referred to as non-human traffic.
Bots accounted for nearly two-thirds of global internet traffic in the first half of 2021, according to research conducted by data security service provider Barracuda Networks. Malicious bots, meanwhile, accounted for nearly 40% of all traffic.
With this in mind, we've sought to answer some key questions about bot traffic in this article, including what it is, what it does, and how to block/remove it.
Table of content:
What Is Bot Traffic?
Bot traffic describes any non-human traffic that visits a website. Whether the website is a hugely popular news site or a small, newly published startup, the site is bound to be visited by a certain number of bots over time.
While the term bot traffic is often misconstrued to be inherently harmful, this is not always the case. There is no doubt that some bot traffic is designed to be malicious and can negatively affect Google Analytics data. These web crawlers can be used for credential stuffing, data scraping, and in some cases, even launching distributed denial of service (DDoS) attacks.
However, there are legitimate bots that are essential for the operation of specific web services, such as search engines and digital assistants. Therefore, digital publishers need to use their analytics data to discern between human behavior and the good, the bad, and the ugly of bot traffic.
Types of Bots to Watch Out for
As mentioned, specific bots are required for the operation and optimal performance of search engines and digital assistants. However, specific search engine bots are explicitly designed not to cause damage to sites and user experience.
The types of bot traffic to watch out for include:
Click bots, which are used in click spamming, make fraudulent ad clicks. For most web publishers, particularly those using pay-per-click (PPC) ads, this is considered the most damaging type of bot. This is because click bots skew data analytics, replicating web traffic that erodes advertising budgets without any benefit.
Similar to click bots, download bots also interfere with genuine user engagement data. However, rather than affecting the ad click count, they create a fake dowload count. This is most pertinent when a publisher uses a marketing funnel such as a free ebook download. Download bots create a phony download, leading to false performance data.
Spambots, also known as form-filling bots, are the most common bot. The purpose of a spambot is often to scrape contact information, including email addresses and phone numbers, create fake user accounts, or operate stolen social media accounts. They also disrupt user engagement through the distribution of unwarranted content, such as:
- Spam comments, including referral spam
- Phishing emails
- Website redirects
- Negative SEO against competitors
Spy bots are so named because they act in precisely that manner—as spies. They steal data and information, such as email addresses from websites, chat rooms, social media sites, and forums.
Scraper bots visit websites with the sole intent of stealing publishers' content. Scraper bots can create a real threat to a business and its web pages. Created by third-party scrapers, they are employed by business competitors to steal valuable content, such as lists of products and prices that are then repurposed and published by rival sites.
Imposter bots replicate human behavior by appearing as genuine website visitors. They intend to bypass online security measures and are most often responsible for DDoS activity.
What Is Good Bot Traffic?
While the above examples are undoubtedly cases of harmful bot traffic, what are some instances of good bot traffic?
The following bots are legitimate and are there to provide helpful solutions for websites and applications.
Search Engine Bots
Search engine bots are the most obvious and well-known of the “good” bots. Search engine bots crawl the web and help website owners get their websites listed in search results on Google, Yahoo, and Bing. These bots are helpful SEO tools.
Monitoring bots help publishers ensure their website is healthy and accessible while operating at peak performance. Monitoring bots operate by automatically pinging the site to ensure it is still online. If anything breaks or the site goes offline, the publisher will be automatically notified, making these bots very useful to website owners.
SEO crawlers are software that fetch and index a website and its competitors to provide data and analytics on page views, users, and content. Web admins can then use these reports to plan their content to improve their referral traffic, search visibility, and organic traffic.
Copyright bots crawl the internet, scanning for copyrighted images to ensure no one is illegally using copyrighted content without permission.
What is bad bot traffic?
Unlike the good bots we just covered, bad bot traffic can inflict a lot of harm to your website and can do a lot of damage if left unchecked. This can take the shape of sending false or spam traffic to something far more disruptive, such as ad fraud.
DDoS (Denial of Service) Networks
The DDoS bot has to be one of the oldest and nastiest bots out there.
Known as distributed denial of service bots, this software is placed on unsuspecting victims' computers and is used to take down a specific website or server.
DDoS attacks have been blamed for causing significant financial harm in the past, with Network Security service provider Corero estimating that such an attack in the US costs around $218,000 on average.
Web scrapers scrape webpages for valuable data such as email addresses and contact information.
In some circumstances, they can steal text and images from websites and re-use them on other websites or social media profiles without permission.
Click Fraud Bots
There are numerous sophisticated bots generating malicious bot traffic exclusively toward paid advertisements. Unlike bots that generate unwanted website traffic, these bots engage in ad fraud.
As the name suggests, this non-human traffic drives clicks to paid ads and costs advertisers billions of dollars every year. Frequently disguised as legitimate traffic, publishers have a good many reasons to adopt bot detection tools to help weed out the illicit traffic.
Various malicious bots exist to search millions of websites for vulnerabilities and report them to their creators.
Unlike genuine bots that notify the website owner, these malicious bots are designed to send information to a third party, which can sell the information or use it to hack the websites.
Spam bots are designed to leave messages designed by the bot’s creator on a website’s comment section. These bots need to create accounts and while CAPTCHA tests are designed to filter out software-driven account creation, these tests are not always successful.
How Does Bot Traffic Affect Websites?
An important thing to understand about bots is that most scripts and programmes are designed to perform the same task multiple times. The bot's creator undoubtedly wants the task done as quickly as possible, but this can cause a lot of issues for your website.
Bot traffic can effectively destroy businesses if they don't learn how to identify and manage and filter bot traffic. Sites that rely on advertising alongside sites that sell products and merchandise with limited inventory are particularly vulnerable.
For sites that are running ads, bots that land on the site and click on various page elements can trigger fake ad clicks. This is known as click fraud, and while it may initially increase ad revenue, once online advertising networks detect the fraud, it will usually result in the site and the owner being banned from their network.
For eCommerce sites with limited inventory, inventory hoarding bots can virtually shut down their shop by filling carts with tons of merchandise, making it unavailable for genuine shoppers to purchase.
If a bot requests information from your site regularly, it can cause it to slow down. This means that the site will be slow for everyone who visits it, which can cause significant issues for an online business. Too much bot traffic can take your entire website offline in severe circumstances.
Fortunately, this only happens in the most severe cases; most of the time, the effects of bot traffic on your website are minor. When your website receives a lot of unauthorized bot traffic, you can expect to see things like:
- More page views
- High bandwidth usage
- Incorrect Google Analytics reports
- Conversion decreases
- Junk emails
- Longer load times
- Increased bounce rate
How to Identify Bot Traffic (In Google Analytics & Other Tools)
As we move into an increasingly tech-driven future, search engine crawler bots are becoming smarter by the day. A report released by Imperva last year found that bots comprised almost 41% of Internet traffic, while bad bots accounted for more than a quarter of traffic.
Web publishers and designers can identify bot traffic by examining the network requests to their sites. Utilizing an integrated analytics tool such as Google Analytics will further help website owners identify traffic bots in their website traffic.
The hallmarks of bot traffic include the following characteristics:
Abnormally High Pageviews
When a website sees a sudden unexpected and unprecedented spike in page views, bots are generally the culprit.
Abnormally High Bounce Rate
Bounce rate is a metric that measures the percentage of people who land on your website and do nothing on the page once they are there. An unexpected increase in bounce rate can indicate that bots are being directed to a single page.
Surprisingly High or Low Session Duration
Session duration is the amount of time users spend on a website once they are there. Human behavior dictates that this should remain regularly steady. However, if there is a sudden and unexpected increase in session duration, this likely indicates that a bot is browsing the site at an unusually slow rate.
Conversely, if there is an unusually low session duration, this could indicate a bot crawling pages at a much faster rate than a human.
Junk conversions can be detected through a surge in the number of fraudulent conversions appearing. Junk conversions will appear as an increase in accounts being created with nonsensical email addresses or contact forms filled out with a fake name, phone number, or address.
Spike in Traffic From an Unexpected Location
A sudden spike in website traffic from a specific geographic region, particularly an area that is unlikely to have native speakers of the language the site is written in, is another standard indicator of bot traffic.
How to Stop Bot Traffic
Once a company or agency has learned how to identify bot traffic, it is imperative that they gain the knowledge and tools needed to stop bot traffic negatively affecting their site.
The following tools can help minimize threats:
Traffic arbitrage is the practice of paying to bring traffic to a website to ensure high-yielding PPC/CPM-based campaigns. By only purchasing traffic from known sources, site owners can reduce the risk of bad bot traffic.
Placing a robots.txt file will assist in keeping bad bots away from a site.
Publishers can compile a list of offensive IP addresses and deny those visit requests on their website, thereby reducing the number of DDoS attacks.
Use Type-Challenge Response Tests
One of the simplest and most common ways to detect bot traffic is to utilize CAPTCHA on the sign-up or download form. This is particularly useful in stopping download and spambots.
Scrutinize Log Files
For web admins who have a sophisticated understanding of data and analytics, examining server error log files can help find and fix website errors caused by bots.
How to Detect Bot Traffic in Google Analytics
For publishers using Google Analytics, there are some simple ways to set up your site to filter out bot traffic.
- Firstly, log in to your Google Analytics account.
- Visit Google Analytics Admin Panel
- Next, Navigate to View Settings in the View tab.
- Scroll down to the Bot Filtering checkbox.
- Click Check in the checkbox if unchecked.
- Finally, hit Save.
Why Is It Important to Protect Your Ads?
Any website that is running Pay Per Click ads will at some point be hit by bot traffic in one form or another. It is imperative that publishers take steps to protect their ads, or bot traffic will eventually cause the following issues:
- Website data and analytics may become skewed
- Website load time and performance may begin to deteriorate
- Websites become vulnerable to botnets, DDoS attacks, and ultimately negative SEO results
- CPC is negatively affected, and ultimately revenue may be lost
It is important that you don't ignore bot traffic, as it can become incredibly costly for any company with a digital presence. While there are several techniques to reduce abusive bot traffic, the most successful is to invest in a specialised bot management solution.
Are you a digital publisher who needs help monitoring and detecting bot traffic?
Publift helps digital publishers get the most out of the ads on their websites. Publift has helped its clients realize an average 55% uplift in ad revenue since 2015, through the use of cutting-edge programmatic advertising technology paired with impartial and ethical guidance.
Contact us today to learn more about how Publift can help boost your ad revenue and grow your business!
Knowing Bot Traffic FAQ
1. What is an Internet Bot
Internet bots are any non-human traffic that visits a website. They appear and act almost like a human would, except they have been manufactured to perform a specified task by their creator.
2. What Causes Bot Traffic?
Bots can visit a website to determine search engine rankings or to analyze SEO. However, malicious bots can visit a website to steal contact information, create phishing accounts or conduct DDoS attacks.
3. Should You Block Bots?
Not every bot is malicious, and websites should allow bots who visit a site to determine its search ranking and monitor a site’s health. However, websites should use CAPTCHA to block scraping and spam bots or any other bot that will harm a site.
4. Does Bot Traffic Affect SEO?
Malicious bots negatively affect SEO. They do this by slowing a website's load and response times and coordinating DDoS attacks.
5. What is Fake Traffic?
Fake traffic refers to the amount of non-human traffic—bots—that visit a website. These are not real people or customers so they are considered fake.
6. Are Traffic Bots Legal?
Traffic bots are considered legal, but some US state governments have started to take action against malicious bots. If this trend continues, traffic bot legality may be questioned at a national level.
7. What is a Search Bot?
Search engine bots—also known as search bots—are used by search engines to crawl through websites and determine the ranking of appearance during a user’s search.
8. What is Referral Spam?
Referral spam is when bots create fake traffic on a site to stuff a Google Analytics referral report with spammy links. The goal is to encourage a GA user to click on the link, which then takes them to a malware-ridden site or service scam.
9. Can Direct Traffic Include Crawlers?
In some cases, direct traffic can include search crawlers. Google Analytics tends to filter out most crawlers, but occasionally some are still misrepresented as human traffic.
10. What Percentage Of Internet Traffic is Bots?
Bots make up 66% of all internet traffic with malicious bots accounting for 40% of all traffic, according to research conducted in 2021 by Barracuda Networks.
What is an Internet Bot