Everything to Know About Bot Traffic

Publift
April 15, 2021


What Is Bot Traffic?

Bot traffic describes any non-human traffic that visits a website. Whether the website is a hugely popular news site or a small, newly published startup, the site is bound to be visited by a certain number of bots over time.

While the term 'bot traffic' is often misconstrued to be inherently harmful, this is not always the case. There is no doubt that some bot traffic is designed to be malicious and can negatively affect analytics data. These web crawlers can be used for credential stuffing, data scraping, and in some cases, even launching distributed denial of service (DDoS) attacks.

However, web robots are also essential for the operation of specific web services such as search engines and digital assistants. Therefore, digital publishers need to use their analytics data to discern between human behavior and the good, the bad, and the ugly of bot traffic.


Types of Bots to Watch Out For

As mentioned, specific bots are required for the operation and optimal performance of search engines and digital assistants. However, specific search engine bots are explicitly designed to cause damage to sites and user experience.

The types of bots to watch out for include:

Click Bots

Click bots are used for click spamming by making fraudulent ad clicks. For most web publishers, particularly those using Pay Per Click (PPC ads), this is considered the most damaging type of bot. This is because click bots cause a skew in the data analytics, replicating web traffic and consequently eroding the budget with no benefit to the publisher.

Download Bots

Similar to click bots, download bots also interfere with genuine user engagement data. However, rather than affecting the ad click count, they create a fake download count. This is most pertinent when a publisher uses a marketing funnel, for example, a free e-book download. Download bots create a phony download, leading to false performance data.

Spam Bots


Spam Comment on Blog Post

Spambots are the most common bot. The purpose of a spambot is often to scrape contact information, including email addresses and phone numbers, create fake user accounts, or operate stolen social media accounts. They also disrupt user engagement through the distribution of unwarranted content, such as:

  • Spam comments, including referral spam
  • Phishing emails
  • Ads
  • Website redirects
  • Negative SEO against competitors

Spy Bots

Spy bots are so named because they act in precisely such a manner - as spies. They steal data and information, such as email addresses from websites, chat rooms, social media sites, and forums.

Scraper Bots

Scraper bots visit websites with the sole malicious intent of stealing publishers' content. Scraper bots can create a real threat to a business and its web pages. Created by third-party scrapers, they are employed by business competitors to steal valuable content, such as lists of products and prices which are then repurposed and published on competitor sites.

Imposter Bots

Imposter bots replicate human behavior by appearing as genuine website visitors. They intend to bypass online security measures, and they are the bots most often responsible for DDoS activity.

What Is Good Bot Traffic?

Good Bot Traffic Vs. Bad Bot Traffic


While the above examples are undoubtedly cases of harmful bot traffic, what are some instances of good bot traffic?

The following bots are legitimate and are there to provide helpful solutions for websites and applications.

Search Engine Bots

Search engine bots are the most obvious and well-known of the 'good' bots. Search engine bots crawl the web and help website owners get their websites listed in search results on Google, Yahoo, and Bing. These bots are helpful SEO tools.

Monitoring Bots

Monitoring bots help publishers ensure their website is healthy and accessible while operating at peak performance. Monitoring bots operate by automatically pinging the site to ensure it is still online. If anything breaks or the site goes offline, the publisher will be automatically notified, making these bots very useful to site owners.

SEO Crawlers

SEO crawlers are software that crawls a website and its competitors to provide data and analytics on page views, users, and content. Web admins can then use these reports to plan their content to improve their referral traffic, search visibility, and organic traffic.

Copyright Bots

Copyright bots crawl the internet, scanning for copyrighted images to ensure no one is illegally using copyrighted content without permission.

How Does Bot Traffic Affect Websites?

Bot traffic can effectively destroy businesses if they don't learn how to identify and manage bot traffic. Sites that rely on advertising alongside sites that sell products and merchandise with limited inventory are particularly vulnerable.

For sites that are running ads, bots that land on the site and click on various page elements can trigger fake ad clicks. This is known as click fraud, and while it may initially increase ad revenue, once online advertising networks detect the fraud, it will usually result in the site and the owner getting banned from their network.

For eCommerce sites with limited inventory, inventory hoarding bots can virtually shut down their shop by filling carts with tons of merchandise, making it unavailable for purchase by genuine shoppers.

How to Identify Bot Traffic

As we move into an increasingly tech-driven future, search engine crawler bots are getting smarter by the day. A report released by Imperva in 2020 found that bots comprised almost 40% of Internet traffic, out of which bad bots were the most significant offenders.

Web publishers and designers can identify bot traffic by examining the network requests to their sites. Utilizing an integrated analytics tool such as Google Analytics will further help website owners identify traffic bots in their website traffic.

The hallmarks of bot traffic include the following characteristics:

Abnormally High Pageviews

When a website sees a sudden unexpected and unprecedented spike in page views, bots are generally the culprit.

Abnormally High Bounce Rate

Bounce rate is a metric that measures the percentage of people who land on your website and do nothing on the page once they are there. An unexpected increase in bounce rate can indicate that bots are being directed to a single page.

Surprisingly High or Low Session Duration

Session duration is the amount of time users spend on a website once they are there. Human behavior dictates that this should remain regularly steady. However, if there is a sudden and unexpected increase in session duration, this likely indicates that a bot is browsing the site at an unusually slow rate.
Conversely, if there is an unusually low session duration, this could indicate a bot crawling pages at a much faster rate than a human.

Junk Conversions

Junk conversions can be detected through a surge in the number of fraudulent conversions appearing. Junk conversions will appear as an increase in accounts being created with nonsensical email addresses or contact forms filled out with a fake name, phone number, or address.

Spike in Traffic From an Unexpected Location

A sudden spike in website traffic from a specific geographic region, particularly an area that is unlikely to have native speakers of the language the site is written in, is another standard indicator of bot traffic.


How to Stop Bot Traffic

Once a company or agency has learned how to identify bot traffic, it is imperative that they gain the knowledge and tools needed to stop bot traffic negatively affecting their site.

The following tools will help minimize threats:

  1. Legitimate Arbitrage

Traffic Arbitrage is the practice of paying to bring traffic to a website to ensure high-yielding PPC/CPM-based campaigns. By only purchasing traffic from known sources, site owners can reduce the risk of bad bot traffic.

  1. Use Robots.txt

Placing a robots.txt file will assist in keeping bad bots away from a site.

  1. JavaScript for Alerts

Site owners can place a contextual JavaScript (JS) to alert them whenever a bot appears to enter the website.

  1. DDOS Lists

Publishers can compile a list of offensive IP addresses and deny those visit requests on their website, thereby reducing the number of DDoS attacks.

  1. Use Type-Challenge Response Tests

One of the simplest and most common ways to detect bot traffic is to utilize CAPTCHA on the sign-up or download form. This is particularly useful in stopping download and spambots.

  1. Scrutinize Log Files

For web admins who have a sophisticated understanding of data and analytics, examining server error log files can help find and fix website errors caused by bots.


How to Detect Bot Traffic in Google Analytics

Bot Traffic Filtering Settings in Google Analytics


For publishers using Google Analytics, there are some simple ways to set up your site to filter out bot traffic.

  1. Firstly, visit the Google Analytics Admin Panel. 
  2. Next, Navigate to View Settings in the View tab.
  3. Scroll down to the Bot Filtering checkbox.
  4. Click Check in the checkbox if unchecked.
  5. Finally, hit Save.


Why Is It Important to Protect Your Ads?

Any website that is running Pay Per Click ads will at some point be hit by bot traffic of one form or another. It is imperative that publishers take steps to protect their ads, or bot traffic will eventually cause the following issues:

  • Website data and analytics may become skewed
  • Website load time and performance may begin to deteriorate
  • Websites become vulnerable to botnets, DDOS attacks, and ultimately negative SEO results
  • CPC is negatively affected, and ultimately revenue may be lost

Are you a digital publisher who needs help monitoring bots?

At Publift, we pair simplified, cutting-edge programmatic advertising technology with impartial guidance to help our clients understand the ad tech landscape and get the most out of the ads on their websites.
Contact our friendly team to learn more today.

Calculate your potential