Automated attacks skyrocket with bad bots making up 40 per cent of all traffic

Most traffic originates from public data centres.

Automated bot traffic has significantly increased over the past few years with bad bots accounting for a huge 40 per cent of all traffic, according to a new report from Barracuda, a trusted partner and leading provider of cloud-enabled security solutions.

The new report, Bot attacks: Top Threats and Trends Insights into the Growing Number of Automated Attacks, explores emerging traffic patterns, live examples of bot behaviour and detection, and the steps IT teams should take to protect their businesses.

Automated bot traffic has significantly increased over the past few years. Originally used primarily by search engines, bots now have various uses, both good and bad. Good bots carry out useful tasks as search engine crawlers, social network bots, aggregator crawlers and monitoring bots, whereas bad bots or ‘malware bots’ are often used for hacking, spamming, spying, and compromising websites.

Analysing internet traffic patterns over the first six months of 2021, the report reveals that only a quarter (25 per cent) of all traffic is good bot activity, while bad bot activity makes up almost double this, ranging from basic scrapers used to steal data from applications to inventory hoarding, account takeover attacks, distributed denial of service (DDoS) attacks and advanced persistent bots that seek to evade detection.

Results reveal that most of the traffic analysed in the report came from AWS and Microsoft Azure public clouds, which Barracuda researchers report may make it easy for threat actors to set up accounts for their malicious bot activity.

According to the report, North America accounted for 67 per cent of bad bot traffic, followed by Europe (22 per cent) and Asia (8 per cent). And although automated, these attacks are designed to follow a normal workday, which allows them to blend into other traffic.

When it came to targets, Barracuda found that e-commerce apps and login portals are the most common target of advanced persistent bots — which are harder to detect as they closely imitate human behaviour.

“While some bots like search engine crawlers are good, our research shows that a much larger number of bots are dedicated to carrying out malicious activities at scale,” said Barracuda’s VP of product management, application security, Nitzan Miron.

“When left unchecked, these bad bots can have serious consequences for businesses and ultimately lead to a breach. That’s why it’s critically important to be prepared to detect and block these attacks.”

 

 

 

Tags:

Leave a Comment

Related posts