Not all bots are bad. But in the wrong hands, botnets can be commanded to do some very nefarious things, like Distributed Denial of Service–DDoS–attacks to disrupt and bring down websites. There are also malware-based bots that are increasingly being used to steal data and personal information.

By some accounts, 40 percent of many businesses’ online traffic is generated by bots. That is a good incentive for Federal managers to take a proactive stance in identifying and mitigating bad bots–especially as more government services move online, and public-sector websites become more interactive.

“Right now, many Federal websites are one-way consumption,” said Drew Reinders, a solution engineer manager with Akamai’s public sector and Latin America markets.  “As we expose more two-way interaction with public-facing sites where user names and passwords are required, that is where the risk will increase.”

In fact, President Trump’s Executive Order on Strengthening the Cybersecurity of Federal Networks and Critical Infrastructure–signed on May 11, 2017–called for “resilience against botnets and other automated, distributed threats.” The order directed the Secretary of Commerce, together with the Secretary of Homeland Security, to “lead an open and transparent process to identify and promote action by appropriate stakeholders” with the goal of “dramatically reducing threats perpetrated by automated and distributed attacks (e.g., botnets).”

Given the prevalence and different types of bots online, agencies cannot just solely rely on blocking or allowing bots, but rather they need technology that can analyze bot behavior to help them distinguish between benign and malicious traffic, industry experts said.

A bot, or Internet robot, is simply a piece of software running on a server connected to the Internet. As with any piece of software, a variety of different bot operators–ranging from individuals to criminal organizations to legitimate businesses–create bots to perform a variety of different tasks, according to Akamai. Bots can perform many tasks from search-engine indexing to website monitoring and distributed denial of services attacks.

Based upon conversations with Federal managers, Reinders sees a need for bot management tools that can provide a range of capabilities, such as:

  • Determine known and unknown bots. Artificial intelligence (AI)-based technology that analyzes bot behavior would be applicable here.
  • Categorize bots based on business impact. Bot operators are becoming more sophisticated with the ability to harness the power of the entire Internet. So, network-layering blocking is not an effective option these days.
  • Apply policies based on the type of bot.
  • Employ a wide range of detection techniques.

“Machine learning, a form of AI in which systems learn from data and improve over time, is an excellent technology that Federal agencies can deploy to analyze bot behavior,” said Sven Krasser, chief scientist with CrowdStrike, a provider of cloud-based endpoint protection tools that employ machine learning capabilities.

“Benign entities and malicious entities tend to differ,” he continued. “Machine learning works well for picking those differences up, especially if you have a nice cloud tie-in. The cloud gives you a nice vantage point because you are not just looking at the data on one system. You are looking at the behavior on many systems and then the data can be correlated for more informed analysis.”

Bot detection is complicated because security teams don’t always know what they are looking for. But if they can collect a very broad set of event data with lots of details, a machine learning algorithm can help analysts find the most important information to look for in the data.

“The security space has a lot of good data, and with the emergence of cloud there is a nice collection of consolidated data we can work with,” Krasser said. It is hard for humans to get their heads wrapped around all these nuances. Machine learning can take large data sets and extract the gist of what is important, and find the things that a person intuitively knows should stand out in a data set, Krasser explained. The key will be integrating technologies such as machine learning with human operators to achieve optimal results.

Read More About