Enable breadcrumbs token at /includes/pageheader.html.twig

Behavorial Biometrics Can Supplement Traditional Identity Verification

Technology can identify users by how they use their keyboard and mouse.

Artificial intelligence and machine learning are powering a new generation of technology that can identify computer users by the way they handle their keyboard and mouse.

Known as behavioral biometrics, the technology provides a way to continuously authenticate users—guarding against credential theft and account takeover, two of the most common forms of online attacks.

Behavioral biometrics isn’t designed to replace so-called static authentication—the login with a PIN, password, token or conventional biometric. Instead, behavioral biometrics supplements traditional forms of online identity verification, according to attendees at AFCEA’s Federal Identity Summit.

“On average, it takes about 20 minutes of normal user activity [at a computer] to create a profile,” explained Jim Fischer, the federal practice lead for Plurilock. A software agent installed on the endpoint collects data about the user’s behavior and feeds it to Plurilock’s AI engine, which can be installed either on premise or in the cloud.

The profile is built on data about the way users type and manipulate the mouse, which means it can be gathered in the background while the user goes about their business on the network. “It’s basically invisible to the user,” Fischer said.

Once the profile is created, the AI engine checks the user’s behavior against it every 3-5 seconds. The checks generate “a kind of scoring system … a confidence level about whether or not the user really is” the person who they logged in as, said Fischer.

Although the initial profile is generated within minutes, “The software keeps learning. … It takes about a week of normal activity for the profile to become fully mature.”

Anomalous behavior—like a user typing much faster or slower than normal—triggers a response which, depending how system administrators configure the software, can vary from a simple alert to the network’s security operations center, through a challenge to the user to log in again, all the way to kicking the user off altogether.

Fischer said administrators can calibrate the response based on the scoring system and other factors like, “Is the user a privileged user? Are they working on sensitive data? What’s their location? … It’s very granular.”

He said the speed in which the software reacts was key to its value proposition. “It’s an automatic response generated within seconds,” he said. “You have to act at machine speed to mitigate [cyber] attacks because the damage can be done within minutes or even seconds.”

According to the International Biometrics and Identity Association—a trade group for identity vendors—behavioral biometrics have already been deployed in the commercial sector in the Internet banking, e-commerce and online payments markets.

But Plurilock and other vendors hope to bring it to the federal market and Fischer says his company’s technology was piloted last year by the Defense Information Systems Agency, or DISA, and successfully field-tested this summer by the Army’s Network Enterprise Technology Command, or NetCom, based at Fort Huachuca, Arizona.

Similar technology was looked at several years ago by the Defense Advanced Research Projects Agency (DARPA), he said, but was plagued by “too many false positives … The [AI] engine wasn’t sophisticated enough” to distinguish between normal variations in user behavior and anomalies that might mean an attacker had stolen credentials and was impersonating them.

“When I’m caffeinated in the mornings, my interactions [with the keyboard] are different,” Fischer said.