Next-Generation Cybersecurity Telemetry Offers Promise
Across the federal government, agencies are dealing with an explosion of cybersecurity data from new sensors, hyper-scale cloud infrastructure, microservices and a geographically distributed workforce—and the pace shows no sign of slowing.
Automation drives the ability for agencies to process and analyze these massive workloads, but if not deployed and managed with proper expertise, they can add complexity and risk.
That, in turn, points to an even greater issue: the hundreds of thousands of open jobs in cyber analytics and related domains. This labor shortage has persisted for more than a decade. It gained speed during the pandemic and is accelerating.
Finally, the rapid evolution of sophisticated cyber attacks reputedly makes ransomware look like kindergarten cyber work. Yet, far more dangerous are the mountains of poorly written software, misconfigured systems, shortcuts and ignored problems that leave infrastructure vulnerable in the first place.
Security and Risk Management (SRM) leaders are always looking to improve infrastructure security operations productivity, detection and response. There are too many security tools from different vendors with subpar integration and automation for incident response. Often these tools operate in isolated data silos without visibility or correlation. How can SRM leaders ensure their agencies’ information assets are secure, that their cybersecurity protocols are clearly defined and measurable and that their cyber workforce is empowered to use advanced capabilities to deter sophisticated cyber attacks?
The answer: Telemetry, the use of automation to manage communications across multiple data sources and speed the detection of threats. Not just collection of data, but how that data is used at wire-speed—to include alert and incident correlation as well as built-in automation. One approach that has been used successfully follows six steps:
Automation-friendly incident response plans (IRPs)—for automation to work at scale, agencies need tightly coupled processes baked into an agency’s IRP. The model IRP must include industry-leading standards, a detailed understanding of the information technology (IT) environment and mission, and the ability to respond to emerging threats with agility and speed.
Strategic application—for years, government agencies have been tasked with doing more with less. Decision makers must empower people and machines to do what each does best. The clock is ticking. Wire-speed attacks require agencies to respond just that fast, using automation coupled with human expertise. To support this speed of response, machine learning tools can perform detection and pattern matching in nanoseconds to speed and enhance decision-making for the best response to a cyber attack.
Detection as code as the norm—“Detection as Code” is gaining in popularity in the cybersecurity community, and rightfully so. Detection as Code is one way to automate security analytics and help agencies move away from manual to automated threat detection techniques. By incorporating threat detection into the software development process, agencies can make incredible improvements to their cybersecurity postures. Detection as Code is an approach similar to DevOps that can dramatically change the way teams operate. Integrating threat detection with software engineering allows development teams to tailor alerts and responses to their code rather than rely on cyber tools crawling through the mountain of log data generated by modern applications and infrastructure. In addition to incorporating this into cyber range work for the U.S. Army, there is movement toward using this approach in work with the Department of Homeland Security’s Continuous Diagnostics and
Mitigation (CDM), where implementation of CDM capabilities as a true enterprise service on agency networks helps to automate detection and enables automatic reporting to the CDM Federal Dashboard.
Continued CI/CD, for continuous integration/continuous deployment, must continue along with acceleration of the implementation of security testing as a part of the CI/CD pipeline. This approach improves existing cyber defenses and drives fresh innovations, creating safer, large-volume data processing workflows.
Centralization is vital. Most agencies operate in multiple cloud environments. This creates several challenges for information technology leaders and increases the threat surface. Agencies must avoid moving data around from environment to environment to reduce costs and manage cyber risks. New approaches to cyber analytics allow agencies to deploy centralized consoles in conjunction with search infrastructure distributed across multiple cloud platforms. Distributing analytic processing across multiple clouds significantly reduces data movement and the costs associated with cloud data egress.
Federated cloud computing—as multicloud environments proliferate, so will federated cloud computing. And it makes sense. The deployment and management of multiple external and internal cloud services can accelerate mission success safely and securely. Cloud providers can also offer high-speed analytics capabilities that work on-premises, eliminating the need—and risk—of data transfer between environments.
The goal of using advanced cybersecurity telemetry to detect and deter threats is within reach. All it takes is a coordinated focus on harnessing the promise of data and the power of automation to defend at machine speed and deny adversaries the opportunity to win.
Dan Smith is vice president of strategic initiatives at ManTech.
ManTech sponsors AFCEA International’s annual The Cyber Edge writing contest, open to thought leaders and subject matter experts in the military, government, academia and industry. This year’s contest theme is Emerging Technologies in the Cyber Realm. Three authors will win monetary prizes and publication in SIGNAL Media. Top prize is $5,000; second prize is $2,000; and third prize is $1,000. Submissions are due February 28.