Cyber Data Sharing is Tinisha McMillan's Mission
DISA’s division chief improves cyber situational awareness.
Network data collection, analysis and sharing are core to cyber defense, and Tinisha McMillan is on a mission to improve all three.
As division chief for the Cyber Situational Awareness and NetOps Division within the Defense Information Systems Agency (DISA), McMillan is responsible for building and providing cyber analytics and tools to enhance the department’s cyber information sharing to protect the Department of Defense Information Network (DODIN).
The division is the largest within the agency’s Cyber Development Directorate and has a unique mission to provide capabilities to the agency itself and across the department. McMillan’s team works with the Joint Forces Headquarters-DODIN, the four military services, the Joint Cyber Command and Control Office and even the Department of Energy.
One of McMillan’s initiatives is to streamline the number of cyber tools within the agency. This spring, she passed her team’s recommendations to the agency’s senior leadership. “One of the things our agency has been looking at is if there are any duplicative cyber tools that exist within the agency and looking after efficiencies. Our team has been responsible for identifying what cyber tools have been procured within the agency and providing a recommendation to senior leadership where there is duplicative effort,” she explains. “Given our fiscal climate, it’s incumbent upon us to put forth those recommendations where we might have duplicative efforts and where recommended cyber tools can be leveraged for where we’re trying to get at for increased requirements in mission activities.”
And the team left no cyber tool unturned. “There was a large gamut of various categories that were identified across the agency that the seniors are starting to take a look at. For example, how we’re doing incident response management, or how we’re doing our ticketing system, and if there were any duplicative efforts going on.”
Delivering the recommendations to senior leaders, she says, is just the beginning of the process, and she cannot yet estimate how many tools might ultimately be eliminated. “We’re still in the conversation phase right now. I don’t think the percentage of efficiencies has truly been identified yet,” she says. “As that evolves, definitely the efficiency, if we acted upon some of those convergence efforts, would truly be recognized.”
Her team also is developing an enterprise sensor strategy. For example, she indicates, the sensors could stay at integrated access points, or could be placed elsewhere for a more comprehensive view of the network. “We have been trying to sit down with our operators and understand their requirements and where we want to go with our sensing strategy, [and] where we place our sensors,” she states. “In the past, we’ve been more reactionary. I’m trying to very much be proactive in how we’re looking at the problem set.”
The strategy is designed to evolve over time. “As different threats emerge, you might have to tailor it in the future. It is an evolving process that we will continue to enhance in partnership with our stakeholders,” she offers.
Additionally, her division has been leading the department in an effort to improve efficiencies. To accomplish that goal, McMillan’s team is reviewing requirements and looking for ways to enhance continuous monitoring and provide intuitive displays of information. One of the first things she did as the division chief was to study what cyber analysts go through to perform their mission. She discovered that they have to log into far too many systems to find the information they need.
“I watched how many of the analysts had to go through multiple systems and platforms to gather data for one piece of intel to respond to senior [leaders], and it became even more important to me that we started working toward fixing the problem and working better as a team internal to our own agency,” she says.
She emphasizes the importance of an effective and efficient infrastructure to help analysts accomplish the mission. “The criticality of this mission is that we’re looking at our architecture to ensure that it’s supporting the need for more data.”
A data repository also plays a critical role. “The goal is to have the data repository that supports endpoint data, sensors and critical ingest by our analysts. Our data repository capability will better enable a single area in which our endpoint data is being ingested. How we tailor our environment for how we ingest such a large data set is critical and valuable for the Department of Defense,” she elaborates.
As part of an effort to better share data, her team has been evaluating the department’s enterprise requirements, exploring ways to enhance continuous monitoring and seeking solutions for more intuitive displays. That array of data includes information from the Host Based Security System and the Assured Compliance Assessment Solution, as well as comply-to-connect data, which ensures a degree of security before a device is allowed to connect to the network.
“As you can imagine, that’s not a small task. With the ever-increasing growth of that information, you have to be able to ensure you have an infrastructure that’s supporting that large and growing demand for information,” she offers.
For example, JFHQ-DODIN personnel may need to visualize data from U.S. Cyber Command. “We want to ensure that we’re giving them not only the data that’s available to make decisions but also that the data is query-able rapidly in order for them to do that,” McMillan says.
The division has been providing a great deal of cyber-related data during the COVID-19 pandemic. Due in part to the sudden boom in teleworking, the Defense Department saw a surge in cyber attacks, Essye Miller, the department’s principal deputy chief information officer, revealed during a virtual town hall meeting in March.
And, of course, with those increased attacks came an increased need for data. “Our teams have been working very closely with the operators to ensure the availability of visualization of those analytics supporting COVID-19 responses. This is such a unique time in our cyber community in which data reliability, availability and the visualization that our team provides is critical to defend our network,” McMillan says.
To accomplish the mission, her team relies largely on DISA’s Big Data Platform. It is a DISA-developed open source system that supports the data ingest, correlation and visualization infrastructure. The Big Data Platform common architecture can be installed across hundreds of servers in several hours, according to a DISA webpage. It also enables data, visualizations and cyber analytics to be shared with mission partners, to include: Defense Department cyber operators in other organizations, enterprise service users, cyber mission forces and cyber protection teams, and other federal agencies, and it significantly enhances the agency’s ability to rapidly share capabilities with end users.
She also cites the Security Information and Event Management, which combines event, threat and risk data into a single system to improve the detection and remediation of security issues and provide an extra layer of in-depth defense. “It’s a capability that provides real-time analysis of security alerts for data ingest that you get maybe from applications or a network hardware,” she explains.
McMillan cites automation as an emerging technology that will make a difference. “The disparity of environments has truly come to impact the responsiveness from our analysts. Most of our analysts have to pivot through a lot of systems to gather information, so we are looking at technologies for implementation to help the automation of incident responses, as well as we look at how we are automating our business processes to adapt, defend and respond in defending the DODIN,” she says. “It’s critical that we are proactively enabling our mission partners to respond to and address issues in real time.”