Enable breadcrumbs token at /includes/pageheader.html.twig

Determining What Is Secure or Not Secure: That Is the Challenge

Are there more efficient ways to streamline the ARTPC's data identification process, or would these just add layers to what is envisioned as an already more simplified effort? Discuss your ideas, express your concerns here. We welcome your unclassified data input.
The roots of the Army's Research and Technology Protection Center (ARTPC) don't date back to Shakespearean times, but the center's genesis took place early in the 21st century with the original mission of protecting technologies of the now-canceled Future Combat Systems program. In this month's SIGNAL Magazine, Henry S. Kenyon takes to the stage to describe the evolution and goals of the ARTPC in his article, "Center Spotlights Critical Information." A 2001 security report commissioned by then Army Chief of Staff Gen. Eric Shinseki, USA, made clear the need for coordinated security efforts. ARTPC's chief, Richard Henson, explains the general's reasoning:
A lot of people were working on it [security], but there was no one place where it either all came together or where it was all visible. As a result, the Army created the ARTPC as a center of excellence to focus on research and technology protection activities.
The ARTPC officially opened its curtains in 2002. It's up to the center's experts to help PEOs and PMs identify critical program information and help them decide how to protect it, Henson says. During its FCS days, the center focused on developing a systematic way of ID'ing crucial data and standardizing its protection. These actions are evident in the role the ARTPC plays today. Using a two-act ensemble-technology protection engineers and program protection architects-the ARTPC supports Army programs, working together to address different sides of the ID and classification process. In act one, technology protection engineers, embedded in major Army R&D efforts, work directly with PMs. They ID a program's critical technologies, seek a balance between security and info sharing, and bridge the language gap between intel experts and researchers. Act two features the program protection architects, who help a PM develop specific protection plans. Then they identify and recommend methods for particular scenarios. Figuring out what should be classified is the next step. Does the program have an updated guide to classification levels? Is the technology so old that it's already recognizable in the public domain? Is it service-unique? Will its dissemination place the nation in harm's way?  To achieve horizontal protection, the ARTPC and the OSD, Undersecretary of Defense for AT&L, are developing an acquisition security database. Implementing the Defense Department's Instruction 5200.39-which sets guidelines for protecting critical program information-is another of the center's tasks. The ARTPC has undergone enormous changes in the last two years to address its goals, one of which is staff structural changes. Although Defense Department auditing found shortcomings in the center's implementation of countermeasures and oversight, the ARTPC is now a government organization that can assume different responsibilities where it couldn't with a contractor-only staff. Improvements in structure and processes are ongoing, as with all organizations that must remain relevant, so no curtain call is imminent for the center. Are there more efficient ways to streamline the ARTPC's data identification process, or would these just add layers to what is envisioned as an already more simplified effort? Discuss your ideas, express your concerns here. We welcome your unclassified data input.

Comment

Data classification and its close cousin, asset identification, are the gnarly knot of security programs. Are there any data or servers that are *not* critical on a program? Probably not. Every risk management regime starts with "identify critical information assets". Those responsible for that classification soon run headlong into the fact that no one is willing to stand up and volunteer their data or servers as "non-critical". I propose a different approach, that of a perimeter (sorry Jericho Foundation :-) Identify the organizations and people who have access to the particular program that needs to be secured. Build a virtual wall around those people and the networks they use to accomplish the assigned task. Harden servers, encrypt data, require strong authentication and monitor everyone's activity. Only in this way can some level of security be applied to a particular program. -Stiennon

Comments

The content of this field is kept private and will not be shown publicly.

Plain text

  • No HTML tags allowed.
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.