Enable breadcrumbs token at /includes/pageheader.html.twig

SYSTEM IMPROVEMENTS AFTER THE CHRISTMAS TERROR ATTACK?

Information Technology in the federal enterprise does not work like it does in Hollywood. Although there are plenty of success stories to go around, federal IT is more limited and constrained than we would all want, for lots of reasons. Some of the reasons are just do to complexities and limited budgets. Some of the reasons are for security. Some of the reasons are because of the way the government funds its agencies and manages programs. And some of reasons are because we humans have designed things using the wrong models and implemented them to serve workflows that are flawed to begin with.

Information Technology in the federal enterprise does not work like it does in Hollywood.   Although there are plenty of success stories to go around, federal IT is more limited and constrained than we would all want, for lots of reasons. Some of the reasons are just do to complexities and limited budgets.  Some of the reasons are for security.  Some of the reasons are because of the way the government funds its agencies and manages programs. And some of reasons are because we humans have designed things using the wrong models and implemented them to serve workflows that are flawed to begin with.

This last point is something that I hope enterprise technologists all try to improve on for all our systems.

But perhaps the most important systems to consider improving right now are the ones supporting our national security decision-makers in the field.  These systems will very likely be under review in as part of the investigation into the Xmas terror attack.

There are a huge number of data-focused systems involved that could have been part of stopping this event. The most famous of these are the Terrorist Identities Datamart Environment (TIDE) and the Terrorist Screening Data Base (TSDB).  Other data sources that will need to be scrutinized exist at the Department of Homeland Security, Department of State (especially visa applications), and the intelligence community.

We can expect lots of thought will be put into how many very large enterprise grade data systems interconnect and interoperate together as part of the reviews and after actions underway right now.   I’m sure all concerned in the review of what went wrong in the national security community will realize the problems were not IT and the human element is absolutely the most important to understand.  But still there will be more that systems can smartly do to help humans do right.

I have two recommended pieces of reading for anyone involved in studying what went wrong and how to fix it. Both are from the same thinker, Jeff Jonas. Jeff publishes great thoughts on his blog at: http://jeffjonas.typepad.com, and frankly I recommend all enterprise technologists familiarize themselves with all his writings.  But two pieces stand out as being incredibly relevant to systems design to stop back actors:

Perpetual Analytics and You won’t have to ask– data will find data and relevance will find the user.

In the first piece on Perpetual Analytics Jeff describes the approach to data analytics most organizations find themselves in, where data is extracted from existing systems and secondary analysis is conducted on the extracted data across the entire enterprise whenever a question must be answered.  The ocean must be boiled every time a new analysis is conducted.  This boil the ocean approach does not scale.

He uses the term “perpetual analytics” to “…describe the process of performing real-time analytics on data streams.  Think of this like “directing the rain drops” as they fall into the ocean – placing each drop in the right place and measuring the ripples (i.e., finding relationships and relevance to the historical knowledge). Discovery is made during ingestion and relevant insight is published at that magical moment. ”

Check out the post for more, you will be glad you did, I’m sure.

The other piece I’d recommend you study is You won’t have to ask– data will find data and relevance will find the user .   It highlights a very important shift in approaches to getting the right data to find the right data and make relevance that will find a user. Here is how Jeff put it:

“Next generations of information management systems will not principally rely on users dreaming up smart questions to ask computers. Rather, this new breed of technology will make it possible for data to find itself and relevant discoveries to find the consumer (e.g., a user). And all in real time of course. While this will bring with it new policy debates like which data will be permitted to find which data and who is notified of what relevance, I am going to stay focused in this post on what this technology will enable.”

Another thing all of us who have ever worked in the federal IT business know, it can be very hard to change/improve legacy systems and legacy approaches.  But our existing systems are calling out for improvement, and I’m certain Jeff Jonas is spelling out a model that will improve upon the current approach.

Do you agree?