Home » Metrics » Got Big Data? Free It For The Greatest Results

Got Big Data Free It For The Greatest ResultsMeasurement from [The Deep Space Climate Observatory] will be intercepted by ground stations and piped to NOAA’s Space Weather Prediction Center in Boulder, Colo. There, automated software will process the data within seconds and turn them into “actionable information,” … “It cannot be more than 5 minutes between measurement on the spacecraft and when an operator in Boulder picks up the phone or sends out an e-mail warning.” Protecting the Power Grid From Solar Storms, IEEE Spectrum, April 2013.

So a message that comes from a satellite 1.5 million kilometers in space, and gets processed within seconds requires someone to pick up a phone or type up an email to get the information passed to those who need it?

I hope not.

Last century, as a young lieutenant in the Air Force, I had the responsibility to ensure that any early warning message from our spaceborne sensors got passed within seconds to the national command authority in Washington DC. This was while the cold war was still ongoing and it was to alert everyone to an impending nuclear attack. Back then we had our system immediately forward messages to everyone who would need to see them, worldwide.

This century, as a director of software development, we had this fancy new time tracking system that everyone had to use to record the work they did every day. As a director, one would think I could just pull up and look at the results each day. Nope. Instead, the data was kept by the human resources department and only passed to me and my fellow directors after the general manager and COO had first seen the data.

For the rest of the story, see What To Do With Your Staff Numbers

I’m always surprised how we acquire and implement these wonderful automated tools and then we put in a large air gap or otherwise disconnect the results of these tools from everyone who could use the data. There is often this mindset that now that we’ve captured all this data, that someone needs to check it and control it and hold it back. We feel that the data can only be released to small controlled groups or with restrictions.

At the National Security Agency, we were always very stingy about explaining where we got our data from or even sharing the exact data we had gotten, because we didn’t want to compromise our sources. So while I agree that in some situations control of information is important, in the rest of the world I’ve rarely seen where this kind of overcontrol was anything more than control for control’s sake or the fear of the unknown.

The best approach I’ve seen is where when we’ve turned on a new system, we’ve allowed the information to flow out to everyone with the warning that we need everyone to ensure their data is accurate and everything is working correctly before we start drawing conclusions. Yes, we still had people who trumpeted problems with what turned out to be bad data. But having multiple eyes looking at and trying to use a new system has always resulted in a faster bring-up and speed to usefulness than holding it all back and saying “we’ll let you see some of your results after someone looks at it first.”

Is that powerful new tool you are implementing being held back by a reluctance to let the data flow unimpeded to everyone who can use it?

Thank you for sharing!