Data, Data Everywhere and Not a Thought to Think! (redux)

 

Picture

OK, I’ve blogged about this before.  I wrote that application-performance-data-aggregation was the answer to ‘too much data’.  I was wr…um,  I was wro…uh, I wasn’t exactly right.  In my defense I was young and naive and I’ve learned from my mistakes.Data Smog.  Signal-to-Noise Ratio.  Data Glut. We call it many things.  In a world where data has started consuming us, how do we sift through it all and extract meaningful information?  Merely aggregating the data into two-dimensions of visualization isn’t enough and it’s just too much work.

Yesterday I drilled down into an application performance alert, trying to make sense of the data, only to find, eventually, that it was a false alarm.  If I don’t trust the alert, I either turn it off or ignore it.  Or cover it with electrician’s tape.

Some people solve the problem of too much data by implementing “Business Intelligence” software.  This typically is  simple, visualized mashups of the different datasets.  Pretty charts; little value.  Others attempt a more comprehensive data-analytics approach, but I haven’t seen any of them cross the 80% probability threshold.

80% probability that it will rain today may make me take my umbrella to work, but I’m not going to cancel a golf-outing based on 80% confidence.

Trusting APM alerts requires that the information has a high probability of accuracy.  We compile a lot of data with intelligent analytics, but the output still needs validation and corroboration.  So, how do you raise the probability level of analytics to an answer you can trust?

Humans!  HA!  Humans have an amazing ability at solving complex problems.  We just don’t scale very well.

Computer weather-models may only give us 80% probability but add in knowledge and realtime analysis from sky-watchers and storm-chasers and you now have a model with a probability level that allows you to plan your outdoor activities.  Some of us may love to play “amateur meteorologist” and dig through all those statistics, but all I want to know is if I can go outside at recess today.  The same goes for application monitoring.  Just tell me what the problem is so that I can get it resolved quickly.

To solve this problem, we need BIG DATA to store every object we can, grind through complex analytics to discover what is truly worth noticing.   Once noticed, we need a correlation engine that will help us decide what the problem is that we need to act upon.  Once the decision to act is trusted, we need a collaboration platform that will help us crowd-source the problem, finding the technicians, subject-matter-experts and decision-makers required to solve the problem.   And once we solve the problem, we need to capture the knowledge that solved the problem so that the correlation and collaboration technologies can leverage that intelligence for future reference.

The interesting thing about this solution is that it crosses many use-cases.  A couple of startups I am working with are solving their particular problems in very much the same way.  Make their digital analytics smarter with human intelligence that allow the answer to be achieved quickly and with confidence.

So, Mr Kurzweil, until we achieve the Singularity (http://en.wikipedia.org/wiki/Technological_singularity), we simple humans still have relevance.  We just need to invent technology that scales our domain knowledge so that we don’t have to ‘redo’ our analysis over and over.

Yay us!

P-Cz