The perception that big data is still incredibly complex to understand is right on the money. There are examples of how the complexity is getting abstracted. But still, even when I talk to data scientists they will say how they are still mastering the tools to simply show data in something besides a spreadsheet.
Marck Vaisman, a freelance data scientist, said in an interview that the issue for most everyone is as much about the technology’s complexity as it is about the people who are trying to figure out what to do. Too often, only a few people actually understand how to use the data that they have. And their ability to explain it is limited. Further, there are few organizations that take holistic approaches to doing data analytics, leaving instead a fragmented array of projects that have differing approaches and overlap and do little to provide value.
Pitfalls abound but the fundamentals are what matters. Vaisman distills it down to five commandments, a set of principles he has developed and written about in a book by O’Reilly Publishing called “The Bad Data Handbook.”
In this confusion, we find vendors who promise great speed and ability. But often, what companies need is a more calibrated approach that matches the amount of data they want to work for them with applications that have a correlating velocity. Executives from Accenture, which has built a practice around data analytics, outlined how they view the market in a brief interview I also did at Strata.
Data has to be a strategic asset. The presence of consultants at a conference like Strata shows how much confusion people still have in realizing how to get the value that vendors promise in such bountiful amounts.