What causes some Big Data projects to fail?
These days, we often see many articles talking about Big Data in various verticals – manufacturing, media, insurance, oil & gas, finance & retail etc. We are living in interesting times and the next 5 years will be fantastic w.r.t quality and safety of life on many fronts – travel safety, innovation in life science, new product discoveries in manufacturing, targeting customers with right products which they care about, assessing the degree of risk to the corporation at any given point with very high accuracy and so on. All these benefits are made possible primary due to ‘Big Data Technologies’ – Hadoop, Cassandra, MongoDB, to name a few.
At Orzota, we have helped customers ranging from SMBs to large corporations in many verticals – retail, financial and manufacturing. We have provided solutions on public clouds like AWS and Rackspace, private clouds using OpenStack, and of course in data centers ranging from a few nodes to thousands of nodes.
With this experience and knowledge, I’d like to share some of the scenarios that can go wrong, if one is not careful, causing big data projects to fail. Hopefully, these highlighted areas will provoke some thought and help you plan and execute big data projects correctly and of course under planned budget!
Here are a few areas that can cause big data projects to fail:
1. Traditional way of thinking
2. Not having clear strategy and roadmap
3. Treating Hadoop as yet another data platform
4. Not clearly defining the use case(s) to solve
5. Technology focus rather than business focus
6. Selecting the wrong tool for the job
7. Not knowing and planning data access patterns
8. Not having the right team
And of course, there are many other areas where one could go wrong in Big Data projects. We will continue to share our experience. Do contact us for any help at any stage of your Big Data project life cycle.