Define 25% ~ Source & Structure 20% ~ Customise 30% ~ Optimally Visualise 25%
The World of a- Myths and Reality
In the evolving digital world, data is the new currency. Lately, there is a lot of hype surrounding data and its analytics. Organisations are looking to monetize data and generate value in unprecedented ways. There is very little doubt in the fact that data generates great value for businesses across sectors. However, there are commensurate risks that come along with its analytics, if not done accurately.
For instance, one of the telecom operators seeking to predict customer churn rate recently drew wrong conclusions on customer lifetime profile and targeted the wrong customers. This happened due to the incurring of major unwanted marketing cost. The flawed analytics was the result of analysing a data set with conflicting values.
Thus, data may be a treasure trove but analytics is certainly not a magic wand.
There are many questions that commonly occur around ‘real objectives of analytics,’ ‘data sourcing and mapping,’ ‘data architecture,’ ‘bespoke algorithms,’ ‘visualisation and presentation for smart and effective decision-making.’ In this article we will try and explore answers to these questions.
Understanding the Ingredients
Let’s start with the 1st theme, ‘real objective of analytics.’ More often than not organisations just embark on the journey of data analytics by buying a ready-made solution. However, after the initial euphoria, the solution fizzles out. There are three main factors for this:
- First, non-involvement of relevant functional owners during the analytics definition phase;
- Second, absence of bespoke analytics solution/approach/mind-set – most of the solutions are off-the-shelf and preach bolt-on approach; and
- Third, limited sectoral view in the solutions available in the market
In order to avoid these pitfalls, it is imperative that every organisation spends considerable time on the conceptualisation phase by involving the actual users of the solution, defining the objectives of doing the analytics and customising the nature/type of analytics in line with the organisation’s -objectives, business needs or its culture. On a ballpark, 26-28% of the total project time should be spent on this phase.
The 2nd theme involves the data itself. The large volume, variety, velocity and veracity of data results in data overloading, time consuming integration and interoperability issues. There are two reasons for the same:
- First, data is available in multiple formats, such as text, excel, PDFs, images, speech, and videos from varied sources; and
- Second, people don’t understand data fully. ‘More is not always better’. The vast generation of data (especially from social media and digital media,) adds lot of noise, lowering the overall quality. Many a times, the data to be used for analyses is incomplete and inconsistent.
As a solution, a data filtering workshop is required to screen through relevant data and shortlist the logical match. It is important to minutely map data source for each analytical test to avoid getting false positive/negatives.
The 3rd theme is around developing good Data Architecture and obtaining data from various sources in an efficient manner. Most ERP extraction tools have limitation to pull data from multiple sources. For instance, data on customers’ browsing preferences, content consumption, purchasing behaviour and preferred payment medium is available. However, due to unsophisticated platform and silos between various functions, companies fail to generate a consolidated profile on customers. Recently a major global insurance company made a heavy investment of US$300 million to implement advanced big data solution to get a single holistic view on the customer across the enterprise. As a solution, along with the ERP extractor various ETF tools should also be evaluated. Depending upon ERP, some of these are quite – cost-effective and certified by the ERP itself.
The 4th theme is around coding algorithms to extract tangible insights. The most fundamental issue is around flexibility in coding the algorithms to factor in both structured (ERP etc.) and unstructured data (emails etc.) in order to provide – decision-making insights to the reader. A good way is to look for a bespoke solution which is fully customisable in line with user requirements.
The 5thand the last theme is around visualisation. The learnings from the data need to be disseminated to the right people in a timely manner. Further, the output should be compatible with handheld devices such as smartphones and tablets where information can be displayed using advanced visualization techniques. A great example where this is effectively used is automotive industry where real-time vehicle diagnostic data (such as tyre pressure, challenges in oil tank, engine or brakes) is used to identify any potential risk. The less-traffic routes and weather conditions are communicated to drivers based on sensor and telematics data. Hence, such analysis should be timely shared with right audience using right visualization platforms to generate value.
To succeed, intelligent adoption is key to avoiding data analytics backlash
In nutshell, though many may claim to understand analytics, only a few have actually mastered it to generate intelligent insights.
Incorrect data can be catastrophic for the user and can cause more harm than good. The data analytics solution should be sophisticated enough to discover opportunities, manage risk and reduce frauds. It should be able to handle complex data while testing its quality using proper monitoring parameters.
Moreover, the data disconnect between different functions of an organization should be addressed through a holistic analysis of data.
There is also a need of competent resources that can deal with the complexities of data. A team with an effective mix of business intelligence professionals, data scientists and visualization experts is required to form data analytics strategy and deal with risks.
Lastly, security of data and privacy has become a major concern for organizations. Companies should effectively manage cyber and regulatory risks which can cause monetary as well as reputational losses. Anonymity, de-identification and encryption are some of the methods to address privacy concerns. Hence, a well-thought of strategy addressing all the themes highlighted above can help the firms to leverage big opportunities from data.