Tapping Into the Value of Analytics Through Use Case Discovery Workshops
We’ve all heard promising statistics about how investing in data analytics can drive powerful returns, yet in a recent Bain survey of companies that are heavily invested in big data, a full third said they lack a clear strategy for embedding data and analytics into their companies.¹
The truth is, many large analytics investments are still not paying off.
So, what gives?
The field of analytics—and, more recently, the related fields of machine learning and artificial intelligence—have been on a steep growth trajectory for 15 years. We can see this from Google trends:
We can also see this growth in the number of undergraduate and graduate programs sending throngs of hopeful data scientists into the workforce each year. And, even more telling, is that (spoiler alert) Kree from Captain Marvel was being led by the AI Supreme Intelligence.
But, despite an increase in interest and investment, there is still disillusionment, and two-thirds of analytics efforts have yet to pay off.³ And while it’s true that with every new technology there is a pattern of excitement followed by disappointment, pitfalls seem to be more prevalent in the field of analytics.
The good news is that, when you take a closer look, these pitfalls are easily surmountable. At Columbus Collaboratory, we often hold Use Case Discovery Sessions to model opportunities in analytics that bring together business stakeholders, process subject matter experts (SMEs), and data scientists to identify and prioritize high impact analytics use cases within a department or organization. The goal is to help surface and prioritize specific opportunities for advanced analytics while ensuring common pitfalls are identified and avoided.
Following are some common pitfalls we regularly encounter:
Common Analytics Pitfall #1:
Focusing on the techniques and not the solution
Right now, every vendor in the world says they “do AI,” but it was revealed in a recent article that 40% of Europe’s artificial intelligence start-ups do not actually use any AI programs in their products!
Many vendors out there also sell “deep learning,” which, for the right use case, may be wonderful. However, for some use cases, techniques such as logistic regression or generalized additive models might actually perform better. For example, at Columbus Collaboratory, we’ve started several “analytics projects” only to find that some automation and merging of databases was all that was really required.
Having a partner who can match the right solution to the use case at hand will help you avoid investing in techniques that aren’t the right fit.
Common Analytics Pitfall #2:
Not spending enough time on change management or implementation
It doesn't matter how good your analytics are if the following two things don't happen:
- The analytics solution is integrated with other systems
- People actually use it
Integration is really hard—often harder and more time-consuming than building the model in the first place. Why? Because you'll have to get in IT’s project plan, find people with access to the databases, and enlist experts on end-user systems who are already stretched too thin.
As if that’s not enough, change management is really hard, too. If you are creating innovative solutions, it is likely you are changing someone’s process, and there can be resistance.
Before you undertake these efforts, you should know what you’re up against and have access to people and processes that can help you succeed.
Common Analytics Pitfall #3:
Limiting projects or picking the wrong projects
Many business units are underserved by their analytics teams, because they are often built for a specific use case and never expand beyond it. Even in organizations with large analytics teams, there is almost always more opportunity for growth and exploration.
But being open to new ideas doesn’t mean you have to go after every opportunity. It is often better to start with a project where success is likely to be high, data is available, and integration is relatively simple. With a little bit of scoping and data exploration, one can usually predict the likelihood of success.
There are also a few guiding principles that can help prevent many of the “red flags” that can signal your analytics program will fail, such as poorly defined analytics roles, a lack of analytics translators, or analytics that are isolated from the business at large.⁴ These are:
- ALL of the right people need to be there – stakeholders, users, data stewards and a facilitator who is also a practitioner.
- A longer, more intensive meeting is more effective than many smaller meetings held over time.
- A focus on solutions is better than a focus on techniques.
- It’s OK to identify areas of improvement that aren’t necessarily “analytics” such as RPA and data governance.
If you’d like some help in uncovering opportunities for analytics in your organization, contact us to plan a half-day Use Case Discovery Session for your team. Or download the Use Case Discovery Session Fact Sheet that outlines our process and what you can expect from the resulting report.
- Brahm and Sherer. (2018). Closing the Results Gap in Advanced Analytics: Lessons from the Front Lines. Retrieved from https://www.bain.com/insights/closing-the-results-gap-in-advanced-analytics-lessons-from-the-front-lines/
- Google trends
- Zetlin, M. (2017). What is a chief analytics officer? The exec who turns data into decisions. Retrieved from https://www.cio.com/article/3235706/chief-analytics-officer.html
- Fleming et. al. (2018). Ten red flags signaling your analytics program will fail. Retrieved from https://www.mckinsey.com/business-functions/mckinsey-analytics/our-insights/ten-red-flags-signaling-your-analytics-program-will-fail