By Mathias Coopmans, Principal Business Solution Manager at SAS
This blog is part of a tailor-made content series centred around the SAS Forum Belux 2015. It is linked with the event track called `Digital Society’. Click here to join the event and learn more about the other 3 tracks (Internet of Things, Data Science and Data Management).
Large(r) organisations, nowadays, are terrified about being disrupted. They’re afraid that a whizz kid with a cool idea and a platform is going to pinch their customers, their partners and erode their margins. And they really should be. Scared, I mean. Because it is happening everywhere. But I consider myself a positive person. I always try to see the silver lining. And I feel that most companies are forgetting one key aspect. Yes, long-established companies tend to be slower to react to emerging trends, but they have one – very big – asset over the agile newcomers: data, and lots of it.
One of the best defences incumbents have over brand-new disruptors, is the massive amount of historical and real-time data they have. Because data - Big Data to be precise - is one of the best answers to innovate and stay relevant in these fast-moving times. With the right data and analytics, you could find new markets for existing products, discover new business models and uncover blue oceans. Big Data is so much more than an operational by-product. It is the very fuel of innovation. That’s because it allows you to detect patterns in consumer behaviour. It shows how they are changing and helps you uncover how you could adapt to this in order to profit from it.
The tragic part is: most companies know this. They just have cold feet, because they have absolutely no idea where to start when it comes to leveraging Big Data. For most organisations, Big Data analytics are some intangible trend, which seems almost impossible to turn into any practical value. So they order a dry business case from a big consultancy firm, they receive a thick and intelligent sounding report and some very ‘sexy’ slides in return. But the gap between the ideal situation portrayed therein and the reality of the company, is so wide that Management still does not know what to do. So they wait. Which is definitely the wrong reaction in these fast-moving times.
At SAS, we like to help our customers out of this unnecessary paralysis with a concept we call the ‘Big Data Innovation Lab’. If you have no idea which approach to choose, a great solution is to experiment in small ways: to check what works with little trials, keep the best parts, scale them and chuck away what does not work, … fast. This works well with Big Data projects: you can find out quickly and practically if the analytics project you had in mind, is relevant to your company: do you have the right data for it and is it clean, do you need more data, is the solution you chose the best fit, are the insights coming from it as useful as you expected, do you have the human resources to keep it running or should you be looking for outside help, do you have the requisite budgets,… ? All these questions can be answered in ways that are so much more accurate than a mere dry business case report. Don’t get me wrong, I have nothing against business cases, obviously. But some projects are better served with an immediate trial in the field than others.
Let’s say that you are a telco company, and want to improve your upselling strategy as well as boost the experience and loyalty of your customers. Real-time Big Data analytics would be a great answer to this business conundrum. But how exactly would you go about it? A small-scale experiment could show you what the best times are to upsell a data supply that is past its limit, so that your communication is perceived as welcome help from your customers, instead of as a pushy sales effort. Even, better, it could tell you how much you should propose: 100 MB or perhaps 1 Gig, because offering too much or too little could very well capitalize a sale. It will help you detect if you have the data you need for that or if you ought to get more of it. It might even convince you to simply stick to upselling the same package to everyone who is out of data, because personalising might, in this case, not yield a return on your investment. In short, it is a perfect manner to check quickly if the Big Data project you had in mind, is as useful as you first thought, if it will enhance your customer’s experience like you expected and what should be changed for it to have a steady ROI. And that’ just one simple example, from one industry. Just think of what retailers and banks, logistics, transportation, healthcare, etc. have to win from Big Data. Its potential is enormous, and small scale trials are the best manner to uncover how it should best be handled.
But Big Data experiments could go further than this type of beginner’s experiment, and even be of use to companies that are a lot more mature when it comes to digitalization. Larger companies, with a fixed innovation budget could, for instance, install this kind of Innovation Lab on a permanent, structured basis: with a fixed number of Big Data experiments backed by a dedicated team.
When you read about disruptive innovation, the answers for speeding up a company and boosting its innovation potential are mostly focused on culture and management: flat organisations like holacracy, reversed coaching, more sharing, better collaboration,… Although all these factors have merit, I still believe that data is the most important parameter of them all. It makes sense: if the digital is what caused the problem (what enables bright young kids with no budget and some tech savviness to disrupt an entire industry), digital should also be the solution. Data and analytics are what is going to save you from being disrupted. More specifically: Big Data analytics.
So, in what area would you like to start a data experiment? Let me know.