Skip to main content
Blog

Welcome to the 'New Normal'.

Welcome to the 'New Normal'.

Data centre and private cloud

shutterstock 101262022

By now, we’re all fed up with seeing the C-word. But just like the dreaded B-word that preceded it (Brexit), COVID-19 is definitely going to stick around for a while. And just like Brexit, it will have repercussions for years to come.

The ‘orthodox’ ways in which organisations use customer, employee and market data to make astute business decisions are already changing. ‘Novel’ approaches are being deployed in response to the pandemic – and they’re quickly becoming ‘normal’. It’s reactive, adaptive and dynamic. Frankly, it’s exciting too. But what does this mean for the way we’ll manage data in the future?

Equally, the new ‘business as usual’ that working from home brings means we should question the legitimacy of our new approaches to data management. And how do issues of connectivity and security play into an effective data strategy now? Let’s try to address some of the questions here…

What is ‘normal’?

It might sound obvious, but the use of data and data-driven decision making varies wildly between markets, organisations and even individual departments.

For example, vertical markets such as retail, media and finance have been using data intensively for many years and as a result benefit from a more data-driven approach to decision making and even culture. Whereas, organisations such as local governments, who have only recently invested in the adoption of consistent data focus to aid their decisionmaking processes, are just starting to reap the rewards.

Either way, following the hype of ‘big data’ that emerged from the Web 2.0 era, we have certainly seen a shift to a greater appreciation of the value of data and the ability to merge structured and unstructured sources into the data refining process.

In addition, more open application architectures have allowed for more effective data interchange through a variety of APIs. And as a result, greater availability of data and data tooling has increased the potential to validate and lead new strategies and business growth opportunities. It all sounds great, and it is, however… The world’s most popular data science tools remain Microsoft Excel, and pen and paper.

When I think of the orthodox way of making business decisions with data, these basic tools still rule the roost – even in some organisations with very mature data science capabilities.

Essentially, we are still in the transition phase from static, localised reporting to modern, interactive analytics and intelligence. Many businesses will use these two capabilities side-byside, and although they may see the benefits of a modern data architecture, are still very much ‘on the journey’ to implementation.

Data under the microscope

So far, the current pandemic has magnified many of the negative – and positive – elements within business processes, systems and operations that we’ve lived with for years. In particular, it’s highlighted the shortcomings of historic approaches in the clarity of information, and the speed of essential decision making.

With the conventional work environment turned on its head, the ability to ensure data is correct and can be understood in the same way by all stakeholders is of critical importance. The ambiguity of poor-quality data and traditional reporting presents heightened risk in an uncertain world, where communication is challenging, and time is of the essence. For instance, the need for rapid action will stretch traditional data systems that rely on overnight batch processes, manual reporting procedures and centralised creation. All of which have an impact on ‘business as usual’. Given these issues, many organisations have looked to quickly deploy or iterate live dashboards, modern data visualisations and democratised, distributed BI, often powered by SaaS platforms.

While some of these approaches are not especially novel in a broad market context, they may be to the end users.

Therefore, it is equally important that given how critical the decisions being made are to peoples’ lives, the approaches we take are ‘battle hardened’ and proven to work. Many organisations are looking to work with trusted solutions that can quickly demonstrate a ‘minimum viable product’ in the shortest time with easily accessible data sources.

These are often not cutting-edge, but prudent and work with the available data to hand. These are of course unprecedented times. And unprecedented means that large, historically available data sets from which we can work are at worst unavailable, or at least of limited relevance. So, with this in mind, the most valuable approaches are perhaps not the sexiest, but they are definitely the most prudent.

Introducing the new ‘business as usual’

Any approach to data that provides reliable, consistent, accurate and truthful results that can be easily understood by the intended audience is inherently legitimate. However with the context and outcomes changed through a new ‘business as usual’ definition, the utility of these approaches are being challenged.

Priorities have changed. People have too. And many businesses are already significantly different than they were just a few weeks ago, operationally, financially and culturally. The need for rapid, accurate decision making has never been more critical, and as a result, leaders are demanding more from their data environments.

Connectivity and security concerns have increased the desire to use SaaS and cloud services, which not only offer simplified experiences and rapid deployment, but by their very nature are designed to be configured and accessed remotely.

Hate to say I told you so,but...

Connectivity, security and access are all key factors in any successful data strategy and should have really been considered as part of any deployment – irrespective of circumstances.

Those organisations with technical and technology deficiencies in these areas will be experiencing issues. And, unfortunately, with further potential changes to business structure and the nature of working practices, what might have been previously considered minor inconveniences may quickly become major hurdles, both to business continuity and to what comes later.

Back to the future?

We’re in a period of flux, and of course the current ‘business as usual’ will not be a long-term situation. However, certain things will change for good.

While the legacy approach to hoarding data behind several ‘walls’ has been criticised for its shortcomings for years, it is still in place in many organisations. And the businesses with a ‘if it works, don’t fix it’ attitude are about to find out that the higher ‘priorities’ on their IT work lists might not have been so vital after all.

Sure, no-one could have anticipated the current situation and the pressure on business continuity planning. But the problems of limited remote access, not embracing the cloud, failure to categorise and classify data, poor collaboration and more were all problems back in the long-forgotten land of 2019.

The good news, however, is that the organisations who have reacted quickly will have changed their business data strategies both for today and laid the groundwork for a different level of operation tomorrow. And those organisations who are earlier on the journey will have gained valuable learnings that will potentially accelerate the importance of robust data strategies, including connectivity and security, which will underpin a brand new ‘business as usual’