Skip to main content
Blog

How do we access our Data?

How do we access our Data?

Data centre and private cloud

DDS Header 01
jonathanda

Jonathan David

Services Portfolio Insights Manager

Every organisation uses data, and our Data Driven Success blog series delves into the challenges facing organisations around how data is used. Our expert Asset Intelligence Data Analyst, Dr Jon David, looks at the complete data ‘journey’, covering a range of topics to help organisations understand how to unlock the power behind their data.

Prospecting for data: the new oil?

There is an analogy, popularised by a 2017 article in ‘The Economist’, that data is now the world’s most valuable resource; the ‘new oil’. It is difficult to argue with this regarding the pervasiveness and transformative power of data, as we are seeing the same trend with any new and valuable resource; the first to harness this new resource effectively are the first to reap the rewards that uncharted territory brings, whether this is the oil-barons of 100 years ago or the Amazon and Facebook of today.

However, beyond this, the analogy rapidly breaks down. Whilst oil is a finite and ever-dwindling resource, data is infinitely durable and theoretically limitless. Similarly, as oil becomes scarcer it becomes more expensive to extract, whereas obtaining data is becoming increasingly easier and cheaper due to improvements in computer science, proliferation of sensors and our transition towards ‘online’ lifestyles. Furthermore, whilst both require processing to realise their potential, transforming oil is permanent and wasteful, whereas data may add value with each transformation. The list goes on. In short, due to its unique properties and potential, we need to move away from the old ways of accessing and disseminating valuable resources when considering data.

Is there a better way?

Nevertheless, for the average organisation, gaining access to their data can still feel a little like prospecting for oil. How do they discover the data they require and what infrastructure do they need to mine it? Moreover, an organisation may have tapped their data ‘well’ but siphoned it into siloes where it remains unused. They are now sitting on this wealth of data that, without the correct technical expertise or knowledge of their IT landscape, they cannot access.

For an organisation to truly tap into the potential of their data, a centralised data strategy must be implemented that focuses on data orchestration, transaction and manipulation and emphasises the concept of a ‘single source of truth’. The design of the strategy must strike a balance between data accessibility and security yet remain agile enough to facilitate future change requirements. Furthermore, data should be readily accessible regardless of where it physically resides or the characteristics of storage.

A modern, data-driven solution

To achieve this it is necessary, in most cases, to move away from point-to-point connectivity, where custom scripts and stored procedures have previously dominated the methodology for data flow and manipulation. Typically, this is a major bottleneck to accessing an organisation’s core systems of record (e.g. ERP, customer and billing systems, databases etc.), due to the highly specific skills required to access the proprietary connectivity interfaces.

Instead, organisations should move towards Application Programming Interface (API) led connectivity, as a purposeful and robust solution to enterprise data architecture. APIs are clearly defined methods of communication among various components that provide the building blocks to simplify the development of computer programs. Purpose-specific software, or ‘middleware’, utilise these APIs to provide common access and process orchestration across data sources within an organisation and align business requests with applications, data and infrastructure.

Creating a streamlined data access experience

A robust approach is to adopt a layered solution of system, process and experience APIs. This enables the users to access data without any need to learn the underlying systems. Consequently, system experts are not required on every project that needs access to an organisation’s data. This then frees up development capacity and focusses the efforts of central IT to maintain and govern system APIs, which in turn ensures stable and consistent managed access to the critical underlying data.

The central layer of process APIs interact and shape data across systems. For example, a customer API may be created that provides a holistic view of a customer by composing fields from across multiple system APIs containing customer data. Finally, experience APIs are user-focussed and provide a mechanism for reconfiguring data from a common source, typically multiple process APIs, so that it is most easily consumed by its intended audience.

Summary: A change worth making

For most organisations, significant time, work and money will need to be spent to transition towards the streamlined, data-driven infrastructure described above. However, those forward-thinking organisations who do implement a centralised and open data source are far more capable of digging into their data and reaping the associated rewards. These can range from streamlined application development through to forecasting business revenue using machine learning and AI. If you are one of these organisations, please get in touch so we can help you realise the true potential of your data.