An agile approach to interoperable data sharing in the energy industry | NTT DATA

Thu, 22 September 2022

An agile approach to interoperable data sharing in the energy industry

The energy industry needs to modernise rapidly to meet compliance requirements from Ofgem. In our first blog of this series, we looked at why sharing data is so important for the energy industry, and we introduced Ofgem’s Presumed Open regulations.

Here, we look at how businesses can modernise their approach to data in order to become compliant and realise the benefits of better data management.

Understanding Ofgem’s requirements

As we explored in the first blog, Ofgem is looking for distributed network providers to accelerate their digital transformation efforts in the upcoming price control period, beginning in 2023. Ofgem has published 11 data guidelines, designed to ensure data is treated as an asset and used effectively for the benefit of customers, stakeholders, and the public interest.

These best practice guidelines are as follows:

1. Identify the roles of stakeholders of Data Assets.

2. Use common terms within Data Assets, Metadata, and supporting information.

3. Describe data accurately using industry-standard Metadata.

4. Enable potential Data Users to understand Data Assets by providing supporting information.

5. Make Data Assets discoverable for potential Data Users.

6. Learn and deliver to the needs of current and prospective Data Users.

7. Ensure data quality maintenance and improvement are prioritized by Data User needs.

8. Ensure Data Assets are interoperable with Data Assets from other data and digital services.

9. Protect Data Assets and systems by Security, Privacy, and Resilience (SPaR) best practice.

10. Store, archive and provide access to Data Assets in ways that ensure sustained benefits.

11. Treat all Data Assets, their associated Metadata, and Software Scripts used to process Data Assets as Presumed Open.

Compliance with these guidelines will mean energy operators need to open up common data sets to other players in the market. So, what are the steps energy operators need to take to achieve this?

NTT DATA’s methodology

NTT DATA has developed a four-step evolutionary and agile methodology to help organisations modernise their data and meet Ofgem’s compliance requirements. The four actionable steps can be summarised as follows:

1. Plan: identify your data, assets, stakeholders and define the data sharing strategy and use-cases.

2. Standardise: create trusted interoperable data assets and products.

3. Secure: protect the data through several techniques per data sharing request.

4. Open: make data discoverable, searchable, understood, and shareable.

The aim of this methodology is to enable organisations to adopt the Ofgem presumed open best practices across their operations. This involves opening and sharing data sets iteratively until data sharing scope and required capabilities are fulfilled at full maturity. Let’s look at each of the steps in turn:

Planning

In the initial discovery phase, organisations will need to establish a leader responsible for overseeing the data modernisation process. The leader will evaluate their organisation’s in-house capabilities for data governance and management to establish if the right expertise and infrastructure are available to create governed data assets – where data quality is enforced, sensitive data is secured, access to data is controlled, and data lifecycle is managed.

At this stage, organisations should define their initial data-sharing scope, including justifying why some data cannot be opened – due to privacy concerns, for example. If an organisation is already sharing data with external parties, that will count as part of the initial scope for this initiative.

At this stage, architectural capabilities to open and share data and the cultural changes needed to facilitate the strategy should also be outlined. The data culture will need to adapt to the ‘domain-driven data product’ paradigm, which demands strong data ownership, with high quality data to be shared with consumers from related data domains.

Standardisation

The second phase of NTT DATA’s methodology involves sharing data products that are assumed to be interoperable, based on industry standards. A potential challenge here is that some existing data sets may not be designed based on industry data modelling standards. It can be costly to transform these to be fully interoperable.

In such a case, ‘meta-data interoperability’ is advisable. This involves mapping existing data onto the industry-standard business glossary to make it understandable for consumers. The related data product would then be shared with its meta-data, providing the right semantics to the consumer. For any new data products, organisations can use industry-standard data models to readily provide interoperable data sets to consumers.

Securing data

Security is a crucial part of our methodology for data modernisation. As a first step, organisations will need to have the required data management capabilities to protect data products according to industry compliances such as GDPR.

Before opening data to consumers, additional data privacy risk mitigation capabilities will need to be applied based on the data consumption context. This could involve anonymisation, pseudonymisation, randomisation, or aggregation to limit the details provided to consumers. This process is called ‘data triage’ and it explains to data stakeholders how the related risks are mitigated.

Making data open

Finally, interoperable data products will need to be shared based on specific data consumption requests. These requests could be to share data within an organisation, or across organisations. Data sharing requests might also come from external partners in the industry. In addition, organisations might be required to share their data with public government portals.

For each request, the data needs to be prepared accordingly. This might require creating new data fields and mitigating data-privacy risks using additional data protection techniques. Related meta-data, such as JSON documents, also need to be made available to the consumer. The ‘Open Data’ process can be automated as much as possible with self-service capabilities that allow data domains to prepare the required data rapidly for each data consumption context.

For data that cannot be shared, Ofgem asks operators to maintain a ‘data triage’ log that justifies why certain assets cannot be opened to the industry. This report must be created as part of the ‘Open Data’ process.

Once the data is opened – and risk mitigation techniques have been applied – it can be shared with consumers through mediums such as data marketplace, an analytics tool of choice, or even an API that is flexible enough to meet all data consumption requirements. Data sharing should observe the ‘no data copy’ principle, and no manual extracts should be allowed as this will make it difficult to track how the data is being consumed.

Starting your journey

In the past, data sharing has taken place in a largely ad-hoc way. Now, it’s increasingly becoming a necessity, and an automated, self-service approach to ‘Open Data’ and data sharing is needed to enable organisations to share data easily and rapidly with consumers.

Organisations need to consider an overarching data (sharing) strategy with the required architectural capabilities and cultural changes to unlock the potential of data, deliver business objectives, promote industry innovation, and meet compliance requirements. If you’re looking to progress your data modernisation strategy, get in touch today.


How can we help you

Get in touch