A digital twin is a virtual model of a physical object. It comprises an interconnected system of sensors and data analysis methods using aspects of artificial intelligence to learn about the systems it is monitoring and predict when issues may arise based on the experience it has seen.
The impact on an organisation of having a fully-functional digital twin could be vast: knowing what’s going to happen before it happens; having exact confidence in where an action needs to take place; replacing equipment before it breaks down; and analysing the performance of different component models. All of these things can give housing providers insights that will save them money, improve customer services and provide better homes for their tenants.
Getting an application with this much impact up and running isn’t as easy as installing a new piece of software. The integrated nature of the digital twin needs to be carefully considered, not just with the physical location of sensors and how they interface the data but also with how that data is interpreted within business processes. You would also need to consider what data you are feeding into the system, how reliable that data is and how frequently it’s updated. The last thing you want is to rely on a system that is using incorrect information and therefore giving you poor quality actions. Just like with a new development where the buildings need strong foundations, a digital twin is reliant on the quality of your housing and asset data.
How many storeys?
For example, consider an eight-storey building and the light goes out on the stairwell on the top floor. Your property database may have this (incorrectly) recorded as a seven-storey building (surprisingly likely because the numbering of the storeys starts at ground, first… seventh). The digital twin reports the issue and a ticket is raised to replace the lightbulb on the eighth storey. The person who picks up the ticket may review the location and reject the ticket, saying that there must be something wrong with the new system because the property database says that there are only seven storeys. The lightbulb therefore remains broken and the confidence the user has in the digital twin is reduced. The customers then have to manually report the broken light and the benefit of the digital twin is lost.
Meet your digital cousin
If your data is incomplete or has inaccuracies, you won’t have a digital twin but more of a digital cousin; something that acts and looks similar but in practice is really quite different. You would still be able to use the digital twin but the output wouldn’t be accurate and might cause further issues. This isn’t an irreconcilable issue; it just means there is more work to be done and there is a larger margin of error until the data has been cleansed.
Data cleansing is often a daunting task that has been attempted before, but over time the quality has deteriorated again. Unfortunately, having high-quality data is not a ‘one and done’ task. The data needs to be maintained and managed but with the right processes in place this doesn’t need to be an onerous task.
To really make the most out of any data cleansing activities, you need to ensure that you’ve created robust lifecycle processes for your data, from initial collection through maintaining the data and into the end of its life.
Implementing effective data controls and clear processes around the changes to data will ensure that the quality of your data and your confidence in it will be maintained because all changes that are needed will be actioned appropriately and in a timely fashion. This is the point where you can go over your existing data and ensure that it’s correct, making changes to the data where errors are located and following the data-change protocols that had been designed and implemented previously.
Qualitative & quantitative reporting
Once the data has been cleansed, then it needs to be monitored and maintained through effective and timely reporting; this should be a combination of quantitative and qualitative reporting to ensure the data is complete as well as valid and accurate. Where data is reported, it’s important to consider who it’s reported to; in this context, your data improvement programme needs to establish data governance roles adapted to the needs of particular roles.
Improving your data quality will have significant impact all over the business and improve the effectiveness of any new technologies.
Quite simply, you will stop looking after lots of people born on 01/01/1900, inspecting phantom components and have true confidence in your compliance position. You can then be confident that your digital twin is actually a twin, not a distant cousin.
Camilla Shrieve is a data consultant at Data Futurists.