For years, digital twins in the built environment were a consultant’s dream and an operator’s headache. Beautiful 3D models. Impressive demos. Expensive pilots that never quite made it into daily use.
That is starting to change. But not in the way the hype suggested it would.
In January 2026, Twinview director Neil Hancock gave one of the most honest assessments of the digital twin market you are likely to read. Not a sales pitch. Not a white paper. A frank conversation about where real adoption is happening — and why so many projects still fail to deliver.
His conclusion: 2026 will be the year digital twins move from visualisation tools into genuine operational infrastructure. But only for organisations that have done the unglamorous work first.
Where Digital Twins Are Actually Working
Forget smart offices and shiny new commercial developments. According to Hancock, the sectors seeing fastest adoption are the ones with the most operational pressure: higher education, healthcare, social housing, and large estates portfolios.
These are environments with ageing buildings, tight budgets, regulatory compliance pressures, and FM teams stretched thin. A digital twin that connects live building systems, asset registers, energy data, and maintenance workflows in one place is not a luxury for these organisations. It is survival.
The organisations pulling ahead are not the ones with the best 3D models. They are the ones that have made the digital twin part of how the building is actually managed. Twinview, developed by BIM Technologies, positions itself exactly here — as an operational layer connecting BIM, IoT sensor data, and CAFM systems into a single working environment for estates teams.
When it works, you see it in behaviour. People stop treating it as a separate system. They just go straight to the twin because it is the fastest way to understand what is happening and what to do next.
The Data Problem Nobody Wants to Admit

Here is the part most vendors gloss over. Hancock puts it plainly:
“A lot of estates teams don’t have normalised asset data, consistent naming, integrated systems or even a trusted single view of what’s happening across the building. You’ve got data sitting in the BMS, in spreadsheets, in CAFM, in energy portals, and half of it doesn’t line up.”
This is the hidden reason up to 75% of digital twin initiatives fail to deliver meaningful ROI. Not because the technology does not work. Because the data foundation underneath it is broken.
Asset registers that have not been updated since the building was handed over. Sensor data arriving in incompatible formats. Energy metering that does not map to any recognisable zone in the building model. For quantity surveyors and FM professionals, this will sound familiar. It is the same data quality problem that undermines cost planning, life cycle cost analysis, and condition surveys. Different system, same root cause.
And it is getting worse, not better, as organisations layer more systems on top of each other without ever fixing the foundations.
Why This Is an AI-Readiness Issue
Here is where it gets important for anyone thinking about AI in building operations. Digital twins are increasingly the delivery mechanism for AI-powered building management — predictive maintenance, automated fault detection, energy optimisation. But as Hancock notes: “The quality of insight you can get from AI correlates directly with the quality and structure of your building data. If that foundation isn’t there, you’re basically just generating plausible-sounding answers.”
That is a sentence worth reading twice. AI tools in the built environment are only as trustworthy as the data they run on. Garbage in, garbage out — but now the garbage is dressed up in a confident-sounding dashboard.
Getting AI-ready in the built environment means getting your data ready first. That means centralising asset records, standardising naming conventions, integrating systems, and establishing clear data governance so you know what is reliable and what is not. It is less exciting than demos of autonomous buildings. But it is the work that determines whether your AI investments deliver real outcomes or just expensive theatre.
The Responsible with AI programme is built around exactly this principle: that responsible AI adoption in construction and FM starts with competence, governance, and data infrastructure — not with the tool itself. A digital twin powered by clean, structured data with proper human oversight is a genuine operational asset. Without those foundations, it is a liability dressed as innovation.




