Who Owns the Data When AI Surveys Your Building

By ResponsiblewithAI Team|Last updated: 18 Apr 2026|5 min read

A surveyor walks around your building with a tablet. AI tools process the photos, generate a defect report, produce a structural summary, and flag maintenance priorities. The report lands in your inbox 24 hours later.

Now ask yourself: who owns that data? Who owns the report? And what can the vendor do with the information that just got fed into their system?

Most firms in the built environment have not thought this through yet. The vendors have.

Three parties, three competing interests

When AI processes a building survey, you end up with a three-way tangle. The client owns the building and wants the data. The surveyor produced the professional judgement. The AI vendor built the tool and almost certainly wants to improve it using your project data.

Standard contracts were not written for this. The RICS standard forms and JCT appointments predate AI-generated outputs by some margin. Most say little about what happens when third-party AI tools are embedded in the workflow. That gap is where disputes begin.

A recent JD Supra analysis of AI in construction contracts identifies data ownership as one of the highest-risk areas. The core question is whether the client owns the inputs fed into the AI, the outputs it generates, or both. In many current contracts, the answer is unclear. Some vendor terms explicitly reserve the right to use project data for model training. Unless you have a clause prohibiting this, you may have already agreed to it.

The Data Ownership Gap

60%: of AI projects may fail due to data governance gaps, per Gartner

11,520: responses to the UK government's Copyright and AI consultation (2024-25)

9 Mar 2026: RICS AI standard effective date, requiring data governance protocols

IP in AI-generated reports: the UK picture

Copyright in AI-generated works is actively contested in the UK. The UK government's report on Copyright and AI, published in March 2026 under the Data (Use and Access) Act 2025, acknowledged that under current UK law, unlicensed use of copyright-protected works in AI training is likely restricted. The debate is far from settled.

For surveyors, there is a practical wrinkle. The UK's Copyright, Designs and Patents Act 1988 (Section 9(3)) provides a framework for computer-generated works, attributing authorship to the person who made the arrangements for the work. But when an AI tool produces a report from your building data, who made the arrangements? The surveyor who commissioned the tool? The client who owns the building? The vendor who built the algorithm?

The RICS global standard for responsible use of AI in surveying, mandatory from 9 March 2026, requires members to address data governance, system governance, and output reliability. It specifically covers AI procurement and due diligence. It does not resolve the IP question for you. But it makes clear you cannot ignore it.

"Many vendor contracts give them rights to use your project data for model training. Unless you have a clause prohibiting this, you may have already agreed."

GDPR and building data

Building surveys increasingly capture personal data, whether that is images of residents' possessions in a flat inspection, floor plans that reveal movement patterns, or access control data. Once personal data enters an AI processing pipeline, GDPR applies. That means purpose limitation, data minimisation, appropriate processing agreements with vendors, and documented lawful basis.

The AI tool being cloud-hosted adds another layer. Where is the data processed? Who has access? What happens if the vendor is acquired or goes under? These are not hypothetical concerns. Construction tech firms have come and gone with project data on their servers.

What your contracts need to say

The JD Supra construction AI contract guide recommends explicit clauses covering five areas: data ownership of both inputs and outputs, restrictions on vendor use of data for training, confidentiality of project information uploaded to AI platforms, cybersecurity obligations if the data is compromised, and record retention and audit trail requirements for AI-generated documents.

On the surveyor side, professional appointments need to specify whether AI tools will be used, which tools, and what the surveyor's liability is for outputs they did not directly produce. The standard of care question is live. If an AI flags a defect incorrectly and a surveyor signs off the report without questioning it, who carries the liability?

None of this means avoiding AI in building surveys. The efficiency gains are real. But the data ownership questions need to be resolved before the surveyor walks through the door, not after the report is disputed in court.

Building AI Competence in the Built Environment

The Responsible with AI programme equips built environment professionals to evaluate, procure, and govern AI tools. Explore how to build the right frameworks for your firm.

Explore the Programme → Responsible with AI

Related Blog Post

Responsible with AI Logo

Responsible with AI Training Platform, which offers accessible training on responsible AI principles, enabling professionals to build knowledge in ethical AI practices and governance.