Can AI Write a Construction Contract? Lawyers Say the Risks Are Still Too High

By Micah Stennett|Last updated: 2 Feb 2026|5 min read

Someone, somewhere in the built environment, has already done it. Opened ChatGPT, typed a prompt, and used whatever came out as the basis for a real construction contract. Maybe on a domestic extension. Maybe on a larger scheme. Maybe without telling their client.

It’s not hard to imagine why. A decent-looking contract lands in seconds. It has headings, clauses, the right-sounding legal language. If you don’t know what a proper NEC or JCT suite looks like, it feels like the real thing.

But it isn’t. And the lawyers who’ve tested it are unequivocal.

What Happens When You Ask ChatGPT to Draft a Contract

Construction Dive ran the experiment in 2023: ask ChatGPT to generate a design-bid-build contract for a 600-unit mixed-use project in San Jose, California. The result was a document with scope of work, payment terms, a termination clause, indemnification, insurance, and change order language. Superficially reasonable. See: Construction Dive (2023).

But construction attorneys tore it apart. The contract was under two pages. A comparable real-world agreement would run 60 to 100 pages before exhibits. It was missing special inspection requirements, obligations to pay subcontractors, liquidated damages, dispute resolution provisions, and lien rights clauses. Attorney Carol Sigmond put it plainly: “It looks complete, but it is actually missing a lot of key provisions. It’s enforceable, but there would be lots of disputes on the open issues.”

Construction Dive followed up in November 2025, two and a half years later. The conclusion? The risks have got worse, not better. Attorney Megan Shapiro said: “Oh my God, no. I would be terrified if I heard of anybody using it to generate a contract.” She added that consumer confidence in AI output is increasing even as the quality of that output may be declining. See: Construction Dive (2025).

The specific problem for construction is that contract law is jurisdictionally specific. NEC4 terms don’t translate to Texas lien statutes. JCT conditions don’t map to Scottish adjudication rules. An AI model trained on internet text doesn’t know your project, your jurisdiction, your trade, or the clause you need because your last client tried to do exactly this.

“AI cannot replace the need for lawyers to analyse and assess risk in construction contracts because risk analysis is highly project-specific. It’s unique to each project.”

The Hallucination Problem Is Not a Minor Footnote

Here’s the part that should alarm every quantity surveyor, project manager, and FM professional reaching for an AI drafting tool: the models lie. Not deliberately. But confidently and convincingly.

Stanford’s Human-Centered AI Institute found that general-purpose AI tools hallucinate on legal queries between 58% and 82% of the time. Even specialist legal AI tools marketed as “hallucination-free” produced false information in roughly one in six queries when tested. See: Stanford HAI (2024).

The consequences are now on the record. Damien Charlotin, a French lawyer and data scientist, tracks every court decision involving AI-generated hallucinations globally. His database now contains over 1,000 cases. Lawyers have been sanctioned, fined, referred to bar associations, and had cases dismissed. In one 2025 California appeal, 21 out of 23 case citations in a brief were fabricated by AI. The attorney was fined and referred to the state bar. See: Charlotin AI Hallucination Database.

This isn’t a legal sector problem that sits safely away from construction. Your contract documents, your dispute submissions, your adjudication bundles — if they’re drafted or reviewed using AI without proper oversight, you are exposed in exactly the same way.

The Air Canada case is instructive. The airline’s chatbot gave a passenger incorrect information about bereavement fares. Air Canada argued the chatbot was a “separate legal entity” responsible for its own actions. The tribunal called this a “remarkable submission” and ruled that the company was liable for everything on its website, chatbot included. If you deploy AI and it misleads a client, counterparty, or court, the liability lands with you. See: BBC (2024).

So What Should You Actually Do?

The answer isn’t contract law and AI That’s not realistic, and there are legitimate use cases: summarising contract terms, flagging missing clauses in a draft you’ve been sent, running a quick comparison between two versions of an agreement. AI is a useful first-pass tool in experienced hands.

But drafting from scratch? On a live project? Without a construction lawyer reviewing the output? That’s where it goes wrong. A professionally drafted contract suite from a construction solicitor typically costs £1,500 to £5,000. One unenforceable contract, one missing lien clause, one disputed payment term, the legal fees to fix that start at five figures and climb fast.

The firms that will get this right are the ones building governance around their AI tools now. That means knowing which tasks AI is appropriate for, who reviews the output before it goes anywhere, and what the liability position is if something goes wrong. That’s not a nice-to-have. For construction professionals working with contracts, it’s a duty of care question.

The Responsible with AI programme exists precisely because this gap is real. AI tools are powerful. They will become more embedded in how built environment professionals work. But power without governance isn’t progress it’s risk transfer onto the people who don’t know what they’ve signed.

Related Blog Post

Responsible with AI Logo

Responsible with AI Training Platform, which offers accessible training on responsible AI principles, enabling professionals to build knowledge in ethical AI practices and governance.