AI Is Valuing Your Property. It Might Also Be Discriminating Against You

By Micah Stennett|5 min read

Imagine two identical houses on the same street. Same floor plan. Same condition. Same extension. One is valued at 15% less than the other. The only difference? The neighbourhood demographics fed into the algorithm.

That is not a hypothetical. A 2024 survey reported by MPA Magazine found that 87% of UK estate agents believe automated valuation models undervalue properties, with 73% saying they do not trust AVMs to produce accurate results. These are not Luddites resisting technology. They are frontline professionals seeing the gap between what the algorithm says and what the market does.

This sits at the heart of property valuation, mortgage lending, and housing access across the UK. And the stakes are higher than most people realise.

The Numbers Tell an Uncomfortable Story

  • 87% Of UK agents say AVMs undervalue properties (MPA Magazine)
  • 73% Of agents who do not trust AVM accuracy (MPA 2024)
  • 1.2% Ethnic minority representation in UK surveying (RICS)

The survey covered hundreds of estate agents across England and Wales. Their concerns are specific: AVMs struggle with non-standard properties, miss the impact of local amenities, and produce valuations that do not reflect what buyers actually pay. When those models inform mortgage decisions, the consequences are real.

Meanwhile, RICS has acknowledged that racism has caused lasting damage in real estate, with just 1.2% of UK chartered surveyors coming from ethnic minority backgrounds.

Why This Matters for Housing Equality

The Joseph Rowntree Foundation reports that more than a quarter of BAME working adults in the UK spend over 33% of their income on housing costs. When automated valuations systematically undervalue properties in certain areas, the effects compound.

“The algorithm does not need to see the applicant's ethnicity. It just needs to see the postcode, the property type, and the neighbourhood profile. The bias is baked into the data before the model even starts.”

Most AVMs are trained on historical transaction data. That data reflects decades of systemic inequality in UK housing markets.

The legal precedent is already forming. In the Manjang v Uber Eats case, a driver of colour was locked out of his account after the platform's facial recognition system failed to verify his identity.

The UK Regulatory Landscape

The Equality Act 2010 covers indirect discrimination in housing and services. If an AVM systematically disadvantages certain groups, the organization using it could face legal action.

The ICO's guidance on fairness, bias and discrimination in AI makes clear that organizations must assess whether their AI systems produce discriminatory outcomes.

The answer is not to avoid AI in property valuation. The efficiency benefits are too significant. The answer is to adopt it responsibly.

Sources: MPA Magazine / Alto Survey (2024). RICS, Repairing the Damage of Racism in Real Estate. Hometrack, UK AVM Guide. ICO, Fairness, Bias and Discrimination in AI. EHRC, Assessing AI Equality Impact. JRF, Housing Costs. RSW Law, Manjang v Uber Eats.

Building AI Competence in the Built Environment

The Responsible with AI programme helps surveyors, values, and property professionals understand AI bias risks and build practical governance frameworks.

Explore the Programme → Responsible with AI

Related Blog Post

Responsible with AI Logo

Responsible with AI Training Platform, which offers accessible training on responsible AI principles, enabling professionals to build knowledge in ethical AI practices and governance.