Izzy Lepone 11 August 2025

AI-based gender bias could thwart care delivery, research finds

AI-based gender bias could thwart care delivery, research finds image
Close-up of a glowing AI chip on a computer motherboard with digital light effects. © dee karen / Shutterstock.com.

Issues of gender bias bred by the use of AI tools could threaten councils’ delivery of care services, a study has revealed.

Research undertaken by London School of Economics (LSE) found that AI tools such as large language models (LLMs), which are used in social work contexts by more than half of the local authorities in England, could be ‘introducing gender bias into care decisions’.

As mentioned in an exclusive in The Guardian, the study evaluated adult social care records for 617 individuals, using large language models (LLMs) ‘Gemma’, created by Google, and Meta’s ‘Llama 3’.

According to Dr Sam Rickman, the lead author of the report and researcher in LSE’s Care Policy and Evaluation Centre (CPEC), women’s needs were ‘downplayed’ more frequently than those of their male counterparts in long-term care summaries produced by Gemma.

Google’s LLM was also revealed to use more ‘direct’ language in its summaries of male case notes, compared to the ‘euphemistic’ language in those produced for women.

The report concluded that mental and physical health were found to be ‘mentioned significantly more in male summaries’ generated by Gemma, while the health problems identified for women were presented as ‘less severe than men’s’, with specifics about women’s needs ‘sometimes omitted’.

While highlighting that LLMs could provide ‘substantial benefits’ for councils, such as ‘easing administrative burden’ in the care sector, the study emphasised that gender bias must be evaluated to achieve ‘fairness’.

The report said: ‘As generative models become more widely used for creating documentation, any bias within these models risks becoming part of official records’.

It also recommended that biases relating to ‘gender, ethnicity, and other legally protected characteristics’ within LLMs should be assessed, and provided ‘practical’, data-driven methods for doing so.

Dr Rickman explained: ‘If social workers are relying on biased AI-generated summaries that systematically downplay women’s health needs, they may assess otherwise identical cases differently based on gender rather than actual need. ‘Since access to social care is determined by perceived need, this could result in unequal care provision for women.’

He added: ‘While my research highlights issues with one model, more are being deployed all the time making it essential that all AI systems are transparent, rigorously tested for bias and subject to robust legal oversight.’

To learn more about digital transformation, check out: 10 Ways Councils Are Using AI to Transform Public Services.

SIGN UP
For your free daily news bulletin
Highways jobs

Enterprise Coordinator

London Borough of Richmond upon Thames and London Borough of Wandsworth
£38,976 - £47,229
Enterprise Coordinator
Recuriter: London Borough of Richmond upon Thames and London Borough of Wandsworth

Senior Practitioner - Children in Care, West Essex

Essex County Council
Negotiable
Senior Practitioner - Children in Care, West EssexPermanent, Full Time£46,574 to £56,027 per annumLocation
Recuriter: Essex County Council

Team Manager - Adoption Recruitment & Training Team

Essex County Council
£54001.0000 - £66899.0000 per annum
Team Manager - Adoption Recruitment & Training TeamPermanent, Full Time£54,001 to £66,899 per annumLocation
Recuriter: Essex County Council

Facilities Assistant

Essex County Council
Up to £25081.00 per annum + full time equivalent
Facilities AssistantPermanent, Part Time£25,081 per annumLocation
Recuriter: Essex County Council

Senior Practitioner - Children in Care, North Essex

Essex County Council
£46574.0000 - £56027.0000 per annum
Senior Practitioner - Children in Care, North EssexPermanent, Full Time£46,574 to £56,027 per annumLocation
Recuriter: Essex County Council
Linkedin Banner