Izzy Lepone 11 August 2025

AI-based gender bias could thwart care delivery, research finds

AI-based gender bias could thwart care delivery, research finds image
Close-up of a glowing AI chip on a computer motherboard with digital light effects. © dee karen / Shutterstock.com.

Issues of gender bias bred by the use of AI tools could threaten councils’ delivery of care services, a study has revealed.

Research undertaken by London School of Economics (LSE) found that AI tools such as large language models (LLMs), which are used in social work contexts by more than half of the local authorities in England, could be ‘introducing gender bias into care decisions’.

As mentioned in an exclusive in The Guardian, the study evaluated adult social care records for 617 individuals, using large language models (LLMs) ‘Gemma’, created by Google, and Meta’s ‘Llama 3’.

According to Dr Sam Rickman, the lead author of the report and researcher in LSE’s Care Policy and Evaluation Centre (CPEC), women’s needs were ‘downplayed’ more frequently than those of their male counterparts in long-term care summaries produced by Gemma.

Google’s LLM was also revealed to use more ‘direct’ language in its summaries of male case notes, compared to the ‘euphemistic’ language in those produced for women.

The report concluded that mental and physical health were found to be ‘mentioned significantly more in male summaries’ generated by Gemma, while the health problems identified for women were presented as ‘less severe than men’s’, with specifics about women’s needs ‘sometimes omitted’.

While highlighting that LLMs could provide ‘substantial benefits’ for councils, such as ‘easing administrative burden’ in the care sector, the study emphasised that gender bias must be evaluated to achieve ‘fairness’.

The report said: ‘As generative models become more widely used for creating documentation, any bias within these models risks becoming part of official records’.

It also recommended that biases relating to ‘gender, ethnicity, and other legally protected characteristics’ within LLMs should be assessed, and provided ‘practical’, data-driven methods for doing so.

Dr Rickman explained: ‘If social workers are relying on biased AI-generated summaries that systematically downplay women’s health needs, they may assess otherwise identical cases differently based on gender rather than actual need. ‘Since access to social care is determined by perceived need, this could result in unequal care provision for women.’

He added: ‘While my research highlights issues with one model, more are being deployed all the time making it essential that all AI systems are transparent, rigorously tested for bias and subject to robust legal oversight.’

To learn more about digital transformation, check out: 10 Ways Councils Are Using AI to Transform Public Services.

SIGN UP
For your free daily news bulletin
Highways jobs

Catering Assistant

North Yorkshire Council
£13.28 - £13.28 per hour
Would you like a job with an hourly rate of £13.28? Scarborough, North Yorkshire
Recuriter: North Yorkshire Council

Apprentice - Business & Administration - Virtual Schools

Essex County Council
Up to £15435.0000 per annum
Apprentice - Business & Administration - Virtual SchoolsFixed Term, Full Time£15,435 per annumLocation
Recuriter: Essex County Council

Cleaner and Caretaker

North Yorkshire Council
£13.28 - £13.47 per hour
Are you looking for a part time cleaning role? York, North Yorkshire
Recuriter: North Yorkshire Council

Cook

North Yorkshire Council
£13.47 - £14.13 per hour
We have an opportunity to join our award-winning Facilities Management catering team Malton, North Yorkshire
Recuriter: North Yorkshire Council

Cycle Trainer

North Yorkshire Council
£13.28- £13.47 per hour
We are seeking Bikeability Instructors across the county to join our established team. Northallerton, North Yorkshire
Recuriter: North Yorkshire Council
Linkedin Banner