Issues of gender bias bred by the use of AI tools could threaten councils’ delivery of care services, a study has revealed.
Research undertaken by London School of Economics (LSE) found that AI tools such as large language models (LLMs), which are used in social work contexts by more than half of the local authorities in England, could be ‘introducing gender bias into care decisions’.
As mentioned in an exclusive in The Guardian, the study evaluated adult social care records for 617 individuals, using large language models (LLMs) ‘Gemma’, created by Google, and Meta’s ‘Llama 3’.
According to Dr Sam Rickman, the lead author of the report and researcher in LSE’s Care Policy and Evaluation Centre (CPEC), women’s needs were ‘downplayed’ more frequently than those of their male counterparts in long-term care summaries produced by Gemma.
Google’s LLM was also revealed to use more ‘direct’ language in its summaries of male case notes, compared to the ‘euphemistic’ language in those produced for women.
The report concluded that mental and physical health were found to be ‘mentioned significantly more in male summaries’ generated by Gemma, while the health problems identified for women were presented as ‘less severe than men’s’, with specifics about women’s needs ‘sometimes omitted’.
While highlighting that LLMs could provide ‘substantial benefits’ for councils, such as ‘easing administrative burden’ in the care sector, the study emphasised that gender bias must be evaluated to achieve ‘fairness’.
The report said: ‘As generative models become more widely used for creating documentation, any bias within these models risks becoming part of official records’.
It also recommended that biases relating to ‘gender, ethnicity, and other legally protected characteristics’ within LLMs should be assessed, and provided ‘practical’, data-driven methods for doing so.
Dr Rickman explained: ‘If social workers are relying on biased AI-generated summaries that systematically downplay women’s health needs, they may assess otherwise identical cases differently based on gender rather than actual need. ‘Since access to social care is determined by perceived need, this could result in unequal care provision for women.’
He added: ‘While my research highlights issues with one model, more are being deployed all the time making it essential that all AI systems are transparent, rigorously tested for bias and subject to robust legal oversight.’
To learn more about digital transformation, check out: 10 Ways Councils Are Using AI to Transform Public Services.