极速赛车168最新开奖号码 Future of Social Work Archives - Community Care http://www.communitycare.co.uk/future-of-social-work/ Social Work News & Social Care Jobs Mon, 24 Feb 2025 09:47:24 +0000 en-GB hourly 1 https://wordpress.org/?v=6.7.2 极速赛车168最新开奖号码 AI tool improves direct work in adult social care despite accuracy concerns, practitioners report https://www.communitycare.co.uk/2025/02/10/ai-tool-adult-social-care-accuracy-issues-practitioners-report/ https://www.communitycare.co.uk/2025/02/10/ai-tool-adult-social-care-accuracy-issues-practitioners-report/#comments Mon, 10 Feb 2025 13:42:00 +0000 https://www.communitycare.co.uk/?p=215389
This article is part of our new ‘Future of Social Work‘ series, where we’ll be reporting on innovative practice approaches and technology driving social work forward. Get in contact with us to flag up anything that you think ticks either…
]]>

This article is part of our new ‘Future of Social Work‘ series, where we’ll be reporting on innovative practice approaches and technology driving social work forward. Get in contact with us to flag up anything that you think ticks either of those boxes at anastasia.koutsounia@markallengroup.com

Adult social care professionals have reported that using an artificial intelligence (AI) tool saved them time and improved direct work, despite some concerns over accuracy.

Research commissioned by Beam, the company behind the Magic Notes tool, found practitioners reported reduced time spent on case recording and assessments, enhanced engagement with people they supported, and improved quality of supervision meetings.

The tool was also commended by a non-native English speaker and a neurodivergent practitioner. However, concerns were raised over inaccuracies and assumptions in the summaries and scripts, requiring practitioners to make, sometimes time-consuming, edits.

Magic Notes, currently used by 85 councils in the UK, records meetings and generates a transcript, summary and suggested actions based on council-agreed prompts.

The evaluation, produced by Rob Procter and Stanislav Zhydkov of AI research body the Alan Turing Institute, was based on analysis of usage by, and feedback from, 91 staff in three councils in England collected during a trial of the tool between April and September 2024. Researchers also carried out in-depth interviews with 11 of the social care professionals, six of whom were social workers with one to six years’ experience.

Staff from the three authorities – Swindon, Barnet, and Camden – gave an average feedback rating of 4.26 out of 5.

‘I feel on top of my caseload’

Several interviewees said the tool significantly reduced the time they spent writing up notes and assessments. This reflects data from Swindon showing that the tool cut the average time to conduct a Care Act assessment conversation from 90 to 35 minutes and time spent on follow-up case notes from four to one-and-a-half hours, during its trial from April to July 2024.

One practitioner said the quick turnaround of assessments made a tangible difference to their workload and wellbeing.

“I feel that I’m getting through things more quickly. They’re good assessments. It’s allowing me time to go out and see more people more frequently. And just for my own wellbeing, I feel that I’m on top of my caseload, which you don’t hear social workers say that a lot.”

Beneficial for neurodivergent and disabled social workers

A social worker with ADHD, dyslexia and arthritis also found that ‘talking’ to the app significantly improved their workflow.

“It takes me a long time to structure things in a way that is cohesive for someone else. I’m not great with my sentence formation,” they said.

“It could take me a couple of days to make sure that my care assessment is [accurate], strength-based, and is looking at that person. Whereas if I have this 20-minute conversation with myself on Magic Notes, explaining my thought-process, saying what they said and summarising my thinking – it takes two minutes to generate. I have to change things but it cuts [the time] that assessment [takes] down by at least half.”

‘There is far better engagement’

By reducing time spent on note taking, Magic Notes allowed for better engagement in meetings with people needing support and enhanced the quality of supervision sessions, reported practitioners.

“[There is] far better engagement. […] We’ve got note-taking skills, but it’s giving me the opportunity to not think about that so much and just have the conversation,” said one participant.

Another social worker added: “I used to have a manager that would type on the laptop supervision notes as you go. And they spent 90% of the time looking at the laptop and about 10% looking at me and talking and it just felt like you weren’t having a natural conversation.

“With Magic Notes you could just put the phone in front of you and have a full conversation. I’m not having to pause to write something.”

However, one participant noted that the tool struggled in team meetings where multiple cases were discussed.

“I’ve had colleagues who’ve tried using it in meetings where it’s really not done well. When you’re talking about multiple different clients, it doesn’t capture all the information. And even when they’ve tried to edit it, it’s not managed to pull out more information.”

Celebrate those who’ve inspired you

Photo by Daniel Laflor/peopleimages.com/ AdobeStock

Do you have a colleague, mentor or social work figure whom you can’t help but gush about?

Our My Brilliant Colleague series invites you to celebrate anyone within social work who has inspired you – whether current or former colleagues, managers, students, lecturers, mentors or prominent past or present sector figures whom you have admired from afar.

Nominate your colleague or social work inspiration by filling in our nominations form with a few paragraphs (100-250 words) explaining how and why the person has inspired you.

*Please note that, despite the need to provide your name and role, you or the nominee can be anonymous in the published entry*

If you have any questions, email our community journalist, Anastasia Koutsounia, at anastasia.koutsounia@markallengroup.com

‘Sometimes it makes assumptions’

A person working on laptop with icons representing ethics, law, and governance in AI, importance of responsible AI usage, ethical decision-making,

Photo: S

Some practitioners noted inaccuracies, including name misspellings, repetitions and assumptions in the AI-generated script and summaries.

This required reviewing and editing, which sometimes proved time-consuming due to the tool’s slow processing of alterations, said professionals.

One interviewee said: “It’s not perfect yet. I wouldn’t just copy and paste [a summary] straight over because sometimes it makes assumptions. I think it can’t pick up on nuances and it did use to make quite a bit of assumption about someone’s needs or their understanding of things.”

“I’ll fact-check it to make sure that there isn’t anything kind of added in. What’s the word I’m looking for? Presumptuous. Sometimes it can add in detail based on what’s not been discussed.”

Another highlighted the importance of maintaining professional judgment: “I wouldn’t want to just copy-paste a full summary and be like that’s my assessment. I take ownership. I’m accountable for making recommendations and judgments about things that weren’t in the conversation or weren’t said. It was almost making professional judgements, which is the domain of a professional, whoever they are, not the AI.”

Prime minister hails AI’s impact on social work

 

Keir Starmer meets Ealing council staff and Beam CEO Alex Stephany at Downing Street

Keir Starmer meets Ealing council staff and Beam chief executive Alex Stephany at Downing Street (photo: Simon Dawson / No 10 Downing Street)

The report comes with prime minister Keir Starmer having highlighted AI’s impact on social work, in setting out government plans to use the technology to transform services.

Launching the government’s AI opportunities action plan last month, Starmer said that artificial intelligence could “almost halve the time social workers spend on paperwork”.

He recently met Beam chief executive Alex Stephany and representatives from Ealing Council, where more than 100 social workers use Magic Notes, to discuss the impact of AI on practitioners’ work.

Ealing has calculated that the tool has reduced the time adult social care staff spend on admin by 44-50%.

Social worker Joanna James, who attended the 10 Downing Street reception, said: “I got into social work to support people and the AI tool has helped me and the team really focus on that part of the job by cutting down the time I spend doing paperwork.”

Directors welcome use of AI in social care

President of the Association of Directors of Adult Social Services (ADASS) Melanie Williams welcomed the use of AI in adult social care, emphasising the need to “harness” its potential to improve the sector.

“Using AI to speed up admin tasks will give care workers more time to spend with people they support, which is key to good quality social care,” she added.

“With most of us needing care and support at one point in our lives, cost of care increasing and workforce challenges, using technology in the right way can also bring real cost benefits to local councils, as well as improving outcomes for people.”

Calls for regulation and further evaluation of AI tools

However, the British Association of Social Workers (BASW) urged caution, calling for more evaluation of tools being used in social work.

A spokesperson said: “The evaluation of AI tools in the workplace is to be welcomed to improve understanding of how such tools are best used. A wide range of AI tools are now in use in workplaces without such evaluation and with a limited understanding of their flaws and restrictions. There are also concerns about data protection and security.

“AI tools, used appropriately, could potentially benefit social workers and the people with whom they work, but the question remains as to whether such time-saving tools free up time for relationship-based social work or whether any time saved will simply result in increased caseloads.”

Social Workers Union general secretary John McGowan raised similar concerns, saying: “[AI] should not be used as a quick fix to kick much-needed support into the long grass once again. Right now it is crucial that the proper AI frameworks, regulations, oversight, and guidance are developed and delivered so that this new technology can be used safely and efficiently.

“Social workers must be supported to use AI in a way that is designed to uphold human rights and does not risk perpetuating inequalities that many people they support already experience or undermine trust in the profession.”

Social Work England to review AI’s impact

Social Work England has also recently commissioned two pieces of research to examine how artificial intelligence is shaping social work practice and education.

The research will look at:

  • The areas of Social Work England’s professional standards that may be affected by social workers’ use of AI in their work.
  • The types of AI being used across health and social care in England and their application in social work practice, including the risks of bias and discrimination.
  • If social workers feel confident and prepared to use AI ethically and appropriately, in line with Social Work England’s professional standards, and how employers are supporting them to do this.
  • How social work education providers are preparing students for AI in their future work.
  • Data protection and confidentiality when using AI with people using services and the public.
]]>
https://www.communitycare.co.uk/2025/02/10/ai-tool-adult-social-care-accuracy-issues-practitioners-report/feed/ 6 https://markallenassets.blob.core.windows.net/communitycare/2025/02/Social-work-across-the-decades-19.png Community Care Photo by Sutthiphong/AdobeStock
极速赛车168最新开奖号码 Social Work England probes AI’s impact on profession https://www.communitycare.co.uk/2025/02/04/social-work-england-probes-ais-impact-on-profession/ https://www.communitycare.co.uk/2025/02/04/social-work-england-probes-ais-impact-on-profession/#comments Tue, 04 Feb 2025 12:10:17 +0000 https://www.communitycare.co.uk/?p=215193
This article is part of our new ‘Future of Social Work‘ series, where we’ll be reporting on innovative practice approaches and technology driving social work forward. Get in contact with us to flag up anything that you think ticks either…
]]>

This article is part of our new ‘Future of Social Work‘ series, where we’ll be reporting on innovative practice approaches and technology driving social work forward. Get in contact with us to flag up anything that you think ticks either of those boxes at anastasia.koutsounia@markallengroup.com

Social Work England has commissioned research to examine how artificial intelligence is affecting the profession, while also hosting a summit on the issue.

The two pieces of research – one of which is a literature review – are exploring how AI is shaping social work practice and education.

The regulator said the purpose of the research was to help it understand:

  • The areas of Social Work England’s professional standards that may be affected by social workers’ use of AI in their work.
  • The types of AI being used across health and social care in England and their application in social work practice, including the risks of bias and discrimination.
  • If social workers feel confident and prepared to use AI ethically and appropriately, in line with Social Work England’s professional standards, and how employers are supporting them to do this.
  • How social work education providers are preparing students for AI in their future work.
  • Data protection and confidentiality when using AI with people using services and the public.

Summit on AI amid increasing use in social work

The summit with sector leaders, held today (4 February 2025), covered the extent of AI use in social work practice currently, the opportunities it can bring to a relationship-based profession, the risks it carries and the concerns being raised with the profession and the ethical implications, particularly regarding equality, diversity, and inclusion.

The news comes with increasing numbers of councils testing the impact of AI tools on practice, including in helping practitioners save time on recording and summarising case notes and suggesting actions to take following assessments or visits.

About one in five practitioners were using such tools for day-to-day case work as of October 2024, according to a Community Care poll.

Other usages for AI in the sector include supporting student and practitioner learning and predicting future needs for social care.

However, social work bodies have raised concerns about the technology’s impact on the profession, including in relation to the quality and reliability of tools, their susceptibility to bias and discrimination and their implications for the privacy of the people social workers work with.

Government plans to roll out AI in public sector

At the same time, the government is planning to roll out the use of artificial intelligence across the public sector in order to reform services.

The implications of this for social work and social care are as yet unclear, though prime minister Keir Starmer pointed to reductions in the time social workers spent on administration as a benefit of the technology, in launching the government’s AI opportunities plan last month.

In a LinkedIn post, following the summit, Social Work England’s executive director of professional practice and external engagement, Sarah Blackmore, said: While already in use, this is a new area for social workers to get to grips with. We are also keen to develop our knowledge through connecting and working with the experts.

“Holistically, there is real value in the tech and social work sectors working together with the potential for real positive impact on people across the country.”

]]>
https://www.communitycare.co.uk/2025/02/04/social-work-england-probes-ais-impact-on-profession/feed/ 6 https://markallenassets.blob.core.windows.net/communitycare/2025/02/Social-work-across-the-decades-20.png Community Care Photo: Supatman/Adobe Stock
极速赛车168最新开奖号码 Should the government use AI to reform public services? https://www.communitycare.co.uk/2025/01/29/government-ai-reform-public-services-readers-take/ https://www.communitycare.co.uk/2025/01/29/government-ai-reform-public-services-readers-take/#comments Wed, 29 Jan 2025 18:16:51 +0000 https://www.communitycare.co.uk/?p=215050
Earlier this month, prime minister Keir Starmer announced his intention to “harness” artificial intelligence (AI) to “transform public services”. Launching the government’s AI opportunities action plan, Starmer said AI had the potential to make services “more human” by freeing up…
]]>

Earlier this month, prime minister Keir Starmer announced his intention to “harness” artificial intelligence (AI) to “transform public services”.

Launching the government’s AI opportunities action plan, Starmer said AI had the potential to make services “more human” by freeing up time spent on admin, giving staff “more time for the personal touch, the connection, the service that people really value”. This applied to social work too, he argued.

“AI could save hundreds of thousands of hours lost to missed appointments, […] it can spot potholes quicker, speed up planning applications [and job centre form filling], help in the fight against tax avoidance and almost halve the time social workers spend on paperwork,” said Starmer.

“This means they can refocus on the care and connection aspects of their job that so often get buried beneath the bureaucracy.”

Impact on social work

What data exists on the impact of AI on social workers’ admin time comes from local authority pilots of different tools.

For example, when piloting an AI tool in 2024, Swindon Council found it reduced the average time to conduct a Care Act assessment conversation from 90 to 35 minutes and time spent on follow-up case notes from four to one-and-a-half hours.

However, it is too soon to draw definitive conclusions from these initiatives, while there are also widespread concerns about the ethical implications of AI in social work, including in relation to potential bias and the privacy of those practitioners work with.

‘More research needed’

But what do social workers think of the government’s plans to incorporate AI into its reform plans for public services?

The majority of practitioners (66%) in a Community Care poll with over 600 votes seemed mostly positive about the prospect. However, while 16% gave their fully support, on the grounds that AI had been “proven to boost productivity, half of respondents said more research was needed to “understand potential risks”.

One-third of readers said AI would cause “more harm than good”. 

Calls for regulation

Social work bodies, including the British Association of Social Workers (BASW), have previously called for the regulation of AI to address the ethical implications amid its increasing use in the sector.

“Right now the onus should be on the social work regulators to produce guidance for using AI and on the government for centrally regulating AI,” said Social Workers Union general secretary John McGowan in October last year.

“This would put protections in place for social workers and the people and families they support, as this technology has known issues, including biases, presenting false or misleading information as fact, data governance and growing concerns about environmental impact.”

Celebrate those who’ve inspired you

Photo by Daniel Laflor/peopleimages.com/ AdobeStock

Do you have a colleague, mentor, or social work figure you can’t help but gush about?

Our My Brilliant Colleague series invites you to celebrate anyone within social work who has inspired you – whether current or former colleagues, managers, students, lecturers, mentors or prominent past or present sector figures whom you have admired from afar.

Nominate your colleague or social work inspiration by filling in our nominations form with a few paragraphs (100-250 words) explaining how and why the person has inspired you.

*Please note that, despite the need to provide your name and role, you or the nominee can be anonymous in the published entry*

If you have any questions, email our community journalist, Anastasia Koutsounia, at anastasia.koutsounia@markallengroup.com

]]>
https://www.communitycare.co.uk/2025/01/29/government-ai-reform-public-services-readers-take/feed/ 2 https://markallenassets.blob.core.windows.net/communitycare/2023/02/Readers-Take.jpg Community Care Photo by Community Care
极速赛车168最新开奖号码 Social workers split over impact of AI on professional skills https://www.communitycare.co.uk/2024/10/25/social-workers-split-over-impact-of-ai-on-professional-skills/ https://www.communitycare.co.uk/2024/10/25/social-workers-split-over-impact-of-ai-on-professional-skills/#comments Fri, 25 Oct 2024 12:14:08 +0000 https://www.communitycare.co.uk/?p=212857
The use of artificial intelligence (AI) tools for administrative tasks in social work has sparked debate among practitioners. This follows calls from the British Association of Social Workers (BASW) and the Social Workers Union for government to regulate AI and…
]]>

The use of artificial intelligence (AI) tools for administrative tasks in social work has sparked debate among practitioners.

This follows calls from the British Association of Social Workers (BASW) and the Social Workers Union for government to regulate AI and address ethical concerns, such as around privacy, bias and quality of practice.

Currently, 28 councils in England are using or testing the tool Magic Notes to produce case notes from visits, with a pilot finding that it reducea time spent on assessments and case recording.

Other local authorities are using Microsoft’s Copilot, an AI program which, similarly to Magic Notes, transcribes meetings and generates notes and actions based on prompts.

However, the use of AI in social work remains in its early stages. A recent Community Care poll with 713 respondents found that 79% did not use tools such as Copilot or Magic Notes.

Yet the issue prompted significant debate in the comments section of the related article.

‘Making good notes is a key social work skill’

Victoria warned that “saving time isn’t conducive to promoting social work skills and knowledge”.

“We dismiss the professional skills and learning gained through summarising notes and interactions, making choices and the need for writing good notes to be understood as a key social work skill which aids reflection and decision-making,” she added.

Victoria also questioned councils’ consent policies, including whether people were clear that their information would be interpreted or summarised using AI.

DK, who said they were not opposed to AI assistance in recording meetings, said note taking was an essential skill in social work decision making.

“It is often the act of writing notes that prompts the thinking and reflection that prompts action, that moves practitioners from simply recording ‘what [happened]’ to thinking about ‘so what’ and ‘now what’,” they added.

‘AI may lead to greater risk aversion’

Sally Pepper, a mental health social worker, “cautiously welcomed” the use of AI as a time-saving tool but questioned whether its recommendations would become “a standard to measure my work by”.

“I am concerned that AI may lead to greater risk aversion, counter to a strengths-based approach and human rights,” she added.

“What’s AI going to recommend about a person who frequently self harms and has suicidal intent?”

She called for social workers to be part of the tools’ design process, which should be adjusted to local needs.

Beam, the company behind Magic Notes, has confirmed that prior to rolling out the tool, it works with the local authority for a few weeks, tailoring the programme to its needs.

“If it can help me focus on the difficult dilemmas by spending less time on things that are more straightforward, I’m all for it,” added Sally.

“It’s a complex system though, becoming more complex, with all the hazards and ethical problems that entails.”

‘Tools may make mistakes’

However, some social workers warned that AI may only succeed in adding to their workload.

“Social work is in the mess it is now because leaders are beguiled by ‘innovation’,” said Tahin.

“Electronic notes were meant to free up social workers to spend more time out of the office. Result? More time spent on admin in front of a screen.”

Another practitioner, Sabine, warned that AI tools may get the assessment or follow-up recommendations wrong.

“I just hope that no one ends up in a situation where managers say ,‘Well, do what the AI says’.”

Senior social work lecturer David Gaylard warned that even “well-meaning innovations require careful regulation”.

“Such technical advances should not replace crucial professional reflection, judgment, and decision making. Otherwise, what’s the point of becoming a registered professional if prefixed words or prompts alongside set algorithms can determine complex social work decisions?”

‘If we do not embrace change, change will change us’

Yet, not all views were negative.

Social worker Jimmy Choo praised Copilot for helping him save time and improve the spelling and professionalism of his writing, although his employer has since stopped using the system.

Another, Fab, called the implementation of AI across social work “essential”, adding that it could “significantly enhance productivity”.

“Social workers must not be left behind,” added Kudakwashe Kurashwa. “We need to engage with AI and work on addressing the bad bits of the technology.

“If we do not embrace change, change will change us. Am ready for the future of social work, with AI and other disruptive technologies such as fintech as part of it!”

Would you use AI tools for recording case notes?

]]>
https://www.communitycare.co.uk/2024/10/25/social-workers-split-over-impact-of-ai-on-professional-skills/feed/ 6 https://markallenassets.blob.core.windows.net/communitycare/2023/02/Readers-Take.jpg Community Care Photo by Community Care
极速赛车168最新开奖号码 AI could be time-saving for social workers but needs regulation, say sector bodies https://www.communitycare.co.uk/2024/10/04/ai-could-be-time-saving-for-social-workers-but-needs-regulation-say-sector-bodies/ https://www.communitycare.co.uk/2024/10/04/ai-could-be-time-saving-for-social-workers-but-needs-regulation-say-sector-bodies/#comments Fri, 04 Oct 2024 16:16:48 +0000 https://www.communitycare.co.uk/?p=212114
Social work bodies have called for the regulation of artificial intelligence (AI) to address the ethical implications, as more councils employ AI tools to save time on administration. Currently, 28 councils in England are using or testing the AI tool…
]]>
Note: There is a poll embedded within this post, please visit the site to participate in this post's poll.

Social work bodies have called for the regulation of artificial intelligence (AI) to address the ethical implications, as more councils employ AI tools to save time on administration.

Currently, 28 councils in England are using or testing the AI tool Magic Notes in children’s and adults’ services, to produce case notes from visits and assessments.

Developed by AI company Beam alongside social workers, Magic Notes records meetings and emails the practitioner a transcript, summary and suggested actions for inclusion in case notes based on council-agreed prompts.

According to Beam, the technology complies with social care statutory requirements and, in all cases, practitioners must review the documents before adding them to their case management systems.

Swindon Council, which piloted Magic Notes with 19 adult social workers between April and July 2014, found it reduced the average time to conduct a Care Act assessment conversation from 90 to 35 minutes and time spent on follow-up case notes from four to one-and-a-half hours.

The authority said the tool particularly benefited practitioners with learning difficulties and visual impairments, along with those who were not native English speakers.

Meanwhile, other local authorities, like Barnsley, are using another AI tool, Microsoft’s Copilot, with similar functions of transcribing meetings and generating notes and actions based on prompts.

Calls for regulation

However, the rise of AI in social work has also sparked concerns about data privacy for families, bias, and whether AI-generated actions will be adequately reviewed before being carried out.

Social Workers Union general secretary John McGowan said AI could be a “helpful time-saving tool”, but should not be used as a “quick fix” for the lack of funding and staff in the sector.

“Right now the onus should be on the social work regulators to produce guidance for using AI and on the government for centrally regulating AI,” he said.

“This would put protections in place for social workers and the people and families they support, as this technology has known issues, including biases, presenting false or misleading information as fact, data governance and growing concerns about environmental impact.”

The British Association of Social Workers (BASW) also called for the regulation of AI, along with a national framework of ethical principles for its use, to ensure accountability to citizens and to uphold human rights.

‘We need to apply the brakes on AI’

Christian Kerr, senior social work lecturer at Leeds Beckett University, questioned whether local authorities had carefully considered AI’s implications for privacy and human rights.

“We need to apply the brakes on AI, or at least slow down considerably, to allow the social work regulator, our professional association, education providers and practising social workers to come to grips with the myriad ethical implications and challenges,” said Kerr.

“From my interactions with social workers across the country, it is clear to me that it is practitioners who are leading the ethical debate in local authorities and they need the support of social work leaders, organisations and educators to do that to best effect.”

Swindon’s adult social care privacy notice states that individuals are informed when Magic Notes is used for the recording of calls, with an option to opt out, and that personal data is automatically deleted after a maximum of one month.

Beam has also confirmed that no data is used to train AI systems and the tool has undergone data protection assessments prior to testing.

‘Social work must not be left behind’

However, while BASW chair Julia Ross acknowledged that AI had “bad bits”, she stressed that social workers must engage with the technology to avoid being left behind.

“We can promote what we do and give our input as social workers, but we’ve got to be there,” she said. “We can’t afford to be left behind. We’ll lose huge opportunities for ourselves, our practice and the people we work with.”

Welcoming the debate around AI in social work, she also urged the sector to take the time to understand its varying applications and potential uses before rejecting it.

“You don’t just get into it hook, line, and sinker,” Ross added. “What you do is adapt that tool and merge it with the emotional intelligence social workers are so good at. If we just stand back and say we don’t like it, then we won’t do ourselves, the profession or the people we work with any advantage.”

“We need to remember that social work operates in the real work and the real world now is an AI world.”

What do you think about the use of AI tools in social work?

]]>
https://www.communitycare.co.uk/2024/10/04/ai-could-be-time-saving-for-social-workers-but-needs-regulation-say-sector-bodies/feed/ 18 https://markallenassets.blob.core.windows.net/communitycare/2024/09/AI.png Community Care Photo by Sutthiphong/AdobeStock
极速赛车168最新开奖号码 ‘No evidence’ machine learning works well in children’s social care, study finds https://www.communitycare.co.uk/2020/09/10/evidence-machine-learning-works-well-childrens-social-care-study-finds/ https://www.communitycare.co.uk/2020/09/10/evidence-machine-learning-works-well-childrens-social-care-study-finds/#comments Wed, 09 Sep 2020 23:01:29 +0000 https://www.communitycare.co.uk/?p=180685
There is no evidence that using machine learning to predict outcomes for families involved with children’s social care services is effective, research has found. Models built by What Works for Children’s Social Care and trialled over 18 months in four…
]]>

There is no evidence that using machine learning to predict outcomes for families involved with children’s social care services is effective, research has found.

Models built by What Works for Children’s Social Care and trialled over 18 months in four local authority areas failed to identify, on average, four out of every five children at risk.

Where the models flagged a child as being at risk, meanwhile, they were wrong six out of 10 times.

The research found introducing text information extracted from social work reports did not reliably improve models’ performance, despite this offering a more nuanced picture of families than can be gleaned from demographic information and data tracking interactions with practitioners.

What is machine learning?

Machine learning (ML) seeks to find patterns in data. What Works examined a type of ML called predictive analytics, under which models use patterns from historic data to learn to what extent certain inputs or decisions are associated with particular outcomes. It then uses these patterns to predict the likelihood of the specified outcome in future, given the relevant input data.

The study report called on councils already trialling predictive technology in children’s social work to be transparent about its limitations. One such council, Hackney, axed its Early Help Profiling System (EHPS), commissioned from the private provider Xantura, late in 2019 after it did not “realise the expected benefits”.

“Given the extent of the real-world impact a recommendation from a predictive model could have on a family’s life, it is of utmost importance we work together as a sector to ensure these techniques are used responsibly if at all,” the report concluded.

‘Time to stop and reflect’

The new research follows on from a separate What Works review, published in January 2020, which questioned how ethically compatible machine learning was with children’s social work.

Michael Sanders, the What Works executive director and co-author of the study report, said the findings indicated that it was time for the children’s social care sector “to stop and reflect”.

“The onus is now on anyone who is trying to say [that predictive analytics] does work, to come out and transparently publish how effective their models are,” Sanders told Community Care.

“What we have shown in our research is that with a lot of the best techniques available to us, the data across these four local authorities says it’s not working,” he added.

Sanders, who has also researched machine learning in children’s social care as part of the Behavioural Insights Team (BIT), formerly part of government and known as the ‘nudge unit’, said his views had changed, in line with available evidence, as to the technology’s potential benefits.

“We don’t think we are infallible – if someone can find a mistake we’ve made, or can take our code [which will be publicly available] and do something good with it, then I am happy for that to happen,” he said. “But it needs to be in an open and transparent way, not behind closed doors.”

Sanders added that central government, or bodies such as the Local Government Association (LGA) or Association of Directors of Children’s Services (ADCS), could now take a lead in policing the use of machine learning until such a time as its worth could be demonstrated.

‘Surprisingly bad performance’

The What Works study’s models were developed to predict eight separate outcomes (see box), using three to seven years of data provided by the four councils, from the North West, South West, West Midlands and South East regions.

The eight predictions

The What Works study looked at eight different scenarios, each based on a decision-making point for a social worker in a case and looking at whether the case would escalate at a later point in time. They were:

  • Is a child re-referred within 12 months of a ‘no further action’ decision, and does the case then escalate to statutory intervention?
  • Does a child’s case progress to a child protection plan or beyond within six months of a contact?
  • Is a child’s case open to children’s social care, but below statutory intervention, within 12 months of a ‘no further action’?
  • Is a child’s case escalated to a child protection plan or beyond between three months and two years of a referral?
  • Is a child’s case escalated to a child protection plan or beyond within six to 12 months of a contact?
  • After successfully engaging with early help, is a child referred to statutory services within 12 months?
  • Does a child’s case escalate to a child protection plan between one and 12 months of an assessment authorisation date?
  • Does a child’s case escalate to them becoming looked-after between one and 12 months of an assessment authorisation date?

Each was tested in four different builds, so as to gauge whether including pseudonomised text data from social work records improved performance, and what impact only using historical data (thereby simulating real-world usage) had.

In each instance, the models failed to reach a pre-specified ‘success’ threshold of 65% precision. “This is lower than the threshold we would recommend for putting a model into practice but provides a useful low benchmark,” the report said.

In particular, the study found, the models tended to miss the majority of children at risk of a given outcome, which could potentially lead to results discouraging social workers from intervening.

In models where text had been introduced, performance improved in some scenarios. But it worsened in others, giving an overall picture of no consistent benefit – a result Sanders said was unexpected.

“I was surprised by just how bad the models performed overall,” he said. “From my previous research [with BIT, in a single borough], we found quite a big benefit to using text data as well, but that picture is much cloudier coming out of this piece of research.”

Sanders said that it was likely the evolution of systems and practice models, and turnover of staff, meant text data “is particularly vulnerable to changing over time”, making it less reliable as a basis for predictions.

A poll of 129 social workers conducted as part of the study, uncovered no clear support for the use of predictive analytics across a range of scenarios, with a tool to support practitioners to identify early help for families the most popular, but backed by only 26% of respondents. Just over a third (34%) of respondents said they did not think it should be used at all.

‘We are far from achieving minimum standards’

Responding to the new findings, Claudia Megele, the chair of the Principal Children and Families Social Worker (PCFSW) Network and co-chair of the Social Work Practice and Digital Network, said: “There is a fascination with the use of machine learning and predictive technologies in children’s social care and as demonstrated by previous research and experiences of several local authorities as well as this report, there are significant risks, biases and ethical challenges associated with the application of such models in practice.

“In fact, local authority data often does not have the depth, breadth, structure and detailed information required by such systems,” Megele added. “Therefore, both ethically and practically we are far from achieving the minimum standards for use of machine learning for predictive purposes in practice.”

But, Megele noted, algorithms and machine learning can be used in other areas to intelligently support practitioners.

“It would be helpful if local authority resources were focused on the aspects of technology that are proven to be effective and that can support practitioners in time-taking tasks, ranging from gathering historical data to information sharing and partnership working with other agencies,” she said. These could include automating chronologies or important information-sharing tasks by intelligently routing required information to relevant agencies and professionals; for example court orders, or child protection plans.

“Such automations will offer immediate practical support for practitioners while reducing costs and increasing the accuracy, timeliness and availability of information,” Megele said.

‘Human connection is the heartbeat of social work’

Meanwhile Rebekah Pierre, a professional officer at the British Association of Social Workers (BASW), said the What Works report reinforced “that human connection is the heartbeat of social work, [which] is not an ‘exact science’ that can be replicated by automated processes”.

Pierre added that social work was “founded on relationships, person-centred practice, and meaningful interactions – none of which can be achieved by data sets”, saying that BASW’s 80:20 campaign was continuing to champion this at a time when the coronavirus pandemic had diminished face-to-face contact.

“The margin of error [identified in the study] is deeply concerning – if applied to practice, most children would be left without support,” Pierre said. “It is unsurprising that there is a low level of acceptance of the use of these techniques in children’s social care among social workers. Being experts in the field, it is imperative that frontline practitioners are listened to.

“The recent A-level fiasco, which saw the opportunities of millions of children diminished at the expense of an algorithm, highlights the devastating consequences of predictive technology,” Pierre added. “The safety and wellbeing of society’s most vulnerable children must not be gambled with in the same way within social work.”

Jenny Coles, the president of the Association of Directors of Children’s Service, said: “This report highlights the challenges of trying to predict human behaviour and gives policy makers, local authorities and others a lot to consider. Children’s social care is complex, no two families or situations are the same and building relationships are central to the work of social workers and other professionals supporting families in times of need.”

‘Could be worthwhile exploring further’

But Coles also flagged up the fact that the study did not seek to answer whether or not machine learning could ever work in this context. “We know some local authorities are developing or exploring the use of machine learning models in children’s social care as an additional tool to support professional decision making,” she said. “It could be worthwhile exploring further, particularly if it could help us to be effective in identifying opportunities to support children and families earlier before they reach crisis point.”

A Department for Education spokesperson said: “This report was testing a children’s social care model that used predictive analysis, and its findings will help refine and improve the evidence available. In the coming years we expect to see more local authorities using new technology such as machine learning or artificial intelligence, so it is right that we improve our understanding of how it can improve practice.”

]]>
https://www.communitycare.co.uk/2020/09/10/evidence-machine-learning-works-well-childrens-social-care-study-finds/feed/ 4 https://markallenassets.blob.core.windows.net/communitycare/2020/01/machine-learning.jpg Community Care (credit: Elnur / Adobe Stock)
极速赛车168最新开奖号码 National standards for machine learning in social care needed to protect against misuse, urges review https://www.communitycare.co.uk/2020/01/31/national-standards-machine-learning-social-care-needed-protect-misuse-urges-review/ Fri, 31 Jan 2020 12:21:35 +0000 https://www.communitycare.co.uk/?p=176754
National standards on the responsible use of machine learning in children’s social care should be introduced to protect against misuse, a study has recommended. The ethics review, conducted by What Works for Children’s Social Care and academic partners, warned of…
]]>

National standards on the responsible use of machine learning in children’s social care should be introduced to protect against misuse, a study has recommended.

The ethics review, conducted by What Works for Children’s Social Care and academic partners, warned of “legitimate hesitation with regard to the moral justifiability” of predictive risk modelling, through which which some councils have used data on factors such as poverty to identify children at risk.

Increasing concerns have been raised in recent years around the use of such models, including around their potential to discriminate against poorer families, depersonalise relationships with service users and generate errors, particuarly in a context of rising demand and shrinking resources.

The review, which drew on existing literature and the perspectives of service users and sector experts, said more research was needed around whether and how children’s social care’s current environment could support the ethical use of machine learning tools that directly affected individuals.

Key recommendations

  • The responsible design and use of machine learning models in children’s social care should be mandated via national standards.
  • Open communication between social workers and data scientists across local authorities should be encouraged to improve the national knowledge base on machine learning learning in social care.
  • Local authorities that develop machine learning applications should engage with citizens to gain consent for their use.
  • The use of data science should be refocused away from individual risk and towards exploring “deeper social-structural problems” driving rising social care demand, and also be focused on promoting better outcomes for families and strengths-based approaches.

But by “redirecting the energies” of data science towards analysing the root causes behind the need for social work interventions, and exploring how positive as well as negative measures can be integrated into analytics, machine learning could potentially become better aligned with social work values, the study report said.

People likely to be affected by machine-learning systems should be consensually involved in their development, it added.

‘Take a step back’

Speaking to Community Care ahead of the review’s launch yesterday, Michael Sanders, the executive director at What Works, said it would be “wrong to pass judgment” on children’s services already experimenting with machine learning.

But he added that “local authorities and social workers want to be doing their jobs in the most ethical way possible”.

“Hopefully this [research] helps them, when they’re existing in a messy and complicated environment, to take a step back and consider – are we doing this thing we want to do as ethically as we can, and under some circumstances, is it worth doing it, because it’s not ethical enough?” Sanders said.

Something that’s unethical can’t be good social work.”

What is machine learning?

The debate around the interface between technology and children’s social care has seen some mixing and matching of terms such as ‘machine learning’, ‘artificial intelligence’ or simply ‘algorithms’.

While an algorithm, at a simple level, is simply an automated instruction, machine learning involves applications being able to independently ‘learn’ from new data they are exposed to, without being fully preprogrammed by designers. Most machine learning models within children’s social care are classed by the What Works review as involving ‘supervised learning’, with “numerous examples [being] used to train an algorithm to map input variables onto desired outcomes”.

Based on these examples, the ML model ‘learns’ to identify patterns that link inputs to outputs. This has led to disquiet when the approach is combined with predictive analytics, which identify possible future outcomes on the basis of inputs and estimate the likelihood of such outcomes.

David Leslie, an ethics expert at the Alan Turing Institute, who led the study with the University of Oxford’s Lisa Holmes, said the review was “aiming to set the bar of best practice” and that there remained “a stretch and a distance that needs to be covered, in terms of the various components of readiness and resource allocation,” he said.

“I think that shouldn’t be prohibitive of ways in which data science can be used as a tool to improve the social care environment,” added Leslie, who last year told Community Care that he hoped children and families’ experience, and a social justice perspective, could be placed at the centre of machine learning applications in the future.

But Claudia Megele, chair of the Principal Children & Families Social Worker (PCFSW) Network, who also spoke at the review launch, stressed that the time was not yet right for their introduction, despite the technical ability being there.

“Machine learning can result in encoding the current challenge, issues or inadequacies that are the result of austerity and its impact on practice,” she said. “This obviously would be detrimental to services and people who access those services.”

Megele added that machine learning was inherently less suited to social work than to medicine’s diagnostic model. “The processes of identifying and managing risks in social work are much more challenging,” she said. “As a result decisions are more complex and can have significant and last impact on people’s lives; therefore, they require professional judgement.”

‘Algorithmic reproduction of social injustice’ risk

The new review sought to question whether machine learning should be used in the children’s social care sector at all – and concluded that common ethical values between the two fields, including the need to be objective, impartial and use evidence-based judgments, offered a theoretical way forward.

But participants in workshops that formed part of the study raised concerns about the real-world use of predictive analytics being fuelled by the need to find efficient ways of working against a backdrop of cuts.

The report concluded that systems that devalued person-centred approaches to children’s social care were “not ethically permissible”.

Researchers also raised concerns that information fed into machine learning systems would inevitably reinforce correlations between deprivation and being from disadvantaged ethnic groups, and involvement with children’s services.

“In this connection, the major problem that arises regarding the use of predictive analytics in children’s social care is that the features that are indicative of social disadvantage and deprivation, and that are simultaneously linked to socioeconomic and racial discrimination, are also highly predictive of the adverse outcomes used to measure child maltreatment and neglect,” the report said.

It said approaches that inferred a causal link between disadvantage and risk and attributed this to individuals “should be scrutinised accordingly and handled with extreme interpretive caution and care”.

‘Exceptionally demanding’ task

Besides big-picture considerations as to whether machine learning can be justifiable in children’s social care, the review also discussed the practical challenges around responsibly setting up systems.

The report said it was “exceptionally demanding” to ensure good data quality and model design, effective testing and validation and sufficient staff training to implement them appropriately, and all of these features were needed to make systems ethically jusifiable.

Some of the most crucial concerns surrounded the quality of data held by local authorities, including how representative, relevant, recent and accurate it was.

Data used for predictive risk models stem were based on the past provision of public services in which certain groups were under- or over-represented, creating a sampling bias. It said there was “no clear-cut technical solution to create optimally representative datasets”.

The validity of data was also time-sensitive due to the impact of contextual changes, including to laws and procedures, as well as shifts in thresholds due to demand, resource availability or inspection results.

Contested information

Researchers also noted the potential impacts of human error, bias and the ways in which electronic case management systems can “oversimplify situations”.

Meanwhile workshop participants highlighted issues around contested information being included in records – and around the influence of incentives on data collection, including by payment-by-results initiatives such as the Troubled Families programme, from which some councils’ children’s social care machine learning has derived.

Megele said it was “well known” that the data held by most local authorities could not provide a reliable and appropriate pool for machine learning.

“Even if we had the required data quality there are many other questions and challenges that need to be addressed,” she said. “For example, who and how will we select the “right” data, who will drive the design and implementation of such systems and what is the level of transparency and scrutiny involved? All these questions pose significant difficulties in achieving a machine learning system that is ‘fair’, ethical and effective for children’s social care.”

 

]]>
https://markallenassets.blob.core.windows.net/communitycare/2020/01/machine-learning.jpg Community Care (credit: Elnur / Adobe Stock)