Skip to content

By David Sibley

In children’s social care, the rise of artificial intelligence AI has felt nothing like a science fiction plot. It has arrived slowly through small changes to case recording. Beneath this slow rollout sits a struggle between a Californian optimism about what technology can do, the everyday bureaucracy and audit culture of social work, and the realities of social hardship, inequality and under resourced practice. This article explores how that tension could leave social work intensified rather than eased and asks what kind of thinking and organisational choices are needed when AI is built into everyday social work practice.

Introduction

The rollout of AI into public services has begun. I work as an organisational consultant at Tavistock Consulting part of the Tavistock and Portman NHS Foundation Trust. I spend much of my time in conversations with leaders and practitioners working in high demand systems. A systems psychodynamic view invites me to notice both what is happening on the surface of work, such as tasks, roles and processes, and what is happening underneath, the feelings, relationships, cultures and anxieties that shape how people take up their roles.

When a new technology arrives, it does not only change productivity and working practices, it also stirs up hopes, fears and phantasies. Socio technical1 approaches to work have long suggested that any technical change will reverberate through the social system, affecting both productivity and the lived experience of work. Early experiments with AI in children’s social care are a live example of this. Regulators, Social Work England, have commissioned research into how AI is being used in practice and education, how it intersects with professional standards and what it might mean for public trust in social work. 2

This article draws on an interview with Flo*, a frontline child protection social worker. We explore how AI may reshape not only administrative work, but also the ways practitioners cope with the emotional impact of their role and how AI changes the way case records are written and read. We also look at the hopes and anxieties that these technologies stir up for practitioners and for the wider system.

The interview

In November 2025, ten months after Prime Minister Keir Starmer’s speech3 on AI opportunities, setting out how AI will revolutionise public services and drive economic growth, I interviewed Flo*, an advanced practitioner in a long-term child and families team in an English local authority.

Flo’s everyday work is the familiar core of statutory children’s social care, long term child in need and child protection cases, repeated meetings with parents, schools and health professionals, continual assessment of risk and safety. Flo has been qualified for eleven years and describes social work as a profession that is under resourced, frequently misunderstood by the public, and often pushed into what she called “a firefighting role”. Her time is scarce. Every hour spent writing minutes or updating plans is an hour she is not in direct contact with children and parents.

Into this already stretched system, AI has quietly entered as an additional character in the growing digitally driven systems that govern families in modern social care.4 A few weeks before our conversation, Flo’s local authority introduced a trial of a specialist AI software called Magic Notes.5 Designed for social workers, with the necessary data protection, policy safeguards and consents. She attended a training session, in which she described herself as “mesmerised” by the technology. She watched it record a meeting, generate minutes and produce a child’s plan. “It was incredible”, she said, that “something on my laptop” appeared to “understand my job role”.

Since then, in meetings with adults, Flo has been using the software routinely, pressing record in case conferences and child in need meetings, then choosing a template so that the software converts the conversation into a combined meeting record and child’s plan in a single document. Flo has been impressed by its ability to record these meetings accurately. The child protection record, always a crucial record in the system, is now co-authored by a machine that listens and summarises. These recordings and transcripts also create a new level of visibility over practice, generating detailed digital trails that can be searched, monitored and audited.

Saving time and losing reflection

Bureaucracy and managerialism, with a focus on efficiency, performance metrics and audit, generate high demands for paperwork in children’s social care. Systematic reviews show that these managerial cultures can negatively affect social work practice, increasing stress and squeezing time for relationship-based work.6 Against this backdrop AI is being introduced as a possible answer to workload problems, but the question is whether it will genuinely ease the pressure or simply become another layer in the audit and surveillance culture.7

In children’s social care, every child open to the service must have an up to date plan, and meetings must be held and recorded within strict timescales. In Flo’s team the meeting record and the plan are combined in one document, which has to be completed promptly after each review. In the past this meant someone taking minutes, then hours of writing into a rigid case management system. Now, as she explains, Magic Notes can “record meetings, phone calls, conversations and it will then process that information in the way you need it to be”, generating the minutes and the plan from the same recording. For a single complex meeting, Flo estimates that the software saves “a few hours easily”. In a week of heavy workload, that can be the difference between writing until late in the evening and getting home to her own family.

The software can also be used for case notes. In the past Flo would dictate her notes into Word, something she has found easier as a neurodivergent practitioner. Reading back over the notes helped her “process it”, supporting her to understand and make sense of a visit or meeting and to check that it was accurate. Writing or dictating a visit forced her to revisit the encounter, to choose what to include and to check what she had heard against her own sense of the family and their situation. This slow loop of dictating, reading back and revising was one of the main ways she digested the emotional impact of the work, making sense of what she had seen and felt.

With basic dictation, Flo still had to compose each visit in her own words. The new AI software Magic Notes goes further. It can be used to record the whole encounter with adults, produce a transcript, then offer back a structured summary and plan in its own templated language. Her task is no longer to decide what to say, but to read what the system has produced and decide what to accept, correct or delete. That is a different kind of thinking, quieter and more evaluative, and it risks leaving less time in contact with the raw experience of the visit.

AI optimises the managerial function of case notes, supporting speed and compliance with deadlines. At the same time, it eats into the function that notes provide as a space practitioners need when writing to link what they feel to thoughts.8 Case recording is a key practice that enhances social workers’ development, in skills such as analysis, critical thinking and decision making.91011

Bion, a British psychoanalyst, developed a theory of thinking12. He suggested that the capacity to think grows out of emotional experience and that learning from experience is part of our emotional development. We learn to think by staying with frustration long enough to make sense of it, rather than evacuating it through projection. AI promises to bridge an impossible gap between demand and capacity, yet it risks eroding the everyday “digestion” of the work.  

Flo spoke to this by saying:

Early this morning I was doing some write ups, and I flew through them. I am happy with what I have recorded, I am happy with what is documented, I feel it is comprehensive, it covers everything. But did I have that time to reflect. No. It is like a conveyor belt.

The risk of speeding up administration tasks in social work is that it takes away the reflection that helps practitioners digest and think about their work, so they can provide a professional analysis and evaluation of what is happening in families.

Flo shared that she chooses to work compressed hours and often finds herself processing experiences from her work on her days off. AI risks deepening this pattern. By taking some of the emotional digestion out of work time it may push more of it into social workers’ private time, while the hours that are saved are absorbed back into throughput and targets. The work risks becoming intensified rather than eased. A key question for the profession is what parts of case recording should AI do, and which parts are unnecessary bureaucracy or central to social workers’ expertise.

AI also alters another function of case notes and meeting records, their role as a future record of the child’s history.

Social work records as an enduring object

For Flo, writing up is never just an administrative exercise. She holds on to a line: “Whoever has the pen has the power”. When you are writing minutes or visits, you “automatically take control of that story and whatever you put down becomes fact and could be shared for years”. She routinely reads histories that go back as far as the system allows, looking for patterns and previous concerns. She is clear about human fallibility here. What a worker wrote in a hurry ten or twenty years ago now appears as truth about a family.

AI changes who ‘holds the pen’. The software listens, segments and decides “what the issues are and then the actions that need to go with that”. The initial framing of the meeting is produced by a system that was not physically in the room in the same sense that the practitioners and family were. Flo checks and edits, yet the structure, the headings and much of the wording arrive already formed.

This matters for the children who will later read their records. Flo notes that assessments in her local authority are written “to children” on the basis that, when they are older, they might read them and try to make sense of what happened. For many former care experienced people, accessing social work files is a painful attempt to reconstruct why certain decisions were made and how professionals understood their families. We are already familiar with stories of people who find their childhood described in blunt, sometimes shaming language131415. In future, some will find something else, key meetings and plans written in a distinctive AI voice.

Social work records are enduring objects that carry the projections of parents, workers, managers and courts, and they are becoming more layered with AI generated text being added. An AI voice is now part author of the story that the child will turn to when they want to know who saw them, who believed them, and what was done or not done to keep them safe.

Nice language that avoids reality

When we talk about the tone of AI generated text, Flo describes it as “worded in a very restorative way, compassionate and kind”. That can fit well with a relational, strengths based ethos. She often appreciates the richer vocabulary and the attempt at empathy. At the same time, she has noticed how easily serious harm can be wrapped in soft language.

She gives the example of situations in which parents have clearly hurt their children. She explains that AI generated paragraphs lean heavily into understanding the parent’s experiences, justification and struggles, yet dilute the simple, necessary statements that offer clarity about harm that has been caused. In the interview I offered the term “sugar coating” to describe this drift into sunny, Californian optimism even in the context of abuse, and she agreed that she sometimes has to dial the tone back.

By Californian optimism16 I mean a bright, solution focused tech mindset that assumes new digital tools are the solution to deep social problems. When that tone seeps into AI generated social work notes it can act as a quiet social defence, softening the descriptions of harm and helping workers and organisations step back from the full impact of what is being described.

This has consequences. For social work managers reviewing AI notes, softening language may make it harder to analyse information, see risks if feeling has been wrung out. For children reading about their own abuse years later, that smoothing over may not feel kind. The optimistic tone may feel humane and compassionate in documentation yet fail the child’s need for clear recognition that what happened to them was wrong.

Seen this way, AI’s reassuring voice could become part of the wider pattern of social defences in children’s services, including how workers manage pain and vicarious trauma.

Defence against pain and vicarious trauma

Flo links our discussion of AI directly to “vicarious trauma”. She notices that after a really challenging visit she can be “still full of feeling” and worries about whether she will be objective enough if she writes up straight away, so she sometimes waits until her feelings have settled.

Frontline children’s social work involves repeated exposure to stories and scenes of neglect, violence and emotional and sexual abuse.

It is already normalised in this job, pain and abuse, because we write about it. We look at it, we see it, and we hear about it every day. Family after family after family, there is not a conversation that does not have a reference to harm.

Part of the work has always been to transform the raw experience of trauma into something thinkable.17 Writing has been one of the places where this transformation occurs. It is not comfortable, because it involves engaging with pain and trauma. AI offers a tempting defence against engaging with this pain.

The idea of social defences described by Isabel Menzies Lyth in her study of nursing18 helps to make sense of how organisational structures and policies can serve to protect staff members from the intense, primitive anxieties inherent in their work.19 Menzies showed how organisations unconsciously reshape tasks that expose workers most directly to suffering. Work is restructured, routinised or pushed away so that contact with distress is reduced. The consequences of social defences can be a secondary anxiety, because the root causes of anxiety have not been addressed, learning is reduced and staff are prevented from developing their capacity to tolerate the reality of the work.

With AI it may be that the work gets written up quickly and is presented as up to date. However, what has been skipped is the social worker’s professional analysis and evaluation of what is happening in a family. An illusion is created.

AI as Trojan horse, organisational phantasy and workforce politics

Towards the end of our interview, I asked Flo what image or metaphor came to mind for AI arriving in her workplace. She replied, without hesitation, that “the first thing that came into my head was a Trojan horse”.

Flo believes that senior leaders “genuinely want for workers to have time freed up, to do your job properly” and “to spend more time sat with families as opposed to writing things down”. In a profession where people are drowning in paperwork, that is a realistic desire. AI tools arrive speaking exactly this language.20 They slot neatly into national narratives, with the UK Prime Minister21 publicly heralding AI’s potential to almost halve social workers’ paperwork and drive innovation, modernisation and doing more with less.

However, Flo senses “there’s always an ulterior motive”. The same technology that frees time could be used to justify higher caseloads and further managerial demands. She links AI explicitly to “a resource light way of addressing workforce issues without spending any money or investment or taking the time to look at what this profession needs”.

Here the concept of the phantastic object is helpful. Tuckett and Taffler22 use this term for a mental object imagined as if it could perfectly fulfil one’s deepest wishes, giving a feeling of omnipotence. In a social work context, the public and government can invest AI with an unconscious phantasy that it will resolve workforce issues in social work, perhaps even the problems of poverty, inequality and abuse. For social workers, the wish may be for AI as a magical assistant that can make their job easier and rescue them from bureaucracy, taking away not only the “bad parts” of the job but the painful feelings too. A phantastic object helps us see both the intensity of hope that AI will meet these wishes, and the questions that arise when the fantasy inevitably fails. Where might disappointment and rage then be directed, towards the technology, managers or frontline workers.

The excitement that Flo notices among colleagues, as they share tips and feel “new and exciting” possibilities opening up, can be genuine and also vulnerable to co-option. The National Workload Action Group has recently made recommendations on how to reduce social worker workload.23 The final report frames AI as just one strand in a wider programme. It recommends evaluating whether time saved from administrative tasks is genuinely translated into more direct work with children and families rather than simply absorbed into the system. This gives the profession a small but important lever for arguing that any time released by AI should be protected for relationship based practice and for asking how and where AI should be used, alongside safe workload limits for practitioners.

Microsoft Copilot Chat

Flo also explained that there is another tool available for her to use that has a different function, Microsoft Copilot chat.24 This is an AI chat assistant powered by the latest large language models. The local authority has built data protection and safeguards so it can be used with families’ records. She can now upload documents, for example reports and assessments, and ask Copilot to summarise and analyse the information, answer questions and provide responses in any format she wishes. Flo explained that she often works with large numbers of reports and data about families, and Copilot means that staff do not have to read all of these reports in full. She is clear that she sees ethical issues with this.

Flo slightly sheepishly admitted, “I used it to help me with my Social Work England renewal”, and that she found it helpful and felt it did a good job. She had joked with her partner that it is typical of this profession that “the renewal becomes something that you want to get out of the way”. Flo recognises that renewal should be an opportunity for reflection, yet being time poor she used AI to help her complete it and now realises this may have reduced her chance to think more deeply about her practice. Getting things out of the way becomes both the attraction and the risk of tools like Copilot, relieving pressure in the short term but potentially squeezing out exactly the reflective work that renewal and reading full records are meant to support.

AI in social work supervision

Flo is excited about using AI in her supervision. She explained that “computers are really getting in the way of quality supervision”, that her supervisor, as is commonplace in social work has to type throughout their meetings, splitting their attention between her and the screen.

This is because, as well as supervision being a reflective space, it also has a case management aspect, and any decisions need to be recorded and available for Ofsted and audit purposes. Although she understands this, she notices that it impacts her relationship with her supervisor and the quality of the discussion. The possibility that supervision could now be recorded and summarised by AI excites her. She hopes it will allow for a greater sense of presence together without “distraction of technology”. At the same time, recording supervision conversations for later review increases the sense that supervision is itself under observation.

Perhaps this points to a different use of AI, that with appropriate safeguards on data use, allows a freeing up of attention for a more relational and reflective focus in supervision25, rather than a normative and administrative one.

Conclusion

Flo’s experience sits at the meeting point of Californian optimism about what AI can do and the stubborn realities of frontline social work. Tools like AI software can genuinely save hours of administration for social workers, and it is easy to see why they are welcomed by practitioners who are exhausted by paperwork and want to spend more time with children and families.

As AI is integrated into high demand systems like social work, the pressing questions are what tasks it is allowed to take on, which parts of thinking and decision making are consciously or unconsciously handed over to it, what policy boundaries are appropriate, and what needs to remain firmly in human hands when staff in the system are already so overburdened.

If we are to live with AI in children’s social care, then leaders, practitioners and those of us who work alongside them need to make these choices consciously. That means deciding which tasks AI assists with, and using any time it frees up to support more meaningful contact with children and families and to create space for supervision and reflection, rather than simply absorbing it into higher caseloads and new targets. It also means staying in touch with how the emotional impact of the work is processed when note writing shifts from authoring to reviewing drafts, and paying close attention to the tone and accuracy of records, especially for former care experienced readers who may one day meet this AI voice. Above all, it means treating AI as a tool to support thinking, not a phantastic fix, a kind of magical solution, so that professional authority and judgement remain at the centre of practice.

Californian optimism will always be tempting in systems under strain, since it promises that complexity can be mastered and suffering softened by technology. Hope and optimism26 also have an important place in the relationship between social workers and families. Sustaining belief that change is possible. Yet the same optimism can become part of social defences,27 sometimes described as ‘the rule of optimism’, where signs of abuse are minimised or explained away, and where Kettle and Jackson28 note, optimism can be used in serious case reviews to locate fault in individual social workers rather than in the structural conditions that shape their practice.

One of the central tasks in social work is to stay with experience long enough to digest it into thought, so that professional analysis is anchored in workers direct experience of families rather than outsourced consciously or not, to AI. Any technological solution should remain in service of the human, relational and ethical core of the work, so that it supports practice rather than becoming the author of it.

*Flo’s name has been changed to protect her confidentiality and to support her being candid about her experiences of work. If you are interested in supporting future articles about how AI is affecting work, please get in touch with David Sibley dsibley@tavistockconsulting.co.uk

AI has been used in the editing of the article.

Header image: Photo by Yoksel 🌿 Zok on Unsplash

References

  1. Trist, E. L., & Bamforth, K. W. (1951). Some Social and Psychological Consequences of the Longwall Method of Coal-Getting: An Examination of the Psychological Situation and Defences of a Work Group in Relation to the Social Structure and Technological Content of the Work System: Human Relations, 4(1), 3-38. https://doi.org/10.1177/001872675100400101 (Original work published 1951) ↩︎
  2. Blackmore, S. (2025) ‘Artificial intelligence in social work’, Social Work England, 4 February. Available at: https://www.socialworkengland.org.uk/news/artificial-intelligence-in-social-work/ (Accessed: 12 December 2025).  ↩︎
  3. Starmer, K. (2025) ‘PM speech on AI Opportunities Action Plan: 13 January 2025’, GOV.UK, 13 January. Available at: https://www.gov.uk/government/speeches/pm-speech-on-ai-opportunities-action-plan-13-january-2025 (Accessed: 12 December 2025).   ↩︎
  4. Edwards, R., Gillies, V., Vannier-Ducasse, H., & Gorin, S. (2024). The moral, the political and social licence in digitally-driven family policy and intervention: Parents negotiating experiential knowledge and ‘other’ families. Social Policy & Administration, 58(5), 856–869. https://doi.org/10.1111/spol.12997 ↩︎
  5. Beam (2026) Magic Notes: The AI assessment solution powered by Beam, Magicnotes.ai, https://magicnotes.ai/ (Accessed 12 January 2026). ↩︎
  6. Pascoe, K.M, Bradley, B., & McGinn, T. (2022) ‘Social Workers’ Experiences of Bureaucracy: A Systematic Synthesis of Qualitative Studies’, The British Journal of Social Work, Volume 53, Issue 1, January 2023, Pages 513–533. ↩︎
  7. Cooper, A. and Lousada, J. (2005) ‘The state of mind we’re in: sincerity, anxiety, and the audit society’, in Borderline welfare: feeling and fear of feeling in modern welfare. London: Karnac Books, pp. 59–82. ↩︎
  8. Bion, W.R., 2014. A theory of thinking. In: C. Mawson, ed. The Complete Works of W.R. Bion, Vol. 6. London: Routledge, pp.153–161. ↩︎
  9. Macdonald, M. A. (2025) National Workload Action Group Final Report: research and recommendations on reducing social worker workload. London: Department for Education, September. Available at: https://assets.publishing.service.gov.uk/media/68d51a8030734bac9ba0fcbc/National_Workload_Action_Group_Final_Report_September_2025.pdf (Accessed: 12 December 2025).   ↩︎
  10. Social Care Institute for Excellence (SCIE) (2021) Social work recording. Available at: Social work recording – SCIE (Accessed: 17 December 2025). ↩︎
  11. Stanley, Y. (2019) ‘What makes an effective case record?’, Ofsted: social care, 24 July [Blog]. Available at: https://socialcareinspection.blog.gov.uk/2019/07/24/what-makes-an-effective-case-record/ (Accessed: 17 December 2025). ↩︎
  12. Bion, W.R., 2014. A theory of thinking. In: C. Mawson, ed. The Complete Works of W.R. Bion, Vol. 6. London: Routledge, pp.153–161. ↩︎
  13. FamilyConnect (n.d.) ‘What records look like, and the impact they might have on you’, FamilyConnect. Available at: https://www.familyconnect.org.uk/considering-the-impact-of-care-records/ (Accessed: 12 December 2025).   ↩︎
  14. Koutsounia, A. (2025) ‘I’ve had bin collection letters with more warmth: Rebekah Pierre on the reality of accessing care records’, Community Care, 8 August 2025. Available at: https://www.communitycare.co.uk/2025/08/08/care-files-rebekah-pierre-accountability/ (Accessed: 12 December 2025).   ↩︎
  15. Pierre, R. (2022) ‘An Open Letter to the Social Worker Who Wrote My Case Files’, British Association of Social Workers (BASW), 21 October 2022. Available at: https://new.basw.co.uk/articles/open-letter-social-worker-who-wrote-my-case-files (Accessed: 12 December 2025) ↩︎
  16. Barbrook, R and Cameron, A 1996, ‘The Californian Ideology’, Science as Culture, vol. 6, no. 1, pp. 44–72. ↩︎
  17. Mawson, C. 1994. Containing Anxiety in Work with Damaged Children. In Obholzer A, Zagier Roberts V, editors. The unconscious at work: a Tavistock approach to making sense of organizational life. 2nd ed. Abingdon: Routledge; 2019. p, 83 ↩︎
  18. Menzies, I. E. P. (1960). A Case-Study in the Functioning of Social Systems as a Defence against Anxiety: A Report on a Study of the Nursing Service of a General Hospital: A Report on a Study of the Nursing Service of a General Hospital. Human Relations, 13(2), 95-121. https://doi.org/10.1177/001872676001300201 (Original work published 1960) ↩︎
  19. Krantz, J. (2010), SOCIAL DEFENCES AND TWENTY-FIRST CENTURY ORGANIZATIONS. British Journal of Psychotherapy, 26: 192-201. https://doi.org/10.1111/j.1752-0118.2010.01173.x ↩︎
  20. Rothera, S. (2025) Artificial Intelligence (AI) in case recording: National Workload Action Group, reducing unnecessary social worker workload supplementary report. Department for Education, September. Available at: Reducing unnecessary social worker workload through the use of Artificial Intelligence (AI) in case recording  (Accessed: 12 December 2025). ↩︎
  21. Starmer, K. (2025) ‘PM speech on AI Opportunities Action Plan: 13 January 2025’, GOV.UK, 13 January. Available at: https://www.gov.uk/government/speeches/pm-speech-on-ai-opportunities-action-plan-13-january-2025 (Accessed: 12 December 2025).   ↩︎
  22. R.J. and Tuckett, D.A. (2003) ‘Internet stocks as “phantastic objects”: A psychoanalytic interpretation of shareholder valuation during dot.com mania’. In: Boom or bust? The equity market crisis: Lessons for asset managers and their clients, pp. 150 to 162. European Asset Management Association. Available at: https://dspace.lib.cranfield.ac.uk/server/api/core/bitstreams/f428301b-4816-4679-ada1-e5134cd84a93/content (Accessed: 12 December 2025).   ↩︎
  23. Macdonald, M. A. (2025) National Workload Action Group Final Report: research and recommendations on reducing social worker workload. London: Department for Education, September. Available at: https://assets.publishing.service.gov.uk/media/68d51a8030734bac9ba0fcbc/National_Workload_Action_Group_Final_Report_September_2025.pdf (Accessed: 12 December 2025).   ↩︎
  24. Microsoft (2026) Microsoft Copilot: Your AI companion, Copilot.microsoft.com, https://copilot.microsoft.com/ (Accessed 8 January 2026). ↩︎
  25. Earle, F., Fox, J., Webb, C. and Bowyer, S. (2017) Reflective supervision, Resource Pack. Edited by S. Flood. Research in Practice. Available at: https://www.researchinpractice.org.uk/media/2d2dxwrn/reflective_supervision_resource_pack_2017.pdf (Accessed: 15 December 2025).  ↩︎
  26. Featherstone, B., White, S. and Morris, K. (2014) Reimagining Child Protection: Towards Humane Social Work With Families, Bristol, Policy Press. ↩︎
  27. Froggett, L. (2002) Love, Hate and Welfare: Psychosocial Approaches to Policy and Practice, Bristol, Policy Press. ↩︎
  28. Martin Kettle, Sharon Jackson, Revisiting the Rule of Optimism, The British Journal of Social Work, Volume 47, Issue 6, September 2017, Pages 1624–1640, https://doi.org/10.1093/bjsw/bcx090 ↩︎