7 Aug 2019
In the first of an occasional series entitled A Day In The Life, we talk to Nina Rawlings, who tells us about her role as a paralegal.Read more
This month, the Law Society will hold sessions in Cardiff and London in February 2019 to consider the use of and ethics of algorithms in the justice system In England and Wales. It will examine “ what controls, if any, are needed to protect human rights and instil trust in the justice system”.
Algorithms are a process, or set of rules, which have been used by many local authorities across the world. Amid financial pressures and funding cuts, local authorities are using a range of data to identify children who are at risk of harm or abuse, allowing them to focus their limited resources more effectively. This process is often carried out and completed by a computer and considers a range of data including children being excluded from school, youth offenders and incidents of domestic violence.
This data does not allow for a local authority to be subjective; it does, however, allow a local authority to identify vulnerable families quickly, and then provide support. Some local authorities, facing a significant lack of funding, are using this method to assist social workers in an effort to apply limited resources more effectively.
But there are drawbacks. Prof Ruth Gilbert and Rachel Pearson of University College London and Prof Gene Feder of University of Bristol, writing in The Guardian, said: "The stakes are high in terms of potential harms. The purported benefits of algorithms must be supported by transparency and robust evidence of benefit."
Algorithms typically stereotype families. The data takes into account characteristics of children who are currently in the care system, and ‘flags up’ potential vulnerable families. This data does not, therefore, allow for anomalies, and uses private personal data in order to predict child abuse.
The Information Commissioner’s Office (ICO), which regulates the use of personal data by public and private bodies, is tasked with ensuring whether councils’ use of algorithms complies with data protection laws.
“All organisations have a duty to look after personal information in their care but records involving children – often sensitive personal data – require particularly robust measures,” said an ICO spokesperson.
Ultimately, we have to find a balance between protecting the rights of the many and protecting the vulnerable in society. We look forward to the results of the Law Society's upcoming sessions.
"Algorithms typically stereotype families."
Book an appointment
Book your appointment here. Your first consultation is free.
We will be in contact shortly to arrange your appointmentArrange another appointment?