Algorithms must meet ethical and professional standards to recover public trust, report recommends

LONDON (1 September 2020) — Algorithms that change people's lives - for example when estimating students' grades - should now meet strict standards of ethics and competence, according to a new report by the professional body for IT.


Government use of data science must achieve public service standards of openness, accountability and objectivity to avoid another 'computer says no' moment in education and other disciplines, the study by BCS, The Chartered Institute for IT found.


The new report: 'The Exam Question: how do we make algorithms do the right thing?' recommends government endorse and support professionalising of data science, in line with a plan already being developed by a collaboration of the Royal Statistical Society, BCS, The Royal Society and others.


It would mean algorithms whose creators did not meet a strict code of practice for ethics, professionalism and competence cannot decide issues such as exam grades or make estimates about the outcomes of pandemics like COVID-19 to government.


The BCS study concluded that policy makers should ensure 'the best possible ethical and professional practice in algorithm design, development and testing is ubiquitous at information system level across government and industry.'


All algorithms and data with high-stakes consequences such as grades estimation or triggering lockdowns should be put through an impact assessment against widely recognised ethical standards, and open to public scrutiny, before 'going live', the BCS' report added.


Dr Bill Mitchell OBE, Director of Policy at BCS, The Chartered Institute for IT said: "The exam crisis has given algorithms an undeserved reputation as 'prejudice engines' when in fact ethically designed algorithms fed on high quality data can result in massive benefit to our everyday lives.


"Lack of public confidence in data analysis will be very damaging in the long term. Information systems that rely on algorithms can be a force for good but, as students found out to huge cost, we have been using them to make high-stakes judgements about individuals based on data that is subjective, uncertain and partial.


"We need true oversight of the way algorithms are used, including identifying unintended consequences, and the capability at a system level to remedy harm that might be caused to an individual when something goes wrong.


"That means, first of all, professionalising data science so that the UK has the most trusted, ethical and sought-after data science teams in the world."


View the algorithms report (PDF)