Throughout the coronavirus pandemic a stream of new words have made their way into common parlance, from ‘R number’ to ‘social distancing’. The word of this week though has to be ‘algorithm’. After the disaster of the A-level results last week and the government U-turn to revert to the original teacher estimates, it seems algorithms have become the new target of criticism for armchair experts.

The one thing most people agree on is that something went drastically wrong with the A-level algorithm. It made me wonder if there is anything that we in the financial services industry – and businesses in general – could learn from this experience to avoid repeating similar mistakes.

Lesson 1: Be wary of small datasets

One of the criticisms of the A-level algorithm was that students from independent schools seemed to benefit more than students from the state sector. Tempting as it may be to see a government conspiracy to favour their more privileged friends, I don’t believe for one moment this is the case. The problem, I think, stems from the size of the set of data being used. The A-level algorithm aimed to standardise results within examination centres based on three factors: previous results from that centre, the ranking of students within that centre and teacher predicted grades. The aim was to ensure consistency at the local level would then convert into a consistent spread nationally.

However, independent schools traditionally have higher results from previous years. They also have a smaller number of students to deal with when compared with a large sixth form college (which are now a popular model for the state sector). The outcome of this is that the teacher predicted grades and the previous performance became a more important factor in the outcome than in larger colleges. This is what skewed the results in favour of the independent schools. For businesses using algorithms for automation or more advanced artificial intelligence, this is probably the first lesson to learn: be wary of small datasets as they may produce anomalous results.

Lesson 2: Test, test and then test again

The scale of public outcry in Scotland, followed by a similar reaction in the rest of the UK, suggests that the testing regime was inadequate and probably fundamentally flawed. The one thing we should always consider when automating processes is that the algorithms we use are not perfect so we must test and test again. We must constantly refine during the testing process so that the outcomes we see are closer to our original hypothesis. We may even do this in parallel to a manual process and compare the results so that we are comfortable that what we are getting is a fair result. One thing for certain is that we should always analyse the anomalies and try to smooth these out. So tip number two for businesses is to test, test and then test again.

Lesson 3: It’s never just a maths problem

Lesson three also relates to the public backlash. When developing automation and complex algorithms, we must look beyond the complex maths challenge to understand the perspective of all stakeholders. Our solution will obviously have to align to business objectives but also to legal and regulatory frameworks. The problem that you thought was just a maths one may well produce a solution that is fundamentally flawed in the real world and even fall foul of legal and regulatory rules. I have heard of a number of examples where organisations employed clever data scientists to improve their pricing algorithms only to find later that some of the data points used fell afoul of important regulations. In another case, a company produced an algorithm that cut down on the number of customer service visits in pursuit of improved efficiency, without appreciating that the company’s revenue model was based on charging per visit! The key point for businesses is that it’s never just a maths problem. It’s a business problem first and foremost and the maths is there to help.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Lesson 4: Build in safeguards

The final point is that we need to understand the impact on the people impacted by automation and put safeguards in place to protect those who are disadvantaged. In the case of the A-level results this could have been a provision for earlier manual intervention involving the test centres themselves. For businesses, this might be ensuring that an algorithm that rejects applications has a manual feedback process to sense check the rationale behind those rejections. Constant monitoring of the results and the addition of more datasets, such as customer satisfaction and net promoter scores, will help ensure automation is not only giving you what you want but what your customers want too.

I’m not sure what the consequences of the A-level algorithm fiasco will be and it’s clear we may not know the full impact for quite some time. But let’s hope other sectors can learn something from the mayhem and develop our automation journeys without becoming a slave to the algorithm.

Richard Phillips is technical architect at Altus, a financial services software solutions company.


Read more:  ICO publishes guidance on AI and data protection