Financial Industry Must Tackle Gender Bias in Algorithms, According to Global Fintech Leader, Finastra
– Finastra publishes five-point plan to tackle algorithmic bias in consumer finance decision-making
– New KPMG report, commissioned by Finastra, examines the size of global consumer lending markets and the potential impact of algorithmic bias in society
– Finastra urges the financial industry to address the problem and work together to help solve it
Finastra, one of the world’s largest fintechs, is calling upon the global finance industry to tackle algorithmic bias which is likely to be impacting millions of people every day. The fintech firm, which supplies vital technology to financial institutions of all sizes, including 90 of the world’s top 100 banks, recently commissioned consultancy firm KPMG to look at the issue across banking, lending and insurance. The research considered how decisions coming from this advanced technology have the potential to impact outcomes for certain people and groups. In response to the findings, Finastra has published a five-point plan to identify and tackle algorithmic bias and is urging the financial industry to come together to take action and build a fairer society.
In the past decade, the financial world has been industrialized and digitalized through the introduction of artificial intelligence (AI), particularly forms of machine learning, boosting efficiencies and automating processes, resulting in many parts of banking, lending and insurance decision-making processes now being made by algorithms. The pandemic has accelerated the use of these technologies and whilst it brings clear positives, these vital algorithms can only be as ‘fair’ and unbiased as the data sets that are used to build them. The industry must check if the biases that exist in society are being repeated through the design and deployment of these technologies.
Recommended AI News: FlowBank Puts Global Markets at Investors Fingertips
To understand the severity of the problem, Finastra commissioned KPMG to produce a report which reveals the sheer size of consumer lending markets and the potential impact of algorithmic bias. For example, in 2020, consumer lending and transactions across key financial products (credit card, other consumer product lending and mortgage/home lending) were over:·
- $6,110bn in the U.S.
- HK$1,270bn in Hong Kong
- £440bn in the United Kingdom
- €280bn in France
- SG$110bn in Singapore
Both the provision and costs – e.g. the interest rates charged – to consumers of this credit will be informed in many cases by the algorithms that are used.
Simon Paris, CEO at Finastra, said, “Without this being a priority in the financial industry, AI will become a flywheel that will accelerate the negative impact on human lives. Finastra doesn’t have all the answers but we believe that the industry must first acknowledge that there is a problem with algorithmic bias – only then can we work together to find a solution. We will work with our partners and ecosystem to drive the change the industry needs to make – collectively and collaboratively we can redefine finance for good and open it up to all. Finastra’s goal is to ensure financial technology is benevolent and fair in every way to give everyone a level playing field when it comes to borrowing money.”
Dr Leanne Allen, Director at KPMG Financial Services Tech Consulting, said, “Consumer and public trust are critical success factors for Financial Services. The findings in our report for Finastra make it clear that providers need to take care when designing, building and implementing these algorithms to ensure innovation can continue to advance in a safe and ethical way. The report brings together recent thinking on algorithmic bias, with specific applications to financial services and the potential for biased decision-making. Mitigating bias is vitally important in our digital and data-led world. Not doing so could run the risk of serious financial harm to the consumers who use these services.”
Recommended AI News: DATIS to Host Virtual Workforce Management Summit for Human Services Leaders
To show its commitment to tackling this problem, Finastra has published a five-point plan as part of its drive to help redefine finance for good.
1.) Reforming Finastra’s developer agreement: Finastra has updated its developer t******************* for FusionFabric.cloud, its open platform and marketplace for developers. This means developers and partners will be expected to account for algorithmic bias and Finastra has the right to inspect for this bias within any new application
2.) Creating new proof of concept technologies: such as FinEqual, a digital tool that enables bias-free lending, to give users the technology to empower them to tackle algorithmic bias within their own businesses. Currently at proof-of-concept stage, Finastra aims to make FinEqual available to customers in the next 18 months
3.) Hacking for good: Finastra commits to all future hacks having a focus on inclusion. To support this, Finastra will be launching a global hacking competition as part of its Hack to the Future series to shine a light on female talent in the industry by finding and celebrating balanced, female-led teams pushing the boundaries of AI and machine learning
4.) Workplace equality: Within the organization, Finastra is continuing its journey to 50:50 male to female ratios across all its teams. This includes increasing women amongst our top 200 leaders and engineers from 30% to 40% by 2025 and to 50% by 2030
5.) Work with regulators: Finastra is fully committed to tackling AI bias. The company is working closely with regulators in multiple markets and, as a technology leader, is calling upon the financial services industry to take note of the threat algorithmic bias poses to society
Recommended AI News: Crayon US Gains Microsoft Modern Workplace Unit from Strategic Partner
Copper scrap environmental compliance Copper scrap safety measures Environmental compliance in scrap metal industry
Copper cable recycling cost, Metal scrap yard operations, Copper billet recycling
Scrap metal tracking systems Ferrous material customer relations Iron recycling and reclaiming center
Ferrous material hauling, Iron waste recycling depot, Demolition metal recycling