Công Nghệ Thông Tin - Information Technology
  • Trang Chủ
  • Tin Tức
  • Thủ Thuật Máy Tính
  • OS
    • Linux
    • Windows 11
    • Windows 10
  • Website
    • WordPress
  • Network
  • Liên Hệ
Reading: AI bias poses danger to the financial sector, Bank of England warns
Share
[language-switcher]
Công Nghệ Thông Tin - Information TechnologyCông Nghệ Thông Tin - Information Technology
Font ResizerAa
Search
  • Home
    • Home 1
    • Home 2
    • Home 3
    • Home 4
    • Home 5
  • Demos
  • Categories
  • Bookmarks
  • More Foxiz
    • Sitemap
Have an existing account? Sign In
Follow US
Công Nghệ Thông Tin - Information Technology > Blog > Tin Tức > AI bias poses danger to the financial sector, Bank of England warns
Tin Tức

AI bias poses danger to the financial sector, Bank of England warns

hoidabunko
Last updated: 2023/11/06 at 1:22 PM
hoidabunko Published November 6, 2023
Share
ai artificial intelligence bank financial sector
SHARE

The implicit bias of present-day generative AI models make their rapid adoption in the financial sector hazardous, according to a report by a financial technology expert working for the Bank of England.

Kathleen Blake’s report, published Wednesday, splits AI model bias into two categories — bias based on underlying training data, and bias based on the results of model output. Although both reflect the human biases that developers and creators bring to AI models, the former category is impossible to counteract just by getting rid of data points that indicate, for example, femininity or non-whiteness.

Blake likened data bias to the practice of redlining in mortgage lending. In a redlining system, home insurers and mortgage lenders assess non-white customers as “risky” based on their neighborhood, making credit and insurance more difficult to come by for people of color, while not directly attributing denials or higher prices to race. Similar logic is already visible in AI systems, Blake noted.

“[T]he model may make underlying correlations that lead to biased decision-making based on non-protected features,” Blake wrote. “In other words, the remaining, non-protected features could act as proxies for protected characteristics.”

Societal bias, by contrast, is “where norms and negative legacy from a society cause blind spots.” Blake cited an Amazon recruitment algorithm that tended to recommend more male candidates than female, because the data, historically, showed that males tended to be hired more often.

AI bias is particularly dangerous in the financial sector, Blake warned. Trust in the banking system is at serious risk from the aforementioned biases displayed by AI, given that (while noticeably less discriminatory than human decision-makers), AI still charged higher rates to Black and Latinx mortgage customers when compared white ones. Moreover, the opaque “black box” nature of proprietary models means that, if a large group of important firms uses them for similar purposes, their actions could have a huge effect on the economy as a whole, and be fairly unpredictable into the bargain.

Blake acknowledged that this type of destabilizing AI event hasn’t happened yet, but warned that the risks are very real, citing a 2021 case in which Apple and Goldman Sachs were investigated by the New York State Department of Financial Services for algorithmically offering smaller lines of credit to women.

“Beyond the inherent issues with bias, fairness and ethics, this could potentially lead to stability issues for financial institutions or the financial system as a whole,” Blake wrote.

hoidabunko November 6, 2023 November 6, 2023
Share This Article
Facebook Twitter Email Print
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Cry0
Embarrass0
Joy0
Shy0
Surprise0
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Facebook Like
Twitter Follow
Pinterest Pin
Youtube Subscribe

LATEST NEWS

virtual eye / digital surveillance, privacy / artificial intelligence / machine learning

ChatGPT learns to forget: OpenAI implements data privacy controls

hoidabunko hoidabunko November 7, 2023
Biden lays down the law on AI
Remove the HD Video Player PUP
Apple’s iPhone factories are going to get much, much smarter
Otter AI Chat acts like a collaborative ChatGPT tailored for meetings
Công Nghệ Thông Tin - Information Technology
Go to mobile version
Welcome Back!

Sign in to your account

Lost your password?