Bias in loan application decisions is a persistent issue. The Federal Trade Commission (FTC) points to two federal laws, the Equal Credit Opportunity Act (ECOA) and the Fair Housing Act (FHA), which aim to offer protections against lending discrimination for protected groups. These laws make it illegal to offer less favorable loan terms to applicants on the basis of their race, nationality, gender, age, and marital status. Despite the laws, cases of discrimination in home loan and credit approval persist.
Stories from NYT
about racial discrimination in loan application. Akridge, a black man, who had all the necessary financial credentials like steady job, well-paid salary, high FICO credit score, was declined to refinance his mortgage.
According to the report from CNBC, "A majority (59%), of Black homebuyers are concerned about qualifying for a mortgage, while less than half (46%) of White buyers are, according to a recent survey by Zillow. Lenders deny mortgages for Black applicants at a rate 80% higher than that of White applicants, according to 2020 data from the Home Mortgage Disclosure Act."
The Home Mortgage Disclosure Act (HMDA) requires home mortgage lenders to publicly disclose their mortgage application decisions. A 2019 HMDA report by the Consumer Financial Protection Bureau states that ‘the denial rates for conventional home-purchase loans were 16.0 percent for Black borrowers and 10.8 percent for Hispanic White borrowers. In contrast, denial rates for such loans were 8.6 percent for Asian borrowers and 6.1 percent for non-Hispanic White borrowers.’ In 2019, the Apple Card was investigated after facing complaints of gender discrimination, however, investigators ultimately found no wrong-doing. These findings not only show that bias in loan application decisions persevere, but also that it is difficult to address.
Verge will assist the decision making process for loan approvals and assist lenders to double check whether the prediction results of their AI tool used for approving loan applications meet fairness standards. It offers a ‘second opinion’ on the bank’s original algorithmic decision, and makes cases visible where the bank’s model and our models that are optimized for different fairness metrics disagree. Verge will provide predictions from fairer algorithms in a transparent, trustworthy, and intuitive way, and highlights instances where human-in-the-loop review may prevent an unfair decision from happening due to algorithmic bias. Our product would also increase trust in the remaining AI decisions by making it clear when a decision satisfies the bank’s algorithm’s criteria as well as our criteria, which is optimized for fairness.