New Apple/Goldman Card Under Investigation for Gender Bias in Credit Algorithm

The New York’s State Department of Financial Services announced last week that it would investigate the new Apple/Goldman Sachs credit card venture over claims of gender discrimination.
Apple has pitched its new card as a model of consumer friendliness and claimed it was the most successful launch of a credit card ever. But a pall was cast when social media began erupting earlier this month with postings from married men and women after husbands were given credit limits on their Apple cards that were multiple times higher than their wives. The couples owned all of their property together and lived in states where both partners were jointly responsible for all debts.
Initially, Apple representatives tried to dismiss concerns by saying that Apple Card’s credit assessments are made by a computer algorithms and not humans - oblivious to the avalanche of evidence in recent years of bias in machine learning decision-making. Goldman then released a statement saying that the Apple Card (unlike much of the industry) doesn’t let households share accounts. This also didn’t seem to explain why men were given substantially higher credit limits than their wives with the same credit risk.

To be fair, it has not been made clear how the differences in credit limits were established. What is surprising is the apparent cluelessness in the responses of Apple and Goldman given that the credit algorithm decision-making is a “black box” process to both consumers and regulators. As long as this remains the case, the burden is going to be on banks and other lenders to show that no prejudice or unfair bias occurs - regardless of their decision-making algorithms and tools.