Apple’s new credit card is being investigated by financial regulators after customers complained that the card’s lending algorithms discriminated against women.
The incident began when software developer David Heinemeier Hansson complained on Twitter that the credit line offered by his Apple Card was 20 times higher than that offered to his spouse, even though the two file joint tax returns and he has the worse credit score.
“The @AppleCard is such a fucking sexist program,” wrote Hansson. “My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does.”
Hansson said Apple’s customer service responded quickly to his complaints and eventually intervened to raise his wife’s credit limit as a one-off response, but those same reps reportedly told him they could do nothing to change the algorithm’s overall decision.
After Hansson’s complaints went viral, other Apple Card customers reported similar incidents. Apple co-founder Steve Wozniak was among them, saying he was given ten times the credit limit offered to his wife. Wozniak called on the government to investigate the operation of such black-box algorithms, which experts say are often discriminatory.
“We don’t have transparency on how these companies set these things up and operate,” said Wozniak in an interview. “Algos obviously have flaws … A huge number of people would say, ‘We love our technology but we are no longer in control.’ I think that’s the case.”
In response to the complaints, the New York State Department of Financial Services (DFS) said it would investigate the card, which is issued by banking giant Goldman Sachs.
“Any algorithm that intentionally or not results in discriminatory treatment of women or any other protected class violates New York law,” a spokesperson for the agency told The New York Times. “DFS is troubled to learn of potential discriminatory treatment in regards to credit limit decisions reportedly made by an algorithm of Apple Card, issued by Goldman Sachs, and the Department will be conducting an investigation to determine whether New York law was violated and ensure all consumers are treated equally regardless of sex.”
In a statement, Goldman Sachs denied that its algorithms would discriminate in this way. “In all cases, we have not and will not make decisions based on factors like gender,” said company spokesperson Andrew Williams. The Verge has reached out to Apple for comment and will update this story if and when we hear back.
These complaints feed into growing public awareness of discriminatory algorithms, which are increasingly making decisions in fields like healthcare, hiring, and housing. Companies and government agencies often introduce such automated systems to cut costs and handle complex datasets, but for years experts have been warning that such algorithms are opaque and unregulated, encoding sexist and racist thinking into unaccountable software.
In the case of the Apple Card, the complaints against it were only escalated so quickly because the developer David Hansson has a large online following. For the majority of people being discriminated against by opaque algorithms such a response is unheard of.
On Twitter, Hansson noted that some people were defending Apple by saying gender discrimination was par for the course in the finance world. “How is that anything but the most damning charge upon Apple’s pitch with their card?” responded Hansson. “Did the iPhone launch pledging to please carriers and the status quo as its modus operandi? No.”
When Apple launched its credit card earlier this year it boasted it would bring the simplicity and transparency of Apple products to the confusing world of finance. Using the Apple Card would be smooth and seamless, said Apple, leading to “a healthier financial life.”