What does Apple Card have to do with B2B lending?

Recently, a tweet caused a big kerfuffle when Danish programmer David Heinemeier Hansson tweeted that his credit limit for the new Apple Card was 20 times that of his wife’s, even though she has the higher credit score. All of a sudden the news picked up stories on gender bias, claiming the credit card's issuer, Goldman Sachs, is giving women far lower credit limits with the new Apple Card, even if they share assets and accounts with their spouse.

But it's impossible to know if the Apple Card — or any other credit card — discriminates against women, because creditworthiness algorithms are notoriously opaque. Credit scores are only one factor in determining credit worthiness, so it’s hard to jump to conclusions.

But if the future is more technology around B2B lending, just how concerned should we be that models built on AI may not be as good as one might think? If we are to believe the capabilities of AI, and I do, then through self-learning mechanisms like self-driving cars, the intelligence should get better with more experience.

But humans design software, and humans have biases. Some examples include:

  • Overconfidence:We are too confident in our own abilities.
  • Confirmation bias:We tend to listen to only the information that proves our points.
  • Clustering illusion:We see patterns in random events, like the number 7 turns up five time in a row on the craps table and you see a pattern.
  • Recency effect:We weigh the latest information more heavily than older data.
  • Ostrich effect:We bury or ignore negative information.
  • Information bias:More is not necessarily better.

These biases can lead to design flaws in the algorithms we code to try to make us more efficient and effective in automating B2B underwriting.

There are four areas that we should recognize:

  1. Algorithm bias:For example, your model may be motivated by profit margin and could sway loans toward certain businesses. Or in medicine, some patients make you money and a big dependency could be insurance, but the model may conflict with patient health. There could be hidden bias and design bias. Design bias may be intentional, as designers of algorithms may conflict with societal values or norms.
  2. Data bias: Algorithms are only as good as the data they are learning from, and bias can be embedded in data. For example, some organizations are trying to predict dilution in order to finance invoices. This is by no means a small feat, especially when the data sets typically have been insulated in a benign credit world. (See the story: Post Confirmation Dilution in an Uncertain Credit World.) Or take the case of Tesla, which has 2 billion self-driving miles to assess. Am I confident that a healthy percentage of that driving was done in the city, or was the vast majority highways or tracks built by Tesla itself?
  3. Interpretations of what the algorithms mean:Algorithms are a black box, and designers may know the model limitations and how to interpret it, but when it comes to lender or relationship managers, they may not know how to interpret. This is where you get into what I consider the most important error that can be made — Type I or Type II errors, also called false positives and false negatives, respectively.
  4. Who is responsible for the decisions the model makes?While AI may produce better outcomes, it can also reduce autonomy.

So in building AI applications for B2B lending, it’s important to bear the above points in mind. No doubt, there are big advantages with AI as it can reduce inherent individual biases. With lending applications, the ability to get smarter as you look at more data sets to help reduce expected losses is quite attractive, but we must bear in mind the caveat that most modeling does not have a long enough business credit cycle. And for many models built on recent data, the real concern can arise when this long benign credit cycle ends.

David Gustin runs Global Business Intelligence, a research and advisory practice focused on the intersection of payments, trade finance, trade credit and working capital. He can be reached at dgustin (at) globalbanking.com.

Share on Procurious

Discuss this:

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.