What’s in your Black Box?

Kimberly West FLOCK Notes

I should start by admitting that we use the AI language model ChatGPT in our office every day; some days we probably speak to ChatGPT more often than we speak to each other.

We typically use ChatGPT as a coding “help desk” (almost all of my interactions with ChatGPT start with “Why does the following code not run…”).

Most organizations – particularly in the FinTech space – are in some evolving stage of using AI to support operational efficiencies, derive more precise insights, and improve the customer experience.  Using AI to support our ability to translate data into information is only going to become more ubiquitous to organizations in the traditional Financial Services and FinTech sectors.

However, we get into trouble when we outsource our understanding of how data is transformed into outputs that drive (or automate) decisions that impact our customers.  This was the point that Rohit Chopra, Director of the CFPB, was making in his recent cautionary statement that AI algorithms are “black boxes behind brick walls”[1] – AI can inhibit transparency and generate results that are inconsistent with an organization’s (or a society’s) values.

As a former professor of statistics, I used to warn my students that calculators (computers) are mediums of computational expediency – NOT a substitution for the ability to do the math.  Conceptually, this should be the litmus test for our collective usage of AI – specifically, can we work backwards from the output through the algorithm to get back to the input?  Can we reproduce the output using the inputs on a piece of paper…with a pencil?  If the answer is “no”, then I would challenge any FinTech executive to ask their AI/Data Science teams to “prove” that this lack of transparency is not negatively impacting customers, clients, employees, shareholders, investors.  Can you be sure that your “AI” black box solution is consistent with the values of your organization?

For more information about Flock Specialty Finance, contact Jennifer Lewis Priestley, CDO (jpriestley@flockfinance.com)

[1] https://www.consumerfinance.gov/about-us/newsroom/director-chopra-prepared-remarks-on-interagency-enforcement-policy-statement-artificial-intelligence/