open banking data

Professor Takes on Big Data in Hearing

Dr. Christopher Gillard, PhD., is a Professor of English at Macomb Community College and the Digital Pedagogy Lab Advisor. He was recently one of five witnesses to testify in front of the House Financial Services Committee in a hearing entitled: “ Banking on Your Data: the Role of Big Data in Financial Services.”

In his opening written statement, he noted, “The notion that companies like Facebook, Google, Amazon are entering into banking in order to benefit the unbanked or people who do not have access to traditional credit markets is absurd on its face. As one recent report in Bloomberg asserted, regarding Google’s proposal to partner with banks to offer checking accounts through its Google Pay app. ‘For Google, the bank partnerships will give the tech behemoth a better ability to show advertisers how marketing dollars spent on its system can drive purchases.’”

Facebook

When Mark Zuckerberg, the Chair and Chief Executive Officer (CEO) of Facebook, testified in front of the same committee about Facebook’s digital currency idea called Libra in October 2019, Brad Sherman, a Democrat from the State of California, made a similar point.

“For the richest man in the world to hide behind the poorest people and say that’s who you’re trying to help. You’re trying to help those for whom the dollar is not a good currency: drug dealers, tax evaders, terrorists,” Sherman stated.

Dr. Gillard also noted that the technology which supports big data and other financial technologies can perpetuate stereotypes and further marginalized minorities and other groups already marginalized.

Dr. Gillard noted in his written remarks, “There are two crucial frameworks for understanding these technologies and their impacts on marginalized communities: digital redlining7 and predatory inclusion. Digital redlining is the creation and maintenance of technology practices that further entrench discriminatory practices against already marginalized groups—one example (among many) being when journalists at ProPublica8 uncovered the fact that Facebook Ad targeting could be used to prevent Black people from seeing ads for housing, despite the Fair Housing Act prohibiting such conduct.

“Predatory inclusion is a term coined by scholars Louise Seamster and Raphaël Charron Chénier to refer to a phenomenon whereby members of a marginalized group are offered access to a good, service, or opportunity from which they have historically been excluded but under conditions that jeopardize the benefits of access. ‘… the processes of predatory inclusion are often presented as providing marginalized individuals with opportunities for social and economic progress. In the long term, however, predatory inclusion reproduces inequality and insecurity for some while allowing already dominant social actors to derive significant profits.’ As an example of this, we might look at a report on the cash advance app Earnin, which offers loans and users are able to ‘tip’ the app. As reported in the NY Post, ‘If the service was deemed to be a loan, the $9 tip suggested by Earnin for a $100, one-week loan would amount to a 469 percent APR.’”

This is not the first time the House Financial Services Committee has heard from experts which say financial technology are leading discrimination against minorities.

In a hearing in June 2019 entitled, “Perspectives on Artificial Intelligence: Where We Are and the Next Frontier in Financial Services” two experts also made similar proclamations.

Dr. Nicol Turner-Lee is a fellow of Governance Studies at the Center for Technology Innovation at the Brookings Institution, a think tank.

She noted that data engineering was causing minorities to be even more targeted for credit approval.

“In the case of credit, we are seeing people denied credit due to the factoring of digital cognitive profiles which include their web browsing histories, social media profiles and other inferential characteristics to the factoring of credit models and these biases are systematically finding themselves with less favor to individuals in particular groups where there is no relevant difference between those groups, which justifies that harm.”

“Despite a strengthening economy, record low unemployment and higher wages for whites, African-American homeownership has decreased every year since 2004 while all other groups have made gains.” Dr. Turner noted further. “In 2017, 19.3 percent of African American applicants were denied home loans, while only 7.9 percent of white applicants were rejected.”

Dr. Douglas Merrill is the founder and CEO of Zest Finance, a company which uses machine learning (ML) for the loan approval process.

He also testified in June. He said that he found that Machine Learning tools showed signs of discrimination.

“Without understanding why a model made a decision, bad outcomes will occur. For example, a used-car lender we work with had two seemingly benign signals in their model. One signal was that higher mileage cars tend to yield higher risk loans. Another was that borrowers from a particular state were slightly less risky than those from other states. Neither of these signals raises redlining or other compliance concerns.

“However, our ML tools noted that, taken together, these signals predicted a borrower to be African-American and more likely to be denied. Without visibility into how seemingly fair signals interact in a model to hide bias, lenders will make decisions which tend to adversely affect minority borrowers.”