More than 2.5 quintillion bytes of data are created every single day. As such, lenders are collecting vast amounts of consumer information and calling it “big data,” a key product to generate insights, support decision making, and enable automation.
Access to “big data” gives financial institutions more power over their consumers’ personal information and, in turn, financial institutions must be held accountable for ethically using their consumers’ data, according to a hearing last week with the House Financial Services Committee‘s Task Force on Financial Technology.
“When we are generating this much data and power, it imposes an ethical duty to use the data properly,” Congressman Tom Emmer (R-Minn.), ranking member of the Task Force, said during the hearing.
The power of big data has led to the development of new ways for lenders to underwrite consumers, the Task Force noted. For example, the increased use of big data in financial services has enabled credit reporting agencies like Experian, TransUnion, and Equifax to create new credit scores. Moreover, insurance companies are using IoT data from cars to predict risk, and some mobile lending apps track location to determine how much time their users spend at work.
There is even an app that tracks a consumer’s call history under the belief that people who regularly call their mothers are more likely to repay their loans, witness Seny Kamara, associate professor of computer science at Brown University and chief scientist at Aroki Systems, said during his testimony.
However, despite the innovative ways data can assist financiers, there is a “lack of transparency” with data, Kamara said. For example, consumers can’t understand why or how a machine learning model makes decisions. “There is a bias in decision making, and we are still in the early stages of understanding these algorithms,” he added.
Another witness, Lauren Saunders, associate director at the National Consumer Law Center, called on the Consumer Financial Protection Bureau to play a “bigger role” by supervising data aggregators more heavily.
Yet, Task Force members expressed concern about dense language most mobile apps present to consumers when asking permission to access their data. For example, Venmo’s privacy agreement is 13,196 words — 14 pages — of dense legal jargon.
“I’m an attorney, and I had a hard time going through [Venmo’s] privacy agreement,” said Chair of the Task Force Congressman Stephen Lynch (D-Mass.)
However, mobile apps from lenders like USAA, Bank of America, and Citibank give consumers a choice to “kill connectivity” at any time, said Managing Director of Financial Data Exchange Don Cardinal. “Consumers can see very clearly who they permission in [these apps].”
Still, the Task Force was not sold on the idea that it’s a consumer’s responsibility to understand legal terms and conditions. “Consumers don’t know to what extent their data is being used, and it shouldn’t be put on the consumers to know that,” Emmer said.
The Task Force has drafted three pieces of legislation — the No Biometric Barriers to Housing Act of 2019, Safeguarding Non-Bank Consumer Information Act, and Financial Information Data Modernization Act — to combat the issues presented during the hearing.
Members of the Task Force must submit additional questions for the witnesses after the holiday break.