The Rizzi family recently returned to the US after living in Italy for three years, during which time a love for all things pizza was cemented. I could write a whole blog about the best Roman pizzeria (it’s Roscioli), so when a colleague told us, the data team, about a recent study about pizza and data privacy – MIT undergraduates were willing to relinquish their friends’ personal information in exchange for a free pizza – we were interested.

Should we chalk the results up to college students’ love of late-night snacks? Not exactly. This year, the Ipsos Global Trends survey across 33 countries found that despite growing concern around big data, nearly half of respondents said they would trade their information away in return for personalized services and products. This was a 7 percent increase since the last time the survey was conducted in 2013. It’s easy to assume that this implies that people don’t care about data protection, but it fails to account for the frequently zero-sum choice presented to consumers for whom access to vital products and services increasingly relies on digital gatekeepers. The financial inclusion field is no exception.

Unchecked Superpowers?

For the unbanked and underbanked and those that work with them, the explosion of available data, data-crunching services and data-enabling regulatory schemes bring unprecedented potential. A BFA paper called certain artificial intelligence applications “practical superpowers” and the possibilities to unlock wealth and opportunity for the poor indeed seem extraordinary – from prospecting and credit scoring for thin-file consumers to well-timed nudges for saving, from fraud management and complaint chatbots to access to e-commerce platforms and open banking regimes. At CFI, we feel the case has been well-articulated for supporting a myriad of data-driven tools and will continue to be made.

Our perspective is that for these opportunities for financial inclusion to be fully realized, they must also be examined for their risks, pitfalls and unintended consequences for low-income consumers. This lens is not meant to throw a wet blanket on innovation and enthusiasm, but to have a clear-eyed focus in a world where data is growing by the zetabyte (10^21) every month, where tens of thousands of individuals become digital immigrants every day and where innovation races ahead of traditional guardrails on data protection and privacy.

Our vision is that digitization is catalytic for the poorest and most vulnerable consumers, not just those who are already on the pathway to digital inclusion. Looking at developed markets where smartphone ownership is widespread, we still see vulnerable communities trapped in a state of data marginalization, due to gaps in technology access, fear of documentation, or failures of mainstream data collection mechanisms. As the use of alternative data to drive financial products grows and other data tracking systems, such as biometrics, expand, developing countries may see more data silos and failures in capturing a holistic picture of the lives of the poor.

We see evidence of this when looking at the consumer base of digital credit products, which tend to be younger men living in urban areas. There are many reasons why women may not be accessing these products, but failing to account for limits on women’s use of smartphones is a structural barrier to collecting accurate data to use in digital credit decision models. Pushing for rapid onboarding to digital finance without an understanding of these dynamics could lead to a more entrenched digital divide, unintentionally excluding those who could benefit most from new products and services because of their incomplete digital footprints.

CFI plans to build on its decade of experience in financial consumer protection to focus on key questions around the promise of data and financial inclusion. And we are raising several of these issues as part of Financial Inclusion Week 2020.

Hot Topics

Responsible Algorithms

The data-driven predictions in the financial inclusion space that have grabbed the most headlines in recent years have been on consumer creditworthiness. In economies where there is less access to reliable financial data, providers are leveraging other sources, the bulk of which come from mobile phones but can also include social media, geospatial data, etc., in order to make underwriting decisions. Today there is limited visibility into the type of data that financial service providers are collecting and how they are using it, with few incentives for transparent data practices.

CFI is interested in examining how financial inclusion actors approach the use of algorithms and automated decision-making, especially using alternative data, as their use becomes ever more pervasive in the industry. Within this arena are several important topics, including how to effectively measure capacity to repay with alternative data as well as how to mitigate against biased decisions. Although algorithms and automation don’t necessarily lead to discrimination, bias can occur at varying stages – from the data sources, processing, model development, or outputs. And there are heightened risks that biases, which could undermine financial inclusion goals, could arise from data that does not paint a full enough picture of a low-income person or customer segment.

Compounding this, algorithms are not designed to be interpretable or explainable, and are often treated as trade secrets that are only understood by few within an organization. Data protection frameworks have emerged in developed markets that empower consumers with data rights vis-à-vis automated decision-making, and rely heavily on consent. As the principles in these frameworks percolate into emerging markets, we wonder what constraints and barriers there are in their implementation.

Beyond Consent

While notice, choice and informed consent have been the traditional approach to data privacy, those concepts seem ill-equipped for our data-driven world. For instance, you probably already knew that the apps you use monitor your location to send you custom advertisements, but did you know that a hedge fund might also buy that geolocation data to figure out which stores you like? Or that there is a whole industry of data brokers – scraping, compiling and collating bits of data about you to monetize and make predictions about your behavior? Acxiom, one of the biggest, operates in more than 62 markets and has information on over 2.5 billion consumers. Especially for low-income or vulnerable consumers, who are both often targets for data collection efforts and less likely to understand or use basic privacy protections, consent-based approaches seem insufficient. New approaches beyond consent are clearly needed for data protection in financial inclusion (and beyond), but what might those look like in practice, who bears the cost for them and who determines their efficacy?

We hope you continue to engage us and challenge us on these issues, not only during this Financial Inclusion Week, but as we roll out our new strategy. There will always be tension – sometimes healthy, sometimes unhealthy – in the use of technology for inclusive finance: will it make better lives within reach? Or will technology overstep and infringe on privacy rights….or worse?

Alex Rizzi

Senior Research Director, Consumer Data Opportunities and Risks

Since joining CFI in 2012, Alex has been an advocate for consumer protection and evidence-based inclusive finance practices.

She manages the responsible data practices research portfolio at CFI which focuses on opportunities and risks for low-income consumers from data-driven financial services. Previously, she was a senior director for CFI’s Smart Campaign initiative, where she managed consumer-protection learning agendas and helped launch the Client Protection Certification Program.

Prior to joining CFI, Alex was a project development manager at Innovations for Poverty Action, where she helped create its microsavings and payments innovation initiative (now part of its global financial inclusion initiative), a research fund focused on impact evaluations of savings products and payment systems. She also worked at the Centre for Microfinance at IFMR Research (now Krea University) in Chennai, India, where she was the head of knowledge management and dissemination.

She has participated in multiple industry working groups on responsible finance, including an advisory group to GSMA’s mobile money certification program and the Microfinance Center’s social performance fund steering committee.

Alex is a graduate of Princeton University and holds a master’s degree from Georgetown’s foreign service school, as well as a certificate in digital money from the Digital Frontiers Institute and The Fletcher School at Tufts University. She speaks conversational French and needs to work on her Italian.

Alex Kessler

Former Research Manager

With a background in journalism and digital media analytics, Alex Kessler played a leading role in CFI’s research and policy work on advancing responsible data practices for inclusive finance and addressing the barriers to women’s financial inclusion. Alex joined CFI in 2017, originally focused on consumer protection and digital finance. In her previous role at Monitor 360 (now Protagonist Technologies), a digital communications consultancy, Alex advised on product development and led strategy engagements for clients in the government, foundation, and corporate sectors. From 2010 to 2013, Alex served as the international editor for The Daily Star newspaper in Beirut, Lebanon, where she also reported on Middle East politics and Lebanese civil society. More recently, Alex led projects examining the barriers to entrepreneurship in the Middle East for the Wamda Research Lab and analyzed Rwanda’s regulatory environment for financial inclusion for The Economist Intelligence Unit. Alex holds a master’s from The Fletcher School at Tufts University, and a bachelor’s in international relations from Tufts.

Explore More

Privacy as Product, FSP designers
Toolkits and Guides

Privacy as Product: Privacy by Design for Inclusive Finance

Sign up for updates

This field is for validation purposes and should be left unchanged.