Banks turn to smartphone metadata to assess lending risk

Because banks often decline to give loans to those whose "thin" credit histories make it hard to assess the associated risk, in 2015 some financial technology startups began looking at the possibility of instead performing such assessments by using metadata collected by mobile phones or logged from internet activity. The algorithm under development by Brown University economist Daniel Björkegren for the credit-scoring company Enterpreneurial Finance Lab was built by examining the phone records of 3,000 people who had borrowed money from a bank in Haiti, looking at when they made calls, how long those calls lasted, and how much money they spent on their phones. Björkegren found that the bank could have reduced defaults by 43% by using the algorithm, in part because the metadata collected by phones in daily use much more quickly reflects changes in people's circumstances than traditional reports. The research also found that the time of day when people make calls and the neighbourhoods they call were useful indicators.

While such a system may help those who have been economically marginalised, there are caveats. Users who opt to protect their privacy by refusing to share data may become the targets of suspicion. Security is also a risk. However, startups such as Inventure and Lenddo are using similar methods to perform borrower risk assessment in South Africa, the Philippines, and Colombia. 

Writer: New Scientist
Publication: New Scientist

External Link to Story

What is Privacy International calling for?

People must know

People must be able to know what data is being generated by devices, the networks and platforms we use, and the infrastructure within which devices become embedded.  People should be able to know and ultimately determine the manner of processing.

Data should be protected

Data should be protected from access by persons who are not the user.

Limit data analysis by design

As nearly every human interaction now generates some form of data, systems should be designed to limit the invasiveness of data analysis by all parties in the transaction and networking.

Control over intelligence

Individuals should have control over the data generated about their activities, conduct, devices, and interactions, and be able to determine who is gaining this intelligence and how it is to be used.

Identities under our control

Individuals must be able to selectively disclose their identity, generate new identities, pseudonyms, and/or remain anonymous. 

We should know all our data and profiles

Individuals need to have full insight into their profiles. This includes full access to derived, inferred and predicted data about them.

We may challenge consequential decisions

Individuals should be able to know about, understand, question and challenge consequential decisions that are made about them and their environment. This means that controllers too should have an insight into and control over this processing.