Subscribe

It’s banking, Jim – but not as we know it

Trust is the word of the moment in banking: can customers trust banks? Can society? But banking has always been fundamentally a trust business: when neither borrowers nor lenders know one another well enough, they trust a bank to act as an intermediary.

The arrival of big data, social media, more access to previously private information and even artificial intelligence (AI) represents a new era in the trust business – for better and worse – although the essentials of banking won’t change.

" New advances promise [lenders] superior knowledge of customers and the environment in which they are living."
Andrew Cornell, BlueNotes managing editor

What will change – and is changing – is the focus of that trust.

Historically, borrowers trusted banks to give them funds if they satisfied certain criteria; lenders trusted a bank to repay them even if the borrower went broke; that the money they had lent by way of a deposit was safe.

For the bank then it is hardly a new thing that the better they know the people they lend money to – individuals or companies or countries – the more likely they are to get their money back. (Or at least have charged a higher enough interest rate to compensate for the risk of default).

The new dimensions in trust are private information and financial advice. These too are not new domains: customers have always trusted banks with personal details and relied on them for financial advice.

What is new is the amount of private information potentially available, especially via social media. And broader role banks can play in wealth creation via the advice they give – along with the reputational risk when the advice is inappropriate.

That’s why the new data age and advances like AI are so important in banking. Just as computerised data bases and risk-based modelling allowed banks to expand their lending beyond their immediate community (where bankers could see and understand the risk first hand), so these new advances promise superior knowledge of customers and the environment in which they are living.

But at what point will consumers fear banks have too much knowledge? And is financial advice truly in the interest?

BLUNT INSTRUMENTS

Speaking at a recent mortgage industry conference, ANZ’s head of home loans Will Ranken noted regulators were also increasing the pressure on banks to better understand their customers.

He was frank: banks have "relatively blunt data measures at the moment" on their customers' ability to repay. That data too tends be aggregate, covering demographics or sectors. Borrowers though are individuals – consumers or companies.

"What we can start doing with data is not just [examining] the ability to service [a loan] but also [the customer’s] intention to service as well," he told RFi Group's Australian Mortgage Innovation Summit.

"At the moment, we have very general metrics around what is the underlying property. We capture a lot of different property types under the one classification. But as we all know, one house next to another can be very, very different," he said.

That’s true across the financial services sector: what is true for an aggregate data set only gives us, at best, a good approximation of the risk.

So to the vastly richer data now available, via social media and other sources, which can be better sorted and analysed by vastly more sophisticated models and computers (and eventually AI), promises superior and more tailored credit decisions. And less privacy.

Promising, yes, but new issues for the industry and society.

HEART OF THE CHALLENGE

At the heart of this challenge is the question of social equity and the rights of individuals. Better knowledge of the risks certain borrowers face should lead to a more accurate dispersal of funds. But it may also lead to financial exclusion.

Social media profiles are already used in assessing the suitably of applicants for jobs. What if that is extended to suitability for funds? Would potential borrowers be judged by the company they keep or places they visit if it is determined these might increase risk? For example, what if an applicant was a frequent visitor to a casino?

There was a fascinating debate on ANZ’s internal communications platform MaxConnect on this over recent weeks kicked off by software engineer Rupert Jones, sparked by BlueNotes’ publishing a weforum.org story about Google’s AI translation tool.

The tool created its own language in order to translate between Japanese and Korean without using English as an intermediary, what Google calls “Zero-Shot Translation’.

There are implications for banking in creative ways of eliminating potentially confusing steps but another executive, Peter Smith, noted big data analysis of people’s digital footprints was more immediately useful, citing an article on Motherboard about the work of Michal Kosinski. 

In 2012, Kosinski proved that on the basis of an average of 68 Facebook ‘likes’ by a user, it was possible to predict their skin colour with 95 per cent accuracy, their sexual orientation (88 per cent accuracy) and their affiliation to the US Democratic or Republican party (85 per cent).

But it didn't stop there: intelligence, religious affiliation, as well as alcohol, cigarette and drug use, could all be determined from the data.

So “if Facebook, Google and Apple are best placed to judge personal creditworthiness (by using such data) , how long until they monetise this? What form would this take? Selling scores to fintechs or banks or non-bank financiers?”

Interestingly, there has already been a spate of rumours in the US about Amazon’s interest in Capital One, a non-conforming lender, to build payments and lending.

Capital One operates at the riskier end of the lending spectrum, using what is known as ‘thin’ credit file lending – lending to customers with little or poor credit histories – and is adept at customer acquisition and credit analytics. The argument is Amazon’s rich customer knowledge would dovetail perfectly.

From a social perspective before we get to a world where AI can perfectly predict who will repay loans, what the best investment advice for an individual is and how risk should be priced, there will inevitably be a period when the science isn’t perfect.

Some bad credits will still slip through but, perhaps more significantly for society, some people will receive the wrong advice or be denied credit when they would have paid it back.

ISSUES

There are also competitive issues. ANZ wealth analyst Matthew Nelson drew attention to a UK insurer, Admiral, which wanted to use Facebook data to assess first-time driver risks (their analysis included how often its applicants used the exclamation mark which is - as far as I can see - an excellent basis for exclusion). Facebook stopped the project and the network itself has a patent on credit worthiness.

As some have noted, this data analysis rests on guilt by association: if you demonstrate similar characteristics to a population you are likely to behave in the same - critically though, only ‘likely’.

Indeed, this is the central conundrum of clinical psychology: decisions are made about individuals drawn from probabilities extrapolated from groups. But even if in 999 cases out of a 1000 members of a population behave a certain way, what if you are the one who doesn’t? Is it just bad luck?

Regulators too are looking at AI to monitor not just human interaction but how other robots are giving advice. IBM has bought the risk-management program Promontory and intends to link it with its AI genius Watson, creating a sort of regulatory cyborg.

Ultimately, society benefits when capital goes to those who earn the best return but we are still a long way from where those decisions can be handed over completely to robots, algorithms or social media.

Andrew Cornel is managing editor at BlueNotes

BANNER PIC: Getty Images

The views and opinions expressed in this communication are those of the author and may not necessarily state or reflect those of ANZ.

editor's picks

22 Feb 2017

Can the next generation of coal power be banked?

Andrew Cornell | Past Managing Editor, bluenotes

The Australia government has flagged an expansion of the mandate of its so-called ‘green bank’ to allow investment in clean-coal technologies – at the same time as the banking regulator delivered its most detailed comments to date on how banks should think about climate change.

17 Feb 2017

ANZ growing but still cautious: Elliott

Andrew Cornell | Past Managing Editor, bluenotes

ANZ continues to grow but remains cautious of the economic environment despite posting a solid increase in cash profit for the first quarter.

14 Feb 2017

RegTech joins the C-Suite

Andrew Cornell | Past Managing Editor, bluenotes

A new ‘C’ has been added to the suite in banking: the Chief Security Information Officer. BlueNotes has a video interview this week with ANZ’s new CISO, Lynwen Connick, who comes from an eminent career in the Australian public service.