How is AI changing power dynamics in the world of money?
Artificial intelligence (AI) and other technologies are causing a shift in power dynamics in financial services, and this change has significant implications for individuals' privacy and government regulators
There has been a notable change in power dynamics associated with the provision of financial services and the rise of FinTech due to increased use of automation technologies such as AI and machine learning by financial firms and financial services regulators, according to Rob Nicholls, Associate Professor in Regulation and Governance at UNSW Business School.
“The increased use of AI by all services providers leads to some important questions, the main one being the question of fairness. AI systems are known to reproduce human biases and introduce new ones. Can we prevent it?” A/Prof. Nicholls asked.
In addition to the bias problem, there is also an issue of the extent to which privacy is maintained in the context of automated decision-making. “After all, AI works in the context of significant amounts of data,” said A/Prof. Nicholls, who was speaking ahead of a UNSW conference on Money, Power and AI: From Automated Banks to Automated States which will be held in person on 29 November and online on 30 November.
AI applications and privacy protections
To preserve privacy, he suggested this data could be anonymised or pseudonymised in some AI applications. “However, in the financial services sector, this presents important challenges, for example in the context of the “know your customer” requirements, where de-identification of data may not be an option.”
The potential risks associated with both privacy loss and potential bias become even greater in an environment where the decision-making technology is used by government. A/Prof. Nicholls observed many of the automated decision-making and AI tools, which governments are eagerly applying today, have been developed and experimented within the private sector.
“In some cases, the private sector has rejected their use,” he said. “That rejection is often based on experience of data matching and, in the government context, the experience of Robodebt is still raw. The potential use of AI as a support tool in regulatory enforcement, particularly RegTech, means that the bias issue is as important to businesses as it is to individuals.”
Read more: How to avoid the ethical pitfalls of artificial intelligence and machine learning
A/Prof. Nicholls will be presenting recent research on support tools for regulatory enforcement at the conference, which is sponsored by UNSW centres and institutes: the CLMR, the Australian Human Rights Institute and the Allens Hub, and in collaboration with ARC Centre of Excellence ADM+S. The conference will feature speakers from both academia and professional backgrounds, and they will address key issues associated with increased use of AI in the financial services sector and by state players.
What issues will AI cause in financial services?
“Technology tools, and associated managerial culture, are often transferred from private corporations to government departments. The major consultancy firms are often the vector for this transfer,” said A/Prof. Nicholls, who noted there is a great deal of opacity in the development of automated decision tools by businesses.
These tools are often the basis of a unique selling proposition and are protected as trade secrets, and A/Prof. Nicholls said this means that it is not clear how the latest technology is used by the industry and what new tools are under development.
“There is certainty that these tools will soon be used by public administrations. In the context of that opacity, there is scope for a debate about accountability, better regulation, and scrutiny,” said A/Prof. Nicholls, who will be discussing issues associated with disclosure of market-sensitive information by listed firms and the detection of the effects of financial influencers (or “finfluencers”) at the conference.
As well as formal ASX releases, businesses also use media releases and social media to interact with stakeholders. A/Prof. Nicholls said this can lead to market-sensitive information becoming available without a formal announcement. Separately, finfluencers can cause changes in the stock prices of listed entities, he said.
A/Prof. Nicholls is working is on a decision support tool that can identify compliance with continuous disclosure obligations, and he explained the same approach can also be used to identify stocks that are subject to “pump and dump” activity driven by finfluencers.
“The aim of this conference is to bring together different people working on various topics in relation to law and regulation of the use of AI tools by financial firms and governments: law scholars and researchers, social scientists, professionals, technology enthusiasts and critics,” said one of the conference co-organisers, Dr Zofia Bednarz. “We rarely talk to each other, focused exclusively on our own perspectives.”
Co-organiser, Dr Monika Zalnieriute echoed Dr Bednarz and said: “This discussion of issues that the use of AI tools by financial industry and governments might cause is particularly timely, because of the rising use of AI technology, for example in the pandemic context.”
UNSW’s Money, Power and AI: From Automated Banks to Automated States conference will be held in person on 29 November and online on 30 November. The conference will bring together speakers from industry and academia across multiple faculties, including Engineering (computer science), Business (regulation and governance) and Law. For more information visit the conference website, download the conference program or register here.