The challenges of implementing new technology safely – and fairly

At the launch of UNSW’s PRAxIS Lab, experts discussed how to implement emerging technologies while prioritising ethics and equity

Emerging technologies present both opportunities and risks for organisations and society at large. With significant privacy, security and equity issues at stake, principles like transparency and ethics must be paramount to keep innovation from outrunning greater societal goals.

The implications of technological evolution are far-reaching and extend to corporations, individuals, governments and communities. As generative artificial intelligence (AI) becomes more democratised and other technologies break new ground, risks around design and data biases threaten the fairness of systems and public trust, making the responsible implementation of emerging technologies key to an equitable future.

The speed at which change is occurring presents its own challenge, though, with emerging technology rolling out on a massive scale before governments and organisations can set clear policies. That’s especially true for the government regulatory space, where legislation tends to lag technological developments by about four years and changes can have vast impacts on people, according to Dr Kate Harrington, Director of Corporate Services at the Independent Pricing and Regulatory Tribunal and an expert in public sector digital transformation.

Photo gallery: UNSW Business School’s PRAxIS (Practice x Implementation) Lab launch

“It is so incredibly complex when it comes to government because most people do not have a choice in the government services they wish to consume,” she said. “We have a responsibility of care to make sure people can still consume services, but we also have to find a balance for the level of protection.”

Dr Harrington was among prominent industry voices and academics who gathered recently to share insights on these challenges at the launch of UNSW Business School’s PRAxIS (Practice x Implementation) Lab, a research lab founded to address real-world problems through applied research and technology transfer. Part of the UNSW Business Insights Institute, the PRAxIS Lab seeks to address the challenge known as the “last research mile” – how to close the gap between research and practice for the benefit of society – with the help of implementation science, according to co-founder Yenni Tim, Associate Professor in the School of Information Systems and Technology Management.

“Despite a lot of excellent research being conducted, much of it remains underutilised and very difficult for people in industry and the community to discover, pick up and use,” A/Prof. Tim said. “Academics and professionals need to collaborate more closely, co-designing research and working together in a more systemic way to continuously create solutions for real-world problems.”

PRAxIS co-founder and School of Information Systems and Technology Management Professor Barney Tan explained that the lab focuses on three themes, including creating value for businesses and organisations and generating societal impact. Its third area of focus, the use of digital technology to improve resilience and sustainability for people and organisations, motivated the launch event’s emphasis on responsible implementation of emerging technologies.

Read more: How to innovate responsibly in an era of AI democratisation

In addition to showcasing the lab’s work thus far in partnering with companies and government entities to work on their challenges in areas like healthcare applications and the strategic deployment of generative AI, the launch event included a panel discussion and ‘reverse pitch’ highlighting how leading practitioners are approaching these challenges and why deeper collaboration on research solutions is essential.

A moment of technological change

The stakes are high, according to Jay Hira, Director of Cyber Security at KPMG Australia, who joined Dr Harrington as well as Dr Manisha Amin, founder and CEO of the Inclusive Design Collective, and Charles Lee, Director of Data and AI at PWC Australia, as panellists in the roundtable event, moderated by Dr Alba V. Olivares Nadal, Senior Lecturer in the School of Information Systems and Technology Management and a member of the PRAxIS Lab.

Mr Hira sees three trends in technology having a massive impact on cybersecurity, including implications for threat defence and response from AI and “Internet of Things” (IoT) technology’s implications for the safety of our devices. The heightened accessibility “adds convenience to our lives, but at a cost”, Mr Hira said. “These devices open tiny little doors and windows to our households, and if they’re compromised, if they’re not protected properly, if they don’t come with a default security setting, they create access to all the data those devices have access to.”

PRAxIS Lab co-founders Professor Barney Tan and Associate Professor Dr Yenni Tim.JPG
PRAxIS Lab co-founders Professor Barney Tan and Associate Professor Dr Yenni Tim at the launch of UNSW Business School’s PRAxIS (Practice x Implementation) Lab. Photo: UNSW Sydney

Another possible development on the horizon is quantum computing, which has the potential to equip bad actors with encryption-busting capabilities that threaten the security of all information on the internet.

Developments in generative AI are also changing the landscape for responsible AI, according to Mr Lee, who noted that the advent of ChatGPT “largely democratised AI”, the use of which was previously limited, for the most part, to data scientists. But this change has also democratised the risk of AI. “That’s made responsible AI much more important now; now, everyone at an organisation has access to AI,” he said. “How do you govern the use of generative AI by 5000 people in an organisation?”

Need versus problem

However, the panellists agreed that amid such rapid technological change, any responsible approach to innovation should begin with a more fundamental question around whether a given technology presents a solution to a need, or merely to a problem. As Dr Amin said, she is frequently asked about which new technologies promise to improve equity for underrepresented and marginalised communities, and she responds that there are none.

“Everyone’s always asking for the next, best thing – ‘What’s the technology that’s going to make the difference?’ The reality is, there is a technology for everything we want to do,” she said. “The questions we should be asking are, what do we need that technology for in the first place, how are we going to use it, and what is its benefit?”

Read more: How AI is changing work and boosting economic productivity

The key is the context in which the technology will be used, as these solutions are “never one-size-fits-all,” she said. “It’s all context-dependent, and the more we ask for the next brightest and best thing, the more we’re going to forget the things we’ve already designed that actually work well, and the more we think about designing for the fun of designing rather than for the fun of solving something that’s important for society.”

Mr Hira agreed, pointing to the tendency to “go down the path of using all the creative juices” in tackling challenges and the need to “point that in the right direction”. “Are we trying to solve the problems where the need is the greatest, or are we solving a problem that isn’t even a priority?”

This is particularly relevant for government entities deploying emerging technologies, beginning with the fact that they’re spending taxpayer dollars. “There is, first and foremost, the challenge of taking money away from core services,” Dr Harrington said. “Anything we might spend on trialling emerging tech, we’re not spending on education or healthcare or wherever there is a need. So, how do we think about the balance of exploring new tech, continuing to deliver the services we need to deliver and actually addressing a public sector need?”

Deep-seated challenges

The importance of these questions is evidenced by the known limitations and risks of generative AI and other new technologies, including risks around bias and access, the need to balance innovation with regulation and stakeholder expectations, and ethics and privacy concerns.

These are areas where industry and academia are working to find solutions and where greater collaboration would be impactful, according to Mr Lee. A key focus should be on the design of technologies like AI and how humans interact with them. “We think about AI being bad or biased, but, as humans, we’re equally responsible in designing our world,” he said. “We can work together.”

Jay Hira, Director of Cyber Security at KPMG Australia, and Charles Lee, Director of Data and AI at PWC Australia.JPG
Jay Hira, Director of Cyber Security at KPMG Australia, and Charles Lee, Director of Data and AI at PWC Australia, spoke as panellists at the launch of the UNSW Business School’s PRAxIS (Practice x Implementation) Lab. Photo: UNSW Sydney

Social and cultural barriers are also significant. Dr Amin said it’s crucial to keep bias and privilege front of mind when implementing new technologies. Addressing these issues early in the development and implementation process is also essential. “Fundamentally, when we do research, when we start working on innovation, we need to think about the societal context of the people we’re thinking about, and how and when we’re bringing that into our design systems,” she said. “It’s not good enough to wait until the end; it’s not good enough to wait until we have a Robodebt-type issue and then try to fix it.

“If we haven’t thought about it properly up front, if we haven’t thought about where our escape clauses are, it’s not good enough to say, ‘We’ll design a system that’s 100% accurate and safe, because it never will be.”

Prioritising responsible implementation

With technological advancements proceeding regardless of society’s readiness, organisations and individuals need to know the limitations around technology and take steps to ensure it doesn’t deepen inequities but instead promotes access and inclusivity, especially for underserved communities.

According to Dr Amin, the first thing to consider is who we are designing technology for and whether the design team reflects that community, which should be part of the design process. Co-design should be based on equal partnerships, not merely participatory. “It took someone who was blind to design Braille,” Dr Amin said. “It’s not about all of us designing something for vulnerable communities and deciding on what they want; it’s about asking them what they need.”

Subscribe to BusinessThink for the latest research, analysis and insights from UNSW Business School

Another step is ensuring transparency in how technological applications use, store and protect the information we give them, Mr Hira said. Data minimisation should be a key objective. “What are we asking for from our users, and are we limiting it to only what’s needed for the benefit we’re providing them?” he said. “This is why concepts of transparency, security and privacy by design, and data minimisation are key for us to adopt.”

To help ensure responsible technology implementation, Mr Lee said it is important to embed ethics in the process. “Every organisation should have a policy on how they view AI. It shouldn’t be a fluffy point of view. It should say, ‘This is how we view AI, and this is how we’re going to execute it’, so that when people build, they know what to look at,” he said.

Closing the launch, Dr Tim noted that the PRAxIS Lab is now “open for business” and encouraged industry professionals, communities and academics to get involved in co-designed research projects. Dr Tan agreed, noting that the lab is “fixated on generating societal impact”.

“When you bring authentic problems to us and we help solve them, that’s our pathway for making impact,” he said. “Let’s work together on meaningful solutions to the wicked problems of our time.”

Republish

You are free to republish this article both online and in print. We ask that you follow some simple guidelines.

Please do not edit the piece, ensure that you attribute the author, their institute, and mention that the article was originally published on Business Think.

By copying the HTML below, you will be adhering to all our guidelines.

Press Ctrl-C to copy