As laws chase new tech, consumer safety is a risky grey area

Technology is evolving quickly, and regulatory lags mean other consumer protection frameworks must be robust, warns IPART’s Dr Kate Harrington

Cutting-edge technologies have great potential to improve lives but have profound implications for how governments serve the public. They’re also emerging faster than governments can regulate them, presenting pressing ethical and legal challenges and making consumer protection an urgent priority, according to an expert in public sector digital transformation. 

Speaking recently at the UNSW Management Innovation Conference, Dr Kate Harrington, Director of Corporate Services at the Independent Pricing and Regulatory Tribunal (IPART), warned the audience of senior industry professionals, consultants and academics about the accelerating pace of technological change and its far-reaching impacts on public policy. From brainwave-reading helmets and robotic exoskeletons to virtual reality training and digital twins, Dr Harrington described a world where science fiction is becoming reality, and governments are struggling to keep pace.

Augmented reality is already in widespread use by governments today, for example, with NSW Government – of which IPART is an independent, strategic agency – considering implementation for things like driver testing, allowing a person to experience operating a vehicle without putting them on the road, and to mimic real-world situations for safer employee training.

Dr Kate Harrington, Director of Corporate Services at the Independent Pricing and Regulatory Tribunal (IPART).jpg
Speaking at the UNSW Management Innovation Conference, IPART's Dr Kate Harrington said she is cautious about the accelerating pace of technological change and its far-reaching impacts on public policy. Photo: UNSW Sydney

The challenge is that the line of technological advancement “starts to bleed into the issue of where government regulation and policy need to go for the use of some of these technologies”, Dr Harrington said. “The technology exists; the question is how invasive it currently is,” she added, noting that implanted infrared vision already works in mice and is close to human trials, that robot dogs controlled by soldiers thinking their movements are being trialled in the field and that surgeons are already using headsets to examine remote patients. “It gives you a sense of how close Hollywood has come to the realities we’ve been talking about today.”

Chris Jackson, Professor of Business Psychology in the School of Management and Governance at UNSW Business School and an organiser of the conference, also noted that there are many emerging technologies that have the possibility of immense impact on society and people, and which are currently unregulated. "The problem is that we simply cannot know how technologies will be used and whether they will be used for good or for bad. For example, here in Australia, we are only just beginning to regulate social media after pretty much an entire generation has grown up under its influence," he said.

Regulatory lags

While the risks that accompany the immense potential of these technologies are a focal point for governments, the challenge is in protecting against them in a way that stays relevant. “One of the things about government policy is that it is incredibly slow to act,” Dr Harrington said. “It takes years to make regulatory change in this space, and technology is moving much, much faster than the regulations that protect us.”

Another factor is the evolution new technologies often undergo, making them a moving target for governments, which must balance current and future applications. “If something is invented for use today and we regulate to help protect it, that’s not to say that’s how that technology will deploy in the future,” Dr Harrington said.

Read more: As tech advances ramp up, can policy really spur innovation?

This regulation delay leaves consumers vulnerable and raises significant ethical concerns, particularly as technologies like wearable and implanted devices transition from the medical to the consumer space. For instance, brain implants designed to assist individuals with disabilities are increasingly accurate and moving towards mainstream adoption.

“We’re going from a medical procedure to a purchasable, wearable device,” Dr Harrington said. “A lot of regulation at the moment around the use of these technologies sits in the medical sphere, ensuring that when you’re correcting a disability or a disease as a patient, you’re actually engaging, and you have rights. But if you become a consumer and put these things on voluntarily, where do your rights lie?”

Implications for consumers

The ethical implications of these technologies extend beyond individual rights to questions about identity and autonomy, such as neurotechnology that alters users’ personalities. “At what point is an individual’s identity sufficiently altered to become a new identity, and how would you deal with that from a government perspective?” Dr Harrington asked. 

“When does neurotech move from being a medical procedure to a consumer technology? When does the use of available technology violate human rights, and is that different for government and industry? At what point do we move beyond the correction of a medical issue to the creation of a superhuman? We can apply all of these technologies right now to people with particular disabilities, but what about when a healthy person wants an enhancement – is that still medical, or is that something else?”

Neurotechnology - when it moves from being a medical procedure to a consumer technology.jpeg
IPART's Dr Kate Harrington highlighted ethical concerns about neurotechnology and when it moves from being a medical procedure to a consumer technology. Photo: Adobe Stock

Liability issues further complicate the regulatory landscape, as Dr Harrington illustrated with a hypothetical scenario involving a vision-restoring implant.

“If you have an implant that allows you to see to the point where you’re able to drive a vehicle, and the technology company that’s supporting your implants goes bankrupt and your vision is turned off while you’re in the middle of an intersection and you cause an accident, who’s liable – the government, as the regulator that said your vision had been corrected to the point where you’re entitled to have a licence or the IT company that went bankrupt?” she said. “It’s not a very straightforward problem for government to try to solve.”

Robust protections needed

Despite the regulatory challenges, governments are already deploying many of these new technologies, including the NSW Government through its Digital Twins program, providing virtual models of physical environments, an important tool in public sector planning. Initially deployed around Bathurst, these models are being expanded across New South Wales to simulate population changes and infrastructure needs.

“We’re modelling cities, modelling impact, modelling transport,” Dr Harrington said. “We’re looking at future populations, where they’re going to live, how they’re going to interact, what kinds of jobs they’re going to do.”

Subscribe to BusinessThink for the latest research, analysis and insights from UNSW Business School

But while the public is generally aware of the data they provide to the government, they are quicker to provide information to companies in exchange for immediate service, and governments have little power to regulate legally obtained data. This divide highlights the importance of developing robust ethical and privacy protection frameworks as emerging technologies increasingly collide with consumers in unregulated spaces.

“What we’re seeing in government is what we’re seeing in industry; they’re absolutely going hand in hand,” Dr Harrington said. “But the policy and regulation that are protecting our rights are nowhere near where the technology is.”

Prof. Jackson noted that it is almost impossible for government to regulate what might happen in the future. "Let's hope the entrepreneurs and business people at the forefront of these emerging technologies will understand the need for ethics and self-regulation as opposed to just the profit," he concluded.

Republish

You are free to republish this article both online and in print. We ask that you follow some simple guidelines.

Please do not edit the piece, ensure that you attribute the author, their institute, and mention that the article was originally published on Business Think.

By copying the HTML below, you will be adhering to all our guidelines.

Press Ctrl-C to copy