Same product, different price: When AI decides how much you pay
Download The Business Of podcast on your favourite podcast platform.
UNSW Business School Associate Professor Sam Kirshner unpacks the ethics of personalised pricing, where companies use customer data to tailor costs for individuals
About the episode
What if you paid a different price for the same product – not because of demand, but because of your data.
Prices change, whether it’s annual sales or seasonal variability. But what happens when AI sets the price – not just for the market, but for you?
“The scope and the scale and the pervasiveness down to the individual with personalised pricing is pretty unprecedented.”
UNSW Associate Professor Sam Kirshner explores the trade-off between personalisation and privacy, and the importance of responsible AI use as companies employ new personalised pricing models.
This episode is hosted by Dr Juliet Bourke with insights from Professor Barney Tan.
Want to know more?
For the latest news and research from UNSW Business School and AGSM @ UNSW Business School, subscribe to our industry stories at BusinessThink and follow us on LinkedIn: UNSW Business School and AGSM @ UNSW Business School.
Transcript
Dr Juliet Bourke: We're used to prices changing depending on the time of year, whether it's seasonal produce, holiday travel or annual sales. It's known as dynamic pricing, and it's something we've seen for years. But what happens when AI sets the price, not just for the market, but for you.
Dr Sam Kirshner: The scope and the scale and the pervasiveness down to the individual with personalised pricing is pretty unprecedented compared to just when the whole dynamic pricing revolution started to begin with.
Dr Juliet Bourke: Sam Kirshner is an Associate Professor at the UNSW Business School, where he teaches data analytics and the ethics of AI. His research explores how algorithms and AI shape the way people and businesses make decisions, the kind of Marvels that it can lead to, as well as more darker, dystopian aspects of it and so many companies are really looking to upskill, thankfully, in areas of AI, particularly around responsible AI. So as algorithms rewrite the rules of pricing, will it mean smarter deals or digital price gouging? And where's the line between profit and fairness?
Dr Juliet Bourke: I'm Dr Juliet Bourke, an Adjunct Professor in the School of Management and Governance at the UNSW Business School. This is The Business Of.
Dr Juliet Bourke: So Sam, can you help me understand the difference between dynamic pricing and personalised pricing?
Dr Sam Kirshner: Both of these are techniques that we're just going to see increasingly as we purchase any types of goods and services. So dynamic, very simply, is just a price that is not constant. It's not static. Prices are always changing. And you can even think, just at our local pubs, we have happy hour, right? So that's just a mechanism that brings prices of your favourite house wine or house beer down a little bit to encourage business during times that are a little slower. But the key thing there is that the prices are the same for everyone who walks in the door. Doesn't matter where you're from, what your age is, your gender, everyone gets the same price. So personalised pricing is, I guess, somewhat in the name when factors and characteristics about you personally often captured through data is used to actually create a more custom price. And in many ways, you theoretically could have personalised pricing that is static. So they just kind of take a snapshot of you and that will generate your price. But in most cases, the characteristics that they're using are also dynamic, right? So potentially, where you are in the world, the internet connection that you're using, could even just change the price, even with all your other data being constant.
Dr Juliet Bourke: We understand dynamic pricing goes up, and I love your example of a pub where it goes down, but we always seem to talk about when it goes up.
Dr Sam Kirshner: I think the reason why we talk about it a lot more when they increase is because we just kind of take it for granted when the prices actually decrease. So students often get concessions with their student cards. Same thing. Once people hit that nice age of 65 things start to become cheaper. But a lot of these examples are just things that we've kind of come to expect, and so newer forms of personalised pricing are usually targeted at trying to squeeze a little bit more revenue out of people that can pay for it, ideally. But we do see, obviously, lots of examples where there could be a little more nefariousness in how people are using this. So for example, as we use AI more and more and our data is even more on the internet, these models will probably start to be able to tell kind of your patterns, and how often do you search for other products or other companies offerings, and where there kind of are benefits, though, is if you are savvy, you kind of know, well, maybe I'll just, I'm really into this shirt. I really like it. I will just leave it in my basket for sometimes 30 minutes, sometimes a day later, you just leave it there. Very often, you'll get that email back that says there's something in your cart. Come back to us and we'll give you, you know, free shipping, or 10% off, or some kind of deal. So in many ways, the people that are more savvy and have a better understanding this are probably going to end up with better deals. However, if you're not as literate in using tech and you know you only know this one website or you only know this one app, then that can be exploited.
Dr Juliet Bourke: But I think there's another part to it, which is the lack of transparency around pricing and feeling that someone else is getting a better deal. Can you speak to the recent controversy with Delta Airlines in the US, where a number of senators questioned the airline over its personalised pricing using AI?
Dr Sam Kirshner: Why we feel very unfair, if you look at the kind of pricing example from the airlines, is that it is just so opaque in terms of what are they actually using to sell me this flight at this price? So what is it that's in my cookies or my browsing behaviour, or just the country or city that I'm in that is determining this price, but with such limited options for flights and only so many carriers, what choices do you have? They all have access to the same technology. It's not just that delta has figured out how to do this and can charge you the higher price. So then you would hope that other companies could actually come in and figure out how to wait to kind of offer you something more recently, but similar to gas prices, there seems to be a very cohesive collective movement in terms of everyone trying to keep the prices a little higher in these times of need. So unfortunately, whether it's deliberate anti-competitive prices, or this is just a function of how these algorithms work. And there are some very classic examples of even simple algorithms that can behave in very anti-competitive mannerisms. A great example was, there was this book, I think was like a biology book on flies. There was only one copy, and there were two sellers, and their algorithms basically were, there's another seller with this book. Whatever they sell it at, just add 10, $15 and then they just basically kept every day. It just kept going up and up and up. I think it kind of reached like $10,000 or something like this for this book, because the plan was, if we sell it, I'll just buy it from the other person and then sell it at a markup. So even just this rudimentary algorithm, which wasn't designed to be anti-competitive in any way, shape or form, just ended up being like that. And now with much more sophisticated AI, there's lots and lots of research that shows, if you just use machine learning to come up with these prices very quickly, these algorithms figure out how to be collusive.
Dr Juliet Bourke: But help me out here, Sam, with my moral outrage, yes, around personalised pricing, which is based on a perception that I have more capacity to pay than the person sitting next to me, because data has been scraped on me, and so the product has a higher price point for me. And I don't even know that, because I'm not I'm not aware of what the person next to me is paying.
Dr Sam Kirshner: Yeah, even now, it's like, if you're on your mobile versus your computer in the same spot, you will get different prices for hotels, for airlines, for all this. So it is just very complex. And I guess this is really where government and policy needs to become more active in this space. Fortunately, in Australia, there is very strong consumer protection compared to places like the US, and you can see that the way Europe treats data and AI is very different from how America does. And so we kind of adopt principles of both types of economies. And if you think about it right, it's this tradeoff between personalisation and kind of privacy and security. If everything is secure in terms of our data, then you can't use that data to actually target people, right? So in this case, yes, we're safer. Things are fairer, but you might not get the convenience and the product recommendations that you want. So there is some value to personalisation that saves you time. If you know Amazon has a gajillion items and trying to search for even a specific category, having this personalisation is helpful, obviously, if it goes all the way to completely personalised. It actually has an upward pressure on prices, because they know the perfect item for you, the item that you want most. So there's actually value in picking something in the middle. Because if the firms who are competing for you all think that they might have some chance based on your data that you'll buy, then it actually makes them compete. So you can get in a way so that it does lower prices theoretically, whether this will happen in practice, really, then just kind of comes back to where Australia sets these thresholds.
Dr Juliet Bourke: So let's go back to data. Where is the data coming from that helps them to be so specific? What kind of data is it scraping so that it really knows me, and then what data fits into personalised pricing? So is it about age, is it about postcode? LinkedIn shows you university degree. Is it about frequency of travel, how I vote? I mean, what's fitting into that price point?
Dr Sam Kirshner: I think it's not just going to be, you know, we collect this data point here. She's gone to that website. She comes from that neighbourhood. We put this into our pricing model, and it comes out with this price without knowing, because this is obviously companies' trade secrets. I think it would work a lot more with large, large data sets and historical correlations, if you even just think about the way people have kind of been using social networks for targeted advertisement. It's usually just a full collection of large behaviours, taking so much information, putting that through the model, and the model kind of selecting what it thinks will be, kind of the ad that will be most effective. And I don't think they really can say what weight is actually going on which feature, but all those things you said, and so much more. I mean, think about it. Just like everywhere we go is tracked. There are whole business models. Remember the bikes that were polluted all over Sydney? Probably about 2017-2018, I'm pretty sure that was just a strategy to figure out where people travel, just to collect that data and to sell that data onwards. And so just that data, when combined with other pieces of information, can give very detailed portraits of people.
Dr Juliet Bourke: I think that's what we rail against, right? That there's this very detailed portrait that is now being used against me, and it's being used against me to charge me a higher price,
Dr Sam Kirshner: Yeah. And I mean, if it's just for a higher price, ultimately, it's not great, but it could be a lot worse, too.
Dr Juliet Bourke: Oh, could be a lot worse.
Dr Sam Kirshner: Well, you just think about political ideologies and just how much political tension that there is right now, you know, I'm sure if things continue to go down the current pathways that it seems like are happening lots of parts of the world, you know, people are going to try and probably be a lot more private about, potentially their opinions, potentially who they want to vote for, right? And so in the end, a lot of this is very sensitive data.
Dr Juliet Bourke: So can we expect more personalised pricing? And how should businesses think about these new pricing models? Here's Professor Barney Tan
Professor Barney Tan: AI pricing is a hard sell for companies, but the real backlash often stems from how hidden and opaque the process feels to customers. To win back trust, businesses need to invest in transparency and clearly communicate how their AI pricing works. This is where disclosure becomes a powerful tool, and some companies are getting this right. Microsoft, for example, has set out responsible AI principles that include defining a no-go list of areas where they refuse to apply AI, even if it's profitable. And Google has introduced model cards, documents that explain what the model does, the data it uses and how it works. Think of them like nutrition labels for AI. Now these aren't just compliance tools. They're trust-building mechanisms when consumers know what data is being used and what isn't they feel safer, and that's crucial. Now most businesses aren't quite there yet, especially mid-sized firms that are still waiting to see how the landscape unfolds. We see a lot of mid-tier companies that are hesitant to act. They're not investing heavily in responsible AI yet. Instead, they're watching the market, waiting to see whether the big players get away with opaque pricing models, or whether they get regulated or shamed into reform. It creates this strange diffusion of responsibility, ethical leadership in AI, especially in pricing, needs to be proactive. It can't just be reactive to backlash. Otherwise, companies risk losing the trust of not just customers, but of regulators too in the long run, responsible AI isn't just about models. It's about values, empathy and societal outcomes. Ultimately, the companies that will thrive are the ones who align their technology with their values, the ones who see trust not as a given, but as something that has to be earned through disclosure, fairness and empathy. That's the future of pricing in an AI driven world.
Dr Juliet Bourke: So Sam, what else should businesses think about as they adapt to this new world of pricing?
Dr Sam Kirshner: One of the biggest things is strategic and long term thinking and just really on educating around AI, and what we're seeing even at the UNSW business school, is that so many companies are really looking to upskill, particularly around responsible AI, the scope and the scale and the pervasiveness down to the individual is pretty unprecedented compared to just when the whole dynamic pricing revolution started to begin with, right? So there's just a lot of reasons why firms will actually look to use AI less responsibly, because they will kind of talk themselves in often based on competitors. Everybody in the industry is doing it. So how bad is it if I just do it too? So it's kind of like diffusing their responsibility. And so I think competitors, and what competitors are doing is often a bigger driver than just, oh, we can earn more profit if we do this right. It's this idea of either trying to keep up or trying to be the leader of the pack, and I think just really trying to understand your company's values, being very conscious of how you treat your moral decisions with regards to your competitors, will be something that really can help a company be more ethical.
Dr Juliet Bourke: Do you have a view on how a business can use this trend towards personalised pricing ethically?
Dr Sam Kirshner: So I think that's the key element, right? And there's a lot of talking about Ethical AI, but often the talk is very much centred around the models itself, and less so about the societal outcomes. So if a company wants to be responsible, which I hope in this country, many do, then they really have to kind of think more holistically of what is the impact of this on their consumers. And I think just the most common sense thing to do is, if you are a consumer, and you put your data in and you get a certain price, you know, just like put the mirror on you, How do you feel about this? Do you find it fair? So just trying to put yourself in other people's shoes. And what's just kind of interesting is so far, I mean, at least for big, big tech, this never seems to be the case. I remember the end of the social dilemma movie they were interviewing, I think someone at Facebook Meta, very high up, and they were asked, so like, does your teenage daughter go on social media? Is like, would never let her near the stuff. And it just like, I mean, I'm sure they're obviously very clever people. They see the hypocrisy there. But I think then just something that we need to be cautious of, and something that we really try and still, even with our students at UNSW Business School, is the sense of responsibility and really using perspective taking to try and put yourself in not even other people's shoes. That's even better. If you can imagine yourself as other types of consumers with other data, but just even you as yourself like what would that feel like?
Dr Juliet Bourke: And I think that's the dilemma for people in any business. There's an opportunity to make a profit out of this, but the profit at what cost to how you're perceived in the market, and whether you're suddenly perceived as a bit shady and sneaky because your algorithm is using this very personal data.
Dr Sam Kirshner: And so hopefully this also encourages other firms, then to see gaps in the market. And I really, really hope that people will start to value privacy even more, and potentially the best example for this is to look at sustainable fashion and fast fashion. So for many years, Zara and H, M and all of these companies were really doing, obviously, extraordinarily well using fast fashion models. And it took a little while, I think, for people to really to catch on to the environmental harm, the labour harms and the people harms that are caused by fast fashion and the animal harms. And there was then a fairly high demand for clothes that are more sustainable, where people are paid a living wage to actually make these clothes. And while that definitely has grown, and if you're in Sydney and Paddington, there are so many stores that are just so centred around this ethos, ultimately, the demand is really not there, in my opinion. Unfortunately, I was hoping by this point in time that it would be even more pervasive. But Kmart still, obviously does very well. People still shop at Target, and maybe that's just a means thing, like a lot of people can't afford that. And so I think in the end, going back to AI, hopefully there will be a demand for privacy, for more kind of fair pricing, whether that's static or just kind of limiting what types of data are used in a more transparent and a more systematically algorithmic way, so that you can actually see kind of what is the weight on what personal characteristic which is leading to this price? But I think ultimately people's convenience will potentially win out. So I think again, it really just comes to being more civic-minded understanding the big picture of where this is going.
Dr Juliet Bourke: So by the sounds of it, we're going to have more examples of personalised pricing, AI understanding us at an even deeper level and using it for nefarious purposes?
Dr Sam Kirshner: Well, I guess nefarious or not at this point, I don't think people are really thinking that long term. I mean the number of examples of people that are using ChatGPT as therapists, literally putting their most personal aspects of their lives into a website. And so I think it's also just kind of the consciousness of where this can go. And I think it's the same thing with a lot of these companies, right again, there will be markets for people, if they can kind of build this demand of wanting responsibility. I will say Apple is probably one of the few companies, at least a while ago that, somewhat ironically, took a stand towards privacy and had lots of ads around privacy and tried to make that part of their brand identity right on the surface to the consumer, at least, that this is part of what they do. And so I think trying to encourage more people, and it's great if a brand like Apple is doing this, to think more on these terms, is kind of what we need in our society to try and push back, because that will then create the markets for firms who are thinking more long-term. You know, how can we do things differently that are both more fair and more transparent?
Dr Juliet Bourke: Thanks to Sam Kirshner for joining us on this episode. Learn more about how AI is reshaping our world and the way we work in our conversation with Professor Frederik Anseel.
Professor Frederik Anseel: Whenever AI makes us more productive in one area, it frees up time, but that time is never freely available as a package. It will be eaten up by new things that you need to do. And so this is also why this whole discussion about AI destroying jobs and making people unemployed is probably not accurate.
Dr Juliet Bourke: The Business Of is brought to you by the University of New South Wales Business School, produced with Deadset studios. If you're enjoying listening, we'd love it if you left us a rating or review in your favourite app, or you might like to share it with a friend or a colleague.
To stay up-to-date with our latest podcasts as well as the latest insights and thought leadership from the Business School, subscribe to BusinessThink.