Ethical marketing: how to navigate the AI moral minefield for marketers

Download The AGSM Business of Leadership podcast today on your favourite podcast platform.


Link Group CMO Wendy Mak digs into the ethical side of digital marketing, explaining why marketers need to be at the forefront of the conversation, and exploring how brand-new AI tools might not be as helpful as they first seem

About the episode

What makes a great marketer? Maybe a good eye for product design, a deep understanding of market dynamics, excellent communication skills and the ability to give customers what they want. But what about a fierce devotion to ethics?

Wendy Mak is the Chief Marketing Officer at Link Group, a global financial services company driven by digital and data technology. Wendy’s excited about the potential productivity gains promised by brand-new technology like generative AI, but she’s not all-in – yet.

As a marketer, everything Wendy does needs to build trust with consumers and clients, and right now, artificial intelligence tools are threatening to do the opposite. She explains to host Dr Juliet Bourke how she’s leveraging these powerful new technologies without succumbing to their most dangerous pitfalls.  

Professor Frederik Anseel, Interim Dean at UNSW Business School, will then discuss how to develop an ethical framework that’s right for your organisation, including how to get everyone on the same page when deciding what ‘doing the right thing’ looks like in practice.  

Want to know more? 

For the latest news and research from UNSW Business School and AGSM @ UNSW Business School, subscribe to our industry stories at BusinessThink and follow us on LinkedIn: UNSW Business School and AGSM @ UNSW Business School.

Transcript

Wendy Mak: I firmly believe that if you are using Gen AI or large language models to create content to create campaigns that the byline should definitely 100% be that this was not created by a human.

Dr Juliet Bourke: Wendy Mak is the Chief Marketing Officer at Link Group. 

These days she’s thinking a lot about the tension between tech-driven productivity gains and her duty to craft honest campaigns. This balancing act can lead to a whole range of ethical conundrums for marketing 

So how are today’s top CMOs leveraging modern tech – like generative AI – without alienating their customers?  

I’m Dr Juliet Bourke, and this is The Business Of, a podcast from the University of New South Wales Business School.  

Before we dive into the ethics of modern marketing, let’s get to know Wendy and find out exactly what it is Link Group are selling.

Wendy Mak: Link Group is a financial services organisation powered by digital and data and technology. We work on behalf of a lot of the large businesses and corporates and superannuation funds in Australia. But we are a global business.

Dr Juliet Bourke: Well, let's do a little bit of 101 around marketing, just to make sure that we're all on the same page with it – B2B, B2C, let's get the language right. What does that mean? And what other languages do we need to know so that we're in your territory?

Wendy Mak: Yeah, I think the big one that we're using, particularly for the current role that I'm in is B2B to C, which means that the people that pay us and pay our bills are business clients, so B2B in the traditional form that you might imagine it, but we perform a lot of work on behalf of these businesses that reach out in touch to their own clients or customers, meaning ultimately, we actually interact and touch the consumer, but not directly. So the direct work that we do in terms of marketing communications is usually towards initially the business client of ours. So we never directly communicate to a consumer without the permission, or without the oversight of our business clients. But we certainly are in their universe.

Dr Juliet Bourke: Let's talk about marketing, and the challenges, the unique challenges that you might experience, and particularly in the world at the moment, what is it like for marketers today?

Wendy Mak: I think it's fun and exciting. Because we're presented with a whole bunch of new technologies, you can’t literally turn around anywhere at the moment without hearing the words Gen AI, or generative AI. And that seems to be the flavour of the month. I think 12-24 months ago, we were all about cybersecurity. But the current environment is all about Gen AI, and how we can apply that into our roles to make our roles more efficient or effective. Plus, also, how do we employ Gen AI into campaigns that we're rolling out products that we're developing solutions that we're thinking about? So as marketers, we also get to play with these tools. And that's fun and exciting. But I also think it can be a little scary. As a marketing professional, I don't think we've necessarily thought about the frameworks and the guardrails to protect us and to protect our clients and to protect the members of the public in using these tools and in how we use these tools. And that's probably where I think the next 12 months is going to be really fascinating.

Dr Juliet Bourke: So what are some examples that you've experienced where you thought, gosh, there needed to be some guardrails, from a marketing point of view? Around ChatGPT, for example?

Wendy Mak: AI and its technology has been around for many, many, many years. It's not like this is totally new, it's just that ChatGPT has finally put that into the hands of the general public. And we are now all able to access this technology so easily, and to be able to do all of these things with it. We have seen though, that sometimes, as people experiment with these things, you can do things that are not so great. For example, we know that people have been able to create a lot of deep fakes. So we know that people have been able to clone celebrities or their image and attach that to a background or to a product, but they haven't necessarily endorsed that product.

Dr Juliet Bourke: I think Tom Hanks is an example.

Wendy Mak: Yes, that's a great example. So these are the things that you know where it has gone wrong. And as much as we're going to have to start to think about how we deal with that and how we actually prevent that from happening. Ultimately, marketing is about creating trust with whoever you're marketing to and bridging that trust gap. And if you're living in a universe where now things can be faked so easily and a clever kid could do this with the right sort of access a lot of the tools that are out there now starting to lock down certain things. But still, it's not that hard for someone who's reasonably smart to get around all of that. So as people start to be able to access and do these things, and essentially fake either celebrity endorsement or you know, take it into a scarier platform, such as political voice and that sort of thing, we really are having to be a lot more aware of how we can help our customers understand that they can still trust what they're seeing from us trust what they're hearing from us.

Dr Juliet Bourke: So in this new landscape, we talked about generative AI, what else is worrying you from a marketing perspective as to where marketing is going to go to how you're going to deal with marketing ethical issues?

Wendy Mak: Yeah. So look, I think there's a couple of examples that I can think of straight away. The first one is, and this probably isn't so much about using Gen AI to do deep fakes. But you can use a lot of AI and associated technology and machine learning and all that good stuff to come up with new products and solutions. I'll give you an example. One of the new products that I've started to see out there is I'm a dog owner, and a dog lover. And I have seen in the US products where people are actually creating online, I guess, consultations and solutions for pet owners and sick pets. And they're using a lot of AI, they're using a lot of technology in diagnosing, you know, pets and giving you an outcome and prescriptions. But I think that as marketers, we need to really make it clear in that scenario, that this AI and this technology and this online solution shouldn't replace an in-person vet visit. So you have to be ethically creating your campaign. So that because you know, as a dog owner, the convenience of not having to go to a vet is just amazing to me. But I think as a marketer, we should actually be creating campaigns that say, sure this is going to be really convenient for you, you save a bit of money. But these are the instances where we still want you to go and have an in person consult with a vet, making that really transparent. So that as a consumer, as a pet owner, I understand that this isn't the whole complete solution. And that I might need to go offline to seek adequate pet care for my dog.

Dr Juliet Bourke: So is this almost like a product disclosure that we hear in advertisements in the US when they're selling some medication, and you hear this really fast at the end, you know, may cause dizziness.

Wendy Mak: It is a little bit but I think you know, we can take that one step further and actually be really clear about what this is, you know, rather than just a little footer at the bottom or a little fast disclaimer that's read out at the end of an ad, I think you know, it is actually about being very transparent around here’s a you know, whether it's a brochure or a microsite or whatever it is, however you want to communicate it. But saying, here are the steps that are involved in our product, while online might only play a role in this initial stage of diagnosis. But then, at the end of that phase, that's where the service that we offer stops, and you need to go and see a vet. So I think it's actually being as clear as that and actually being really transparent and honest about where your product stops and starts.

Dr Juliet Bourke: What about marketing at its heart, though. So marketing campaigns I'm thinking about, where you have a marketing campaign that's been created by Generative AI. Do you put a disclaimer on that? Or do you put a byline on that and say, you know, humans were used in the making of this campaign?

Wendy Mak: I firmly believe that if you are using Gen AI, or large language models to create content, to create campaigns, that the byline should definitely 100% be that this was not created by a human. It's a really interesting world that we live in. Because if you actually extrapolate where Gen AI could potentially take us as it starts to mature and evolve and get a lot more sophisticated. You can imagine a universe where Gen AI is writing books, that Gen AI is writing movies, that Gen AI is writing these blog posts that allegedly once your favourite thought leader would sit and write themselves, and now they don't even have to do that. So are you really reading a thought leadership post and getting learnings from your favourite mentor? Or are you learning that from a bot and a machine? And I think that needs to be disclosed. Because if I'm not actually learning and reading and consuming content that's been created by the person that I've signed up with or for, I think it should be really obvious and we should be saying this wasn't created by a human. The other thing that that does is it allows people to make their own decisions about the information that they're consuming. So understanding that maybe a marketing campaign that was created by a machine wasn't actually that it was created by a machine allows me to actually exercise increased judgement and increased questioning as to how much of this am I going to take on board? Am I going to take it with a grain of salt? Because I know there was a machine behind it. And yeah, so I think it's really important to have that disclosure. The other thing that I will say about that is, as we use these technologies to help us create these campaigns, right now, there is still a human at the end of everything. So even if you are sitting there using a large language model to create the content of your latest product brochure, you're the one that's making that request of ChatGPT. So you're bringing potentially your own bias into that request. And you need to therefore, as marketers also think about how we remove that bias of that individual that's making the request of that large language model that eventually then spits out the information that we use in our campaigns. So there's a lot in this whole universe of Gen AI that I think we need to work through and have to think about.

Dr Juliet Bourke: So what are the things that you worry about in terms of here's a potential crisis on the horizon around generative AI and marketing? What's your worst nightmare?

Wendy Mak: So one of the things that I do worry about and lose sleep over is these tools, as I said earlier, they're fun, they're exciting, but they need guardrails. How do you let your people be productive using large language models? How do you let them use these in the jobs to increase productivity, but not compromise the security of your business or the security of your consumers? Because to use these apps, you have to give it a certain amount of information. It's an exchange model. So I'm giving the app something it's given me something back greater, bigger, shinier, brighter. But where is that information going? How is it being housed? Is it secure? Here's an example from financial services. Depending on your sector, there are certain very stringent rules about where data can be stored. So certain financial data cannot and personal data cannot be stored offshore. This is you know, just in Australia, but I imagine a lot more countries all have very similar sort of views. The choice of your provider as a marketer, in terms of third party systems becomes incredibly critical. Because you need to know where your data is being stored. So if you're using a third party provider for a CRM, or you know anything like that, there's all this personal information that's going into your provider, and you need to do a lot of due diligence to make sure that that information still complies with whatever regulation you need to comply with. And that that data is not being offshored. Though we don't know that about where these large language models and the applications and the people that build them, we don't know that about those providers. Where is that data going? How is that being stored is that going offshore? Because potentially, you might be inadvertently releasing information and breaching the restraints of your role without even realising it. Because you're just simply thinking, this is a bit of fun, and I'm just going to type something in and see what it punches out. And that's where that's what worries me.

Dr Juliet Bourke: Let's talk about ethics. For example, how do you now understand that ethical framework as a marketing broader leader?

Wendy Mak: I think that fundamentally, your ethical framework, as a marketing leader has to come down to where you're working and the product that you're selling. What I mean by that is, my decisions as a marketer, and what I feel is ethical or not ethical, is going to be very different if I was selling a gaming app, or tobacco, versus if I work at an NGO, or for a charity. So ethics is a very subjective topic. And it is, to a degree guided by what you're selling and what you're doing, because they will, as I said, you know, what you might deem ethical in one industry is quite different to working in another industry. Ultimately, though, I think you have to think about and this is again, you know, I think it's subjective, and I'm sure there's gonna be lots of debate around this one, but does it do harm? If you are working in an environment where, you know, the product is controversial, you can still take that into am I doing harm to the environment, you know, you can take that into a sustainability focus, like it doesn't have to you can actually think about ethics in a broader sense of the word and not necessarily just within the confines of who's consuming your product. So, you know, if you do have a product that's a little bit more controversial or challenging, think about if you if there's ways that you can not do harm in other areas of what you're doing. And that maybe is, I guess, be the redeeming force.

Dr Juliet Bourke: That sounds a bit like the Hippocratic Oath, right, do no harm? Okay, so let's buy into that as a frame that might apply across industries, but be nuanced in, in different locations. That sounds good in theory. But what about the grey areas? Where do you see in marketing, even within your own industry, or if that's too sensitive, talking about other industries, where there's grey in the do no harm?

Wendy Mak: I think the grey comes from often commercial expectations versus do no harm. And that's challenging one, because you know, at the end of the day, we are all either employed by someone, or we have a board and shareholders to answer to. And there are different drivers for different stakeholder groups. And ultimately, I think that's the biggest challenge for marketers and ethics and do no harm often comes from the commercial expectations of you in your role, and especially in a marketing role, where really, a lot of people look at marketers as the driver of revenue and growth.

Dr Juliet Bourke: Do people within the business have different understandings of do no harm, even within their own business? Does the CFO have a different understanding? Does the CPO does the CRO?

Wendy Mak: I think yes. But I think that the successful organisations are the ones where they're all seemingly on the same page. And that's what gels a leadership team because they all share a common view as to what do no harm means. That comes back to your good old purpose and vision as an organisation, what is your purpose? And if you're all really clear on your purpose and values, then you should all be tracking and aligned towards the same outcome towards that.

Dr Juliet Bourke: Is that your guiding philosophy around this that it's it all comes back to your north star – do no harm?

Wendy Mak: Absolutely. You know, especially when you work with a large cohort of colleagues and peers, you have to have a north star, which keeps you all focused and aligned and moving in the right direction. Otherwise, it's just a free for all.

Dr Juliet Bourke: Within any team, there's new people, there's people who've been there a while. And they may have different ideas about that north star may have different ideas about kind of ethical frameworks. Yeah, do you bring them together even within your own team?

Wendy Mak: Yep, I can actually tell you that really easily. I have eight guiding principles within my own team as to how we work the way that we work. It is a one pager, it's got eight principles as to how we work, which is essentially my expectations of us as a team that gets rolled out to every new person that joins, I go through it. And when we gather together as a team, it's very, very simple. There are little things like you know, always bring quality work. Don't be the fish that John West rejects. That kind of stuff. Nice, but it's short snippets of what my expectations are. No surprises is another one. You know, transparency is key. I hate surprises. I don't think the business likes surprises. I don't think my CEO likes being surprised. Certainly not within a business context anyway. So that is another guiding principle for how we operate as a team, like the news might be bad, you might have done something wrong, a mistake might have happened, but do not try to hide that and let it be a surprise. So those are the things that I think really help bind us as a team and give us that connective tissue that makes sure that we all work in the same way but are also. So not just getting to the same outcome. We're striving to get to the same outcome but doing it in a way that has the same moral and ethical expectations of how we do our jobs.

Dr Juliet Bourke: One of the things that's really interesting at the moment with this sort of volume of data that's being collected is the way that marketing can now personalise experiences for people. But I wonder if there's an ethical dilemma around the personalising experiences, you're, you know, potentially you're you are selling things that people don't really need, but you're tapping into their deepest wants.

Wendy Mak: I can't tell you how many bits of useless hair products and appliances I have at home, because Instagram have just shown me so many ads, and I have fallen prey to it every single time and I actually work in the industry, so I know exactly what's happening. But I still can't help myself and I click and there's this like, you know, a new gadget that's apparently going to change how quickly I can do my hair or whatever. And then I'm handing over my us $35 and getting it shipped to me from China. Yeah, I think that whole personalization thing has been critical for us as an industry because we really have seen a lot of success commercially from the ability to be able to serve up such personalised information ethically. And I don't think this is necessarily a dilemma for an individual organisation per se, but more a larger conversation about us as a society. I think that personalisation, as we get increasingly better at that, and it becomes a lot more tailored, and even more tailored, has the potential to rip apart the commonalities that bind us as a society and a culture. What I mean by that is, in the past, I didn't have streaming apps that, you know, served up very specific content that it knows will speak to me that I'll continue to click on so I'm watching all sorts of random stuff on Netflix, or whatever it is that I'm on. I had the eight o'clock broadcast or Friends in the 90s. And we all watched it in Australia at the exact same time. We only had one channel or one outlet through which we could consume that we then gathered the next day at the water cooler and said, ‘Did you see that episode ending?’ And we bonded over that. That's where I think potentially, you know, the ethical issue of depersonalisation will get us to, we'll be at a point where, as a society and culturally is, is what we really want.

Dr Juliet Bourke: And doesn't that speak to a bigger issue of ethics and marketing, and that is that the marketer is employed by an organisation to maximise the profits usually, of that organisation. But what you're talking about is the unintended consequences at a societal level of very deliberate marketing in a personalised way. Do marketers talk about ethics in that way?

Wendy Mak: No, I don't think so. And that's where I think that, you know, we should open up these conversations, because you're right. Ultimately, there's someone that's paying us, ultimately, you know, we have shareholders or an owner of some sort. And the commercial outcome is usually what is of utmost importance to us as we generally do our roles. There are these unintended consequences, I can sort of envision how they're going to play out, I can't see it stopping because personalisation works, as I just said, you know, I buy all these hair gadgets, and that sort of thing. So it clearly works, even on a seasoned professional myself, why would you stop that as a marketing professional? That's where I think, you know, we need to come together and actually have these conversations about, well, if we can't stop it, what can we do to help create a sense of community and society and that might come into some of the more community based or the social aspects of our role? So you can we give them a different platform? So understand that you might run very personal, personalised campaigns, but can we bring them together as a community in an offline in person event? You know, can we start to create that connection, again, through different things that aren't designed necessarily just to sell a product, but to give us a way to network and connect together as a consumer and a lover of that particular product and brand, while understanding that in the background, you will still run your ad campaigns, to a highly personalised level to encourage your maximum ROI from that perspective.

Dr Juliet Bourke: Within you, Wendy, there is this sort of cutting edge information as to what's coming next. And I wonder if you have a chance to now sort of articulate that. What would be your final words to the audience who’s listening to this to think about this is what I'm sensing is a risk for the future. This is what I'm sensing is an opportunity for the future that we just haven't leaned into yet.

Wendy Mak: I think the opportunity is there for us to actually get super, super, super efficient in how we create campaigns using all of this technology. The question then, for me becomes what role do we as humans play in that? And how does that change the skill set of the team that you're looking to bring in, because if you make the assumption that the universe goes the way that it does, then a lot of roles potentially may in marketing and communications especially, may potentially be superseded by a large language model. The skill set that you bring in might be just one governance and correction, which might be very different to your team, and what that looks like today. So as leaders, I think what's coming next, you've got to really think about what your team looks like in the future for 10 years from now, 15 years from now.

Dr Juliet Bourke: Marketing leaders of tomorrow will need to become even more nuanced in their ability to make ethical decisions. How will they build that skillset? 

Professor Frederik Anseel is Senior Deputy Dean at the University of New South Wales Business School. He’s an expert in how organisations and their leaders adapt to change.  

Frederik’s been looking at how doing the right thing naturally engenders trust, but the really tricky stuff is agreeing on what the “right thing” is in the first place – and then getting your whole team to act accordingly. It starts with something called “behavioural integrity”, that Frederik thinks is crucial for ethical decisions.  

Professor Frederik Anseel: So indeed, ethical behaviour is almost a social phenomenon. And so the question is, so what do leaders do? What do they say? How do they behave to make sure that everyone in the team is acting ethically? So a couple of years ago, together with my team, we did research on a concept, which is called behavioural integrity. And behavioural integrity is a fancy word for just saying that you're actually walking the talk, you're doing what you're what you're saying, right? So what we discovered in our research, when it becomes to ethical behaviour, or also safety behaviour, for instances that values and talking about values don't matter all that much, right, you can talk a lot about that you're an ethical firm, and that you're committed to those certain values.

But what people actually do is they look at what your actions they look at your behaviour, and if they are consistent and aligned with what you're saying. And so that means that for leaders, there's a tricky aspect, you need to say the right thing, but then you need to act on it. And you need to act in a visible way. And if you want to make sure that people do the right thing as you intended it to be, you need to talk about concrete behaviours. What does that mean? Being authentic? What does that mean? Being genuine and trustworthy? So you need to give examples, behavioural examples, and then you need to show it. And when do you show it in tricky situations, everyone can behave ethically, if it's easy to do. But it's only when it's a very difficult grey area where you could go both ways, then it is so important that leaders act on what they've said. And then it's often advisable, actually to afterwards, talk about it and reflect on it because some people might have not have noticed that there was some sort of a paradox there or a difficult choice. And so you might want to reflect and explain like, this is what we did, we could also act differently. So this sort of open climate, and sort of an open culture about talking about ethical dilemmas, and how difficult they are and the considerations come into play, that will lead other people in the team to also make that sort of calculations and try to do the right thing. Because this is about trust, right.

And so this is something that we currently clearly see in the AI space. While a lot of businesses are trying to do the right thing are very cautious are trying to think like, how do we use data? How do we use AI in a responsible way, other businesses are not doing that. And it's very difficult for a consumer to differentiate between those businesses that are doing the right thing and those that are not doing the right thing. And so, what they will do is they will generalise and as soon as they see some breach of trust, they will generalise that to any sort of business. So trust is almost in the online space or common good. And as soon as one party would abuse that, or would create mistrust, all the other businesses would suffer as well. And so that is a bit of a paradox and a challenge for a lot of businesses these days. When you're in marketing, and when you reach out to consumers, when you talk to consumers, it is quite important to think about how you are perceived in terms of being trustworthy, and ethical, in a sense, a marketing situation, as long as a consumer is aware that this is a marketing situation, they're quite open to nudges, to being influenced to getting information certain way. Why? Because they know that that is the situation. These are the rules of the game. And so often if you want to be perceived, maybe as a marketer that behaves ethically it is just making people aware, like this is the situation we're in right, we both know the rules of engagement, so to speak. So I will not try to influence you in any sort of treacherous, implicit, subtle way. But this is how I want to inform you and of course I have a product to sell, or I have a person to sell or service to sell. People will not mind that as long as it's very upfront open and clear. And so I think in sort of ethical marketing, that is now a generally adopted position.

Dr Juliet Bourke: Although marketing and consumer behaviour has changed dramatically, the principles of great – and ethical – marketing are the same as always.  

Your clients, customers and team need to know you’re going to do what you say you’re going to do. 

You need to be authentic to your principles whether you’re dealing with AI or large amounts of data – you always, always need to walk the talk. 

The Business Of podcast is brought to you by the University of New South Wales Business School, produced with Deadset Studios. To stay up-to-date with our latest podcasts, as well as the latest insights and thought leadership from the Business School, subscribe to BusinessThink.

Republish

You are free to republish this article both online and in print. We ask that you follow some simple guidelines.

Please do not edit the piece, ensure that you attribute the author, their institute, and mention that the article was originally published on Business Think.

By copying the HTML below, you will be adhering to all our guidelines.

Press Ctrl-C to copy