Our understanding of ethical boundaries has struggled to keep pace with the ever-increasing speed of technological innovation, leading to a growing number of critical questions. In this episode, we interview Dr. Susan Liautaud, a world-renowned expert in ethics in business, to explore her perspective on this complex issue.
Dr. Liautaud discusses her belief in the importance of integrating ethics into the decision-making process at all stages and how ongoing innovation can effectively co-exist with a strong sense of ethics.
We cover an expansive range of topics, including social media terms of service, privacy implications, and where the ethical line lies in the present and future of technology.
— Dr. Susan Liautaud
Topics covered in the interview
- Social media and technology
- Scattered power
- Being cautious with Terms of Service
- Holding companies accountable for ethical practices
- Pure intent to cross the ethical line
- AI governance council
About Dr. Susan Liautaud
Dr. Susan Liautaud is an internationally-recognized ethical expert and a leading consultant on ethical matters for corporate, government, and non-profit organizations. She is the author of numerous books about ethical decision-making, including The Power of Ethics and The Little Book of Big Ethical Questions.
Additionally, Dr. Liautaud is the founder of The Ethics Incubator and Susan Liautaud & Associates Ltd., an ethics professor at Stanford University, the Chair of the Council of the London School of Economics and Political Science (LSE), and Vice Chair-elect of the Global Partnership for Education (GPE).
She has provided her expertise while serving on several boards and committees, including the UK Cabinet Office’s Advisory Committee on Business Appointments (ACoBA) and the Stanford HAI (Stanford Institute for Human-Centered Artificial Intelligence), the SAP’s AI Ethics Advisory Panel, and Pasteur Institute.
Follow Dr. Susan Liautaud
Show Notes
Dr. Susan Liautaud 00:00
So we have, we as individuals have more power than ever before. To take a simple example, anyone with a cell phone can tutor a child on the other side of the planet, or perhaps engage in a terrorist act. So the technology has given us huge amounts of power.
Susan Sly 00:18
Welcome to Raw and Real Entrepreneurship, the show that dares to bring no nonsense insight to those who have the courage to start, grow and scale a business. I'm your host,
Susan Sly 00:29
Susan Sly. Well, hey, everyone, wherever you are in the world today, I just want you to take a deep breath and let it go. Because you know what, you're in the right place. It's the right time. And I know you're here for a reason. My guest today is a global leader in the area of
Susan Sly 00:52
ethics. And you might be saying, well, Susan, you know, ethics. Is that going to be really interesting? I'm
Susan Sly 00:58
just trying to get a business going. I'm trying to grow a business. And here's what I want to say to you. We must never sacrifice our soul on the altar of fame. And it's much better to build a solid foundation than it is to have to go back and clean it up. And my guest tonight, today, are probably going to talk about some epic cleanups that have had to happen. She is a leading expert, international expert in the era of ethics. She's the author of The Power of Ethics. She's the founder of the Ethics Incubator and of Susan Litaud and Associates limited, a consultancy advising corporate, governmental and nonprofit leader group that focuses on ethics. She, this woman is amazing. Like I honestly need to get an autograph after this episode. She teaches cutting edge ethics courses at Stanford University, where they have a beautiful tree as a logo because my daughter Avery almost went to Stanford. So for those of you who are longtime followers of the show, you know, I'll throw in a family plug anytime. She also serves as chair of the Council of the London School of Economics and Political Science and Vice Chair lapped of the Global Partnership for Education. She has been appointed to the UK cabinet offices Advisory Committee on Business appointments to the Stanford. HAI, Human, which is, this is like, you know, getting me super powered up and excited. The Stanford Institute for Human Centered Artificial Intelligence and to the SAP's AI ethics advisory panel. And so we're going to talk business, we're going to talk social media, we are going to talk about, as I said, How to avoid disaster and maybe how to clean some up. So Dr. Susan Liautaud , thank you so much for being here on Raw and Real Entrepreneurship. I'm so excited to have you.
Dr. Susan Liautaud 02:53
Oh, it's such a pleasure to be here. And I'm so excited to be here. Well,
Susan Sly 02:56
I just want to jump right in. So we are seeing some of the largest social media companies in the world, sending their attorneys to Washington to have to defend some practices. And I want to jump in and just throw the first question out. We're seeing Facebook being called to task, Apple being called to task. In your opinion, how do you think this required new level of transparency is going to affect the average person going forward, especially people with small business who are hoping to monetize their business using social media?
Dr. Susan Liautaud 03:34
So first of all, I love the way you phrased the question, the average person, because indeed, companies like social media companies, and I think we need to distinguish social media from other tech companies. But companies like social media companies do affect the average person and are affected by the average person's interaction with them. I think we're still at a moment in time when the regulation is still woefully behind the technology and the risks of the technology. But I should say at the outset that I'm very pro innovation. So I don't think ethics should be a roadblock. I think ethics should be a way to monitor risk and opportunity. So minimize risk and maximize opportunity. I wish I could say that I thought having armies of lawyers repeatedly go to Washington was going to solve something. But I don't think it is. And I think it needs to be a combination of the companies, and indeed they're boards. And it doesn't matter whether the company is two weeks old, or whether the company is a Facebook level company. But really thinking about integrating ethics into their decision making all the time. And it doesn't need to be a burden. And it really can be a situation where small companies and entrepreneurs decide, I'm going to pick my battles. I'm looking at what I'm trying to achieve. And there are one or two particular opportunities that I want to be able to deliver to the world. And there are one or two business opportunities I want to see for myself, but then there are a couple of risks and I'm going to focus on those. And along with those, I'm going to take care of the basics, meaning sexual misconduct, racism, things that everybody should be making sure they're not engaging in no matter what profession they're in. So I think we're a ways away from getting some good guidance from regulation. We're still going to be relying on companies, including entrepreneurs to think about ethics above and beyond the law.
Susan Sly 05:28
Hmm. I love that. And I love what you said about, you know, essentially a sending a legion of attorneys to Washington with the solution, right? We know, it's not. I'm gonna ask you because that, that just provoked this question with regard to what's happened with Robinhood. So we're seeing the, this, the dramatic shift in the power of the people. So things like meme stocks, meme coins, even, you know, trading platforms like Robinhood, which say, Hey, you don't have to, if you don't have whatever the trading amount of Apple is, today, we're doing the shows like $1.73, or something, if you don't have 100, or $173. If you don't have $173, you can buy a little portion of it or a little portion of Tesla. But we're seeing Susan, masses of everyday people calling these large, multi billion dollar companies to task. In your opinion, do you think this is the new normal? Do you think companies are starting to actually listen to their their end users?
Dr. Susan Liautaud 06:37
So you ask a great question that hits on what I believe is one of the most important forces and ethics today across all sectors. And that is what I call in my book scattered power. So we have, we as individuals have more power than ever before. To take a simple example, anyone with a cell phone can tutor a child on the other side of the planet, or perhaps engage in a terrorist act. So the technology has given us huge amounts of power. Robinhood is a great example that you bring up. You know, when people with commission free trading, all of a sudden can engage in financial transactions that they would not have been able to without this technology. Robinhood claims that his mission is to sort of democratize trading. The problem is that there is an enormous amount of risk. So at the time of the Big Robinhood scandal, Robinhood's Terms of Service was 33, microprint pages. Now, I started my career as a corporate lawyer on Wall Street. And I didn't understand 95% of what I read when I read it for research. So of course, most of the people who are engaging in the trading, just like most of the people who are on social media, can't possibly understand the terms of service. So the first thing these companies need to do is to tell the average user, this is the power you have, but be careful, because these are the top three risks. And I mean, in plain English, or plain language, whatever country you may be from, really clear about what the company's limitations might be. So for example, Robinhood had this big liquidity crisis and had to stop trading. It didn't hurt the hedge funds, it didn't hurt the big investors, ultimately, they'll recover, but it hurt the individual. So they need to be much clearer and not in 33 pages, and not even in 33 words, just the you know, the kind of financial equivalent of smoking kills. A really good question about our company's listening. There are a number of ways that this power is showing up beyond using products. So one is the workforce speaking out, we saw that with Google. Teams of workers speaking up against selling drones military. We saw that with Amazon, speaking up against selling cloud services to fossil fuel companies. So at the end of the day, I think companies are listening. But that doesn't mean that senior management teams, particularly in very large companies have to do exactly what certain groups of employees are telling them to do. They have a whole constellation of considerations when they engage in decision making. I think it's great that employees have a voice, a much bigger voice and I think that will continue. But that doesn't mean that a particular group expressing a particular view will get exactly what they want from management.
Susan Sly 09:24
I love, I'm taking copious notes, as I always do. I get to learn so much, which is amazing, scattered power. And I know for myself, even you know, in my role at Radius, I read a lot of legal documents. I am not an attorney, we have really great ones. But sometimes Susan, honestly, here's my confession. I have to print out these documents because I don't know, maybe it's because I'm turning 50 dnd I'm like I have to read them and highlight them. I know on my phone— I can't do that without doing you know, I can't, I can't really review a document without printing it personally. And the average person who doesn't have maybe a lot of business acumen, they're scrolling through on their phone, they just want to get that first trade, it's like, click. Same thing with social media Terms of Use. So let me ask you this. Are there any companies doing it right?
Dr. Susan Liautaud 10:20
I don't know of any that are really doing it right on the terms of service. And I think that's because back to original question about lawyers, they are all lawyered up, and the lawyers are there to protect them from liability. So if you actually do make it through all of Facebook's Terms of Service, and you get to the end, and this information may be out of date, but at one point, about a year ago, when I was looking at this, the ultimate liability of the company to you was like $10, or $100, or something like that, right. So it's just impossible for the average person to understand. And I also have to print documents. It's just that I sort of grew up that way or, but they're also complicated. But I think because they're all driven by protecting companies from legal liability, there is, nobody is really quite getting this right. But what I do say is, we have to be practical with ethics. And in particular, for entrepreneurs who have many things to do and who don't necessarily have lots of expensive lawyers. And as individuals, and even as companies, what I say is, you can click on I agree, without paying attention, which I confess to doing when it's something like renewing your iPhone subscription, or your iTunes subscription, not a lot of terrible things will happen with that. I would be a little more careful when you're thinking about your financial situation where you're agreeing to terms of service, and you're actually going to be spending your money or investing your money. And then there would even be a whole other level of more careful on just a few things. And I would include in that extra careful category. Things like 23andme, or ancestry.com. You really want to be careful that you understand the implications of using, you know, home genetic testing kits. Do you really want to know what they're going to tell you? Do you really understand that maybe you're going to have to reveal to an insurance company who might decline you as a client or raise your rates, if you learn from one of those companies that you have some sort of predisposition for serious disease? So I think I just tried to be very practical about this and parse things into categories. And you know, like everybody else, I click I Agree, very often without, without reading.
Susan Sly 12:30
So you, you mentioned, you brought up so many great points, and I did my 23andme. I did you know, very early on and I had, I'd really thought about the governance, you know, where does this DNA material go? And my husband said to me, just before Christmas, I regret ever doing it. Right? And so that begs the question, how do we the people hold companies accountable for their ethical practices? And, you know, what is the process for doing that?
Dr. Susan Liautaud 13:02
I think it's a great question. And I certainly understand, you know, your husband's comment. I mean, it's really, we just don't know what's going to happen with that DNA. And in part, because so many other companies are popping up around the original sort of 23andme and ancestry, that effect, what can happen to it, and who can access what. And, you know, we saw, for example, the California killer case, I mean, that may be one case where we don't have a lot of sympathy that somebody's private situation was discovered, and the police were able to arrest. But there are a lot of cases where people are finding things out about other people that, you know, nobody consented to. And you know, there, I write about in the book a case where somebody finds out that they're, they have a half sibling. Well, that means that the parent had an affair at one point. Well, the other parent wasn't too happy to learn that. The sibling didn't necessarily want to be found, but was. I mean, there's all this kind of complicated, you know, things that go on. But even about our personal, our personal DNA, we really need to be able to think about what we're doing with it. And for example, with research, it's one thing to engage in research with big medical institutions. But when you start doing it with a private company that controls things, you don't know whether your DNA is being used through academically robust research processes, privacy processes, you know, as opposed to the controller's of that stock, get to decide, you know, this particular disease is important to my family. So I'm going to use everybody's DNA samples and do research about disease x. So there are a lot of considerations and just to say that, that's one where we, there's certainly nothing wrong with doing it, but it's one where we probably want to do you know, we do want to read the fine print first, and for some people possibly even get some medical advice before doing it.
Susan Sly 14:57
I, that is such sage wisdom, Susan. And thank you so much for that. The, I want to go back to San Bernardino. And the you know, Apple making the decision, No, we are not going to disclose. And this was a an event that happened on US soil where people were murdered. And Apple said, No, we are not going to do that. And what is your opinion on that? Even though there was a, a team that ended up hacking it, I believe that were from Israel that ended up doing it. But what what's your thought? Did they make the right decision?
Dr. Susan Liautaud 15:39
It's interesting that you raised this. I'm working on a new book that's going to come out in April, that's called The Little Book of Big Ethical Questions. And what it is, is a q&a, it's a conversation starter. So I set out a scenario and then I give guidance. I should say that I never tell people what to do. It, my opinion on something is not in any way, telling someone else what is right for them, I try to help everyone craft their own stories and make the best decisions for them. So on this, what I can say about my opinion, there are a couple of different things. One is that these were terrorists, and we didn't know at the time, whether they were connected to other people who could have committed other terrorist acts. We also know that they were dead, which means the privacy implications were not as serious as somebody who was still alive. On the other hand, Apple has made a commitment to its customers. And it stuck by that commitment. And Apple raises a very good point that if they open it for this kind of thing, then various authoritarian regimes and others around the world can also do so for nefarious reasons. So it is a very complicated case. What I, what I would say is that the real issue is for the customer to know what's possible. And so Apple has been sticking to its commitment to customers not to allow the phones to be opened. But now what we saw as you point out, and it was an Israeli company, and also an Australian company that was able to hack, we need to tell customers, Applemay not do it, but it's hackable. And I think the final thing I would say is that when we're forming an opinion on something like this, I talked to some CEOs after this happened. And I asked them what they would do and how they would feel, for example, if something happened in one of their corporate sites, but they had decided not to help the FBI, you know, another terrorist attack that had been in the planning, and it's always very easy to, to theorize when you're not going to have to face the real consequences. But I think a lot of them came down to, you know, I just think I need to protect the, you know, be sure I'm protecting immediate human life first, and worry about the future security issues later. So they didn't all agree with Apple. But there are a lot of, you know, a lot of considerations here. And, and we're facing this kind of company control over technology versus law enforcement in a lot of areas. And another one, also in the police areas, facial recognition technology. You know, I know you're becoming an expert on AI. So I'm sure that's something that's come across your radar.
Susan Sly 18:18
Yeah, and the, the philosophy I've always had is good people create good technology, bad people create bad technology. And your point about people saying, Hey, we're going to leave Google because we don't, this doesn't align with our values, creating weapons or using this technology to do things that are, we feel are nefarious. That piece around facial recognition. So back in, I'm going to date myself back in '91 and '92, I was a research assistant at university and we were working on early facial recognition, technology and quantification of crime scenes. So the nobility of it and I'm gonna ask you a question on this, this pure intent at the time was to say, hey, and we were very focused on, on, on atrocities against women, and using the crime scene and creating, attaching a quantifiable value on some different data points to it, and then helping law enforcement narrow their search. So I spent a couple of years in my university career working on this technology and again, very noble. And we were able to help law enforcement take a very large search potential and, and maybe 1000 potential perpetrators and narrow it down to a very manageable number of you know, 80 to 100. It started out with nobility, right? Facebook started out with, let's just connect some college kids. At what point do you think a company goes from we have this pure intent to they end up crossing the ethical line? Is it something that is very, I guess, very obvious? Or is it something that perhaps is, is a little less obvious?
Dr. Susan Liautaud 20:18
So I love your question, because we look at something like Cambridge Analytica, and you know, we hear things like, well, we couldn't know that this was happening. And my answer is you couldn't have known it in a Harvard dorm room. But certainly you could have known it before it happens. So it's kind of a, when does this happen? And so one of the things that I really advise, especially entrepreneurs is to always be asking one question, which is, what could happen, not just what will happen? Because if you're asking what could happen, what you're doing is your thinking broadly about the risks that could arise. And again, you can't deal with all of them. But you can certainly see earlier on what may be looming on the horizon. And that risk may not have anything to do with the company, it may be that an end user is going to misuse it. So maybe what you need to do is you need to put brakes on how users can use it. So an example that was in the news this morning is about sports gambling. And so if you have a sports gambling technology, what would you do to make sure you're helping prevent against fraud, against gambling, addiction, etc? How would you embed that in the technology?That's something that is certainly noble from the time you set it out. Some of the others, it's more complicated, but there is a sort of an epidemic of could have known and should have known. Even, I'll say willful blindness, and sometimes it's the entrepreneurs, but sometimes it's the boards. So one of the biggest responsibilities, company boards have, even boards of small companies that may just be a few investors, is to be asking these questions as well. And, and on the facial recognition is interesting, we're still not where we need to be. I think it wasn't completely obvious that it could be racially biased, because you know, you're collecting data. And we know that we're collecting data sets that are not necessarily evenly distributed. But certainly we know now that it's racially biased, and that facial recognition doesn't do as well with certain skin tones, etc. And so now we're in a situation with it, where we have to say, Okay, now we know of these risks, what are we going to do about it? We don't want to sacrifice using that technology to find a lost child, or to find a terrorist. The question is, where do we draw the line? And right now, Microsoft and other companies have said, Look, we're not making our facial rec tech available to police forces until we have better laws. That's good, except that we all see how well the regulators have done on getting effective regulation in place. So I think we need to be careful and saying, No, we don't want it randomly used, there's too much racism embedded, there's too much risk that innocent people could have their lives destroyed. But when we're out there, at the extreme end, lost children, terrorism, as you say, serious crimes against women, then maybe the balance, we're willing to live with a little bit of risk, and try to reinstate some checks and balances in other parts of the system, including in the judicial system.
Susan Sly 23:28
That is, it's beautiful. And it is this whole ethical debate, because you know, as a, as a mother, if one of my children went missing, Yes, please turn on all access to see everything, you know, turn it on, right. And that the, one of the debates we were having at MIT is this discussion around an AI Council. And in the EU, they have attempted to do this. One of my, my thoughts, and I want to get your opinion on this is, let's think of this AI Council in a non traditional way. Because the first tendency would be to say, let's put all of these heads of technology on there. And what I countered and said, Let's put moms on there. Let's put psychologists on there. Let's put a variety of people that can help to create governance laws that, that currently, not only don't fully exist to your point, but are also really tough to, to govern. If you look at you know, what is happening in China with the social credit score. Whether the Chinese deny it or not, it's there. It's, it determines my good behavior if I can get a bank account. So what is your opinion on an AI governance council?
Dr. Susan Liautaud 24:49
So I think a lot of the efforts are good efforts. I mean, people are trying to do what they can do, and they're trying to do what they can do within national legal systems. There's nothing that anybody in Europe is going to do with respect to how China decides to regulate or use or misuse, facial recognition technology. So it's a very complicated problem because national legal systems are spectacularly ill equipped to deal with these cross border challenges, by the way, like pandemics also, but, but certainly like AI. I think, you know, as we discussed before the show started, one of my personal missions that cuts across all of my work in different areas is democratizing ethics. So I absolutely agree with you. And I've tried to do a lot of work in this area of bringing in other voices. And that means that it, as I said earlier, I don't think ethics experts should be the deciders. I think we should be out there helping people think about how to think about their decision making, but not making decisions for other people. So I'm 100% in favor of bringing in people from all walks of life. And yes, some experts on technology, experts on ethics, experts in psychology. But you know, the more voices the better, and we have to figure out how to do that. I think as an intermediate step councils are better than nothing. At least they're a place for discussion, but I think they need to hold themselves up to pretty stringent honesty about their limitations.
Susan Sly 26:19
Hmm. Well, perhaps you and I need to lead that charge, honestly. We'll invite Hal Gregersen. He's an MIT professor who wrote the book, The Questions Are The Answer. And one thing I've learned from Hal is just keep asking the question, ask the question after question. And speaking of this Susan, I have a final question for you. And I, this honestly, for, for all of the listeners all over the world, you'll know that I get pretty excited about these discussions because the, the world we're raising our children in, the world that we're living into now that the people who are making decisions now really are going to affect so many things for decades to come. So Susan, I want to envision it is, let's say 20 years in the future. And that there's you know, whether it's Forbes or whoever it is, is writing about you. If there was an ethical problem that you could be known for having spearheaded as solving, what would it be?
Dr. Susan Liautaud 27:24
It would absolutely be this democratizing ethics would be embedding in how we think about everything from voting to the use of social media, to how we think about truth in society, to gene editing, making sure that we make everybody engaged in the conversation, and also communicate that, to a large extent, it's each of our individual responsibilities. So not to say that corporate leaders don't have outsized responsibility. But you know, if nobody posts fake news, it's not on there. And if nobody reads and spreads it, it doesn't get read or spread. So definitely this this whole notion of democratizing ethics.
Susan Sly 28:05
Hmm. Well, I'm looking forward to 2042. I just can't believe I said that. Yeah. I'm gonna be 70, turning 70. Anyway, um, for everyone, you know, Susan is on Twitter. She's on LinkedIn, she's on Facebook. Depending when you're listening this show order The Little Book of Big Ethical Questions. I'm excited to read that myself. It's a, it's a very interesting world we're living in and a time when there are so many voices, and how do we establish ourselves in a way as ethical people building ethical companies. So you've given me so much to think about today, Susan, I cannot thank you enough.
Dr. Susan Liautaud 28:49
Oh, I thank you. And thank all the listeners who have joined us today. It's really been a pleasure.
Susan Sly 28:55
Well, thank you. So with that, I would encourage all of you to go and follow Susan and go to her website. I'm going to read it out loud. It is triple www.susanliautaud.com. And thank you again, Susan. And for everyone listening all over the world, go out there and rock your day and just take that next bold step in pursuing your dreams in a very practical way. And so with that, God bless. And I will see you in the next episode of Raw and Real Entrepreneurship.