Anthropic v Trump transcript

Introduction

This is “What Just Happened?,” the podcast that looks at the biggest brand crises of our time, what they meant for organisational strategy and behaviour, and their lasting impact on our approach to crisis communication.

I’m Kate Hartley. And I’m Tamara Littleton. And together, we’ll delve into what happened, why it mattered, and whether it could happen again.

Episode 

Tamara Littleton: Welcome back to What Just Happened, and there’s been some really interesting developments in the world right now around AI. Kate, do you want to just give a little sort of summary of what you’re seeing at the moment?

Kate Hartley: Yeah, sure. It’s been really, really fascinating. I think there’s been the news last Friday that Trump said that he was going to order all US government departments to stop using Anthropic, which is the makers of Claude AI. And for anyone who doesn’t know the context of that, Anthropic said it wasn’t going to accept new terms of use from the US Department of Defence on how Claude is used, basically by that department.

So that included giving the US military full access to AI tools without having a say on how those tools were used. And I listened to a really interesting interview with Anthropic’s CEO, who said there were two things, basically, which were his red lines in the terms, specifically the potential for use in domestic mass surveillance and fully autonomous weapons, partly because of what he said was a sort of lag in the law, and partly because he said AI wasn’t sophisticated enough to do that. And apparently other AI providers have accepted those terms.

TL: I mean, this is such an interesting thing because, I don’t know, maybe I’m being naive, but presumably companies have the right to define their values, their ethics, who they choose to do business with, and how. I mean, so I feel like they should be allowed to turn down business.

KH: And I guess they are. But the issue there is, what’s the cost of doing that? And that’s what’s, I think, really interesting here. And we’re not going to go into the rights and wrongs of whether they should or shouldn’t accept those terms, obviously, but there are some interesting questions that are raised.

And I guess you can defy, well, you have to operate within the law obviously, but you could define who you work with and who you don’t. But what does that cost you? So there’s a financial penalty, fairly obviously, in this case losing a contract, but also losing the trust potentially of the US government.

They’re talking about saying that Anthropic is a supply chain risk because it wouldn’t accept those new terms. And then I guess there’s the question of whether you are ready for the potential backlash that might come from either accepting terms or not accepting terms. Either way you’re going to get backlash.

And we’re already seeing that, aren’t we, with some of the providers that have accepted those terms and are getting a bit of a backlash, and Anthropic no doubt will as well for not accepting them. So you just have to weigh it up, don’t you, I suppose.

TL: And I think it is that polarisation, because I know that they were referred to by Trump on Truth Social as a radical left AI company run by people who have no idea what the real world is all about. And so it’s been positioned as this woke, anti-woke issue.

And I think that’s the thing, it’s forcing consumers to make a decision based on their political leanings.

KH: This is the thing, isn’t it? I thought that was fascinating too, because a lot of companies now are saying that they don’t want to be political. They don’t want to get caught up in this polarisation that we’re seeing everywhere.

But this is a great example of how even if you don’t want to be political, you can still get drawn into politics. And I don’t know how you avoid that in this kind of case. I guess if you choose to work with government, then you are already political in a way.

And it is important to say that Anthropic has been working with the Department of Defence, I think, and I think said it would accept most of the terms. It was just two that it had concerns over. But yeah, it’s been pulled into this hugely polarised debate.

And I suppose the lesson is we have to all be prepared for that, don’t we? And also the very public spat between two quite high-profile, well very high-profile individuals in one of their cases. So it is becoming a big public spat between Donald Trump and Dario Amodei, the CEO of Anthropic.

So then you’ve got it almost coming away from the company and becoming this kind of clash of personalities, which is really fascinating.

TL: And as you said, we’re not getting into the ins and outs of this whole subject matter, but with a bit of a lens on crisis preparation. Something that is absolutely fascinating to me is, as you said, it’s so public.

And there is now this sort of conscious uncoupling that the company has had to do. They’ve been publicly sacked and they now have to, I think they were given six months so that people can sort of change them as a supplier.

But it’s all done in the public glare. And usually when someone loses a contract there’s usually a period of time, it’s done behind the scenes.

I guess you sort of end the contract, you move on to a different person, but they’ve now got to, as I said, consciously uncouple in the public glare. And it feels like that’s something that now all companies should prepare for, what happens if they have to end contracts very, very fast.

And maybe that’s a whole sort of area of crisis planning now.

KH: Yeah, God, maybe it is, and that’s something I don’t think we’ve really considered until the current environment that we’re all operating in. As you say, it would have been much less public before, wouldn’t it?

And I know that Amodei said that he would help with any kind of handover to make it easier to do that. It’s really hard because you don’t know what the outcome will be further down the line.

Will there be a reluctance to use Anthropic for companies that serve the US government, for example? Will there be any kind of fallout beyond this immediate contract?

We don’t know. But as you say, we have to prepare for that stuff. And I think it was interesting that he used the term red lines, because I do think one of the things we always say to people is know what your red lines are.

And there is an element of being brave. If you are clear what your red lines are, you shouldn’t really go over them unless you, you know. I mean you can change, you can move the red lines I guess, but if you know where they are and they’re fixed then that is something you absolutely have to stand by.

But the penalty for that can be absolutely huge.

TL: And I suppose you were talking about the competitors, so other companies like OpenAI stepping in and picking up this contract. I guess any company that crosses over those red lines themselves has to expect to be in the crosshairs and in the firing line for a potential boycott.

KH: Yeah. I suspect we’ll see that on both sides actually. I suspect we’ll see boycotts against Anthropic for doing this and boycotts against OpenAI for taking the contract.

So I think in this polarised world we’ll see boycotts on both sides. That tends to be what happens, doesn’t it?

And we’ve seen this with other organisations in different areas. So we talked on our podcast about Disney and the “Don’t Say Gay” bill, which became a very public spat between Ron DeSantis and Bob Chapek, the CEO of Disney.

And so I think we’ve seen the importance of knowing where your red lines are and sticking to them. But also you have to know that those red lines are right.

And there are all sorts of things you have to do, I think, to prepare for that. So are your employees on board with it? It would be fascinating to know what the Anthropic employees think, and equally what the OpenAI employees think of this as well.

You know we’ve seen examples, haven’t we, where employees have had a huge say over the company direction, or perhaps more than we’ve ever seen before. So you’ve got to bring your employees with you.

You’ve got to bring your other customers with you. You’ve got to think about the long-term future of your business beyond political cycles. There are all sorts of things that companies have to consider now.

TL: And I suppose actually working with the government, working with the US government as a client, or them being your client rather, comes with its own risks now anyway.

KH: I think that’s absolutely true.

TL: Fascinating.

Outro

You’ve been listening to “What Just Happened?” with Kate Hartley and Tamara Littleton. If you enjoyed the podcast, please subscribe, rate, and review.