WASHINGTON AI NETWORK PODCAST TRANSCRIPT Host: Tammy Haddad Guest: Michael Spence, Nobel Prize-winning Economist and author of “Permacrisis: A Plan to Fix a Fractured World.” November 21, 2023
Tammy Haddad: Welcome to the Washington AI Network podcast. I'm Tammy Haddad, the founder of the Washington AI Network. Where we bring together official Washington insiders and AI experts who are challenging, debating, and just trying to figure out the rules of the road for artificial intelligence. It was only a year ago that OpenAI released ChatGPT. This AI revolution has been led by industry and governments around the world are running to catch up. Today's guest is one of the world's most respected economists. Michael Spence is a recipient of the Nobel Memorial Prize in Economic Sciences, among numerous other awards. He is sounding a big alarm, calling for government and business leaders to take action to help the world's economies get out of what he calls a permacrisis. And he says AI is a critical piece to help economies grow. He is joined by Gordon Brown, the former UK prime minister who guided the world through the 2008 economic crisis and Mohamed El-Erian, president of Queens College and chief economic advisor at Allianz, and creating a road map on how to do it in their new book, “Permacrisis: A Plan to Fix a Fractured World.” Michael Spence, thank you so much for being here on the Washington AI Network podcast. Tammy Haddad: Please tell us about “Permacrisis.” Michael Spence: Well, thank you, Tammy. Thank you for having me. So Gordon Brown and Mohamed El-Erian and I are friends and during the pandemic, we started talking to each other. By this time we realized that, you know, we were probably encountering some kind of regime change in the global economy and a series of crises. You go back to the Asian financial crisis, the great financial crisis, the sovereign debt crisis here in Europe, the pandemic a war that started, you know, before the pandemic ended and now another war and the geopolitical tensions. And it looked like a pretty confusing, turbulent world that was suffering in terms of economic performance pretty much everywhere. And so we started talking to each other and at some point somebody said, well, maybe we should write something down. At that point, I think Mohammed said, well, we didn't take any notes. So we had to scramble a little bit. And, we wrote this book. So part of the theme of this is, the core of the book basically says we need to, you know, have new or adjusted growth models to overcome these challenges. We need new mindsets and an understanding of a different world in terms of macroeconomic management, which would include monetary policy, and we need to try to arrest the sort of rather startling fragmentation that's occurring in the global economy. Now, you know, we're not, you know, unrealistic about this. Gordon has spent his post prime ministerial years, you know, involved in international relations as high commissioner for education for the U. N. and a whole range of other things. We know that, you know, sort of national security considerations, geopolitical tensions are not going to simply go away, but we don't want to sort of walk blindly on autopilot down the road where most of the benefits of that post war global order are kind of given away with adverse effects for everybody, including especially some lower income developing countries. So, that was the task we set out to try to tackle. Tammy Haddad: How do you think governments can stop the permacrisis? I mean, what are the actual steps? Because in the book you talk about the importance of growth. We'll talk later about AI, but it seems to me that no matter what kind of economic issues you're talking about, it starts with government. Michael Spence: It's that no, absolutely right. I mean when I was working on developing country growth earlier on and people would ask, you know, well, what's the main cause of the big failures. It's always governance. Really? It, you know, overrides simple mistakes in economic policy. So you're absolutely right. You know, without effective governance, effective economic management and farsighted kind of long time horizon economic policy. It's very difficult to improve performance and achieve inclusive and sustainable growth patterns. It's just the way the world is. Tammy Haddad: But with the rise of nationalism all across the globe, what can you do? Michael Spence: Well, I think you can do things that take the China U.S. relationship, for example. There's a lot of sensible people on both sides who are worried about, you know, sort of unrestricted, you know, strategic competition. And, and so we've started a set of visits back and forth, with very, very senior people in government on our side, you know, the Secretary of Commerce and the Secretary of the Treasury and of course, the Secretary of State, the term varies so much. They're not doing anything unrealistic. They're not trying to pretend that we don't have strategic conflicts or potential conflicts. We're probably going to have to live with restrictions on the flow of critical technologies for the foreseeable future, but we don't have to let that contaminate the entire global trading system, the entire global investment system and so on. And so I think they're working hard to find a way to ring fence the strategically important things and maintain a reasonably open multilaterally underpinned and structured global trading system. And, and I'm cautiously optimistic. These are very talented people and they're working hard on it. It's not easy, for sure. Tammy Haddad: And the meetings in the last week seem to bode well for what you're talking about. Because it looks as if China's just as interested as the rest of the world in working together. Is that a good analysis? What's your analysis? Michael Spence: No, I think that's accurate. I mean, two things, two observations to put this in context. One, the pandemic was a disaster because, you know, all those interpersonal connections at multiple levels from the most senior people in government through business through academics, through think tanks essentially got cut off. And that is a prescription for having sort of a slightly unrealistic kind of cartoonish, you know, characterizations of the other side that are overcome by interpersonal interaction. And secondly, the level of trust. If I can put it that way between these 2 entities, has fallen into a dramatic low, even before you get to the kind of the hawks on both sides of the equation. So, I think the way to think about the San Francisco meeting around the apex summit was it's a whole lot better than nothing. You can't expect them to sort of solve every problem, but at least working on climate change together. At least working in a cooperative fashion to deal with data and things related to technology, including their use in war is probably a pretty good forward step and so on. So, you know, serious people at multiple levels are trying very hard to avoid an all out rush to a world that's dominated by only nationalist consideration. Tammy Haddad: Well, with the war in Ukraine and what's happening in Israel today, I think that sort of added to the permacrisis. Michael Spence: It did. I was talking to a senior minister in Dubai just last week, and he said, I'm getting used to this. We have one every three months now and so we didn't plan it that way. We were hoping maybe we wouldn't have another one for a while. But it, you know, yes, I mean, and I don't, I think it's real. I mean, one of the things you've already seen in the book is that, that the world is just subject to shocks. It's not just these, it's not just these wars and pandemics, but you have the geopolitical crises, but climate change is delivering shocks all over the world which disrupt commerce. They've become a kind of macroeconomic factor now. And so one of the things that I think people have observed and we tried to describe is that the world that we lived in before in which basically these global supply networks were constructed on efficiency and comparative advantage grounds are now being constructed on the basis of resilience and national security, economic security, energy security here in Europe, we're diversifying, you know, at a very rapid rate away from dependence on Russian fossil fuels, but that's a very expensive proposition. So all of these things are adding up to increasing sort of supply side blockages and constraints, even when you get past the ones that kind of came with the pandemic and it subsided and that's a new set of challenges for growth. Tammy Haddad: The other thing you talk about is aging populations in the workforce. Michael Spence: Absolutely. Aging is not a sort of overnight process and so, right? But it, but it is true. I'll give you two statistics because they're kind of strikingly different. If you look at where the global economy produces its output in over 75 percent of the countries that produce the major part of the output that those societies are aging. That's essentially all of the developed world. It's certainly true of China. Japan, of course as part of the developed world and so on. So you have over 75 percent of the global GDP were being produced in aging societies where the workforce is sometimes declining. You know, the dependency ratios are rising, etc. Now that batch of countries only accounts for 33 percent of the world's population. The other two thirds are actually quite young and sometimes start living young. And when, and when you look at them, the problem is that they could drive a fair amount of growth, but they're too early in their... growth stage in their development stages. They're not big enough yet. If India keeps growing at 7%, which seems very likely, they're doing awfully well. And some of these other countries get plugged into the global economy and start growing at the rates we've seen in a wide range of developing countries. Then sometime down the road, the world might look a lot younger in terms of the way it produces, you know, what we invest and consume. But right now we're aging. And there's all kinds of consequences, but when you combine it, Tammy, with the fact that we have labor shortages in major employment sectors, including, you know, government, healthcare, traditional retail, construction in America you have, you know, a set of conditions on the supply side we haven't seen in just ages. So the deflationary features of the past 30 years with, you know, extraordinary development, country growth. You know, pouring mountains of incremental productive capacity into the global economy, that's not over but it's fading as a force. And so, when you add them all together and you find yourself for the first time with supply side constraints that we need to try to overcome. Tammy Haddad: And then you turn to AI and you think that AI can really help with all of these issues and build growth. How do you think it actually is going to work? Michael Spence: Well, so it's not a sure thing. So this is a little bit in the, you know it has the potential. So, one of the trends we didn't mention is we had, especially after the great financial crisis, is just a startling decline in an already declining trend in productivity, right? And so if you're in a supply constrained world, that matters. If the main constraint on growth is demand, you know, you can live for a while. Michael Spence: That's the condition we were in after the great financial crisis. We were, we, we lived in a world in which basically, the interest rates were zero, mountains of liquidity were pumped into the system, there was no sign of inflation at all. That, that happens in a world, the reason that happened was there was an immense balance sheet destruction in the great financial crisis that caused basically consumption to get subdued. Michael Spence: So to go back to AI, I think the best way I think to see it is that, you know, we have a sequence of breakthroughs in AI, They're quite stunning, you know, language recognition, image recognition was extraordinary and and but now we have generative AI, and generative AI has two very striking features, I think. Michael Spence: One is the first time we have an AI that basically switches domains and knows what you're talking about. You can talk to it about the Italian Renaissance and switch to computer coding. And then talk to him about inflation and whatnot, and while it does odd things at times, hallucinates, makes stuff up, and whatnot, all of these are, you know, not fatal problems. They have to be watched out for. So basically, you know, you can use it anywhere. And the other thing is that, you know, that I think people realized right away is that you don't need technical training to use it. I mean, you may need a little practice giving it prompts to get the kind of answers or responses that you want. Michael Spence: And so, you know, chat, the chat GPT, even before you got to chat GPT for it, a hundred million users in the first two months. It's hard to find an example of, you know, picking up an adoption at that rate. So where we are is we're in a period of intense exploration and, and experimentation with this. And, you know, and you can't see in detail, but there's emerging set of examples, which suggests that when it's done, adopted basically in, and here I want to, I want to emphasize something in the powerful, what I call the powerful digital assistant mode, or what James and I call that it's usually called augmentation in the literature as opposed to automation. And sometimes called, I think, a really good way that will evolve to think about is machine human collaboration. And, and in that mode, I think, you know, there's just startling productivity gains to be had. You know, doctors don't have to spend it, you know, anywhere near as much time writing up the reports that they have to write, because the AIs can write the first draft. Michael Spence: They can't write the final draft yet because they make mistakes, right? So they, the medical, and there's just so many examples that Erik Brynjolfsson at Stanford wrote a paper with some colleagues that suggested that a big AI trained on audio recordings of customer service, customer, you know interactions with performance metrics attached to those things. Michael Spence: Was able to produce an AI that was given to half the agents and not to the other half and produce just startling productivity increases. And, the big increase was for the least experienced agents, right? So the, the, everybody benefited, but the ones who are very experienced that benefit in percentage terms was the least. Michael Spence: And, then there was a huge increment in performance for the people who were just going down the learning curve. You know, this, it was a tech environment, so these are people who are, you know, couldn't connect to the network or something like that, some very technical sort of digital related thing. I mean, that's not the whole economy, but every time I look you'll find an AI system has the potential either to increase the productivity of people or increase the productivity of whole systems. They're starting to look at the use of AI basically to increase the transparency and ultimately the efficiency of global supply networks and global supply chains. We'll get a more complete picture as we go along. But what I think is starting to emerge is the awareness that we have the potential to produce a very large surge in productivity and in some sense. You know, close this gap between demand and supply and enable sustainable and inclusive growth. There are other uses of technology that are pretty impressive. I mean, if you look at the India case, India is a wonderfully built sort of digital finance and payment system now. Probably the best one in the world. You know, the rural areas, they don't use banks anymore, they don't use cash anymore, you know, they have a biometric identification system that the geo revolution, you know, we're reliant reliance industries got, you know, 4 to 500 million new mobile internet users in India, a very large fraction of the population, they have a thriving digital ecosystem, you know, 91 and counting unicorns being built in there. And then they, then they did something really startling that I think, I hope our own government does. Maybe it'll be led by the central bank, which is they created something called the universal payments interface. That means every bank and every payments processor and everything is interconnected. Michael Spence: With standards, you know, the exactly the same standards, so all of a sudden you have a level playing field and all kinds of innovation. And, you know, it's just a huge success. This is, I think it'll enhance growth in India, but it'll probably enhance it certainly will enhance the inclusiveness of the growth patterns because it's accessible now to the entire range of population, including the poor in the rural areas. You know, you can go on and on and I won't do that, but, you know, applications in medicine from image recognition, you can detect skin cancer reasonably accurately with this. Do we want to replace dermatologists? Absolutely not. But, you know, 85 percent of the world's population doesn't live anywhere near a dermatologist. Right? So they can do the first, you know preventive screen, you know, in a remote location by just sending the images to the AI. And if it's dangerous looking, then they'll get on a train or, you know, bus and get the medical care they need. Whereas right now the alternative is they just don't do it. Tammy Haddad: I know you deal in the really large issues, and I wonder how much you take into consideration fear of AI in this example, fear of the haves and have nots, because as you talk about this technology and what's going on in India, in the US, as you well know, the government's trying to figure out what kind of regulations. Meanwhile, you've got a lot of people who are saying, well That's not for me. That's only for the elites. How do you spread it out? Now, I know what you're saying, but you know, what's the best way to roll that out to this country, to countries all around the world that they feel like they're part of it and it's not something to work against. Michael Spence: So this is a work in process, but you know, I'm glad you asked the question because it's really important. So whenJames Manyika and I wrote a paper, it's kind of similar in content to the book, but goes into it in greater depth. And one of the things we're worried about is that the diffusion of this technology will, will, you know, repeat a previous pattern and the previous pattern and earlier rounds of digital adoption was the tech sector and the financial sector moved ahead rapidly and a whole bunch of other sectors, you know, were, were at best slower and at worst inert and that if we repeat that pattern, two things will happen, it'll make, you know, it'll make various kinds of inequality worse. And we won't get the productivity surge because it won't appear across the entire economy. And that's a place where government has a role. The private sector by itself doesn't necessarily get this job done. There's hundreds and hundreds, thousands of small and medium sized businesses that don't have the resources of a JPMorgan Chase to conduct the experiments and figure out how to use this. So, It's not that this job falls entirely to the government, but the government has, you know, a very big role, as it has historically, in making sure that there's widespread access, knowledgeability, etc. Two really important, you know, new technologies. The other one I would mention is that there is a very, very strong bias, I call it the automation bias. Almost everybody, whether, you know, they're in the tech world or in economics or, or just fellow citizens sort of thinks these machines are coming for her jobs and it's terrifying. So if a hospital administrator starts talking about the use of, you know, AI or generative AI to kind of help with performance in the hospital, people, you know, we have lots of examples of that, you can read it in the press. Erik Brynjolfsson identifies this as kind of what he calls the Turing Trap. So Alan Turing was this genius who kind of gave us the early versions of computers and he proposed that we evaluate our progress by, by asking the question, can we produce a machine? Now, a digital machine that when interacting with a human, the human thinks they're interacting with another human and a small step from that is that AI, because it's called artificial intelligence is generally benchmarked against humans, you know, so in image recognition, the progress was tracked against whether the AI was recognizing images or objects better than the average human. And indeed, in 2016 or 17, somewhere around there, the AI passed the average human, not all humans, but the average. And then there's one other small step, and this is the dangerous one, which he says, well, if the AI outperforms the humans, why don't we just use the AI instead of the human? And that's where you get this. You know, jump that we think is the jump in the wrong direction. So there's elements of automation. Go back to that doctor example that we talked about a minute ago. If you define the task as writing the first draft, the AI will do it. That's automation, right? The human doesn't have to do it, and they can get on with doing something else, probably more related to their medical expertise, as opposed to writing reports. If you define the task as writing the final report. Then it's machine human collaboration because the machines are prediction machines, and they don't get everything right all the time, and we can't write this, in multiple applications we can't, and should not write the human out of the script. And when we do, we get very odd results. I mean, there's a, you know, in Washington, I think there's a well known story of a lawyer who produced a legal brief with ChatGPT. And, you know, and it does what they do so well, which is made up of all the legal precedents, and that didn't go over very well. I don't, I'm not too worried about that, because people will learn. Well, first of all, the AIs will be equipped with, you know, fact checking capabilities. You know, given access to various search engines and so on. And, you know, or more sophisticated ones like, you know, LexisNexis. And second, people will learn that they really shouldn't do that. So that doesn't settle all the issues. And government has an enormously important role in, you know, defining intellectual property rights, copyright rights. You know, these are really hard problems. So I don't regret that. But what James and I thought, and I may be wrong about this, is that the policy agenda seems very heavily weighted toward protecting ourselves from inappropriate use and various other downside risks and that's fine. But it needs to be complemented with, you know, accelerating and making sure we take advantage of the full upside potential on the economic side as well . Tammy Haddad: Have you been following this weekend's departure of Sam Altman, who I would say is the most famous and positive face of AI being deposed from OpenAI. Now he's been scooped up by the partner of AI, Microsoft. Do you have any response to any of that? Michael Spence: Well, yeah, I mean, first of all, I mean, I gather it was a huge surprise or shock in Silicon Valley was certainly a huge shock to me. And I imagine almost everybody else who saw it. Yeah, I have a couple of reactions to it. First of all, I think it's well known that in the technology community and what I should, I should say, you know, I decided when I needed to try to understand the economic impacts of this, I needed to talk to people who, you know, who weren't just writing about it, but we're actually doing it, you know, be able to go ask him, what are you actually doing here? And I was fortunate to have some Stanford colleagues and my friend, James Manyika, with whom I've worked before, whose background is deeply in computer science and AI to talk to. And these people are divided. I mean, if you look at, you know, the, the volume James edited for Daedalus on AI, which is a wonderful introduction to the subject because it covers so much ground. If you ask them when, how far away are we from artificial general intelligence, which means essentially an AI that's like us, the answers range from 10 years to never. I mean, they don't, you know, and serious people, you know, in the founding fathers of this technology, like Jeff Hinton, you know, think there's existential risk. And so it looks like there's an element of that in the kind of, you know, conflict that underpinned. I mean, we obviously don't know very much at this point. The second thing is, OpenAI is a very odd structure because of its founding as a nonprofit and a board that basically controls it, that doesn't consist of, you know, founders and investors and entrepreneurs mostly but other folks, and they're going in a different direction. And you, you know, I, I, I have to say as a, a person who's been on boards and whatnot, you know, the notion that you would fire a CEO of that visibility and and importance and not have told your stakeholders, you know, except maybe a minute before strikes me as a kind of let me put this way a bit amateurish to put it mildly. So I don't know what's going on there. I'm a little worried. I think other people are that if they have a lot of departures from open AI. That you'll see, you know, take an entity that was a pretty important, not the only one, but a pretty important driving force and, and substantially weaken it all apparently in pursuit of, you know, avoiding the slowing things down and avoiding what some people think of as very serious risks. That's kind of about where I am. I, I wouldn't. I don't think it'll turn into a business school case study of how to manage change at this point. Tammy Haddad: Yeah, that's for sure. So as of this morning, as we're recording this, 500 employees of OpenAI have said that they would, they would leave and go to Microsoft. What about if you were advising Microsoft? Can you imagine the position they're in? Because they have such a large stake in OpenAI, but yet they're going to work with Sam Altman on a new business. Michael Spence: Yeah, I mean, it's a very awkward situation to be in. I mean, I read somewhere, you know, maybe it was just made up. Maybe an AI wrote it that, you know, that Satya Nadella was livid when he heard about this. I mean, I don't have any doubt. Microsoft has the resources, so they can essentially reproduce. You know, with Sam Altman and his colleagues, if enough of them go over they've got the computing power, they can basically reproduce that, but it seems a shame, but they have contractual relations with OpenAI. I don't know what happens to that. Maybe, maybe there's, you'll have to talk to lawyers about this, Tammy. I mean, maybe there's a legal case that, you know, if there's nobody left there. You know that they don't have to honor the contracts, but it's a little bit, you know, I kind of up in the air at the moment. I would guess. Tammy Haddad: I worry that it looks like.. I'm thinking about regular people that read about all these, you know, elites and this technology doesn't really help us. And they're trying, even though no one said it point blank, but it's obviously a fight about commercial products versus, you know, the nonprofit piece that's somewhere in there and it makes people distrust a technology that's still unfolding. That's my big fear, but I want to turn to national security. I interviewed Paul Kwan and Teresa Carlson from General Catalyst last week, and they've launched this global resilience fund and it seems to me that you have the same theme about the change and investment to go from how do we make the most amount of money to let's create these more resilient structures that will elevate the world. Michael Spence: Absolutely. I mean, I remember when the tsunami in Japan shut down a kind of single source plant in Japan that made a critical piece for the automobile industry and the automobile industry, you know, kind of ground to a halt for a while until a replacement was found. And so I thought to myself, you know, these supply chains are, they're very efficient, but they're wound pretty tight. And, you know, there's going to be some diversification. Well, there wasn't, right? But now, the shocks are so common, you know, and, and coming from such, you know, a wide range of sources, that it's very striking that you have a very distinct pattern of expensive diversification of supply chains led by multinational firms, the architects of these supply networks, but also reinforced increasingly by policy. Right? So when the Secretary of the Treasury says we're going to be in the business of friend shoring, people are listening. And if their policies to back that up incentives and other things, they'll go with it. So it's not just business anymore deciding it's a bit risky and we better pay more attention to resilience in terms of why I mean, the incentive still the same long run rate of return on investment, but now governments are in the game as well. Then when you get the real national security, we have severe restrictions on semiconductors depending on where they go. But we also have and this is going to be expensive too, but probably, what we ought to do is we need to develop the domestic capacity to produce some of these critical things so that if it's necessary we can expand it. It comes in two steps. In other words, we need to be able to do it. And not many people can do what, you know, TSMC in Taiwan can do. So they're, they're being encouraged to kind of come and teach us how to do that. I mean, Microsoft just announced that they had developed their own AI chip at the bottom of the article. It said, and guess who's going to make it? TSMC. It turns out this is really hard to do. This is a foundry. You know, they don't design all the chips, but they sure are good at making the most advanced ones, the ones with 3 and 2 nanometer, you know, transistor separations and stuff like that. Tammy Haddad: The federal government put $56 billion in to try to fix that. But are you saying they're not getting as far as they should as quickly as they should? Michael Spence: Oh, no, I think they'll get there. I mean, I think, you know, we've had major, major, you know government activity in this. This is why people keep talking about the return of industrial policy. So the Chips and Science Act puts an enormous amount of money into there. No, I'm not saying we won't get there. It's just hard to do. And there's, there's a reason why there's only one of them, and there's only one ASML in the Netherlands, but that's not a permanent condition. We'll get there. I mean, the chips and science act is really interesting. So there's a lot of basic investment in science and technology that goes along with that, that's, you know, something that America has done persistently and well. Then there's this we got to get some of this capacity on shore. So, in pursuit of basically safety, resilience, ability to function if we have to, more or less on our own. And then there's the part that just is designed to hold China back. And people have varying views on how, I mean, I don't think you can permanently hold China back, but you can probably slow them down a fair amount in the short and medium run. Tammy Haddad: Before we wrap up, I have to ask you about Elon Musk and X and media, as a longtime producer. I just wonder if you have any comments first on X and then also Elon Musk as such a highly visible leader, who's not you know, a leader of government, obviously, but has so much tremendous power. Michael Spence: He does, right? I mean, he's an extraordinary individual. I mean, you know, Tesla, you know, X buys X, SpaceX, you know, Starlink, you know, I mean, he's the internet for lots of people who live in relatively remote areas around the world, he can turn it on and off. I mean, that's an enormous amount of power. Now governments still have the ultimate power, so they can regulate that eventually. Although this is pretty international, so regulating things internationally is not quite the same thing. It's less straightforward than domestically. No, I think there's a huge agenda, and it's not really strictly economics. It has to do with what is the impact of, let's call it the social media on our democracy, on social interaction, on kids, on other things. This was talked about for a long time. Now I think it's a serious, serious set of concerns. I don't have a huge amount of insight into that. There are economic consequences. I mean, the speed with which Silicon Valley Bank imploded was all, you know, almost surely a result of digital technology on the one hand and the social media on the other. So some influential people said this is risky and you can now do banking in nano, you know, I'm exaggerating, in a small number of seconds using your phone. And all of a sudden, the deposit started running, whereas people in that world will say, in the old days, when we had a bank run, you had to stand outside the bank. Right. No longer. So, yeah, I mean, I think in some ways these are the hardest issues to sort of get at. Is this, and people are writing, and they don't all agree, but is this a serious factor in the polarization in, you know, in our society? Or is it, you know, where are we going to get polarized anyway? And this is now the new way of expressing it. I, that's one that I don't feel I have a firm set of views about yet. Tammy Haddad: Well, because the next thing that happens is, of course, the U S election in 2024, but there's quite a few global elections. So, you've got the deep fakes, you've got all this AI technology. And the, the question is what will be the implications and, of course, the economic implications of these various victories like in Argentina yesterday. Michael Spence: Right. Yeah. Well, so Argentina is an interesting case. I grew up in Canada and, you know, in 1900, Argentina was wealthier and higher incomes than Canada. And then things didn't go so well in the 20th century and on into the 21st century, it seems to be a country that has a more than normally highly developed capacity to produce crises, but this really does sound to a, maybe I'm too traditional as an economist, but the idea of getting rid of your central bank strikes me as sort of close to crazy, I mean, to use a technical term. And I don't, so I don't know what's going to happen there, but, you know, they've had an unusual number of economic and financial setbacks. And, and then this sounds like it may be a setup for another one. I feel very badly for the people of Argentina because it, you know, it's a country where it is rich in human capital, natural resources, all kinds of things that should make it pretty prosperous, but it doesn't seem to be working so far. Tammy Haddad: Well, let's hope it gets better along with the rest of the world. One more quick question. Have you talked to the UN about Permacrisis? Is there a role for the UN to play? Michael Spence: Oh, yeah. No, I mean, the latter part of this book, you know, says that what one of the things we really should not do, even if we're going to have a more fragmented world in terms of the way we interact in the global economy, even if we're going to have regional trade agreements and stuff. We really mustn't marginalize the fully multilateral institutions because they're the governance structure. And, you know, and kind of focusing on China-U.S. tensions and kind of forgetting about the other folks or, you know, all the workarounds to the world that's a more complicated one out there in the, in the global economy. I think Gordon feels very strongly about this that we mustn't, the way he would put it, Tammy, is we need a global public goods bank that should be the World Bank. It should have governance reform so it reflects the underlying economic reality. You know, Belgium shouldn't have the same voice as China anymore. And then it needs, you know, a large amount of capital to, to mobilize both public and private, you know, assets in the sustainability challenge. So that'd be an example, he thinks the IMF, which is very effective institution even now, needs an expanded mandate to kind of supervise and build a kind of global financial system that's going to have increasing digital foundations will have central bank digital currencies, as well as all these crypto, whatever they're called around. And so, the UN has important roles to play. I mean, in multiple dimensions, conflict resolution et cetera, protecting people in dire straits, and I won't try to list them all, but so, yeah, we'd be fully on board that, that while we acknowledge we live in a more complicated world, it's going to be harder to navigate regulatory regimes will put companies sometimes in positions where they're actually contradictory regulatory requirements that they face and we've got big challenges and making data mobile without making it dangerous. All kinds of things that we didn't really have to deal and certainly not at the same level before, but we still need these institutions. Tammy Haddad: Excellent. Thank you so much, Michael, for your time today. Much appreciated. Thank you for Permacrisis, a new word that will no doubt be used quite a bit in this next year as we navigate all these really tough issues that come along. Tammy Haddad: I look forward to meeting you. Thank you, sir. Michael Spence: Thank you, Tammy. Thanks for having me. And I look forward to seeing you.
We recommend upgrading to the latest Chrome, Firefox, Safari, or Edge.
Please check your internet connection and refresh the page. You might also try disabling any ad blockers.
You can visit our support center if you're having problems.