Vital Signs is a weekly economic wrap from UNSW economics professor and Harvard PhD Richard Holden (@profholden). Vital Signs aims to contextualise weekly economic events and cut through the noise of the data affecting global economies.
This week: let’s take a break from the data and analyse why the Chair of the United States Federal Reserve says another financial crisis is unlikely in our lifetimes.
Last week I wrote in this column that Reserve Bank of Australia Governor Philip Lowe was not taken to giving boring speeches. This week, Federal Reserve Chair Janet Yellen went him one better with her remarks to the British Academy.
Yellen said of the prospect of another financial crisis like that of 2008:
I do think we’re much safer, and I hope that it will not be in our lifetimes, and I don’t think it will be.
Her rationale was that the regulatory changes put in place since the crisis have made the system a great deal safer.
Yellen expressed her hope that these regulations would not be overturned, saying of attempts to do so:
We’re now about a decade after the crisis and memories do tend to fade, so I hope that won’t be the case, and I hope those of us who went through it will remind the public that it’s very important to have a safer, sounder financial system and that this is central to sustainable growth.
One piece of evidence supporting Yellen’s view were the so-called “stress tests” that the Federal Reserve has performed on US banks since 2011.
Just hours before this writing, the Fed approved plans for proposed dividend payouts for all 34 firms taking part in their annual stress tests. This is the first time since the annual stress tests began that all participating firms got a passing grade from the Fed.
So what should we make of Yellen’s prediction? This doesn’t have the ring of “confidence-boosting” rhetoric. It seems as though this is her genuine view.
At this point I should make two things clear. One: I have enormous respect for Janet Yellen, and two: I was fairly surprised by her remarks. So how does one reconcile these?
We still don’t know what caused the global financial crisis
It’s important to understand that the proximate cause of the financial crisis was a kind of “bank run on the system”. In a classic bank run – like the ones of the Great Depression – depositors go from believing that other depositors think their banks are safe, and are therefore willing to leave their money there, to believing that others consider the banks unsound.
Once investors believe that other investors are going to run on the bank, they want to run too (because banks typically hold less than 10% of the cash required to pay back all depositors). In the language of game theory, there is a switch from the “good equilibrium” to the “bad equilibrium”.
In the financial crisis there wasn’t a run on traditional banks so much as on investment banks and other financial institutions. This led to credit markets drying up (such as the commercial debt and “repo” markets).
What caused this shift in beliefs is a hotly debated issue. The large increase in US house prices and imprudent lending by US mortgage originators played an important role. The use of collateralised debt obligations (CDOs) and synthetic versions of CDOs were important. Lack of regulator controls allowed these things to happen, and incentives within financial institutions facilitated and encouraged risk and bad behaviour.
Yet for quite a long time the “good equilibrium” prevailed. What caused the switch?
That is the big question, and I’m pretty confident that not even Yellen knows the answer.
It all comes down to what we believe
What Yellen does seem to be saying is that regulatory responses to the crisis have made it much less likely that bad behaviour is going to occur. That certainly seems plausible. And if there is no – or very little – bad behavior, then beliefs about what other believe may not be particularly important.
But if there is a view that some new form of moral hazard is taking place, and people lose faith in the regulatory regime, then perhaps 2008 could occur again. And soon.
As game theory – and the lessons of the classic depression-era bank runs – teach us, it all comes down to what people believe about what other people believe.
How do we account for forces and events that paved the way for the emergence of Islamic State? Our series on the jihadist group’s origins tries to address this question by looking at the interplay of historical and social forces that led to its advent.
In the penultimate article of the series, Harith Bin Ramli traces the Muslim world’s growing disaffection with its rulers through the 20th century and how it created the climate for both the genesis of Islamic State and its continuing success in recruiting followers.
Islamic State (IS) declared its re-establishment of the caliphate on June 29, 2014, almost exactly 100 years after the heir to the Austro-Hungarian Empire, Archduke Franz Ferdinand, was assassinated. Ferdinand’s death set off a series of events that would lead to the first world war and the fall of three great multinational world empires: Austro-Hungary (1867-1918), Russian (1721-1917) and the Ottoman Sultanate (1299-1922).
That IS’s leadership chose to declare its caliphate so close to the anniversary of Ferdinand’s assassination may not entirely be a coincidence. In a sense, the two events are connected.
In declaring the resurrection of a medieval political institution almost exactly 100 years later, IS was announcing its explicit rejection of the modern international system based on that very idea of sovereignty.
Other than the Ottoman dynasty’s very late and disputed claim to the title, no attempt has been made to re-establish a caliphate since the fall of the Abbasid dynasty at the hands of the Mongols in 1258. In other words, Sunni Islam has carried on for hundreds of years since the 13th century without the need for a central political figurehead.
The Abbasid caliphs began to lose power from the mid-ninth century, effectively becoming puppets of various warlords by the tenth. And the caliphate underwent a serious process of decentralisation at the same time.
Key contemporary texts on statecraft, such as Abu al-Hasan al-Mawardi’s (952-1058) Ordinances of Government (al-Ahkam al-sultaniyya), described the caliph as the necessary symbolic figurehead providing constitutional legitimacy for the real rulers – emirs or sultans – whose power was based on military might.
As in the case of the Shi’i Buyid dynasty (934-1048), these rulers didn’t even have to be Sunni. And they were often expected to provide legislation based on practical and functional, rather than religious, considerations.
The Muslim world, then, had arguably already experienced secularisation of sorts before the modern age. Or, at the very least, it had for quite some time existed within a political system that balanced power between religious and worldly interests.
And when the caliphate came to an end in the 13th century, both the institutions of kingship and the religious courts (run by the scholar-jurists) were able to carry on functioning without difficulty.
It was the 19th-century Muslim revivalist and anti-colonial movement known as Pan-Islamism that was responsible for reviving the Ottoman claim to the caliphate. And the idea was revived again briefly in early 20th-century British India as the anti-colonial Khilafat movement.
But anti-colonial efforts after the fall of the Ottoman Empire, even those primarily based on religious beliefs, have rarely called for a return of the caliphate.
If anything, successors of Pan-Islamism, such as the Muslim Brotherhood, have generally worked within the framework of nation states. Putting aside doubts about their actual ability to commit to democracy and secularism, such movements have generally envisioned an Islamic state along more modern lines, with room for political participation and elections.
Modern utopias and old dynasties
So why evoke the caliphate in the first place? The simple answer is that it has never been completely dismissed as an option.
In Sunni law and political theology, once consensus over an issue has been reached, it is hard for later generations to go against it. This was why Egyptian scholar Ali Abd al-Raziq was removed from his post at Al-Azhar University and attacked for introducing a deviant interpretation after he wrote an argument for a secular interpretation of the caliphate in 1925.
As manyrecentstudies show, the idea of the caliphate and its revival has had a certain utopian appeal for a wide spectrum of modern Muslim thinkers. And not just those with authoritarian or militant inclinations.
But, in practice, the dominant tendency here too has really been to seek the liberation or revival of Muslim societies within the nation-state framework.
If anything, national aspirations and the desire to modernise society existed before the formation of the new political order after the first world war. The majority of the populations of Muslim lands welcomed the fall of the three empires, or at least didn’t feel very strongly about the survival of traditional ruling dynasties.
And, with the exception of Saudi Arabia, most dynasties that stayed in power did so by reinventing their states along modern, mainly secular, models.
But this did not always succeed. The waves of revolutions and military coups that swept the Middle East and other parts of the Muslim world throughout the 1950s and 1960s amply illustrate that popular sentiment identified traditional dynasties with the continuing influence of colonial powers.
In Egypt, under the Muhammad Ali dynasty (1805-1952), for example, the control of the then-French Canal epitomised the interdependent relationship between the dynasty and Western power. This was why Gamal Abdel Nasser (1918-1970) made great efforts to regain it in the name of Egyptian sovereignty when he became the country’s second president in 1956.
Dissolving political legitimacy
Either way, the success of the new Muslim nation states could be said to be predicated on two major expectations. The first was improvement of citizens’ lives – not only in terms of material progress, but also the benefits of freedom and the ability to represent the popular will through participatory politics.
The second was the ability of Muslim nations to unite against outside interference and commit to the liberation of Palestine. On both counts, the latter half of the 20th century witnessed abysmal failures and an increasing sense of frustration with Muslim leaders.
In many places, populism eventually gave way to authoritarianism. And the loss of further lands to Israel in the 1967 Six-Day War revealed the inherent weakness and lack of unity among the new Muslim nations.
Anwar Sadat’s peace treaty with Israel after the 1973 Yom Kippur War was widely seen as an act of betrayal, for breaking ranks in what should have been a united front. His decision to do so despite lacking popular support in Egypt only revealed the extent to which the country had evolved into a dictatorship.
Sadat’s consequent assassination at the hands of a small radical splinter group of religious militants acted as a warning to other Muslim leaders. Now they couldn’t simply ignore or lock away religious critics, even if the majority of the population still subscribed to the secular nation-state model.
Throughout the late 1970s and 1980s, Muslim leaders around the world increasingly made compromises with religious reactionary forces, allowing them to expand influence in the public sphere. In many cases, these leaders increasingly adopted religious rhetoric themselves.
Showing support for fellow Muslims in the Soviet-Afghan War (1979-1987) or the First Palestinian Intifada provided an opportunity to manage the threat of religious radicalism. National leaders probably also saw this as an effective way to deflect attention from the authoritarian nature of many Muslim states.
The Gulf War also brought non-Muslim troops to Arabian soil, inspiring Osama bin Laden’s call for jihad against the Western nations that participated in it. And it eventually led to the US invasion of Iraq. That set off a chain of events that created in the country the chaotic conditions that enabled the rise of Islamic State.
If IS’s leadership is really an alliance between ex-Ba’athist generals and an offshoot of al-Qaeda, as has often been depicted, then we don’t have to go far beyond the events of this war to explain how the group formed. But the rise of Islamic State and its declaration of the caliphate can also be read as part of a wider story that has unfolded since the formation of modern nation states in the Muslim world.
As some commentators have pointed out, it’s not so much the Sykes-Picot agreement and the drawing of artificial national borders by colonial powers that brought about IS.
The modern nation-state model – as much as it’s based on a kind of fiction – is still strong in most parts of the Muslim world. And, I believe, it’s still the preferred option for most Muslims today.
But the long century that has passed since the first world war has been increasingly marked by frustration. It’s littered with the broken promises of Muslim rulers to bring about a transition to more representative forms of government. And it has been marked by a sense that Western powers continue to control and manipulate events in the region, in a way that doesn’t always represent the best interests of Muslim societies.
An extreme high point of frustration was reached in the events of the so-called Arab Spring. The wave of popular demonstrations against the autocratic regimes of the Arab world were seen as the first winds of change that would bring about democracy to the region.
But, with the possible exception of Tunisia, all of these countries underwent either destabilisation (Libya, Syria), the return of military rule (Egypt) or the further clamping down on civil rights (Saudi Arabia, Bahrain and other Gulf monarchies).
I would hesitate to describe IS’s declaration of a caliphate as a serious challenge to the modern nation-state model. But the small, albeit substantial, stream of followers it manages to recruit daily shows it would be wrong to take for granted that the terms of the international order can simply be dictated from above forever.
When brute force increasingly has the final say over how people live their lives, it becomes harder for them to differentiate between the lesser of two evils.
As the finances of Venezuela continue to deteriorate under the collapse of crude oil prices, the government of President Nicolas Maduro is becoming more paranoid and vindictive.
Venezuela derives the vast majority of its export earnings from sending oil overseas. With the largest endowment of crude oil reserves in the world, the oil-driven economy worked well for the late Hugo Chavez: he provided generous support for the poor, and built allies in the western hemisphere by dispensing cash and cheap oil in exchange for political allegiance.
But state-owned PDVSA has struggled to keep production up. Rather than using its earnings to develop more fields, much of its earnings have been diverted for political and social projects. Chavez also purged PDVSA of thousands of experienced workers, leaving the company short of well-trained staff.
Chavez could paper over the decay of PDVSA’s production base because oil prices were so high in his…