Wednesday, September 9, 2009

Regulation and the Financial Crisis: Myths and Realities

Many regulatory policies were major contributors to the crisis. To proceed without examining past policies, particularly in the areas of housing and bank capital regulation, would preclude learning the lessons of history.

The role of regulatory policy in the financial crisis is sometimes presented in simplistic and misleading ways. This essay will address the following myths and misconceptions:

Myth 1: Banking regulators were in the dark as new financial instruments reshaped the financial industry.

Myth 2: Deregulation allowed the market to adopt risky practices, such as using agency ratings of mortgage securities.

Myth 3: Policy makers relied too much on market discipline to regulate financial risk taking.

Myth 4: The financial crisis was primarily a short-term panic.

Myth 5: The only way to prevent this crisis would have been to have more vigorous regulation.

The rest of this essay spells out these misconceptions. In each case, there is a contrast between the myth and reality.

Myth 1: Banking regulators were in the dark as new financial instruments reshaped the financial industry.

The decade leading to the financial crisis saw the development and growth of many innovations in the financial industry. These included collateralized debt obligations, credit-default swaps, special-purpose vehicles, and private-label mortgage securities. Without getting into the specifics of these innovations, their overall net result was to create what is now referred to as the “shadow banking system.” They allowed banks and other institutions to finance their mortgage security holdings with short-term debt. This meant that the financial institutions lacked the liquid reserves to withstand a change in market perceptions of the risk of such assets. It also meant that they lacked sufficient capital to cover losses when the housing market deteriorated.

Many regulatory policies were major contributors to the crisis.

The dramatic structural changes that took place in the financial industry were not noticed by the general public, and received little coverage even in the financial press. However, it is a myth that financial regulators were unaware of these developments.

The reality is that the policy community observed and approved of the innovations that restructured the financial system. For example, in a speech in 2006, Federal Reserve Board Chairman Ben Bernanke said,

the development of new technologies for buying and selling risks has allowed many banks to move away from the traditional book-and-hold lending practice in favor of a more active strategy that seeks the best mix of assets in light of the prevailing credit environment, market conditions, and business opportunities. Much more so than in the past, banks today are able to manage and control obligor and portfolio concentrations, maturities, and loan sizes, and to address and even eliminate problem assets before they create losses. Many banks also stress-test their portfolios on a business-line basis to help inform their overall risk management.

To an important degree, banks can be more active in their management of credit risks and other portfolio risks because of the increased availability of financial instruments and activities such as loan syndications, loan trading, credit derivatives, and securitization. For example, trading in credit derivatives has grown rapidly over the last decade, reaching $18 trillion (in notional terms) in 2005. The notional value of trading in credit default swaps on many well-known corporate names now exceeds the value of trading in the primary debt securities of the same obligors.

Bernanke described these innovations as reflecting the cooperative efforts of bank supervisors and the regulated institutions. Similarly, the International Monetary Fund reported at around the same time that these developments had “helped to make the banking and overall financial system more resilient.”

The myth is that the regulators failed to focus on the systemic implications of financial innovation. The reality is that the regulators were keenly interested in systemic risk. However, like their counterparts in the financial industry, the regulators thought that the innovations had reduced systemic risk. The problem was not that regulators lacked a mandate to address systemic risk. What they lacked was judgment and insight.

Myth 2: Deregulation allowed the market to adopt risky practices, such as using agency ratings of mortgage securities.

The myth is that, as one prominent policy paper put it, “Market discipline broke down as investors relied excessively on credit rating agencies.” The reality is that it was regulatory policy, not markets, that drove the use of credit agency ratings. Bank regulators, especially with a rule that took effect on January 1, 2002, gave breaks on capital requirements to banks that held assets with AA and AAA ratings from credit rating agencies.

The problem was not that regulators lacked a mandate to address systemic risk. What they lacked was judgment and insight.

The market was not nearly as obsessed with ratings as were the regulators. Many rated securities were not even traded in the market. Instead, banks obtained ratings for the sole purpose of engaging in regulatory capital arbitrage, meaning that they were able to reduce capital requirements for a given risk.

The use of credit rating agencies for regulatory capital purposes was criticized at the time. Fannie Mae and Freddie Mac warned about regulatory capital arbitrage. A group of economists calling itself the Shadow Financial Regulatory Committee warned that ratings on non-traded securities would likely be inflated. Policy makers went ahead with their approach to risk-based capital regulation in spite of these objections.

Regulatory capital arbitrage was the motivating factor in most of the financial innovations that figured prominently in the financial crisis. For example, banks could use off-balance-sheet financing for mortgage securities in Special Purpose Vehicles (SPVs) and Structured Investment Vehicles (SIVs) to avoid capital requirements altogether. Credit default swaps on mortgage securities also served to transfer risk in ways that reduced capital requirements.

Myth 3: Policy makers relied too much on market discipline to regulate financial risk taking.

Many experts, including former Federal Reserve Board Chairman Alan Greenspan, have voiced the complaint that the market proved less rational than expected in its management of risk. The implication is that markets are too unreliable and that stronger regulation is the answer.

It is certainly true that some financial executives made serious miscalculations. They themselves greatly underestimated the risks of a housing market decline and overestimated the insulation from risk that could be obtained by using sophisticated financial models and structured finance.

However, the greater flaw was in the regulatory structure, and in particular the capital regulations for banks, investment banks, and the mortgage agencies Freddie Mac and Fannie Mae. There was an absence of market discipline at these firms in large part because such a large share of the risk that they took was borne by taxpayers rather than by shareholders and management.

There is a myth that financial firms were like teenagers who started a terrible fire because of a lack of adult supervision. In fact, Congress and regulators were doing the equivalent of handing out matches, gasoline, and newspapers.

The Shadow Financial Regulatory Committee criticized the capital “risk bucket” approach from the very beginning, when the 1989 Basel Accords were under discussion. The economists proposed instead that banks be required to issue unsecured debt, which would serve as a layer between the risks of their assets and the insured deposits. Recently, another group of economists reiterated support for such an approach.

Myth 4: The financial crisis was primarily a short-term panic.

The financial crisis has four components:

—Bad bets, meaning unwise decisions by developers to build too many homes, by consumers to purchase too many homes, by mortgage lenders to make unwise loans, and by financial institutions that incurred too much exposure to credit risk in housing.

—Excessive leverage, meaning that the debt-to-equity ratio was so high at some key firms, such as Freddie Mac, Fannie Mae, and Bear Stearns, that only a small drop in asset values could bankrupt the firms.

—Domino effects, meaning the ways at which problems at one firm could spill over to another firm.

—21st-century bank runs, in which institutions that were using mortgage securities as collateral for short-term borrowing from other firms found that their counter-parties were reluctant to extend their loans.

The first two components reflect fundamental problems that developed over a period of at least a decade. The last two components reflect a financial panic that emerged abruptly in 2008.

Too many policy makers are focused only on the financial panic. For example, when Bernanke, offering a retrospective on the crisis at a conference at Jackson Hole in August of 2009, used the word “panic” more than a dozen times, but the phrase “house prices” only twice and the phrase “mortgage defaults” just once.

The biggest myth is that regulation is a one-dimensional problem, in which the choice is either ‘more’ or ‘less.’

Such thinking leads policy makers to focus on tinkering with regulatory organization and regulatory powers rather than addressing deeper policy flaws. We need to be asking more fundamental questions about the costs and benefits of government support for securitization, about the aims of housing policy, and about how to reconcile the goal of providing government protection against financial risk with the need to ensure that this does not lead to unbridled risk-taking.

Myth 5: The only way to prevent this crisis would have been to have more vigorous regulation.

There is a myth that financial firms were like teenagers who started a terrible fire because of a lack of adult supervision. In fact, Congress and regulators were doing the equivalent of handing out matches, gasoline, and newspapers.

Housing policy was obsessed with increasing home purchases. This was pushed to the point where, given the lack of any down payment, the term “home ownership” is probably a misnomer. If the goal was home ownership, then the actual result was speculation and indebtedness.

The easiest way to have prevented the crisis would have been to discourage, rather than encourage, the trend toward ever lower down payments on home purchases. Maintaining a requirement for a reasonable down payment would have dampened the speculative mania that drove house prices to unsustainable levels. It would have reduced the number of mortgage defaults.

Another way to have prevented the crisis would have been to rely on something other than risk buckets and credit agency ratings to regulate bank capital. A better approach would have been to use stress tests, in which regulators would specify hypothetical scenarios for interest rates or home prices, with bank capital adequacy measured against such stress tests. Another approach, noted earlier, would have been to require financial firms to issue unsecured debt. Such debt would help insulate deposit insurance funds from fluctuations in asset prices. Moreover, if such debt is traded, then its price can be used as a market indicator of risk, giving regulators an early-warning system for problems.

Conclusion

The biggest myth is that regulation is a one-dimensional problem, in which the choice is either “more” or “less.” From this myth, the only reasonable inference following the financial crisis is that we need to move the dial from “less” to “more.”

The reality is that financial regulation is a complex problem. Indeed, many regulatory policies were major contributors to the crisis. To proceed ahead without examining or questioning past policies, particularly in the areas of housing and bank capital regulation, would preclude learning the lessons of history.

Arnold Kling was an economist on the staff of the board of governors of the Federal Reserve System and was a senior economist at Freddie Mac. He is a member of the Mercatus Center financial markets working group and co-hosts EconLog, a popular economics blog.

Capitalism After the Crisis

LUIGI ZINGALES

The economic crisis of the past year, centered as it has been in the financial sector that lies at the heart of American capitalism, is bound to leave some lasting marks. Financial regulation, the role of large banks, and the relationships between the government and key players in the market will never be the same.

More important, however, are the ways in which public attitudes about our system might change. The nature of the crisis, and of the government's response, now threaten to undermine the public's sense of the fairness, justice, and legitimacy of democratic capitalism. By allowing the conditions that made the crisis possible (particularly the concentration of power in a few large institutions), and by responding to the crisis as we have (especially with massive government bailouts of banks and large corporations), the United States today risks moving in the direction of European corporatism and the crony capitalism of more statist regimes. This, in turn, endangers America's unique brand of capitalism, which has thus far avoided becoming associated in the public mind with entrenched corruption, and has therefore kept this country relatively free of populist anti-capitalist sentiment.

Are such changes now beginning? And if so, will they mark only a temporary reaction to an extreme economic downturn, or a deeper and more damaging shift in American attitudes? Some early indications are not encouraging.

SOAK THE RICH

A friend of mine worked as a consultant for the now-infamous ­insurance giant American International Group. To prevent him from starting his own hedge fund, AIG offered him a non-compete agreement: a sum of money meant to compensate him for the opportunity forgone. It is a perfectly standard and well-regarded practice — but unfortunately for my friend, his payment under this agreement was to be made at the end of 2008. So he spent the early months of 2009 living in terror: His contract was classified as one of the notorious AIG retention bonuses. At the height of the fury against those bonuses, he received several death threats. Though he had no legal obligation to do so, he returned the money to the company, hoping that the gesture might keep his name from being published in the papers. In case that failed to protect him, he prepared a plan to evacuate his wife and children. It was the responsible thing to do; after all, angry protestors had staked out the homes of several AIG executives whose names appeared in print — and only luck had prevented someone from getting hurt.

While such extreme episodes have, fortunately, been quite rare, they are symptomatic of a broad discontent. In one recent survey, 65% of Americans said the government should cap executive compensation by large corporations, while 60% wanted the government to intervene to improve the way corporations are run. And those views hardly reflect confidence in the government: Only 5% of Americans in the same poll said they trust the government a lot, while 30% said they do not trust it at all. It is just that, at the moment, Americans trust large corporations even less: Fewer than one out of every 30 Americans say they trust them a lot, while one of every three Americans claims not to trust large corporations at all.

These attitudes are familiar to students of public opinion in much of the world. But they are quite unusual for the United States. Until recently, Americans stood out for their acceptance of basic market principles and even for their tolerance of some of the negative side effects markets produce, such as marked income inequality.

Capitalism has long enjoyed exceptionally strong public support in the United States because America's form of capitalism has long been distinct from those found elsewhere in the world — particularly because of its uniquely open and free market system. Capitalism calls not only for freedom of enterprise, but for rules and policies that allow for freedom of entry, that facilitate access to financial resources for newcomers, and that maintain a level playing field among competitors. The United States has generally come closest to this ideal combination — which is no small feat, since economic pressures and incentives do not naturally point to such a balance of policies. While everyone benefits from a free and competitive market, no one in particular makes huge profits from keeping the system competitive and the playing field level. True capitalism lacks a strong lobby.

That assertion might appear strange in light of the billions of dollars firms spend lobbying Congress in America, but that is exactly the point. Most lobbying seeks to tilt the playing field in one direction or another, not to level it. Most lobbying is pro-business, in the sense that it promotes the interests of existing businesses, not pro-market in the sense of fostering truly free and open competition. Open competition forces established firms to prove their competence again and again; strong successful market players therefore often use their muscle to restrict such competition, and to strengthen their positions. As a result, serious tensions emerge between a pro-market agenda and a pro-business one, though American capitalism has always managed this tension far better than most.

THE AMERICAN EXCEPTION

In a recent study, Rafael Di Tella and Robert MacCulloch showed that public support for capitalism in any given country is positively associated with the perception that hard work, not luck, determines success, and is negatively correlated with the perception of corruption. These correlations go a long way toward explaining public support for ​­America's capitalist system. According to one recent study, only 40% of Americans think that luck rather than hard work plays a major role in income differences. Compare that with the 75% of Brazilians who think that income disparities are mostly a matter of luck, or the 66% of Danes and 54% of Germans who do, and you begin to get a sense of why American attitudes toward the free-market system stand out.

Some scholars argue that this perception of capitalism's legitimacy is merely the result of a successful propaganda campaign for the American Dream — a myth embedded in American culture, but not necessarily tied to reality. And it is true that the data yield scant evidence that social mobility is higher across the board in the United States than in other developed countries. But while this difference does not show up in the aggregate statistics, it is powerfully present at the top of the distribution — which often gets the most attention, and most shapes people's attitudes. Even before the internet boom created many young billionaires, in 1996, one in four billionaires in the United States could be described as "self-made" — compared to just one out of ten in Germany. And the wealthiest self-made American billionaires — from Bill Gates and Michael Dell to Warren Buffett and Mark Zuckerberg — have made their fortunes in competitive businesses, with little or no government interference or help.

The same cannot be said for most other countries, where the wealthiest people tend to accumulate their fortunes in regulated businesses in which government connections are crucial to success. Think about the oligarchs in Russia, Silvio Berlusconi in Italy, Carlos Slim in Mexico, and even the biggest tycoons in Hong Kong. They made their fortunes in businesses that are highly dependent on governmental concessions: energy, real estate, telecommunications, mining. Success in these businesses often depends more on having the right connections than on having initiative and enterprise.

In most of the world, the best way to make money is not to come up with brilliant ideas and work hard at implementing them, but to cultivate a government connection. Such cronyism is bound to shape public attitudes about a country's economic system. When asked in a recent study to name the most important determinants of financial success, Italian managers put "knowledge of influential people" in first place (80% considered it "important" or "very important"). "Competence and experience" ranked fifth, behind characteristics such as "loyalty and obedience."

These divergent paths to prosperity reveal more than a difference of perception. American capitalism really is quite distinct from its European counterparts, for reasons that reach deep into history.

THE ROOTS OF AMERICAN CAPITALISM

In America, unlike much of the rest of the West, democracy predates industrialization. By the time of the Second Industrial Revolution in the latter part of the 19th century, the United States had already enjoyed several decades of universal (male) suffrage, and several decades of widespread education. This created a public with high expectations, unlikely to tolerate evident unfairness in economic policy. It is no coincidence that the very concept of anti-trust law — a pro-market but sometimes anti-business idea — was developed in the United States at the end of the 19th century and the beginning of the 20th. It is also no coincidence that in the early part of the 20th century, fueled by an inquisitive press and a populist (but not anti-market) political movement, the United States experienced a rise in regulation aimed at reducing the power of big business. Unlike in Europe — where the most vibrant opposition to the excesses of business came from socialist anti-market movements — in the United States this opposition was squarely pro-market. When Louis Brandeis attacked the money trust, he was not fundamentally trying to interfere with markets — only trying to make them work better. As a result, Americans have long understood that the interests of the market and the interests of business may not always be aligned.

American capitalism also developed at a time when government involvement in the economy was quite weak. At the beginning of the 20th century, when modern American capitalism was taking shape, U.S. government spending was only 6.8% of gross domestic product. After World War II, when modern capitalism really took shape in Western European countries, government spending in those countries was, on average, 30% of GDP. Until World War I, the United States had a tiny federal government compared to national governments in other countries. This was due in part to the fact that the U.S. faced no significant military threat to its existence, which allowed the government to spend a relatively small proportion of its budget on the military. The federalist nature of the American regime also did its part to limit the size of the national government.

When the government is small and relatively weak, the way to make money is to start a successful private-sector business. But the larger the size and scope of government spending, the easier it is to make money by diverting public resources. Starting a business is difficult and involves a lot of risk — but getting a government favor or contract is easier, and a much safer bet. And so in nations with large and powerful governments, the state tends to find itself at the heart of the economic system, even if that system is relatively capitalist. This tends to confound politics and economics, both in practice and in public perceptions: The larger the share of capitalists who acquire their wealth thanks to their political connections, the greater the perception that capitalism is unfair and corrupt.

Another distinguishing feature of American capitalism is that it developed relatively untouched by foreign influence. Although European (and especially British) capital played a significant role in America's 19th- and early 20th-century economic development, Europe's economies were not more developed than America's — and so while European capitalists could invest in or compete with American companies, they could not dominate the system. As a result, American capitalism developed more or less organically, and still shows the marks of those origins. The American bankruptcy code, for instance, exhibits significant pro-debtor biases, because the United States was born and developed as a nation of debtors.

The situation is very different in nations that developed capitalist economies after World War II. These countries (in non-Soviet-bloc continental Europe, parts of Asia, and much of Latin America) industrialized under the giant shadow of American power. In this development process, the local elites felt threatened by the prospect of economic colonization by American companies that were far more efficient and better capitalized. To protect themselves, they purposely built a ­non-transparent system in which local connections were important, because this gave them an inherent advantage. These structures have proven resilient in the decades since: Once economic and political systems are built to reward relationships instead of efficiency, it is very difficult to reform them, since the people in power are the ones who would lose most in the change.

Finally, the United States was able to develop a pro-market agenda distinct from a pro-business agenda because it was largely spared the direct influence of Marxism. It is possible that the type of capitalism the United States developed is the cause, as much as the effect, of the absence of strong Marxist movements in this country. But either way, this distinction from other Western regimes was significant in the development of American attitudes toward economics. In countries with prominent and influential Marxist parties, pro-market and pro-business forces were compelled to merge to fight the common enemy. If one faces the prospect of nationalization (i.e., the control of resources by a small political elite), even relationship capitalism (which involves control of those resources by a small business elite) becomes an appealing alternative.

As a result, many of these countries could not develop a more competitive and open form of capitalism because they could not afford to divide the opposition to Marxism. Worse, the free-market banner was completely appropriated by the pro-business forces, which were better equipped and better fed. Paradoxically, as the appeal of Marxist ideas faded, this problem in many of these countries became worse, not better. After decades of contiguity and capture, the pro-market forces could not separate themselves from the pro-business camp. Having lost the ideological opposition of Marxism and lacking any opposition from pro-market ideology, pro-business forces ruled unchecked. In no country is this more evident than in Italy, where the pro-market movement today is almost literally owned by a businessman, Prime Minister Silvio Berlusconi, who often seems to run the country in the interest of his media empire.

For all these reasons, the United States developed a system of capitalism that comes closer than any other to the ideal combination of economic freedom and open competition. The image many Americans have of capitalism is therefore that of Horatio Alger's rags-to-riches-via-hard-work stories, which have come to define the American Dream. By contrast, in most of the rest of the world, Horatio Alger is unknown — and the image of social mobility is dominated by Cinderella or Evita stories: fantasies more than plausible dreams. This understanding of opportunity has helped make capitalism popular and secure in the United States.

But because the free-market system relies on this public support, and this support depends to a certain extent on the public's impression that the system is fair, any erosion of that impression threatens the system itself. Such erosion occurs when government connections, or the power of entrenched incumbents in the market, seem to overtake genuine free and fair competition as the paths to wealth and success. Both government and big business have strong incentives to push the system in this direction, and therefore both, if left unchecked, pose a threat to America's distinctive form of capitalism.

Although the United States has the great advantage of having started from a superior model of capitalism and having developed an ideology to support it, our system is still vulnerable to these pressures — and not only in a crisis. Even the most persuasive and resilient ideology cannot long outlive the conditions and reasoning that generated it. American capitalism needs vocal defenders who understand the threats it faces — and who can make its case to the public. But in the last 30 years, as the threat of global communism has waned and disappeared, capitalism's defenders have grown fewer, while the temptations of corporatism have grown greater. This has helped set the stage for the crisis we now face — and left us less able to discern how we might recover from it.

THE DEMISE OF AMERICAN EXCEPTIONALISM

A healthy financial system is crucial to any working market economy. Widespread access to finance is essential to harnessing the best talents and allowing them to prosper and grow. It is crucial for drawing new entrants into the system, and for fostering competition. The system that allocates finance allocates power and rents; if that system is not fair, there is little hope that the rest of the economy can be. And the potential for unfairness or abuse in the financial system is always great.

Americans have long been sensitive to such abuse. While we have historically avoided general anti-capitalist biases, Americans have nonetheless nurtured something of a populist anti-finance bias. This bias has led to many political decisions throughout American history that were inefficient from an economic point of view, but helped preserve the long-term health of America's democratic capitalism. In the late 1830s, President Andrew Jackson opposed renewing the charter of the Second Bank of the United States — a move that contributed to the panic of 1837 — because he saw the bank as an instrument of political corruption and a threat to American liberties. An investigation he initiated established "beyond question that this great and powerful institution had been actively engaged in attempting to influence the elections of the public officers by means of its money."

Throughout much of American history, state bank regulations were driven by concerns about the power of New York banks over the rest of the country, and the fear that big banks drained deposits from the countryside in order to redirect them to the cities. To address these fears, states introduced a variety of restrictions: from unit banking (banks could have only one office), to limits on intrastate branching (banks from northern Illinois could not open branches in southern Illinois), to limits on interstate branching (New York banks could not open branches in other states). From a purely economic point of view, all of these restrictions were crazy. They forced a reinvestment of deposits in the same areas where they were collected, badly distorting the allocation of funds. And by preventing banks from expanding, these regulations made banks less diversified and thus more prone to failure. Nevertheless, these policies had a positive side effect: They splintered the banking sector, reducing its political power and in so doing creating the preconditions for a vibrant securities market.

Even the separation between investment banking and commercial banking introduced by the New Deal's Glass-Steagall Act was a product of this longstanding American tradition. Unlike many other banking regulations, Glass-Steagall at least had an economic rationale: to prevent commercial banks from exploiting their depositors by dumping on them the bonds of firms to which the banks had lent money, but which could not repay the loans. The Glass-Steagall Act's biggest consequence, though, was the fragmentation it caused — which helped reduce the concentration of the banking industry and, by creating divergent interests in different parts of the financial sector, helped reduce its political power.

In the last three decades, these arrangements were completely overturned, starting with the progressive deregulation of the banking sector. The restrictions imposed by state regulations were highly inefficient to begin with, but over the years technological and financial progress made them absolutely untenable. What good does it do to restrict branching when banks can set up ATMs throughout the country? How effectively can a prohibition on intrastate branching block the redistribution of deposits, when non-integrated banks can reallocate them through the interbank market?

So starting in the late 1970s, state bank regulations were relaxed or eliminated, increasing the efficiency of the banking sector and fostering economic growth. But the move also increased concentration. In 1980, there were 14,434 banks in the United States, about the same number as in 1934. By 1990, this number had dropped to 12,347; by 2000, to 8,315. In 2009, the number stands below 7,100. Most important, the concentration of deposits and lending grew significantly. In 1984, the top five U.S. banks controlled only 9% of the total deposits in the banking sector. By 2001, this percentage had increased to 21%, and by the end of 2008, close to 40%.

The apex of this process was the 1999 passage of the Gramm-Leach-Bliley Act, which repealed the restrictions imposed by Glass-Steagall. Gramm-Leach-Bliley has been wrongly accused of playing a major role in the current financial crisis; in fact, it had little to nothing to do with it. The major institutions that failed or were bailed out in the last two years were pure investment banks — such as Lehman Brothers, Bear Stearns, and Merrill Lynch — that did not take advantage of the repeal of Glass-Steagall; or they were pure commercial banks, like Wachovia and Washington Mutual. The only exception is Citigroup, which had merged its commercial and investment operations even before the Gramm-Leach-Bliley Act, thanks to a special exemption.

The real effect of Gramm-Leach-Bliley was political, not directly economic. Under the old regime, commercial banks, investment banks, and insurance companies had different agendas, and so their lobbying efforts tended to offset one another. But after the restrictions were lifted, the interests of all the major players in the financial industry became aligned, giving the industry disproportionate power in shaping the political agenda. The concentration of the banking industry only added to this power.

The last and most important source of the finance industry's growing power was its profitability, at least on the books. In the 1960s, the share of GDP produced by the finance sector amounted to a little more than 3%. By the mid-2000s, it was more than 8%. This expansion was driven by a rapid increase not only in profits, but also in wages. In 1980, the relative wage of a worker in the finance sector was roughly comparable to the wages of other workers with the same qualifications in other sectors. By 2007, the person in the finance sector was making 70% more. Every attempt to explain this gap using differences in abilities, or the inherent demands of the work, falls short. People working in finance were simply making significantly more than everybody else.

This enormous profitability allowed the industry to spend disproportionate amounts of money lobbying the political system. In the last 20 years, the financial industry has made $2.2 billion in political contributions, more than any other industry tracked by the Center for Responsive Politics. And over the last ten years, the financial industry topped the lobbying-expenses list, spending $3.5 billion.

The explosion of wages and profits in finance also naturally attracted the best talents — with implications that extended beyond the financial sector, and deep into government. Thirty years ago, the brightest undergraduates were going into science, technology, law, and business; for the last 20 years, they have gone to finance. Having devoted themselves to this sector, these talented individuals inevitably end up working to advance its interests: A person specialized in derivative trading is likely to be terribly impressed with the importance and value of derivatives, just as a nuclear engineer is likely to think nuclear power can solve all the world's problems. And if most of the political elite were picked from among nuclear engineers, it would be only natural that the country would soon fill with nuclear plants. In fact, we have an example of precisely this scenario in France, where for complicated cultural reasons an unusually large portion of the political elite is trained in engineering at the École Polytechnique — and France derives more of its energy from nuclear power than any other nation.

A similar effect is evident with finance in America. The proportion of people with training and experience in finance working at the highest levels of every recent presidential administration is extraordinary. Four of the last six secretaries of Treasury fit this description. In fact, all four were directly or indirectly connected to one firm: Goldman Sachs. This is hardly the historical norm; of the previous six Treasury secretaries, only one had a finance background. And finance-trained executives staff not only the Treasury but many senior White House posts and key positions in numerous other departments. President Barack Obama's chief of staff, Rahm Emanuel, once worked for an investment bank, as did his predecessor under President George W. Bush, Joshua Bolten.

There is nothing intrinsically bad about these developments. In fact, it is only natural that a government in search of the brightest people will end up poaching from the finance world, to which the best and brightest have flocked. The problem is that people who have spent their entire lives in finance have an understandable tendency to think that the interests of their industry and the interests of the country always coincide. When Treasury Secretary Henry Paulson went to Congress last fall arguing that the world as we knew it would end if Congress did not approve the $700 billion bailout, he was serious and speaking in good faith. And to an extent he was right: His world — the world he lived and worked in — would have ended had there not been a bailout. Goldman Sachs would have gone bankrupt, and the repercussions for everyone he knew would have been enormous. But Henry Paulson's world is not the world most Americans live in — or even the world in which our economy as a whole exists. Whether that world would have ended without Congress's bailout was a far more debatable proposition; unfortunately, that debate never took place.

Compounding the problem is the fact that people in government tend to rely on their networks of trusted friends to gather information "from the outside." If everyone in those networks is drawn from the same milieu, the information and ideas that flow to policymakers will be severely limited. A revealing anecdote comes from a Bush Treasury official, who noted that in the heat of the financial crisis, every time there was a phone call from Manhattan's 212 area code, the message was the same: "Buy the toxic assets." Such uniformity of advice makes it difficult for even the most intelligent or well-meaning policymakers to arrive at the right decisions.

THE VICIOUS CYCLE

The finance sector's increasing concentration and growing political muscle have undermined the traditional American understanding of the difference between free markets and big business. This means not only that the interests of finance now dominate the economic understanding of policymakers, but also — and perhaps more important — that the public's perception of the economic system's legitimacy is at risk.

If the free-market system is politically fragile, its most fragile component is precisely the financial industry. It is so fragile because it relies entirely on the sanctity of contracts and the rule of law, and that sanctity cannot be preserved without broad popular support. When people are angry to the point of threatening the lives of bankers; when the majority of Americans are demanding government intervention not only to regulate the financial industry but to control the way companies are run; when voters lose confidence in the economic system because they perceive it as fundamentally corrupt — then the sanctity of private property becomes threatened as well. And when property rights are not protected, the survival of an effective financial sector, and with it a thriving economy, is in doubt.

The government's involvement in the financial sector in the wake of the crisis — and particularly the bailouts of large banks and other institutions — has exacerbated this problem. Public mistrust of government has combined with mistrust of bankers, and concerns about the waste of taxpayer dollars have been joined to worries about rewarding those who caused the mess on Wall Street. In response, politicians have tried to save themselves by turning against the finance sector with a vengeance. That the House of Representatives approved a proposal to retroactively tax 90% of all bonuses paid by financial institutions receiving TARP money shows how dangerous this combination of backlash and demagoguery can be.

Fortunately, that particular proposal never became law. But the anti-finance climate that produced it greatly contributed, for instance, to the expropriation of Chrysler's secured creditors this spring. By singling out and publicly condemning the Chrysler creditors who demanded that their contractual rights be respected, President Obama effectively exploited public resentment to reduce the government's costs in the Chrysler bailout. But the cost-cutting came at the expense of current investors, and sent a signal to all potential future investors. While Obama's approach was convenient in the short term, it could prove devastating to the market system over time: The protection afforded to secured creditors is crucial in making credit available to firms in financial distress and even in Chapter 11. The Chrysler precedent will jeopardize access to such financing in the future, particularly for the firms most in need, and so will increase the pressure for yet more government involvement.

The pattern that has taken hold in the wake of the financial crisis thus threatens to initiate a vicious cycle. To avoid being linked in the public mind with the companies they are working to help, politicians take part in and encourage the assault on finance; this scares off legitimate investors, no longer certain they can count on contracts and the rule of law. And this, in turn, leaves little recourse for troubled businesses but to seek government assistance.

It is no coincidence that shortly after bashing Wall Street executives for their greed, the administration set up the most generous form of subsidy ever invented for Wall Street. The Public-Private Investment Program, announced in March by Treasury Secretary Timothy Geithner, provides $84 of government-subsidized loans and $7 of government equity for every $7 of private equity invested in the purchase of toxic assets. The terms are so generous that the private investors essentially receive a subsidy of $2 for every dollar they put in.

If these terms are "justified" by the uncertainty stemming from the populist backlash, they also exacerbate the conditions that generated the backlash in the first place — confirming the sense that government and large market players are cooperating at the expense of the taxpayer and the small investor. If the Public-Private Investment Program works, the very people who created the problem stand to grow fabulously rich with government help — which will surely do no good for the public's impression of American capitalism.

This is just the unhealthy cycle in which capitalism is trapped in most countries around the world. On one hand, entrepreneurs and financiers feel threatened by public hostility, and thus justified in seeking special privileges from the government. On the other hand, ordinary citizens feel outraged by the privileges the entrepreneurs and financiers receive, inflaming that very hostility. For anyone acquainted with the character of capitalism around the world, this moment in America feels eerily familiar.

THE FUTURE OF AMERICAN CAPITALISM

We thus stand at a crossroads for American capitalism. One path would channel popular rage into political support for some genuinely pro-market reforms, even if they do not serve the interests of large financial firms. By appealing to the best of the populist tradition, we can introduce limits to the power of the financial industry — or any business, for that matter — and restore those fundamental principles that give an ethical dimension to capitalism: freedom, meritocracy, a direct link between reward and effort, and a sense of responsibility that ensures that those who reap the gains also bear the losses. This would mean abandoning the notion that any firm is too big to fail, and putting rules in place that keep large financial firms from manipulating government connections to the detriment of markets. It would mean adopting a pro-market, rather than pro-business, approach to the economy.

The alternative path is to soothe the popular rage with measures like limits on executive bonuses while shoring up the position of the largest financial players, making them dependent on government and making the larger economy dependent on them. Such measures play to the crowd in the moment, but threaten the financial system and the public standing of American capitalism in the long run. They also reinforce the very practices that caused the crisis. This is the path to big-business capitalism: a path that blurs the distinction between pro-market and pro-business policies, and so imperils the unique faith the American people have long displayed in the legitimacy of democratic capitalism.

Unfortunately, it looks for now like the Obama administration has chosen this latter path. It is a choice that threatens to launch us on that vicious spiral of more public resentment and more corporatist crony capitalism so common abroad — trampling in the process the economic exceptionalism that has been so crucial for American prosperity. When the dust has cleared and the panic has abated, this may well turn out to be the most serious and damaging consequence of the financial crisis for American capitalism.

Luigi Zingales is the Robert C. McCormack Professor of Entrepreneurship and Finance at the University of Chicago Booth School of Business, and co-author of Saving Capitalism from the Capitalists.

Animal Planet Vs. Economic Reasoning

Thomas F. Cooley

Do ''behavioralists'' really offer a solution to the crisis?

pic


Forget about rugby: Excoriating economists should be the new Olympic sport. We economists are an easy target because, whether we like it or not, we're in the prediction business. It's not news when we get it right. But when we get it wrong, especially this wrong, well any idiot could have, should have, ought to have foreseen the up, the down, the collapse, the recovery ... You get the general idea.

Interestingly, the harshest critics are some of our colleagues who haven't themselves been involved in serious research for many years, and who are now taking it upon themselves to anoint the worthy and demean the unworthy. But it is very odd to see a periodical like The Economist--that has generally shown more sense--publish a terribly one-sided and thoughtless critique of contemporary economics. And it is mostly unfortunate because it distracts attention from a serious discussion of the lessons to be learned from the financial crisis.

Did the economics profession really blow it by not foreseeing and preventing the financial crisis of 2007-2008? There is no question. The financial system had evolved into a very precarious state, and most economists (along with most bankers, journalists, politicians and policy makers) didn't realize it--and certainly didn't realize the severity of it--when it began to unfold.

Should economists, in particular, have known better? Well, they did know that a financial crisis would occur. Financial crises are a recurring phenomenon just as business cycles are. It also has to be said that, like the business cycle, they are a mystery. Nobody would "choose" to have a recession. Nobody benefits. Similarly nobody benefits from a crisis--from a meltdown in the financial system. But we can learn from these events. We can try to understand the market failures and institutional weaknesses that led to them. Or, we can satisfy rapacious egos by heaping scorn on everyone who could have "let this happen."

In the 20th century, the U.S. alone experienced at least five major crises: the Panic of 1907; the severe contraction in 1921; the Great Depression; the failure of Continental Illinois Bank in 1984--a potentially systemic failure; and the Savings and Loan crisis of the 1980s. There have also been several global crises, including the Latin American Debt crisis of the 1980s, the Asian Financial Crisis of the 1980s, the Swedish Banking crisis of the 1990s, the Japanese Banking crisis and the subsequent "lost decade," and the Mexican Peso Crisis of 1994.

There are important lessons to be learned by looking at which responses to crises have been most successful and which have failed. In that sense, positive analysis can be usefully prescriptive in a way that mere descriptions cannot. Positive analyses lead to useful innovations. Descriptions generally do not. Examples of successful institutions that have emerged as a consequence of previous financial crises include the Federal Reserve, created in response to the Panic of 1907, the FDIC and the SEC. They were successful because they were each based on a correct analysis of a market failure. And they addressed the market failure in a way that did not unduly stifle innovation.

A common theme in discussions of the financial crisis of the past several years has been that it illustrates the failure of the market-driven view of economic activity. On this view, the past several decades of opening up markets, removing regulatory restrictions and trusting markets to discipline themselves have destabilized the financial system.

A companion view is that we should dispense with our belief in markets. Rather, we can best understand what happens in markets as behavioral phenomena--like herd behavior--where market participants all move in the same direction in waves of pessimism and optimism. Certainly, herd behavior is a useful analogy for thinking about market bubbles and collapses. We've all seen the Animal Planet documentaries that feature huge herds of impala stampeded by a hungry lion or two. Hence the description of this market behavior as being driven by "animal spirits." But it is an analogy, not an analysis.

But there is an important distinction to be made between description and explanation. The notion of herd behavior or animal spirits carries with it no positive prescription for policy. Nor does it help to understand the market failures (the concentration of unpriced systemic and liquidity risk) that led to this crisis.

In a recent book, Animal Spirits: How Human Psychology Drives the Economy, and Why It Matters for Global Capitalism, two fine and respected economists, George Akerlof and Robert Shiller, argue that a psychologically enriched version of standard models of economics is needed. It is an interesting and thoughtful book, but I'm not sure how it moves the analytical process forward.

Behavioral economics is based on the observation that people have an emotional, irrational side that is not well-captured by mainstream economic models. And mainstream economic models have been greatly improved by considering these aspects of behavior. They now routinely take into account habits, and learning, and myopia, just to touch on a few. But behavioralists, like psychologists, tend to be interested in the behavior of individuals for the most part, while economists are interested in explaining the behavior of large groups of people interacting in markets. Understanding market failures requires framing problems that way.

There is an interesting object lesson from the early days of the financial crisis that is documented by my colleague Lasse Pedersen in a paper called "When Everyone Runs for the Exits." Beginning in early 2007 liquidity in financial markets began to dry up as it became clear that a large number of subprime borrowers would end up in default on their mortgage obligations. The liquidity shortage quickly spilled over to equity markets, to other credit markets, to currency carry trade markets and so. It occurred because many levered liquidity-providing traders had common features to their portfolios.

Was this because of herd behavior? The answer is: not necessarily. Many of these traders were pursuing very different investment strategies with very different objectives that would appear uncorrelated in normal times. Suddenly they all began to liquidate portfolios. Did "animal spirits" turn them all pessimistic? No. What changed for them in 2007 is that liquidity became scarce. This put pressure on them to liquidate what they could. As they tried to liquidate, prices fell, requiring them to liquidate more--and this led to a downward spiral. It wasn't herd mentality, it was a market failure. The market failure was that they had not priced the liquidity risk in these levered markets, and investors had not been rewarded for bearing that risk.

Now, we have a problem described in a manner that will lend itself to a useful broader policy discussion. That is, how liquidity risk, and indeed, risk management as a whole, needs to be part of a broad view of market dynamics.

While the behavioralists have enriched the conversation, they have a long way to go before "animal spirits" turns into a set of policies to ward off similar debacles in the future.

Thomas F. Cooley, the Paganelli-Bull professor of economics and Richard R. West dean of the NYU Stern School of Business, writes a weekly column for Forbes. He is a contributor to a Stern School book on the financial crisis, Restoring Financial Stability(Wiley, 2009).

Tax-and-Offend Charlie Rangel

By Steven Malanga

Charlie Rangel waited more than a decade to ascend to the chairmanship of the powerful House Ways and Means Committee, which shapes our country's tax laws, but in less than two years in that position he's fared poorly in the glare of the national spotlight.

Rangel's woes, ranging from unpaid taxes to undisclosed assets, shouldn't be shocking to anyone who has followed his four decades in office. For much of that time Rangel operated with impunity a powerful patronage machine in Harlem centered around several controversial, government-funded groups which spent millions of dollars with no records to back up the expenditures, failed to pay withholding taxes on some employees, billed government for a host of questionable expenses, and "pumped millions of dollars into New York's underground economy through off-the-books payments," according to investigators.

In a sane world, Rangel's sad history with organizations like the Harlem Urban Development Corp. and the Apollo Theater Foundation would have raised serious questions about whether he should assume so important position as head of the House Ways and Means committee. That it didn't, that the transgressions at these groups were seen as merely a local matter that passed quietly away, exemplify how since the War On Poverty we have tolerated politicians who create or take over community organizations financed by tax dollars and operate them as little more than patronage mills. Sometimes these groups are merely ineffective and waste government money, but at other times they work in blatantly questionable ways, as when HUDC ignored tax laws. Is it any surprise that a politician who maneuvers freely for decades in such an environment would be so slipshod about paying his taxes and filing financial disclosure forms?

The history and evolution of the Harlem development group HUDC illustrates how the good intentions of the War on Poverty were quickly suberted by local pols like Rangel. Created in 1971 as a subsidiary of the New York Urban Development Corp., the HUDC was supposed to help jumpstart the economy of Harlem with government dollars handed over to community leaders, a dubious proposition at best. Rangel joined the board in 1976, several years after being elected to Congress, according to an investigation of the group, and soon thereafter the organization's board asked for the special privilege of operating with more "autonomy" and independence from the state. This unusual request, granted to no other local development group, was racial in nature.

"There was and continues to be a feeling at HUDC that it is necessary to demonstrate to...the white power structure in general that ‘uptown' [that is, HUDC] could do as good a job as ‘downtown' [state officials] in developing large projects," wrote an attorney for the state in a memorandum uncovered by investigators about the special status of HUDC. Under pressure, the state acceded to the group's request for autonomy and eventually allowed the Harlem group to run completely on its own, electing and appointing its own board members and lobbying legislators for its own funding. In 1983, when former computer company executive Bill Stern took over as head of the state's economic development agency under Governor Mario Cuomo, Stern was surprised to learn the HUDC was beyond his purview.

"I met with HUDC executive director Donald Cogsville to find out what his public-benefit corporation was doing," Stern wrote in 1997 in City Journal. "Mr. Cogsville didn't have much patience with me. When I asked him why he had a state car and driver, he asked me if I was a racist."

Stern made some inquiries. "I called Cuomo's chief of staff, Michael DelGiudice. I asked him for the story on HUDC. He said, ‘Bill, HUDC is Charlie'--Charles Rangel, that is, the longtime congressman from Harlem. HUDC was a patronage machine for Rangel's local cronies."

The long, sweet deal came unexpectedly to an end when a little-known Republican state senator, George Pataki, unexpectedly upset Mario Cuomo in the 1994 gubernatorial elections and subsequently decided to challenge Rangel and rein-in HUDC. A 1997 audit found that the agency had received nearly $100 million over its lifetime but had not completed a single new economic project in Harlem-hardly an endorsement of the development prowess of the boys from "uptown." Meant to invest in Harlem, the group's operating budget, which included among other items $1,650 in car washes billed to taxpayers, was three times the size of its investment budget. The group's biggest contribution seemed to be the injection of millions of dollars into the area's "underground economy" via off-the-books payments to employees listed as consultants. The residents of Harlem would have been better off if the HUDC had merely distributed the $100 million directly to them.

Shortly thereafter another Rangel-headed institution came unwound, the Apollo Theater Foundation, a nonprofit group which ran the Harlem theater with a heavy dose of government money. The group's board had given a deal to Inner City Broadcasting, run by former Rangel political ally Percy Sutton, to produce the television series ''It's Showtime at the Apollo'' but collected practically nothing from Sutton in payments over four years for the privilege, in the process straining the theater's budget. As a result, despite some $16 million in public money invested to restore the theater, under the Rangel led-foundation the Apollo had fallen into disrepair and was dark on most nights.

Eventually Rangel agreed to step down from the foundation's board and Time Warner agreed to take over the theater and help Sutton pay back fees he owed for rights to the show. Conveniently for both Rangel and Sutton, the state attorney general who originally pressed the case against them and sought more than $4 million from Sutton, Republican Dennis Vacco, lost his post in the November, 1998 elections to Democrat Eliot Spitzer, who upon assuming office promptly reduced the amount Sutton and Time Warner needed to pay to $1 million and then closed the case.

Rangel and his supporters explained away these affairs by claiming that he was just too kind-hearted and easygoing with those who worked for him. "He trusted others who have not served him well," was how one off-the-record source put it in a largely adulatory 2000 profile of Rangel in the New Yorker, which also noted rather that, "No one has ever thought he lined his own pockets."

Fortunately for Rangel, while all of these investigations were going on in the mid-1990s the Democrats were out of power in the House of Representatives. And so not many people cared when Dan Rostenkowski, the Illinois representative who was the senior Democrat on Ways and Means, retired, putting Rangel next in line for the chairmanship of the powerful committee whenever his party regained control of the House. That took another 11 years, by which time the indiscretions of the mid-1990s seem to have faded away.

But Rangel hasn't been able to leave his past behind so easily, because the HUDC and the Apollo scandals were part of a pattern, it seems. In July of last year the New York Times disclosed that Rangel has been allowed for years by a developer who was a political supporter of his to lease four rent-controlled apartments at far below market rents in a single-building (including one apartment he used for a campaign office), in violation of New York State law, which says that a rent-controlled apartment can only be used as one's primary residence. Three of the rent-controlled apartments are actually adjacent to one another--something that's nearly impossible to find in New York--allowing Rangel to create a super spacious residence with "custom moldings and dramatic archways," Benin Bronze statues and antique carved walnut Italian chairs, according to the book ''Style and Grace: African Americans at Home."

Soon after these disclosures, the New York Post reported that Rangel failed to list rental income from a villa he owned in the Dominican Republic on his tax forms. Then, earlier this year a private watchdog group reported that over the last 30 years Rangel had failed to disclose numerous assets he'd acquired, and Rangel himself later admitted those assets included real estate and a checking account with somewhere between a $250,000 and half a million dollar balance. All of these circumstances are now under investigation by the House's ethics committee.

Although patronage is nothing new in American politics, once upon a time it meant handing out a few jobs to workers at a local political club to cement their loyalty and demanding payoffs from businessmen looking for government contracts. But starting in the 1960s government changed the game when it started funding a wide range of community groups, a process which quickly became politicized. Dysfunctional programs like the Community Development Block Grant--scandal ridden for decades--grew and thrived because so many of the groups that got funding were tied to local politicians who used them as patronage mills operating under the pretense that they were helping their community. The new kind of official patronage turned out to be a great racket which encouraged in some pols a notion that they can run their affairs with impunity because, after all, they are doing good. As part of his presidential agenda, candidate Obama pledged to dramatically increase the size of programs like CDBC because, after all, they do so much good.

Rangel has done so much good for himself and friends that he helped an organization squander almost $100 million in Harlem. He grabbed four rent-controlled apartments for himself that might have gone to families needing housing, and allowed a friend a sweetheart deal that helped result in the decay of a Harlem landmark. Besides all of this, a little bit of unreported income seems like small potatoes, although in the end that's what will probably sink Charlie Rangel. If he had just stuck to the legalized patronage of wasting millions of taxpayer dollars on ineffective groups run by political friends and allies, he wouldn't be in this mess.

Democrats Losing Seniors

By Dick Morris

Nowhere is the fallout from Obama's healthcare proposals more evident than among the elderly, and nothing is more dangerous permanently for the Democratic Party than their increasing disaffection.

A Wall Street Journal poll taken last week reflects a gain by Republicans in party identification, closing the gap from 40-33 in April in favor of the Democrats to a Democratic margin of only 35-34. The data reflects that one-third of this six-point closure of the partisan gap comes from a major shift among the elderly - the only demographic group to have moved dramatically.

In April, the elderly broke evenly on their party identification, with 37 percent supporting each political party. Now the Republicans hold a lead, at 46-33. This 13-point closure among the 14 percent of the vote that is cast by those over 65 represents two of the six points of closure nationally.

No other group changed nearly as much. Neither liberals nor minorities nor any other age group moved nearly as dramatically as did the elderly. The Journal's pollsters noted that "perhaps the most striking movement is with senior citizens."

The Democratic Party, led by Obama, is systematically converting the elderly vote into a Republican bastion. The work of FDR in passing Social Security in 1937 and of LBJ in enacting Medicare in 1965 is being undone by the president's healthcare program. The elderly see his proposals for what they are: a massive redistribution of healthcare away from the elderly and toward a population that is younger, healthier and richer but happens, at the moment, to lack insurance. (Remember that the uninsured are, by definition, not elderly, not young and not in poverty - and if they are, they are currently eligible for Medicare, Medicaid or SCHIP and do not need the Obama program.) The elderly see the $500 billion projected cut in Medicare through the same lens as they viewed Gingrich's efforts to slice the growth in the program in the mid-1990s.

When the president addresses Congress and the nation on Wednesday night, he will likely indicate a willingness to compromise on aspects of his program. He might attenuate his support of the public option for insurance companies and could soften other aspects of his proposal as it is embodied in the House bill.

But the fundamental equation will not change: He is cutting Medicare spending and using the money to subsidize coverage of those who are now uninsured but cannot afford to pay full premiums. It is this equation that has the elderly up in arms.

And our seniors correctly understand that you cannot extend full health benefits to some portion of the 50 million who live here and lack insurance without causing rationing of existing health services unless you expand the number of doctors and nurses and the amount of medical equipment.

When President Harry Truman first proposed compulsory health insurance in 1949, he coupled his proposal with a big increase in federal aid to medical education. He grasped the fundamental reality that you cannot expand coverage without expanding the number of people who provide the service - unless you are prepared to resort to wholesale rationing.

If Democratic senators and congressmen believe that the elderly will recover from their Republican tendencies by Election Day 2010 - or even by 2012 or 2014 - they misjudge their senior constituents. The elderly are the group most dependent on government services, and they follow politics with an attention that only the needy can give.

They will not forget if the Democrats push through cuts in Medicare and then ask for their support in the next election. Their memories are long and they turn out in huge numbers. Until now, these traits have worked to the advantage of the Democrats. Now they are increasingly likely to deliver Congress and the White House to the Republicans.

Cass Sunstein - Pres Obama's Regulatory Czar - views on 2nd Amendment

Part 4: 09/09/09 Freedom Watch 29 w/ Ron Paul, Peter Schiff, Nick Gillespie, more

Freedom Watch 9-9-09-Ron Paul-Peter Schiff-Plus More Part II

Freedom Watch 9-9-09-Ron Paul-Peter Schiff-Plus More Part V

Freedom Watch 9-9-09-Ron Paul-Peter Schiff-Plus More Part III

Too late for Obama to turn it around?

Camille Paglia
What a difference a month makes! When my last controversial column posted on Salon in the second week of August, most Democrats seemed frozen in suspended animation, not daring to criticize the Obama administration's bungling of healthcare reform lest it give aid and comfort to the GOP. Well, that ice dam sure broke with a roar. Dissident Democrats found their voices, and by late August even the liberal lemmings of the mainstream media, from CBS to CNN, had drastically altered their tone of reportage, from priggish disdain of the town hall insurgency to frank admission of serious problems in the healthcare bills as well as of Obama's declining national support.

But this tonic dose of truth-telling may be too little too late. As an Obama supporter and contributor, I am outraged at the slowness with which the standing army of Democratic consultants and commentators publicly expressed discontent with the administration's strategic missteps this year. I suspect there had been private grumbling all along, but the media warhorses failed to speak out when they should have -- from week one after the inauguration, when Obama went flat as a rug in letting Congress pass that obscenely bloated stimulus package. Had more Democrats protested, the administration would have felt less arrogantly emboldened to jam through a cap-and-trade bill whose costs have made it virtually impossible for an alarmed public to accept the gargantuan expenses of national healthcare reform. (Who is naive enough to believe that Obama's plan would be deficit-neutral? Or that major cuts could be achieved without drastic rationing?)

var ve_publisher = "Salon"; var ve_site = "SALON_ROS_300X250_US"; var ve_area = "SALON_ROS_300X250_US"; var ve_location = "SALON_ROS_300X250_US"; var ve_placement = ""; var ve_width = 300; var ve_height = 250; var ve_alternate = "http://salon.com/ads/videoeggdefault/videoeggdefault300x250.html"; document.write("

By foolishly trying to reduce all objections to healthcare reform to the malevolence of obstructionist Republicans, Democrats have managed to destroy the national coalition that elected Obama and that is unlikely to be repaired. If Obama fails to win reelection, let the blame be first laid at the door of Speaker of the House Nancy Pelosi, who at a pivotal point threw gasoline on the flames by comparing angry American citizens to Nazis. It is theoretically possible that Obama could turn the situation around with a strong speech on healthcare to Congress this week, but after a summer of grisly hemorrhaging, too much damage has been done. At this point, Democrats' main hope for the 2012 presidential election is that Republicans nominate another hopelessly feeble candidate. Given the GOP's facility for shooting itself in the foot, that may well happen.

This column has been calling for heads to roll at the White House from the get-go. Thankfully, they do seem to be falling faster -- as witness the middle-of-the-night bum's rush given to "green jobs" czar Van Jones last week -- but there's a long way to go. An example of the provincial amateurism of current White House operations was the way the president's innocuous back-to-school pep talk got sandbagged by imbecilic support materials soliciting students to write fantasy letters to "help" the president (a coercive directive quickly withdrawn under pressure). Even worse, the entire project was stupidly scheduled to conflict with the busy opening days of class this week, when harried teachers already have their hands full. Comically, some major school districts, including New York City, were not even open yet. And this is the gang who wants to revamp national healthcare?

Why did it take so long for Democrats to realize that this year's tea party and town hall uprisings were a genuine barometer of widespread public discontent and not simply a staged scenario by kooks and conspirators? First of all, too many political analysts still think that network and cable TV chat shows are the central forums of national debate. But the truly transformative political energy is coming from talk radio and the Web -- both of which Democrat-sponsored proposals have threatened to stifle, in defiance of freedom of speech guarantees in the Bill of Rights. I rarely watch TV anymore except for cooking shows, history and science documentaries, old movies and football. Hence I was blissfully free from the retching overkill that followed the deaths of Michael Jackson and Ted Kennedy -- I never saw a single minute of any of it. It was on talk radio, which I have resumed monitoring around the clock because of the healthcare fiasco, that I heard the passionate voices of callers coming directly from the town hall meetings. Hence I was alerted to the depth and intensity of national sentiment long before others who were simply watching staged, manipulated TV shows.


Site Pass Presented by

Why has the Democratic Party become so arrogantly detached from ordinary Americans? Though they claim to speak for the poor and dispossessed, Democrats have increasingly become the party of an upper-middle-class professional elite, top-heavy with journalists, academics and lawyers (one reason for the hypocritical absence of tort reform in the healthcare bills). Weirdly, given their worship of highly individualistic, secularized self-actualization, such professionals are as a whole amazingly credulous these days about big-government solutions to every social problem. They see no danger in expanding government authority and intrusive, wasteful bureaucracy. This is, I submit, a stunning turn away from the anti-authority and anti-establishment principles of authentic 1960s leftism.

How has "liberty" become the inspirational code word of conservatives rather than liberals? (A prominent example is radio host Mark Levin's book "Liberty and Tyranny: A Conservative Manifesto," which was No. 1 on the New York Times bestseller list for nearly three months without receiving major reviews, including in the Times.) I always thought that the Democratic Party is the freedom party -- but I must be living in the nostalgic past. Remember Bob Dylan's 1964 song "Chimes of Freedom," made famous by the Byrds? And here's Richie Havens electrifying the audience at Woodstock with "Freedom! Freedom!" Even Linda Ronstadt, in the 1967 song "A Different Drum," with the Stone Ponys, provided a soaring motto for that decade: "All I'm saying is I'm not ready/ For any person, place or thing/ To try and pull the reins in on me."

But affluent middle-class Democrats now seem to be complacently servile toward authority and automatically believe everything party leaders tell them. Why? Is it because the new professional class is a glossy product of generically institutionalized learning? Independent thought and logical analysis of argument are no longer taught. Elite education in the U.S. has become a frenetic assembly line of competitive college application to schools where ideological brainwashing is so pandemic that it's invisible. The top schools, from the Ivy League on down, promote "critical thinking," which sounds good but is in fact just a style of rote regurgitation of hackneyed approved terms ("racism, sexism, homophobia") when confronted with any social issue. The Democratic brain has been marinating so long in those clichés that it's positively pickled.

Throughout this fractious summer, I was dismayed not just at the self-defeating silence of Democrats at the gaping holes or evasions in the healthcare bills but also at the fogginess or insipidity of articles and Op-Eds about the controversy emanating from liberal mainstream media and Web sources. By a proportion of something like 10-to-1, negative articles by conservatives were vastly more detailed, specific and practical about the proposals than were supportive articles by Democrats, which often made gestures rather than arguments and brimmed with emotion and sneers. There was a glaring inability in most Democratic commentary to think ahead and forecast what would or could be the actual snarled consequences -- in terms of delays, denial of services, errors, miscommunications and gross invasions of privacy -- of a massive single-payer overhaul of the healthcare system in a nation as large and populous as ours. It was as if Democrats live in a utopian dream world, divorced from the daily demands and realities of organization and management.

var ve_publisher = "Salon"; var ve_site = "SALON_OPINION_300X250_US"; var ve_area = "SALON_OPINION_300X250_US"; var ve_location = "SALON_OPINION_300X250_US"; var ve_placement = ""; var ve_width = 300; var ve_height = 250; var ve_alternate = "http://salon.com/ads/videoeggdefault/videoeggdefault300x250.html"; document.write("

But dreaming in the 1960s and '70s had a spiritual dimension that is long gone in our crassly materialistic and status-driven time. Here's a gorgeous example: Bob Welch's song "Hypnotized." which appears on Fleetwood Mac's 1973 album "Mystery to Me." (The contemplative young man in this recent video is not Welch.) It's a peyote dream inspired by Carlos Castaneda's fictionalized books: "They say there's a place down in Mexico/ Where a man can fly over mountains and hills/ And he don't need an airplane or some kind of engine/ And he never will." This exhilarating shamanistic vision (wonderfully enhanced by Christine McVie's hymnlike backing vocal) captures the truth-seeking pilgrimages of my generation but also demonstrates the dangerous veering away from mundane social responsibilities. If the left is an incoherent shambles in the U.S., it's partly because the visionaries lost their bearings on drugs, and only the myopic apparatchiks and feather-preening bourgeois liberals are left. (I addressed the drugs cataclysm in "Cults and Cosmic Consciousness: Religious Vision in the American 1960s" in the Winter 2003 issue of Arion.)

Having said all that about the failures of my own party, I am not about to let Republicans off the hook. What a backbiting mess the GOP is! It lacks even one credible voice of traditional moral values on the national stage and is addicted to sonorous pieties of pharisaical emptiness. Republican politicians sermonize about the sanctity of marriage while racking up divorces and sexual escapades by the truckload. They assail government overreach and yet support interference in women's control of their own bodies. Advanced whack-a-mole is clearly needed for that yammering smarty-pants Newt Gingrich, who is always so very, very pleased with himself but has yet to produce a single enduring thought. The still inexplicably revered George W. Bush ballooned our national deficits like a drunken sailor and clumsily exacerbated the illegal immigration debate. And bizarrely, the hallucinatory Dick Cheney, a fake-testosterone addict who spooked Bush into a pointless war, continues to be lauded as presidential material.


Site Pass Presented by

Which brings us to Afghanistan: Let's get the hell out! While I vociferously opposed the incursion into Iraq, I was always strongly in favor of bombing the mountains of Afghanistan to smithereens in our search for Osama bin Laden and al-Qaida training camps. But committing our land forces to a long, open-ended mission to reshape the political future of that country has been a fool's errand from the start. Every invader has been frustrated and eventually defeated by that maze-like mountain terrain, from Alexander the Great to the Soviet Union. In a larger sense, outsiders will never be able to fix the fate of the roiling peoples of the Near East and Greater Middle East, who have been disputing territorial borderlines and slaughtering each other for 5,000 years. There is too much lingering ethnic and sectarian acrimony for a tranquil solution to be possible for generations to come. The presence of Western military forces merely inflames and prolongs the process and creates new militias of patriotic young radicals who hate us and want to take the war into our own cities. The technological West is too infatuated with easy fixes. But tribally based peoples think in terms of centuries and millennia. They know how to wait us out. Our presence in Afghanistan is not worth the price of any more American lives or treasure.

In response to persistent queries, I must repeat: No, I do not have a Facebook page, nor am I a "friend" on anyone else's Facebook. Nor do I Twitter. This Salon column is my sole Web presence. Whatever doppelgänger Camille Paglias are tripping the light fantastic out there (as in the haunted bus-station episode of "The Twilight Zone"), they aren't me!

Camille Paglia's column appears on the second Wednesday of each month. Every third column is devoted to reader letters. Please send questions for her next letters column to this mailbox. Your name and town will be published unless you request anonymity.

Rush Limbaugh: The Meaning of Barack Obama's ObamaCare Speech to Congress

Merit Pay for Central Bankers?

If only government employees received performance-based salaries.

So let's get this straight: The finance ministers and central bankers of the G-20 sat down in London this weekend and came away with a set of "further steps to strengthen the financial system." Among these steps is an accord on "global standards on pay structure . . . to ensure compensation policies are aligned with long-term value creation and financial stability."

There's no arguing with those goals, but one could ask some questions about the source. We can think of few large employers anywhere that do less to ensure that pay is tied to value creation than government bureaucracies. OK, we can't think of any. And it's hard to imagine any measures more likely to damage economic performance and "value creation" than putting the government in charge of what private employees get paid.

In the public sector, performance is often difficult or politically impossible to measure (never mind encourage or enforce), seniority rules, and even the hint of job cuts or attrition is met with outrage by public-sector unions that have mastered the art of holding the public purse hostage to their special interests.

No one is proposing to pay bankers like social security clerks or school teachers. But count us skeptical that a political class that can't see its way to pay for performance in its own backyard will get the incentives right when dictating pay policies for others. Maybe if central bankers and finance ministers could figure out how to tie their own pay to "value creation and financial stability," the idea would deserve another look. Banks may not get all the incentives right, but at least their contracts reflect the pressures of a competitive market, rather than the politicians' universal mandate to find someone, other than themselves, to blame for a financial panic.

The Fed Can't Monitor 'Systemic Risk'

That's like asking a thief to police himself.

Using the financial crisis as a pretext, the Obama administration is determined to enact massive financial regulatory reforms this year. But the centerpiece of its proposal—putting the Fed in charge of regulating or monitoring systemic risk—is a serious error.

The problem is the Fed itself can create systemic risk. Many scholars, for example, have argued that by keeping interest rates too low for too long the Fed created the housing bubble that gave us the current mortgage meltdown, financial crisis and recession.

Regardless of whether one believes this analysis, it is not difficult to see that a Fed focused on preventing deflation in the wake of the dot-com bubble's collapse in the early 2000s might ignore the sharp rise in housing prices that later gave us a bubble.

There is also the so-called Greenspan put. That's a term that refers to investors taking greater risks than they otherwise would because they believed the Fed would protect them by flooding the financial system with liquidity in the event of a downturn. If there really was a Greenspan put, it has now been supplanted by a "Bernanke put."

These puts may or may not be real, but there is no doubt that the Fed has the power to create incentives for greater risk taking. In other words, simply by doing its job to stabilize the economy, the Fed can create the risk-taking mindset that many blame for the current crisis.

And finally, there are those—including some at the Fed itself—who argue that the Fed does not have, and will never have, sufficient information to recognize a real bubble. As a result, the Fed is just as likely to stifle economic growth as it is to sit idly by while a serious asset bubble develops.

All of this means just one thing: If we are to have a mechanism to prevent systemic risk it should be independent of the Fed. That is probably one reason why creating a systemic-risk council made up of all of the federal government's financial regulatory agencies, including the Fed, has the support of Senate Banking Committee Chairman Christ Dodd (D., Conn.) and others on the committee.

The current administration isn't the only one that has been willing to hand too much power to the Fed. The idea that the Fed should have some responsibility to detect systemic risk originated with the Bush Treasury Department's "Blueprint for a Modern Financial Regulatory Structure," issued in March 2008. In that plan, the regulation of bank holding companies would be transferred from the Fed to the comptroller of the currency and the Federal Deposit Insurance Corporation. The Fed would be charged with detecting the development of systemic risk in the economy.

The idea was that the Fed's authority would be pared back in those areas where it is actually supervising specific financial institutions but expanded where its responsibilities dealt with the economy as a whole. This is a plausible idea. There is every reason to remove from the Fed's plate the supervision of specific financial institutions as well as the regulation of businesses such as mortgage brokers. As a matter of government organization, it makes a tidy package for the Fed to handle issues that affect the economy as a whole.

But piling yet more responsibilities on the Fed raises the question of whether we are serious about discovering incipient systemic risk. If we are, then an agency outside of the Fed should be tasked with that responsibility. Tasking the Fed with that responsibility would bury it among many other inconsistent roles and give the agency incentives to ignore warning signals that an independent body would be likely to spot.

Unlike balancing its current competing assignments—price stability and promoting full employment—detecting systemic risk would require the Fed to see the subtle flaws in its own policies. Errors that are small at first could grow into major problems. It is simply too much to expect any human institution to step outside of itself and see the error of its ways when it can plausibly ignore those errors in the short run. If we are going to have a systemic-risk monitor, it should be an independent council of regulators.

It is one thing to set a thief to catch a thief—as President Franklin Delano Roosevelt is said to have done when he put Joe Kennedy in charge of the newly created Securities and Exchange Commission in the 1930s. But to set a thief to catch himself is quite a different matter.

Mr. Wallison is a senior fellow at the American Enterprise Institute.

No comments:

BLOG ARCHIVE