Wednesday, June 24, 2009

Killing in War

Mises Daily by

Jeff McMahan has written a genuinely revolutionary book. He has uncovered a flaw in standard just-war theory. The standard view sharply separates the morality of going to war, jus ad bellum, from the morality of warfare, jus in bello. Whether or not a war is just does not affect the morality of how war is to be conducted. Soldiers are forbidden to violate the laws of war; but no greater restrictions are imposed on those who fight in an unjust cause than on those whose cause meets the requirements of jus ad bellum. This is exactly what McMahan rejects. Soldiers in an unjust cause have, for the most part, no right at all to engage in violent action against soldiers in a just cause. Not only do they lack moral standing to engage in aggressive warfare; they cannot legitimately even engage in defensive war, in most circumstances.

McMahan states his basic thesis in this way:

The contention of this book is that common sense beliefs about the morality of killing in war are deeply mistaken. The prevailing view is that in a state of war, the practice of killing is governed by different moral principles from those that govern acts of killing in other contexts. This presupposes that it can make a difference to the moral permissibility of killing another person whether one's political leaders have declared a state of war with that person's country. According to the prevailing view, therefore, political leaders can sometimes cause other people's moral rights to disappear simply by commanding their armies to attack them. When stated in this way, the received view seems obviously absurd. (p. vii)

Once advanced, McMahan's thesis seems obvious, and it is his considerable philosophical merit to make us realize how obvious it is. Those who fight in an unjust war are, by hypothesis, directing force against people whom they have no right to attack. If, e.g., the United States had no right to invade Iraq in 2003, then American soldiers wrongly used force against Iraqi soldiers. If so, how can one contend that they are morally permitted to do so? Further, do not defenders against such aggression have the right to resist? If they do have this right, then the aggressors may not fight back, even in self-defense. If a policeman legitimately shoots at a suspect, the suspect cannot claim the right to shoot back in self-defense. McMahan holds that matters in this respect do not change in warfare.

McMahan contends further that his view is of more than merely theoretical importance. Because people accept the incorrect view that soldiers who fight in an unjust war do no wrong, so long as they obey the laws of war, they are more likely to participate in such wars. This makes wars more likely.

[Although] the idea that no one does wrong, or acts impermissibly, merely by fighting in a war that turns out to be unjust … is intended to have a restraining effect on the conduct of war, the widespread acceptance of this idea also makes it easier … to fight in war without qualms about whether the war is unjust. (p. 3)

As mentioned, it seems obvious, once stated, that those engaged in an unjust war have no right to attack others. But is it too severe a doctrine to claim that they have no right to defend themselves, if attacked by just combatants? Quite the contrary, McMahan notes that his view applies a standard position in interpersonal morality to the ethics of war:

For many centuries there has been general agreement that, as a matter of both morality and law, "where attack is justified there can be no lawful defence." These words were written by Pierino Belli in 1563 and were echoed a little over a century later by John Locke, who claimed that "Force is to be opposed to nothing, but to unjust and unlawful force." (p. 14)

McMahan is a very careful philosopher; as soon as he states a thesis, he thinks of qualifications, objections, and rebuttals. He notes an instance where unjust combatants can permissibly use force:

The exception to the claim that just combatants are illegitimate targets in war is when they pursue their just cause by impermissible means. If, for example, just combatants attempt to achieve their just cause by using terrorist tactics — that is, by intentionally killing and attacking innocent people, as the Allies did when they bombed German and Japanese cities in World War II — they make themselves morally liable to defensive attack and become legitimate targets even for unjust combatants. (p. 16)

If McMahan contends that unjust combatants are not morally permitted, in most cases, to use force, has he not placed unreasonable demands on them? They are in many cases conscripted into the armed forces and serve against their will: in fighting, they simply obey the orders of their government. If they refuse to serve, they may face severe criminal penalties. And once enemy troops fire on them, is it not unrealistic to demand that they not fire back?

But these considerations at best give unjust combatants an excuse for their conduct: they do not serve to show that what they do is morally right. Further, not all unjust combatants are conscripts; and, as to those who are, one sometimes has a moral duty to disobey unjust commands, even if doing so leads to harsh penalties.

"I was just following orders" is not always a convincing defense. And the situation for the soldiers who wish to act in accord with moral duty is not always so bleak. McMahan calls attention to the work of S.L.A. Marshall, who claimed that during World War II, "only about 15–20% of combatants had fired their weapons at all" (p. 133). Though not everyone accepts Marshall's figures, it is not in dispute that many soldiers in battle did not fight. But of course the majority of combatants were not jailed for resistance. Soldiers, then, who wish to disobey unjust orders may be able to escape penalties.

McMahan considers an objection to his thesis advanced by David Estlund. Do not soldiers in a democratic country act reasonably in relying on their government's claim that a war is just? After all, the government is likely to have much more relevant information than the soldiers and, Estlund contends, democratic decision-making has "epistemic value"; given the government's democratic bona fides, soldiers act reasonably in not attempting to assess for themselves the justice of a war.

Not so, McMahan responds.

Among democratic countries, the US stands out in two respects: it has carefully designed and robust democratic institutions and also goes to war more often than any other democratic country. What procedural guarantees are there that the wars it fights will be just? The answer is: none. The only constraint is a requirement of Congressional authorization — a requirement that can be fudged… (p. 69)[1]

McMahan is an appropriately severe critic of American foreign policy:

The Pentagon Papers revealed an assortment of lies told to rally support for the war in Vietnam; Reagan lied about the nature of the Contras and the sources of their funding in order to make war against Nicaragua; and members of the George W. Bush administration lied repeatedly about weapons of mass destruction in Iraq in order to justify the invasion and occupation of that country to the UN, the Congress, and the American public. (p. 152)[2]

Although he considers American participation in World War II justifiable, he nevertheless remarks:

It is revealing about our attitudes in general that we sometimes do take combatants who have committed war crimes to be fully excused, or even justified… Perhaps the most notorious case of this sort is that of General Paul Tibbets, who was the commander and pilot of the Enola Gay, the plane … from which the atomic bomb was dropped on the Japanese city of Hiroshima in August of 1945… This single act by Tibbets, with contributions by the other members of his small crew, had as an immediate physical effect the killing of more people, the vast majority of whom were civilians, than any other single act ever done … all plausible moral theories, including even the most radical forms of consequentialism, prohibit the killing of that many innocent people in virtually all practically possible circumstances. Tibbets's act is therefore the most egregious war crime, and the most destructive single terrorist act, ever committed, even though it was committed in the course of a just war. Yet he was congratulated for it by President Truman… (pp. 128–29)

McMahan's drastic revision of just-war theory, however cogent, appears to carry with it an unfortunate consequence. If unjust combatants use force illegitimately, are they not then subject to criminal penalties for their conduct? If we accept this consequence, the result will be large scale Nuremberg Trial proceedings: do we really want this? Further, as F.J.P. Veale long ago pointed out in Advance to Barbarism, the prospect of being tried for war crimes encourages leaders of governments to refuse to surrender.

Save 15% when you purchase all 17 of the books in this collection

Fortunately, McMahan's view does not have this consequence. What he is concerned to argue is that unjust combatants do not have the moral right to use force. It does not follow from this that they ought to be subject to criminal penalties for doing so. That is a prudential matter, and, as McMahan notes, strong considerations tell against resorting to criminal law. For one thing,

there is no impartial international court that could conduct trials of combatants who have fought in an unjust war. Because no government could try its own soldiers for fighting in a war in which it has commanded them to fight, the idea that unjust combatants are liable to punishment could lead to trials by victorious powers of individual soldiers of their defeated adversary … the victorious power that would prosecute allegedly unjust combatants would be more likely to be a vengeful aggressor prosecuting just combatants who had opposed it. (p. 190)

Further, McMahan grasps Veale's point:

Unjust combatants who feared punishment at the end of the war might be more reluctant to surrender, preferring to continue to fight with a low probability of victory than to surrender with a high probability of being punished. (pp. 190–91)

Readers unaccustomed to analytic moral philosophy may find McMahan's book hard going. He does not operate from a general theory but proceeds from case to case, weaving an intricate web of subtle distinctions.[3] The effort required to read the book, though, is well worth it: Killing in War is a distinguished contribution to moral theory.

The Great Depression

Mises Daily by

Although the Great Depression engulfed the world economy many years ago, it lives on as a nightmare for individuals old enough to remember and as a frightening specter in the textbooks of our youth.

Some 13 million Americans were unemployed, "not wanted" in the production process. One worker out of every four was walking the streets in want and despair. Thousands of banks, hundreds of thousands of businesses, and millions of farmers fell into bankruptcy or ceased operations entirely.

Nearly everyone suffered painful losses of wealth and income.

Many Americans are convinced that the Great Depression reflected the breakdown of an old economic order built on unhampered markets, unbridled competition, speculation, property rights, and the profit motive. According to them, the Great Depression proved the inevitability of a new order built on government intervention, political and bureaucratic control, human rights, and government welfare. Such persons, under the influence of Keynes, blame businessmen for precipitating depressions by their selfish refusal to spend enough money to maintain or improve the people's purchasing power. This is why they advocate vast governmental expenditures and deficit spending — resulting in an age of money inflation and credit expansion.

Classical economists learned a different lesson. In their view, the Great Depression consisted of four consecutive depressions rolled into one. The causes of each phase differed, but the consequences were all the same: business stagnation and unemployment.

The Business Cycle

The first phase was a period of boom and bust, like the business cycles that had plagued the American economy in 1819–1820, 1839–1843, 1857–1860, 1873–1878, 1893–1897, and 1920–1921. In each case, government had generated a boom through easy money and credit, which was soon followed by the inevitable bust.

The spectacular crash of 1929 followed five years of reckless credit expansion by the Federal Reserve System under the Coolidge administration. In 1924, after a sharp decline in business, the Reserve banks suddenly created some $500 million in new credit, which led to a bank credit expansion of over $4 billion in less than one year. While the immediate effects of this new powerful expansion of the nation's money and credit were seemingly beneficial, initiating a new economic boom and effacing the 1924 decline, the ultimate outcome was most disastrous. It was the beginning of a monetary policy that led to the stock-market crash in 1929 and the following depression. In fact, the expansion of Federal Reserve credit in 1924 constituted what Benjamin Anderson in his great treatise on recent economic history (Economics and the Public Welfare, D. Van Nostrand, 1949) called "the beginning of the New Deal."

The Federal Reserve credit expansion in 1924 also was designed to assist the Bank of England in its professed desire to maintain prewar exchange rates. The strong US dollar and the weak British pound were to be readjusted to prewar conditions through a policy of inflation in the United States and deflation in Great Britain.

The Federal Reserve System launched a further burst of inflation in 1927, the result being that total currency outside banks plus demand and time deposits in the United States increased from $44.51 billion at the end of June 1924, to $55.17 billion in 1929. The volume of farm and urban mortgages expanded from $16.8 billion in 1921 to $27.1 billion in 1929. Similar increases occurred in industrial, financial, and state and local government indebtedness. This expansion of money and credit was accompanied by rapidly rising real-estate and stock prices. Prices for industrial securities, according to Standard & Poor's common stock index, rose from 59.4 in June of 1922 to 195.2 in September of 1929. Railroad stock climbed from 189.2 to 446.0, while public utilities rose from 82.0 to 375.1

A Series of False Signals

The vast money and credit expansion by the Coolidge administration made 1929 inevitable. Inflation and credit expansion always precipitate business maladjustments and malinvestments that must later be liquidated. The expansion artificially reduces and thus falsifies interest rates, and thereby misguides businessmen in their investment decisions. In the belief that declining rates indicate growing supplies of capital savings, they embark upon new production projects. The creation of money gives rise to an economic boom. It causes prices to rise, especially prices of capital goods used for business expansion. But these prices constitute business costs. They soar until business is no longer profitable, at which time the decline begins. In order to prolong the boom, the monetary authorities may continue to inject new money until finally frightened by the prospects of a runaway inflation. The boom that was built on the quicksand of inflation then comes to a sudden end.

The ensuing recession is a period of repair and readjustment. Prices and costs adjust anew to consumer choices and preferences.

And above all, interest rates readjust to reflect once more the actual supply of and demand for genuine savings. Poor business investments are abandoned or written down. Business costs, especially labor costs, are reduced through greater labor productivity and managerial efficiency, until business can once more be profitably conducted, capital investments earn interest, and the market economy function smoothly again.

After an abortive attempt at stabilization in the first half of 1928, the Federal Reserve System finally abandoned its easy-money policy at the beginning of 1929. It sold government securities and thereby halted the bank credit expansion. It raised its discount rate to 6 percent in August 1929. Time-money rates rose to 8 percent, commercial paper rates to 6 percent, and call rates to the panic figures of 15 percent and 20 percent. The American economy was beginning to readjust. In June 1929, business activity began to recede. Commodity prices began their retreat in July.

The security market reached its high on September 19 and then, under the pressure of early selling, slowly began to decline. For five more weeks, the public nevertheless bought heavily on the way down. More than 100 million shares were traded at the New York Stock Exchange in September. Finally it dawned upon more and more stockholders that the trend had changed. Beginning with October 24, 1929, thousands stampeded to sell their holdings immediately and at any price. Avalanches of selling by the public swamped the ticker tape. Prices broke spectacularly.

Liquidation and Adjustment

The stock market break signaled the beginning of a readjustment long overdue. It should have been an orderly liquidation and adjustment followed by a normal revival. After all, the financial structure of business was very strong. Fixed costs were low as business had refunded a good many bond issues and had reduced debts to banks with the proceeds of the sale of stock. In the following months, most business earnings made a reasonable showing. Unemployment in 1930 averaged under 4 million, or 7.8 percent of labor force.

In modern terminology, the American economy of 1930 had fallen into a mild recession. In the absence of any new causes for depression, the following year should have brought recovery as in previous depressions. In 1921–1922, the American economy recovered fully in less than a year. What, then, precipitated the abysmal collapse after 1929? What prevented the price and cost adjustments and thus led to the second phase of the Great Depression?

Disintegration of the World Economy

The Hoover administration opposed any readjustment. Under the influence of "the new economics" of government planning, the president urged businessmen not to cut prices and reduce wages, but rather to increase capital outlay, wages, and other spending in order to maintain purchasing power. He embarked upon deficit spending and called upon municipalities to increase their borrowing for more public works. Through the Farm Board, which Hoover had organized in the autumn of 1929, the federal government tried strenuously to uphold the prices of wheat, cotton, and other farm products. The GOP tradition was further invoked to curtail foreign imports.

The Smoot-Hawley Tariff Act of June 1930, raised American tariffs to unprecedented levels, which practically closed our borders to foreign goods. According to most economic historians, this was the crowning folly of the whole period from 1920 to 1933 and the beginning of the real depression. "Once we raised our tariffs," wrote Benjamin Anderson,

an irresistible movement all over the world to raise tariffs and to erect other trade barriers, including quotas, began. Protectionism ran wild over the world. Markets were cut off. Trade lines were narrowed. Unemployment in the export industries all over the world grew with great rapidity. Farm prices in the United States dropped sharply through the whole of 1930, but the most rapid rate of decline came following the passage of the tariff bill.

When President Hoover announced he would sign the bill into law, industrial stocks broke 20 points in one day. The stock market correctly anticipated the depression.

The protectionists have never learned that curtailment of imports inevitably hampers exports. Even if foreign countries do not immediately retaliate for trade restrictions injuring them, their foreign purchases are circumscribed by their ability to sell abroad. This is why the Smoot-Hawley Tariff Act which closed our borders to foreign products also closed foreign markets to our products. American exports fell from $5.5 billion in 1929 to $1.7 billion in 1932. American agriculture customarily had exported over 20 percent of its wheat, 55 percent of its cotton, 40 percent of its tobacco and lard, and many other products. When international trade and commerce were disrupted, American farming collapsed. In fact, the rapidly growing trade restrictions, including tariffs, quotas, foreign-exchange controls, and other devices were generating a worldwide depression.

Agricultural commodity prices, which had been well above the 1926 base before the crisis, dropped to a low of 47 in the summer of 1932. Such prices as $2.50 a hundredweight for hogs, $3.28 for beef cattle, and 32¢ a bushel for wheat plunged hundreds of thousands of farmers into bankruptcy. Farm mortgages were foreclosed until various states passed moratoria laws, thus shifting the bankruptcy to countless creditors.

Rural Banks in Trouble

The main creditors of American farmers were, of course, the rural banks. When agriculture collapsed, the banks closed their doors. Some 2,000 banks, with deposit liabilities of over $1.5 billion, suspended between August 1931, and February 1932. Those banks that remained open were forced to curtail their operations sharply. They liquidated customers' loans on securities, contracted real-estate loans, pressed for the payment of old loans, and refused to make new ones. Finally, they dumped their most marketable bond holdings on an already depressed market. The panic that had engulfed American agriculture also gripped the banking system and its millions of customers.

The American banking crisis was aggravated by a series of events involving Europe. When the world economy began to disintegrate and economic nationalism ran rampant, European debtor countries were cast in precarious payment situations. Austria and Germany ceased to make foreign payments and froze large English and American credits; when England finally suspended gold payments in September 1931, the crisis spread to the United States. The fall in foreign bond values set off a collapse of the general bond market, which hit American banks at their weakest point — their investment portfolios.

Depression Compounded

Nineteen Thirty-One was a tragic year. The whole nation, in fact, the whole world, fell into the cataclysm of despair and depression. American unemployment jumped to more than 8 million and continued to rise. The Hoover administration, summarily rejecting the thought that it had caused the disaster, labored diligently to place the blame on American businessmen and speculators. President Hoover called together the nation's industrial leaders and pledged them to adopt his program to maintain wage rates and expand construction. He sent a telegram to all the governors, urging cooperative expansion of all public-works programs. He expanded federal public works and granted subsidies to ship construction. And for the benefit of the suffering farmers, a host of federal agencies embarked upon price-stabilization policies that generated ever larger crops and surpluses, which in turn depressed product prices even further. Economic conditions went from bad to worse, and unemployment in 1932 averaged 12.4 million.

In this dark hour of human want and suffering, the federal government struck a final blow. The Revenue Act of 1932 doubled the income tax, the sharpest increase in the federal tax burden in American history. Exemptions were lowered, "earned income credit" was eliminated. Normal tax rates were raised from a range of 11/2 to 5 percent to a range of 4 to 8 percent, surtax rates from 20 percent to a maximum of 55 percent. Corporation tax rates were boosted from 12 percent to 133/4 and 141/2 percent. Estate taxes were raised. Gift taxes were imposed with rates from 3/4 to 331/2 percent. A 10 percent gasoline tax was imposed, a 3 percent automobile tax, a telegraph and telephone tax, a 2¢ check tax, and many other excise taxes. And finally, postal rates were increased substantially.

When state and local governments faced shrinking revenues, they, too, joined the federal government in imposing new levies. The rate schedules of existing taxes on income and business were increased and new taxes imposed on business income, property, sales, tobacco, liquor, and other products.

Murray Rothbard, in his authoritative work on America's Great Depression (Van Nostrand 1963), estimates that the fiscal burden of federal, state, and local governments nearly doubled during the period, rising from 16 percent of net private product to 29 percent. This blow, alone, would bring any economy to its knees, and shatters the silly contention that the Great Depression was a consequence of economic freedom.

The New Deal of NRA and AAA

One of the great attributes of the private-property market system is its inherent ability to overcome almost any obstacle. Through price and cost readjustment, managerial efficiency and labor productivity, new savings and investments, the market economy tends to regain its equilibrium and resume its service to consumers. It doubtless would have recovered in short order from the Hoover interventions had there been no further tampering.

However, when Franklin Delano Roosevelt assumed the presidency, he, too, fought the economy all the way. In his first 100 days, he swung hard at the profit order. Instead of clearing away the prosperity barriers erected by his predecessor, he built new ones of his own. He struck in every known way at the integrity of the US dollar through quantitative increases and qualitative deterioration. He seized the people's gold holdings and subsequently devalued the dollar by 40 percent.

The Authoritative Work

With some third of industrial workers unemployed, President Roosevelt embarked upon sweeping industrial reorganization. He persuaded Congress to pass the National Industrial Recovery Act (NIRA), which set up the National Recovery Administration (NRA). Its purpose was to get business to regulate itself, ignoring the antitrust laws and developing fair codes of prices, wages, hours, and working conditions. The president's Re-employment Agreement called for a minimum wage of 40¢ an hour ($12 to $15 a week in smaller communities), a 35-hour work week for industrial workers and 40 hours for white-collar workers, and a ban on all youth labor.

This was a naive attempt at "increasing purchasing power" by increasing payrolls. But, the immense increase in business costs through shorter hours and higher wage rates worked naturally as an antirevival measure. After passage of the act, unemployment rose to nearly 13 million. The South, especially, suffered severely from the minimum-wage provisions. The act forced 500,000 Negroes out of work.

Nor did President Roosevelt ignore the disaster that had befallen American agriculture. He attacked the problem by passage of the Farm Relief and Inflation Act, popularly known as the First Agricultural Adjustment Act. The objective was to raise farm income by cutting the acreages planted or destroying the crops in the field, paying the farmers not to plant anything, and organizing marketing agreements to improve distribution. The program soon covered not only cotton, but also all basic cereal and meat production as well as principal cash crops. The expenses of the program were to be covered by a new "processing tax" levied on an already depressed industry.

NRA codes and AAA processing taxes came in July and August of 1933. Again, economic production which had flurried briefly before the deadlines, sharply turned downward. The Federal Reserve index dropped from 100 in July to 72 in November of 1933.

Pump-Priming Measures

When the economic planners saw their plans go wrong, they simply prescribed additional doses of federal pump priming. In his January 1934 budget message, Mr. Roosevelt promised expenditures of $10 billion while revenues were at $3 billion. Yet the economy failed to revive; the business index rose to 86 in May of 1934, and then turned down again to 71 by September. Furthermore, the spending program caused a panic in the bond market, which cast new doubts on American money and banking.

Revenue legislation in 1933 sharply raised income-tax rates in the higher brackets and imposed a 5 percent withholding tax on corporate dividends. Tax rates were raised again in 1934. Federal estate taxes were brought to the highest levels in the world. In 1935, federal estate and income taxes were raised once more, although the additional revenue yield was insignificant. The rates seemed clearly aimed at the redistribution of wealth.

According to Benjamin Anderson,

the impact of all these multitudinous measures — industrial, agricultural, financial, monetary and other — upon a bewildered industrial and financial community was extraordinarily heavy. We must add the effect of continuing disquieting utterances by the president. He had castigated the bankers in his inaugural speech. He had made a slurring comparison of British and American bankers in a speech in the summer of 1934…. That private enterprise could survive and rally in the midst of so great a disorder is an amazing demonstration of the vitality of private enterprise.

Then came relief from unexpected quarters. The "nine old men" of the Supreme Court, by unanimous decision, outlawed NRA in 1935 and AAA in 1936. The Court maintained that the federal legislative power had been unconstitutionally delegated and states' rights violated.

These two decisions removed some fearful handicaps under which the economy was laboring. NRA, in particular, was a nightmare with continuously changing rules and regulations by a host of government bureaus. Above all, voidance of the act immediately reduced labor costs and raised productivity as it permitted labor markets to adjust. The death of AAA reduced the tax burden of agriculture and halted the shocking destruction of crops. Unemployment began to decline. In 1935 it dropped to 9.5 million, or 18.4 percent of the labor force, and in 1936 to only 7.6 million, or 14.5 percent.

A New Deal for Labor

The third phase of the Great Depression was thus drawing to a close. But there was little time to rejoice, for the scene was being set for another collapse in 1937 and a lingering depression that lasted until the day of Pearl Harbor. More than 10 million Americans were unemployed in 1938, and more than 9 million in 1939.

The relief granted by the Supreme Court was merely temporary. The Washington planners could not leave the economy alone; they had to earn the support of organized labor, which was vital for reelection.

The Wagner Act of July 5, 1935, earned the lasting gratitude of labor. This law revolutionized American labor relations. It took labor disputes out of the courts of law and brought them under a newly created federal agency, the National Labor Relations Board, which became prosecutor, judge, and jury, all in one. Labor-union sympathizers on the board further perverted the law that already afforded legal immunities and privileges to labor unions. The United States thereby abandoned a great achievement of Western civilization: equality under the law.

"The vast money and credit expansion by the Coolidge administration made 1929 inevitable."

The Wagner Act, or National Labor Relations Act, was passed in reaction to the Supreme Court's voidance of NRA and its labor codes. It aimed at crushing all employer resistance to labor unions. Anything an employer might do in self-defense became an "unfair labor practice" punishable by the board. The law not only obliged employers to deal and bargain with the unions designated as the employees' representative; later board decisions also made it unlawful to resist the demands of labor-union leaders.

Following the election of 1936, the labor unions began to make ample use of their new powers. Through threats, boycotts, strikes, seizures of plants, and outright violence committed in legal sanctity, they forced millions of workers into membership. Consequently, labor productivity declined and wages were forced upward. Labor strife and disturbance ran wild. Ugly sit-down strikes idled hundreds of plants. In the ensuing months, economic activity began to decline and unemployment again rose above the ten-million mark.

But the Wagner Act was not the only source of crisis in 1937. President Roosevelt's shocking attempt at packing the Supreme Court, had it been successful, would have subordinated the judiciary to the executive. In the US Congress, the president's power was unchallenged. Heavy Democratic majorities in both houses, perplexed and frightened by the Great Depression, blindly followed their leader. But when the president strove to assume control over the judiciary, the American nation rallied against him, and he lost his first political fight in the halls of Congress.

There was also his attempt at controlling the stock market through an ever-increasing number of regulations and investigations by the Securities and Exchange Commission. "Insider" trading was barred, high and inflexible margin requirements imposed and short selling restricted, mainly to prevent repetition of the 1929 stock-market crash. Nevertheless the market fell nearly 50 percent from August of 1937 to March of 1938. The American economy again underwent dreadful punishment.

Other Taxes and Controls

Yet other factors contributed to this new and fastest slump in US history. The Undistributed Profits Tax of 1936 struck a heavy blow at profits retained for use in business. Not content with destroying the wealth of the rich through confiscatory income and estate taxation, the administration meant to force the distribution of corporate savings as dividends subject to the high income-tax rates. Though the top rate finally imposed on undistributed profits was "only" 27 percent, the new tax succeeded in diverting corporate savings from employment and production to dividend income.

"When Franklin Delano Roosevelt assumed the presidency, he, too, fought the economy all the way…"

Amidst the new stagnation and unemployment, the president and Congress adopted yet another dangerous piece of New Deal legislation: the Wages and Hours Act or Fair Labor Standards Act of 1938. The law raised minimum wages and reduced the work week in stages to 44, 42, and 40 hours. It provided for time-and-a-half pay for all work over 40 hours per week and regulated other labor conditions. Again, the federal government thus reduced labor productivity and increased labor costs — ample grounds for further depression and unemployment.

Throughout this period, the federal government, through its monetary arm, the Federal Reserve System, endeavored to reinflate the economy. Monetary expansion from 1934 to 1941 reached astonishing proportions. The monetary gold of Europe sought refuge from the gathering clouds of political upheaval, boosting American bank reserves to unaccustomed levels. Reserve balances rose from $2.9 billion in January 1934, to $14.4 billion in January of 1941. And with this growth of member-bank reserves, interest rates declined to fantastically low levels. Commercial paper often yielded less than 1 percent, bankers' acceptances from 1/8 percent to 1/4 percent. Treasury-bill rates fell to 1/10 of 1 percent and Treasury bonds to some 2 percent. Call loans were pegged at 1 percent and prime customers' loans at 11/2 percent. The money market was flooded and interest rates could hardly go lower.

Deep-Rooted Causes

The American economy simply could not recover from these successive onslaughts by first the Republican and then the Democratic administrations. Individual enterprise, the mainspring of unprecedented income and wealth, didn't have a chance.

The calamity of the Great Depression finally gave way to the holocaust of World War II. When more than 10 million able-bodied men had been drafted into the armed services, unemployment ceased to be an economic problem. And when the purchasing power of the dollar had been cut in half through vast budget deficits and currency inflation, American business managed to adjust to the oppressive costs of the Hoover-Roosevelt "Deals." The radical inflation in fact reduced the real costs of labor and thus generated new employment in the postwar period.

"The professors of earlier years were as guilty as the political leaders of the 1930s."

Nothing would be more foolish than to single out the men who led us in those baleful years and condemn them for all the evil that befell us. The ultimate roots of the Great Depression were growing in the hearts and minds of the American people. It is true, they abhorred the painful symptoms of the great dilemma. But the large majority favored and voted for the very policies that made the disaster inevitable: inflation and credit expansion, protective tariffs, labor laws that raised wages and farm laws that raised prices, ever higher taxes on the rich and distribution of their wealth. The seeds for the Great Depression were sown by scholars and teachers during the 1920s and earlier when social and economic ideologies that were hostile toward our traditional order of private property and individual enterprise conquered our colleges and universities. The professors of earlier years were as guilty as the political leaders of the 1930s.

Social and economic decline is facilitated by moral decay. Surely, the Great Depression would be inconceivable without the growth of covetousness and envy of great personal wealth and income, the mounting desire for public assistance and favors. It would be inconceivable without an ominous decline of individual independence and self-reliance, and above all, the burning desire to be free from man's bondage and to be responsible to God alone.

Can it happen again? Inexorable economic law ascertains that it must happen again whenever we repeat the dreadful errors that generated the Great Depression.

Return of the Dead Hand

Mises Daily by

For much of the world, the period from the 1960s through the early 1980s was an era of ever-greater government intervention into the economy. The period, an expansion of policy built on an intellectual foundation of Marxism, socialism, corporatism, and progressivism culminated in economic stagnation and inflation in the market-oriented mercantilist-interventionist countries such as the "Great Society" in the United States and economic collapse in the Marxist dominated Soviet block.[1]

Policy was changed and reoriented successfully in country after country toward a framework more consistent with the ideals of capitalism, with outstanding results. The long period of stagnation and, in many cases, decline were, within several years and to the surprise of many critics, turned into a sustained period of growth and development accompanied by new or renewed prosperity and increasing economic liberty.

Per Brink Lindsey this return to markets was not something forced on policy makers but was a deliberate response to the "wide failures of central planning and top down control."[2] Lindsey went on to caution against undue optimism about the continuation of this trend toward greater prosperity:

The world is just beginning to overcome a century-long infatuation with state-dominated economic development; market competition continues to be hindered by a wretched excess of top-down controls, and at the same time undermined by a lack of supporting institutional infrastructure. The invisible hand of the market may be on the rise, but the dead hand of the old collectivist dream still exerts a powerful influence.

A major significant failure of the resurgence of market-oriented policies was a failure to reform the monetary system. This failure left significant control and direction of all aspects of money, credit, and financial flows under central-bank planning and control. The resulting monetary and financial crisis is not a market failure, but a central-planning failure — an artificial boom is a centrally controlled misdirection of production.

The resulting crisis is a "mini" calculation failure. This failure to build economic success and freedom on a solid foundation — a market determined money, a sound money[3] — has resulted in two Fed-originated boom-bust cycles in the last ten years. The most recent and most severe was a direct result of the Fed's overstimulating the economy in 2002–2004 in an attempt to postpone a necessary correction. By keeping the federal-funds rate too low (as viewed ex post even by mainstream critics[4] ) policy successfully masked the misdirections of production from the previous boom, prevented or postponed the needed corrections, and, predictably, misdirected production in other channels — housing and commercial real estate relative to the earlier dot-com boom — making the necessary redirections more severe.

The most recent crisis has been used by opponents of capitalism to undermine the intellectual foundations of the reforms of the 1980s and '90s. Critics incorrectly blame the end of the latest boom and current collapse — which is frequently labeled the worst financial crisis since the Great Depression[5] — on laissez-faire capitalism and greed run amok. The response of voters and policy makers to this crisis, has in the United States, accelerated a return of the dead-hand philosophy, with the result that the proven-failed policies of the past have returned, masked as hope and change.[6]

Some in the press have begun to recognize the correctness of the Austrian arguments on the ultimate cause of the current crisis, credit creation, and monetary expansion made possible by a central bank.[7] Some may even accept the notion that, once the recession is over, we might discuss monetary reforms to prevent the next boom and bust. But most continue to labor under the false impression that Austrian policy is too harsh.

Why is what works effectively and efficiently in normal times, too harsh in the times of a bust?

Ultimately, any good policy, whether for good times or "bad" times, should be based on the same sound principles. Policy that promotes growth also promotes recovery, since both growth and recovery are really one and the same thing: entrepreneurs hiring resources for new or expanded ventures as they seek profits by providing valuable goods and services to consumers or other business using economic calculation.

Keep in mind that in the dynamic US economy, even in good years, 15% of the jobs disappear each year. But their places are quickly taken, and in a growing economy augmented, by new jobs created by startups and expansions.[8] Entrepreneurial planning based on economic calculation corrects errors in good times and, if allowed, would correct errors in bad.

The essentials for prosperity are everywhere and always the same: economic calculation and entrepreneurial planning augmented by a sound monetary institutional framework, a predictable legal framework that supports property rights and contract enforcement, and highly competitive resource markets.

If we are to better understand why the same policy that generates wealth to begin with is the best policy to follow in a crisis, one should begin with an understanding of the boom-bust process and more sound normative interpretation of the boom and bust.

The standard normative judgment about the phases of the cycle is that a period of booming economic activity is typically considered "good" while the downturn/recession/bust is "bad." But it is the boom times that play host to the plague of malinvestments, overconsumption, and misdirected production. The bust brings readjustment and reestablishes the potential for sustainable growth. The standard normative judgment is backwards: it is the boom that is "bad," and thus should be prevented or halted before it proceeds too far. It is the bust that, from a long-run perspective, is "good." Past errors and misallocations of resources are discovered and, if markets are allowed to work, corrected.

The reversed traditional normative interpretation of the cycle phases supports the erroneous view that a central bank brings stability and promotes growth. The correct theory and normative ordering leads to the conclusion that placing the money supply under government control — control by a central bank — is not good, whether judged by the standard of efficiency or stability. A central bank is one of the economy's main destabilizing forces, and the growth that it promotes by money and credit creation is inefficient and unsustainable; a boom will eventually be followed by a bust.

Given that we have a central bank, and that said bank has created a crisis, what is the appropriate response?

A good response requires a proper understanding of the correction process — the recession and recovery. The process of recession and recovery are intricately intertwined. Recession is the process of discovering previous excessive errors in the allocation of resources — allocations of resources not consistent with underlying preferences and resource availability.

Recovery (and normal, sustainable growth) is the process of directing and redirecting resources more efficiently, i.e., more consistently in line with preferences and inherent scarcity. Now, if (as assumed in many economic models) all resources were perfectly homogenous, and all prices, wages, and interest rates perfectly flexible, then the recession and recovery would be a single, quick, and practically painless process. As resources were released from declining activities they would be immediately absorbed into expanding industries.

But in the real world, with nonhomogenous resources — including both capital goods and labor — and rigidities in adjustment processes (and, more importantly, state-induced impediments, both existing and new, to competitive markets), the second phase of the process — recovery, the redirection of available resources to more productive uses — significantly lags the first phase of the process, recession.

What then sets the stage for recovery, job creation, and ultimately growth? What is common knowledge to most business people, and should be evident to all students of Austrian capital theory, is that a significant portion of current activity is directed not to current, but to future consumption. Planning, which includes decisions on reinvestment to maintain current levels of production into the future, new investment for expansion, and investment in new enterprises is future oriented. What we do know from theory and history is that sound money, easy and predictable taxes, a stable legal environment built on a rule of law and contract enforcement, and broadly competitive markets encourage successful long-run planning.

What then is the Austrian answer to today's problems? First, per Hayek and supported by Rothbard, stop the credit creation and inflation. Second, per Hayek, prevent a secondary deflation, a deflation that may provide no steering function. Third, remove all government impediments to effective entrepreneurial planning: avoid protectionist measures internationally; allow prices and wages to adjust as needed to restore market equilibrium. Fourth, not only cut tax rates, as was done in the incomplete reforms of the 1980s and early in this century, but, per Rothbard, drastically reduce the government budget, both taxes and expenditures.[9]

Instead, we now — unlike the last major crisis period of the early 1980s, where a tighter monetary policy was accompanied by market liberalizations in other areas, including tax reform — see a policy response that is a combination of the earlier failed and discredited policies that brought on the calamities of the 1930s, the 1970s, and the collapse of the economies of the former Soviet states.

Instead of a monetary-policy response that seeks to end monetary expansion while preventing a contraction of the money-spending stream à la Hayek, we see a Fed balance sheet at $2 trillion and growing. There is pressure from some to target not price stability but a 5–6% inflation rate in the CPI and, since some economist believe the correct target for the federal-funds rate is significantly negative and the actual effective limit is zero, for the Fed to undertake operations over and beyond the traditional targeting of the federal-funds rate.[10] Instead of fiscal constraint and tax decreases, we see massive expansion of government spending, a guaranteed massive tax increase when the 2001–2003 tax cuts expire automatically in 2010, threatened (and actual in some states already) tax increases on the rich (those making over $250,000), a proposed cap-and-trade policy to fight global warming, which is in fact a massive tax increase on production and on any consumption activity that uses fossil-fuel-based energy.

Instead of privatization, we see major government takeovers of private business in the automotive and financial sectors, which are often conducted in ways that violate contracts and supersede the rule of law. We see wasteful government misdirections of production through subsidies and directives such as an energy policy that promises "green jobs," but is an energy policy missing one key ingredient: delivery of affordable reliable energy. The jobs created would be created in the same way that substituting spoons for shovels would increase employment for grave diggers or, to paraphrase Frédéric Bastiat's "Petition of the Candlemakers," in the same way that mandatory restrictions on sunlight would enhance employment for candle makers and other producers of artificial light. What we have is a significant assault on freedom and the market, which should have predictable long-run negative impacts on the economy.

The period from 1980 to 2000 illustrates how well markets can perform when freed from even some of the collectivist constraints of the past. Those same twenty years illustrate the ultimate destructive power of money and credit creation to misdirect production and falsify calculation, even in a period of relatively stable prices.

Without a foundation of sound money, cycles are inevitable and destructive. If the crisis is used as an excuse to bring back the dead hand of collectivist policies, it is not only destructive of short-term economic well-being but also of long-term freedom and prosperity.

Rothbard saw the alternatives for the American economy in the 1980s, without liberal reforms, as a choice between 1929-type depression or an inflationary depression of massive proportions.

There are other current likely alternatives, the most likely of which is a return of a 1970s-style decade-long period of increasing unemployment and inflation. Other alternatives resemble the decade-long stagnation of the Japanese economy, or a permanent Eurosclerosis.

There is however still time to turn course and follow the Austrian prescription for future sustainable prosperity. We can end government intervention in the economy. Such a policy has been dubbed as draconian, but the pain of a short, severe recession followed by renewed, sustained growth and prosperity may actually be "comfortable and moderate compared to the economic hell of permanent inflation, stagnation, high unemployment, and inflationary depression"[11] that is the likely outcome of a continuation of our current policy path.

Obama and 'Regulatory Capture'

It's time to take the quality of our watchdogs seriously.

The reason why those who see economic regulations as akin to tyranny often win policy debates is because they have a fiery argument with visceral appeal. Those who try to sell the virtues of the supervisory state tend to favor the passive voice. They don't do fire. They do law review.

The situation ought to be the reverse today. We have just come through the most wrenching financial disaster in decades, brought about in no small part by either the absence of federal regulation or the amazing indifference of the regulators.

This is the moment for a ringing reclamation of the regulatory project. President Barack Obama is clearly the sort of man who could do it. But in a white paper his administration released on the subject last week, the bureaucratic mindset prevails.

The report uses bland, impersonal explanations for the current crisis. Regulatory agencies were ill-designed, we are told. Their jurisdictions overlapped. They had blind spots. They had been obsolete for years.

All of which is true enough.

What the report leaves largely unaddressed, however, is the political problem.

It was not merely structural problems that led certain regulators to nap through the crisis. The people who filled regulatory jobs in the past administration were asleep at the switch because they were supposed to be. It was as though they had been hired for their extraordinary powers of drowsiness.

The reason for that is simple: There are powerful institutions that don't like being regulated. Regulation sometimes cuts into their profits and interferes with their business. So they have used the political process to sabotage, redirect, defund, undo or hijack the regulatory state since the regulatory state was first invented.

The first federal regulatory agency, the Interstate Commerce Commission, was set up to regulate railroad freight rates in the 1880s. Soon thereafter, Richard Olney, a prominent railroad lawyer, came to Washington to serve as Grover Cleveland's attorney general. Olney's former boss asked him if he would help kill off the hated ICC. Olney's reply, handed down at the very dawn of Big Government, should be regarded as an urtext of the regulatory state:

"The Commission . . . is, or can be made, of great use to the railroads. It satisfies the popular clamor for a government supervision of the railroads, at the same time that that supervision is almost entirely nominal. Further, the older such a commission gets to be, the more inclined it will be found to take the business and railroad view of things. . . . The part of wisdom is not to destroy the Commission, but to utilize it."

The George W. Bush administration elevated this strategy to a snickering, sarcastic art form. It gave us a Food and Drug Administration that sometimes looked as though it was taking orders from Big Pharma, an Environmental Protection Agency that could never rouse itself from the recliner, an energy policy that might well have been dictated by Enron, and a Consumer Product Safety Commission that moved like a rusty wind-up toy.

And it created a situation where banking regulators posed for pictures with banking lobbyists while putting a chainsaw to a pile of regulations. Smiles all around. Let the fellows at IndyMac do whatever they want.

Misgovernment of this kind is not a partisan phenomenon, of course. Democrats have been guilty of it as well as Republicans. Conservatives have written about it as well as liberals. The most famous essay on industry's power over the regulatory state was penned by George Stigler, a Nobel Prize-winning, Chicago-school economist.

Yet today we talk around this problem, with its nose-on-your-face obviousness, as though it didn't exist. It's not until page 29 of the Obama administration's densely worded white paper that you find a reference to "regulatory capture," and then it is buried in a list of items to be considered by a future Treasury working group.

Maybe the administration downplays bad or bought regulators because it believes organizational tweaking can solve the problem. If the new missions of the regulatory agencies are defined clearly and their operations made transparent, it will limit the ability of some future regulator to mess things up.

But the administration must go further. Calling this infernal species of misgovernment by its true name would allow the president both to vindicate the regulatory state and address the problems of recent years. After all, the Bush team was only able to install the dreadful regulators it did because the governance of federal agencies was rarely a topic of public debate in those days.

Mr. Obama should make it an unavoidable subject, something that future politicians will be required to address. The issue cries out for it. And the nation, for once, is listening.

Shooting a Hole in the Bull Case for Gun Stocks

All Eyes on the Fed

Stocks Rally Ahead of Fed

Stocks perked up Wednesday, helped by some positive economic signals and better-than-expected results from a technology bellwether.

Investors are also anticipating two key events slated for Wednesday afternoon: The Federal Reserve's interest-rate committee will emerge from a two-day meeting and issue its latest policy statement at 2:15 Eastern. And the Treasury Department is set to auction off $37 billion in five-year notes at 1 p.m.

The Dow Jones Industrial Average was up 89.4 points at 8412.31 at 11:07 a.m., recouping some of the 217 points it has shed this week coming into Wednesday.

The S&P 500 was up 1.6% at 908.94, boosted by a jump of 3.2% in its basic-materials sector after the Organisation for Economic Cooperation and Development said it revised upward its economic forecasts for this year and next. The Dow Jones Transportation Index, which is sensitive to shifts in the economic outlook, leapt 3.5%.

The technology-focused Nasdaq Composite Index gained 2.2%, helped by a 7.4% jump for Oracle after the software maker beat estimates by posting a 7% slide in earnings last quarter and issued better-than-expected guidance.

Many traders and analysts say the market could remain stuck in a range until more companies report earnings growth. However, such a scenario looks unlikely until sometime next year, when aggregate profits for the S&P 500 are expected to grow about 20%.

"No one really has great conviction in that number, but people also aren't really focusing on it yet," looking to adjust their portfolios accordingly, said Citigroup strategist Tobias Levkovich. "We probably won't get into that process until sometime in the fall," when market volatility could build.

The Commerce Department reported that durable-goods orders rose 1.8% in May, contrary to analysts' expectations for a slight retreat. A gauge of capital spending in the report also jumped, and orders for non-defense capital goods excluding aircraft rose by 4.8%, after decreasing 2.9% in April. It was the largest increase since 8.2% in September 2004.

In other economic news, new-home sales sank 0.6% last month as prices fell 3.4% year over year and inventories fell slightly. Builder stocks were mostly higher.

Later in the day, all eyes will turn to the Fed. The central bank is expected to leave interest rates unchanged and is unlikely to announce any other major changes in policy, though traders will be watching for any changes in the Fed's economic outlook or signals that it's beginning to consider an exit strategy from its quantitative-easing program.

Treasury prices edged lower in regular trading recently. The two-year note was off 3/32 to yield 1.2%. The 10-year note slipped 9/32 to yield 3.66%. An auction of $40 billion in two-year notes attracted robust demand on Tuesday, a hopeful sign for bidding on the five-year notes.

Overseas, European stocks posted small gains as metals and bank stocks rose. Asia markets ended mixed.

The dollar was stronger. Oil futures were down ahead of a weekly report on energy inventories.

The Mideast's New Spring of Freedom

Iran is only part of a trend.

The hotly contested presidential election in Iran between Mir Hossein Mousavi and Mahmoud Ahmadinejad is still unfolding, with uncertain results. But regardless of the outcome, the events in Iran are symptomatic of a larger change in the political landscape of the Middle East -- the revival of a regional freedom movement, which stalled in 2006 after the election of Hamas in Palestine.

The results of the recent parliamentary elections in Lebanon and Kuwait clearly indicate that Islamist parties have lost significant ground to their moderate counterparts. By Middle Eastern standards, these two countries, along with Turkey, have well-established democratic traditions.

Young Iranians show inspiring determination to achieve similar gains in their own country. Scholars maintain that societies that manage to have four or more consecutive elections will usually achieve an irreversible democratic transition. Without direct visible foreign intervention, Turkey, Lebanon and Kuwait may have such a transition well under way. The fear that Islamists might somehow impede the process has not yet been realized. Leaders of competing Islamic forces in both Lebanon and Kuwait have conceded defeat. That includes the much-demonized Hassan Nasrallah of Hezbollah.

Along with the dimming influence of Islamists, President Barack Obama's Cairo speech seems to have energized the democratic spirit in the Middle East. In Lebanon and Iran, voters turned out in record numbers. In his speech, Mr. Obama cited the imperative of upholding minority rights, singling out the Christian Maronites of Lebanon and the Copts of Egypt. He also emphasized the rights of women to education and full inclusion in public life. At least in the case of Lebanon, both Maronites and women responded by voting at an unprecedented rate (60%).

Similar results were announced last month in Kuwait, where for the first time high voter turnout elected four women to parliament despite fierce resistance from tribal and Islamic elements. And in Iran, women and youth are leading a mass democratic uprising in cities from Tehran to Esfahan. This could be Iran's own Green Revolution, reminiscent of the Velvet, Rose and Orange revolutions.

Beyond the Obama effect, it seems clear that, with a high degree of sociopolitical mobilization, Islamic parties can be cut down to size. It's an encouraging sign that Islamists have suffered repeated electoral defeats despite efforts to capitalize on widespread voter apathy and the fragmentation of secular parties.

The defeat of the Hezbollah coalition in Lebanon was a major blow to its hard-line supporters in Iran led by Mr. Ahmadinejad. If he loses to his challenger, Mr. Mousavi, then moderates could return to power in Iran and strengthen regional democratic forces at large. If the forthcoming elections in Iraq proceed without any major setback, then the entire belt from Iran to Turkey, including Lebanon and Kuwait, would be on the democratic path.

Regardless of the gains of the Middle Eastern moderates, Islamists will continue to be an integral part of the region's political landscape. But they should neither be pathologically feared nor cavalierly excluded. Rather, they should be actively engaged and encouraged to evolve into Muslim democratic parties akin to the Christian Democrats in Europe. By implicitly recognizing Hamas, President Obama may be leaning in this direction.

The next major test for democracy will be the upcoming elections in Egypt, the most populous Arab country and a strategic U.S. ally. Egyptian bloggers have made their Web sites and Twitter accounts available to their Iranian counterparts after the mullahs disrupted Iran's Internet. The youth's use of information technology has proven to be a surprising match to the brutal autocrats and rigid theocrats they oppose. The Egyptians' display of solidarity with the Iranians proves their commitment to the fundamental principles of democracy.

Mr. Obama should insist that the Egyptian regime allow free and fair elections. Given the elections in Lebanon, Kuwait and Iran, he and his advisers should resist overreacting to the mistakes of the Bush administration by backtracking on democracy promotion. A win for democracy in Egypt will consolidate what's already a trend in the Middle East: the flowering of a Spring of Freedom.

Mr. Ibrahim, an Egyptian sociologist and human-rights advocate, was imprisoned by the Mubarak regime. He has lived in exile since 2007 and is currently a visiting professor at Harvard.

No comments:

BLOG ARCHIVE