Continuing my diary series on the history of corporations--this is a draft book manuscript that I am posting one piece at a time as it is written. This part covers the Reagan Revolution.
Links to the rest of the series, if you missed it, are here at my profile page:
http://www.dailykos.com/...
As always, I welcome any comments or criticisms.
Nine: The Second Gilded Age
The 1960’s had produced significant gains for the people of the United States. African-Americans finally won civil rights and access to full economic and political power. The women’s liberation and gay rights movement marked the beginning of efforts to win equality for these groups. The consumer and environmental protection movements had put firm limits on the unchecked corporate pursuit of profit. Antitrust laws were being rigorously enforced; a proposed merger between the Brown Show Company and Kinney Shoes, for instance, was blocked by antitrust action even though the new company would have a mere 5.5% of the US market.
Corporate power was at its lowest point since the Progressive Era. In 1960, the largest 500 American corporations had controlled 62% of the nation’s total sales, worth $245 billion, and the wealthiest 1% of American families owned 40% of the total wealth. By 1971, this had fallen to 20%. At a gathering of corporate CEOs in 1975, one speaker glumly complained that “The American capitalist system is confronting its darkest hour.” Another wryly noted, “At this rate, business can soon expect support from the environmentalists. We can get them to put the corporation on the endangered species list.”
But another threat to American domination was quietly growing, and this time it did not come from within the US.
The first crack in US power came in 1965, when the United States decided to use its military power to “defend itself from world communism” in Vietnam—and ended up being beaten. In Vietnam, the lesson to be learned was that it is simply impossible to use military methods to win a political war. It was a lesson the US would forget over and over again for the next 45 years.
But one of the deepest effects of the Vietnam War was economic, when President Lyndon Johnson decided to have a War on Poverty and a war on Vietnam at the same time. Unfortunately, since he didn’t want to raise taxes, the only way he could pay for them both at the same time, was to print more money.
The result was inflation. In 1970, the US experienced its first trade deficit in history. By 1971, the European governments were getting nervous that the large amount of American dollars they were holding were, as inflation increased, becoming worth less and less and were pushing down the value of their own currency. In May 1971, in a surprise move, West Germany withdrew from the Bretton Woods agreement, stabilizing their economy by revaluing the deutschmark. Shortly later, the Switzerland withdrew the Swiss franc from the Bretton Woods framework, and also demanded that the US redeem all the American dollars that Switzerland held, in gold, a move quickly echoed by France. The Americans’ store of gold reserves was depleted, further dropping the value of its currency.
The US was flummoxed—up until now, it was the unquestioned leader in the world economic structure. Now, as the Bretton Woods agreement collapsed, President Nixon was forced to act—in August 1971, he announced that the US dollar would no longer be convertible into gold, imposed a ten percent surcharge on imports entering the US, and declared a 90-day freeze on wages and prices. The gold standard was now a thing of the past—henceforth, all of the world’s currencies would be “floating”, pegged only to each other. With the American dollar no longer the de facto international currency, finance capital from other nations became more free to flow all across the world—an effect that would profoundly alter the global economy.
Shortly after the collapse of the Bretton Woods structure, an even more powerful shock informed the world that the US was no longer the sole economic superpower.
Nearly the entire postwar economic expansion, in Europe as well as the US, was based on just one thing—cheap oil. Not only did the entire automobile and air transportation system depend on a ready supply of oil, but so too did the long list of industries that were dependent upon petrochemicals, including such vital things as fertilizers (which had produced the Green Revolution and allowed a substantially increased food supply), the huge plastics industry (upon which so many other industries depended for materials), the artificial textile industry, and the pharmaceutical industry (many medicines are made using petrochemicals). By 1970, oil had become the single most important product in the world—the entire industrialized economy depended, not only on oil, but on cheap oil.
Increasingly, however, cheap oil was available from only one place—the Middle East. The huge oilfields in Saudi Arabia, Iraq, Iran, and the rest of the area looked like an inexhaustible source. Other oil sources, by contrast, were already beginning to dry up as the rapidly increasing demand for oil rapidly outpaced the ability of any industrialized nation to produce its own supply. Even the US, which had itself begun the oil economy by tapping its oilfields in Pennsylvania, California and Texas, was, by 1950, already importing over one-third of its oil requirements from the Middle East.
Western oil companies, most of them American, dominated the Middle East through a series of friendly governments, and, as foreign oil companies carried away as much as 65% of Middle East oil profits, the price of oil increased by barely 2% between 1945 and 1965. As a wave of decolonization swept across the world in the 50’s and 60’s, however, that situation changed. The Middle Eastern states began to assert themselves and to protect their own economic interests against the oil corporations. Most Middle Eastern countries nationalized their oil industry, and demanded larger cuts of the pie. And in 1960, the Organization of Petroleum-Exporting Countries (OPEC) was formed as a committee to protect the oil-producing nations (mostly the Middle East, but also countries such as Venezuela and Indonesia) from the “Seven Sisters” (the seven large oil corporations that monopolized the world energy trade). In essence, OPEC was an international price-fixing cartel.
By 1973, it was apparent that the price of oil was going to increase. As the Shah of Iran, one of the more dependable American client rulers, put it, “You increased the price of wheat you sell us by 300% and the same for sugar and cement. . . You buy our crude oil and sell it back to us, refined as petrochemicals, at a hundred times the price you’ve paid to us. It’s only fair that from now on you should pay more for oil. Let’s say ten times more.”
Soon, OPEC’s economic interests were compounded by political interests. In October 1973, Egypt and Syria launched coordinated invasions of Israel, and the US quickly rushed military supplies to help Israel beat back the attack. In retaliation, a week later OPEC announced a 70% increase in the price of oil to Europe—and a complete embargo on oil to the United States.
The effect of the five-month embargo was catastrophic. The price of gasoline and heating oil quadrupled. Nixon requested a system of voluntary rationing, asking gas stations to limit themselves to ten gallons per customer, and when that wasn’t enough, he passed a law banning gasoline sales on Sundays. To save electricity (nearly all of which came from oil-fueled turbine generators), Daylight Savings Time was extended for several months. By the end of 1974, the Dow Jones industrial average had fallen 45%. In desperation, the US undertook a massive project to build the Alaska Pipeline to bring oil from the Arctic oilfields to the United States. It was just the first of many efforts over the years to develop domestic energy sources and “end our dependence on foreign oil”. None of them worked, and oil politics would play a central role in the world for the next forty years.
The oil embargo exacerbated the inflation that was already going on, and the American economy soon spiraled into the worst economic recession since the Great Crash of 1929. The phenomenon of “stagflation” appeared, in which prices spiraled up, but production remained flat. To hedge against inflation, people began buying as many things as they could before the latest round of price increases took effect. This continuing demand fueled still more inflation. President Jimmy Carter was forced to fund the government budget with massive borrowing, which drove interest rates up. In desperation, Carter began deregulating key industries by removing government price controls, hoping that competition would reduce prices. Problems became even worse in 1979, when the Iranian Revolution overthrew the Shah and created an “Islamic Republic” run by fundamentalist Muslim extremists. Iranian oil production plummeted, and the US once again saw huge gas price increases and rationing measures. The economic “malaise” continued until the early 1980’s, when actions by the Federal Reserve to raise interest rates even further and cut back the money supply (already begun by Fed Chairman Paul Volcker in the last year of the Carter Administration), and a cut in OPEC’s oil prices in 1980, finally ended the crisis.
The OPEC embargo and the Iranian Revolution had profound effects across the world. The US was no longer viewed as an invulnerable superpower. It could be beaten—other nations could successfully assert their own national interests against those of the US, and they did not have to accept dependence on the USSR to do it.
Given the drastic and very public decline of US global power, it is no surprise that the corporations gave their support in 1980 to a Presidential candidate who ran on a campaign wholeheartedly embracing American corporate interests—who vowed to “make America great again” by an unabashed program of militarism, American nationalism, and free market laissez-faire economics. Ronald Reagan’s attempt to return the country to the corporate glory days of the 1950’s, by restoring unquestioned American global dominance, ultimately failed, but it set the tone of American politics for the next three decades.
Reaganomics
Reagan’s economic doctrine became known to both friend and foe as “Reaganomics”. It had two basic components—“smaller government” and “lower taxes”, and centered around the idea of “trickle-down”, also as “supply-side” economics—the idea that allowing the wealthy to amass ever more wealth would entice them to invest more of their money, thereby creating more jobs and better wages for everyone else. “A rising tide floats all boats,” Reagan declared.
The truly original part of Reaganomics, however, was his attempt to undo the New Deal and remove most of the social protections that had benefited Americans for half a century, removing government from the realm of beneficial social programs and trusting instead to religious groups and private charities to carry out those roles. Not even Nixon had dared to undermine the very core of the “welfare state”. Indeed, some of Reagan’s biggest targets were the environmental and consumer protections that Nixon himself had passed. “In this present crisis,” Reagan declared, “government is not the solution to our problem; government is the problem.” Despite all his talk about “getting government off our backs”, however, the fact is that the size of government did not go down much during the Reagan Revolution; government spending in 1981 was 22.9 percent of GDP; it was still 22.1 percent in 1989. Of course, it was not the government itself that the corporations really wanted to remove—they simply wanted to shift it from a supporter of people’s interests to a supporter of corporate interests. And in that, they were extraordinarily successful.
Reagan’s advocacy of the interests of the corporate rich was ruthless. Under his Administration, taxes on the wealthy plunged from 70% to 28%, and taxes were cut on inheritances and on capital gains made through stock sales. The 50 largest corporations in the US managed to avoid, through loopholes and tax credits, paying any income taxes at all. Protectionist actions (called “import relief”) which defended the interests of American corporations against European and Japanese corporations, increased from covering just 12% of US imports to covering 23%, including such crucial industries as steel and automobiles. While most government functions remained untouched (and even grew) under Reagan, those areas that most directly hurt corporate economic interests (labor laws, OSHA rules, antitrust laws, environmental restrictions, regulation of banks and financial traders) were specifically targeted for gutting.
Deregulation
The deregulation campaign first began with the airlines. The airlines had been regulated by the Civil Aeronautics Board, which set safety standards, assigned particular airlines to certain routes, set pricing standards for passenger fares and cargo rates, and provided subsidies for shorter or less profitable routes.
Jimmy Carter had begin the process before the 1980 election—he hoped that withdrawing government subsidies and oversight would reduce the Federal budget deficit and at the same time allow more airline competition which would reduce prices. At the same time, US airlines were beginning to face competition from European airlines; in 1977, the British company Laker Airways began running the Skytrain air service, which offered cheap flights between the US and England, undercutting the regulated airline prices. The airlines appealed to Washington for help, and the Airline Deregulation Act of 1978, written by Senators Ted Kennedy and Howard Cannon, allowed airlines to choose their own routes and set their own fares. The deregulation process was completed by Reagan, who abolished the now-powerless Civil Aeronatics Board in 1984.
The first steps taken by the airlines after deregulation were to remove unprofitable routes by dropping service to smaller destinations and focusing on flights between large cities. Most airlines adopted a “hub” system, in which flights went to a central hub, like Atlanta Airport, and then outwards from there to their destination. Savage price wars drove fares down, while rising fuel costs drove expenses up—the large increase in revenue during the 80’s, therefore, only produced lower profits. Since deregulation made it easier for new smaller airlines to enter the market, the Big Six—TWA, Pan Am, Delta, Eastern, United, and American Airlines—faced enormous competition from smaller cheaper “no-frills” airlines like People’s Express. By 1991, Delta, Pan Am and American Airlines had all disappeared into bankruptcy.
For the most part, however, Reagan did not flatly eliminate regulatory agencies. His favorite tactics were to simply appoint anti-regulation administers to head them, then cut their budget to the point where they could no longer regulate anything even if they wanted to. The best examples were the Environmental Protection Agency, the Occupational Safety and Health Administration, and the Federal Reserve Board.
The environmental restrictions that had been placed upon American corporations by the Clean Air Act, the Clean Water Act, and the Endangered Species Act, were primary targets for the deregulators, and the efforts to gut them were spearheaded by two Reagan appointees, James Watt and Anne Gorsuch. Watt was a fundamentalist Christian who declared to Congress that he didn’t know if it was necessary to protect the environment anymore since Jesus would be returning soon and the world would end in Rapture. Gorsuch was a corporate lawyer for the telephone industry.
Under Watt, the Department of the Interior slashed its enforcement efforts, decreased funding for purchasing environmentally-sensitive lands (and resisted efforts by private environmental groups to donate land), and opened large areas of national wilderness areas to exploration for mining and oil drilling, more than tripling the number of coal and gas leases in Federal lands. Under Gorsuch, the Environmental Protection Agency’s total budget actually went up, but its enforcement and inspection budget was slashed severely.
When Gorsuch was accused by Congress of mismanaging the $1.6 billion Superfund that was intended to pay for toxic waste cleanups, she refused (at the insistence of the Justice Department) to hand over her internal documents, claiming they were protected by “executive privilege”. After an unsuccessful court fight and a contempt of Congress citation, Gorsuch resigned as head of the EPA, then promptly accepted a job as Chair of Reagan’s National Advisory Committee on Oceans and Atmosphere, where she continued to search for ways to remove environmental regulations.
Watt, meanwhile, also resigned after several gaffes (including his remark, while discussing the lack of need for affirmative action, that his staff included “a black, a woman, two Jews and a cripple”). He became a corporate lobbyist to the Department of Housing and Urban Development, and in 1996 he pleaded guilty to charges of influence-peddling at HUD—one of a long string of Reagan officials who would be indicted over the years for corruption.
Another target of deregulation was the Occupational Safety and Health Administration (OSHA), which had responsibility for making and enforcing workplace safety regulations. It found its staff of inspectors and enforcement investigators slashed, existing safety regulations reduced, and the agency’s ability to introduce new safety regulations curtailed. In particular, the Reagan and George HW Bush administrations bitterly fought against “Right to Know” legislation, which would have provided information to workers regarding all the potential hazards of the chemicals used in their workplace. Many workplace safety programs now became “voluntary”.
In 1984, Reagan’s OSHA administrator, Thorne Auchter, left to take a job with the BB Anderson Construction Companies—the same company he had dropped a $12,000 fine against for OSHA violations.
The deregulation effort that ultimately had the greatest effect, however, was the Federal Reserve Board and the Securities and Exchange Commission’s loosening of the financial regulations concerning banks. In 1982, the Reagan Administration introduced the Depository Institutions Act, which allowed certain banks, particularly the Savings and Loans (S&Ls) to expand their activities. The economy at this time was booming, and the financial institutions wanted a bigger piece of the action. Up to now, S&Ls were conservative institutions that made their living by financing relatively low-risk home and commercial mortgages. The new law allowed S&Ls to put up to one-third of their assets into commercial loans, including credit cards. In 1987, the new head of the federal reserve Board, Alan Greenspan, began efforts to undo the Glass-Steagall Act, the New Deal regulations that responded to the 1929 stock market crash by firmly separating commercial banking from investment banking, and prevented commercial banks from placing assets in stock securities. In 1999, the Clinton Administration, which had largely continued the deregulation campaign introduced by Reagan, virtually eliminated the Glass-Steagall regulations.
The results of bank deregulation under the Reagan, George HW Bush and Clinton Administrations would be catastrophic. The S&L industry collapsed in 1989, necessitating a $600 billion government bailout. And in 2008 the entire financial system suffered a massive meltdown, leading to the worst economic collapse since the Great Depression.
The theoretical justification for the deregulation campaigns of the 80’s and 90’s was that the free market was self-correcting, and that if it became maladjusted or out of balance, ordinary market forces would, without government intervention, move the economy back to the norm. In reality, though, “deregulation” was simply an effort to set the corporations free to do as they pleased. And when you allow rich and powerful people to do whatever they want, they will inevitably do . . . well . . . whatever they want.
The One-Sided War Against Labor
By the time Reagan took office, the American trade union movement had been docile for decades, and was already declining in numbers and influence. Only a handful of industries (including steel and automobiles) still had large active unions, and it is indeed apparent that one of the goals of Reagan’s anti-union crusade was precisely to help save those particular corporations, who were facing serious competition from foreign companies and who were desperate to cut their expenses, including their payrolls.
The first clear signal that the corporations were back in charge was the 1981 strike of the Professional Air Traffic Controllers Organization (PATCO), the trade union that represented the air traffic controllers who were employed by the Federal Aviation Authority to direct traffic at the nation’s airports.
The recession had hit the air traffic controllers hard. At the same time, air traffic was expanding rapidly, and there were not enough controllers to cover it all, leading to longer hours and overwork. In 1969 and 1970, PATCO carried out a number of “sick-ins” to protest work conditions—since government workers were forbidden by law from striking, the union had all its members call in sick instead, thereby shutting down the operation without striking.
In 1980, however, the postal workers had staged a strike, and in 1981, shortly after Reagan took over, PATCO decided to strike too. They were, perhaps, encouraged by the fact that they had endorsed Reagan during the election—Reagan was, after all, the only President who had ever headed a union (the Screen Actors Guild).
Instead of dealing with the PATCO union, however, Reagan declared the strike to be illegal, gave the strikers 48 hours to return to work and, when they refused, fired all 12,000 of them, replacing them with US Army controllers until new civilians could be trained. PATCO was decertified and ceased to exist.
Not long afterwards, the National Labor Relations Board embraced the new hostility to unions by announcing that, while Federal labor law prohibited employers from firing employees who went out on strike, it did not prohibit employers from “permanently replacing” them. The NLRB never really explained what the difference was. But the effect was drastic—the number of strikes in the US plummeted by 90%, and by the year 2000 unions declined to just ten percent of the US workforce, the lowest percentage of any other industrialized nation.
Even the strongest unions now were virtually powerless to fight effectively, while at the same time, the corporations turned more and more to “runaway plants”, relocating entire factories in low-wage havens like Mexico. The history of the next decade was a sad litany of unions “negotiating” one series of give-backs after another in exchange for keeping their jobs. The corporations in turn happily accepted all the give-backs—then moved the plants to Mexico anyway.
It wasn’t long before the top union officials began actively moderating worker demands, doing the corporation’s job for it by preaching “unity” and “common interests” between workers and management to “save our jobs”. When steelworkers at the Wheeling-Pittsburgh Steel Corporation went on strike, they were cajoled by their own union leaders into accepting wage levels $5 an hour below the prevailing industry levels, to prevent the whole plant from running to Mexico.
The best-known example of a workers strike being broken by the unions themselves was the campaign of the United Food and Commercial Workers Union (UFCW) Local P-9 against the Hormel Corporation.
In August 1985, the Hormel meatpacking company demanded a 23% wage cut from the workers at its plant in Austin, Minnesota. The national leadership of the UFCW recommended that the workers accept the cut, but after a vote, the Local went out on strike instead. In January 1986, the company brought in strikebreakers and, when street battles broke out, the Governor sent in the National Guard, then, under public pressure, withdrew them.
The P-9 Local organized a nationwide boycott of Hormel products, and also began something they called a “corporate campaign” to organize pickets at other corporations who were stockholders in Hormel. The corporate campaign generated a lot of press and bad publicity for Hormel, but it did not do the one thing that could hurt the corporation—cost it money by shutting down its production. The company made several offers, all of which included wage cuts, and all of which were rejected by the striking workers.
The UFCW national union, meanwhile, was more and more frustrated with its militant local. In his press statements, union spokesman Lewie Anderson sometimes sounded more like a Hormel executive than a union official, telling National Public Radio that the Hormel workers were being paid too much money and had to take a wage cut to keep the company profitable. UFCW President William Wynn declared that the P-9 local was on a “suicide mission” and again asked the workers to return to their jobs.
In March, the UFCW withdrew its sanction for the strike and cut off strike benefits. When that didn’t work, the union began the process of taking the Local into receivership. In June, P-9’s local officers were replaced by the national union, and a give-back deal was cut with Hormel to end the strike. The union was now a full partner in defending the corporation’s interests.
The corporations soon found another way to entice workers to defend the company’s interests, by targeting Federal environmental regulations like the Endangered Species Act. The most infamous instance of the “jobs vs the environment” strategy was the fight over the spotted owl.
In the Pacific Northwest, logging companies like Louisiana Pacific were making fortunes by clear-cutting the old-growth forests. By 1980, fewer than 10% of the old-growth forests remained, and these were habitat for a number of threatened species, including the northern spotted owl.
In 1986, the US Fish and Wildlife Service, which enforces the Endangered Species Act, proposed protecting the owl, thereby preventing the logging companies from cutting the remaining old-growth stands. For the next decade, war broke out between the environmentalists, the Federal Government, and the logging companies. The corporations declared that the Endangered Species Act regulations would force them to close their sawmills, costing 150,000 jobs and devastating the local economy. Bumper stickers began appearing on workers’ cars that read “Save a job—eat an owl”. The environmentalist groups asked for “sustainable logging”, in which selected trees would be taken at a rate that would allow the forest to continually replenish itself, instead of wholesale clearcutting. A few environmentalists were even able to win the support of logging workers by asking the most basic question of all—what happens to their jobs after the company has cut down all the trees?
In 1994, after years of lawsuits, the issue was settled—the spotted owl was listed as a threatened species, the forests were not clearcut, and the predicted economic collapse never happened.
But the “jobs vs the environment” fight still goes on. Today, it exists in the form of the “Wise Use” movement, an alliance of logging, mining, and oil companies who oppose environmental regulations (on everything from endangered species to global warming) and demand a “balance” between “protecting jobs” and “protecting the environment”. And the corporations are still able to entice their workers into protecting the company’s financial interests in the name of “saving our jobs”.
The Second Gilded Age
The wave of deregulation and laissez-faire economics that began with the “Reagan Revolution” continued through the administration of George HW Bush, who had been Reagan’s Vice President, and then under the administration of Bill Clinton, whose “Third Way New Democrat” program was largely just a continuation of Reagan-era deregulation. By the mid-80’s, the corporations were firmly in charge once more and entered a new Gilded Age, an unabashed orgy of corporate power that was summed up in one of the most famous slogans of the era–“Greed is good.”
The new Wall Street barons utilized a number of techniques for lining their pockets. One of the favorites was the “leveraged buyout”, in which an investor would buy up a controlling share of a corporation’s stock in a “hostile takeover”. To finance the stock purchase, the corporate raider would borrow money from a bank, using as collateral the very assets of the company he wanted to buy. Once he gained control, he would proceed to break the corporation into pieces and sell them off, using that money to pay off the borrowed money and pocketing the rest. Some of the pioneers of the leveraged buyout were Warren Buffett of the Berkshire Hathaway investment firm, and Victor Posner of the DWG Corporation, who were already using the tactic in the 1960’s. It wasn’t until the free-wheeling 1980’s, however, that the LBO became a favorite tactic of Wall Street sharks. In 1982, for instance, former US Treasury Secretary William Simon joined a small partnership which bought the Gibson Greeting Card company, a division of RCA, by putting up $1 million of their own money (Simon’s share was $330,000) and borrowing another $79 million against Gibson’s own assets. A year and a half later, they took Gibson public with an IPO (Initial Public Offer), paid off the debt—and Simon’s share of the resulting profit was a cool $66 million. Between 1980 and 1989, there were over 2,000 LBO’s in the US, culminating in the 1989 hostile buyout of RJR Nabisco by KKR Corporation for $31.1 billion.
A large number of the LBO’s were at least partially financed by Drexel Bernham Lambert, where Michael Milken made his reputation as the “Junk Bond King”. Junk bonds were low-rated bonds issued by smaller or less financially-stable companies at a high interest rate. Although the potential payoff for a junk bond was high, the risk was also very high, and more often than not the bond-holder was left with nothing but a worthless piece of paper. But Milken, along with his partner Ivan Boesky, soon began selling enormous amounts of junk bonds, which corporations, especially KKR, used to finance their leveraged buyouts. KKR used Milken’s junk bonds to acquire Beatrice, Safeway Food Stores, Duracell batteries, and the biggest coup of them all, RJR Nabisco.
In November 1986, Ivan Boesky pleaded guilty to a felony count of manipulating securities and, as part of the plea bargain, testified against Milken. Both Milken and the Drexel Burnham Lambert company pleaded guilty to various financial felonies, and both paid over billion dollars each in fines and restitutions. Drexel declared bankruptcy and folded in 1990, only to have many of its partners reappear later as the New Street Capital Corporation. Milken was sentenced to ten years in jail, but got out on good behavior in less than two years. In 2007 he was still listed by Forbes Magazine as the 458th wealthiest person in the world.
Another tool that became a corporate favorite, particularly among CEOs, was the stock option. In 1980, the average CEO pay was 42 times as much as the average worker in his own company. By 2001, this had shot up to 411 times as much, largely due to the use of stock options.
The stock option is a form of non-salary compensation in which the CEO is given a number of shares in the company at a specified value set by the Board of Directors, which, after a time, the CEO is free to sell at their current market value, use the proceeds to pay the company for the stock at the original specified value, and keep the rest. Originally, the stock option was designed as a way to make CEOs more responsible to the Board of Directors by tying their pay to the performance of the corporation—if the CEO did a good job and the corporation’s stock went up, the CEO would be rewarded, and if he did a bad job and the stock value went down, he lost money. The incestuous management network system, however, in which CEOs of one company sat on the Board of Directors of several other companies, sabotaged the whole idea—in effect, corporate CEOs were, as a group, now able to set each other’s pay. Not only did Boards begin giving away stock options on generous terms, but if the stock value fell and the CEOs lost money, the Boards were often willing to renegotiate the terms of the stock option to let the CEO off the hook. As a result, CEO pay skyrocketed—whether the company was performing well or not. In 1980, only one-third of CEOs had stock options—by 1997 nearly everybody did, and over the next four years, the average value held by individual CEOs went from $32 million to $50 million. In some cases, the value given by stock options allowed the CEO to actually buy enough stock to gain a controlling share of the corporation.
Squashing the Opposition
Like their predecessors, the new Robber Barons took steps to insure their continuing political and economic domination.
The “Think Tank”
One of the most powerful weapons adopted by the corporate sector was the “think tank”, which pretends to be an academic or educational institution or foundation, but is in reality a key tool for influencing the political and social debate on policies and issues. Financed by corporations and by wealthy right-wing industrialists like Joseph Coors, Richard Scaife, and a handful of private foundations, the conservative think tanks serve several purposes for the corporations.
Their most basic purpose is to become what one organizer called a “counter-intelligentsia”, a counterbalance to the research that comes from academic institutes and government departments (most of which does not support many conservative goals).
While the various corporate think tanks pretend to do “scientific and academic research” on various issues, such as global warming, endangered species and economics, in reality they do not do any real research at all. In academia and in scientific research that is carried out by government agencies (like NASA, the EPA, NOAA or the USFWS), research is done using the standard methods of science, and all findings are peer-reviewed before being published. In the corporate public-policy institutions, however, there are no such methods; research papers are based solely on political points—their aim is not to inform, but to convince. As a result, conservative think tank “research” often reaches conclusions about economics or biology or climate that are not a part of the scientific consensus and would never pass peer review, but which are nevertheless released and treated by the corporations as if it were solid science. The classic example of this is the “scientific research” that was published for decades by the tobacco industry, which supported the industry’s claims that smoking is not addictive and does not cause cancer. Like the tobacco industry’s reams of “research”, many corporate studies on issues such as climate change or social sciences, are not science at all, but are simply self-serving propaganda designed to advance a particular point of view that the corporate sponsors have an economic interest in. The corporate institutions, of course, do not have to face peer review for any of their studies, nor does any of their “research” ever face any serious evaluation or questioning, since everybody in the institute is there precisely because they already agree with the conclusions that they want these “studies” to reach.
Nevertheless, the corporations are usually able to get their bogus “research” widely disseminated and accepted, because they have massive power with the media and with politicians, and therefore can get their message out far more effectively than most independent researchers, even government agencies, can (the corporate conservatives, for instance, have their own privately-owned TV news network, in the form of Rupert Murdoch’s Fox News, which is little more than a 24-hour corporate propaganda outlet). This is usually enough to convince the public, which generally has little knowledge or understanding of the underlying issues, that there must be a legitimate “debate” or “disagreement” among researchers, when in fact there is not. In the area of global warming, for instance, the industries that would be economically affected by a serious effort to curb greenhouse gas emissions have managed to convince a large portion of Americans that the science behind climate change is uncertain or is even fabricated, despite the fact that the science is has been fully reviewed, tested and accepted by virtually all the world’s climate scientists. The effect of all the disinformation is to convince uninformed people to see the issue as “controversial” and “uncertain” and therefore to delay action or deny the need for it—which is of course exactly what the corporations want to accomplish.
The corporate think tanks are also important as stepping stones for future political figures. By churning out prepackaged “policy papers” and “research”, corporate-supported politicians are able to build up the semblence of a publication record and an “expertise” on a particular subject, which often leads later on to a political appointment. The neoconservative Project for a New American Century, for instance, was able to present its members as “defense policy experts”, when in fact almost none of them had ever even been in the military—they were not “military experts”; they were simply ideological evangelists. Yet many of them found positions high in the American military establishment, mostly because they said all the right ideological things that the political leadership wanted to hear. They then demonstrated their “expertise” in Iraq and Afghanistan.
Election Financing
The primary method through which the corporations influence the American political process is, of course, through political donations. While elections are ostensibly based on the principle of “one person, one vote”, the reality has always been more like “one dollar, one vote”. When the US Supreme Court in 2010 embraced the idea that political donations are a form of speech that cannot be constitutionally limited, they simply ratified a situation that already existed on the ground. In the real world, the corporations had already dominated both political parties for decades.
The amount of money it takes to run for political office in the US is truly massive, and grows steadily with each election. In 1990, a campaign for the US House of Representatives cost a winning candidate on average $407,500. By 2000, the combined totals spent by both candidates usually exceeded $2 million, and some races reached a combined total of $7-8 million. By 2008, it was not unusual for the average combined expenses of both candidates to top $5 million, and some races managed to double that. By 2000, candidates for Senate routinely spent a combined total of over $12 million; in 2008 $16 million was not uncommon, and the biggest-spenders topped combined totals of $40 million. In 1996, both Presidential candidates spent a combined total of $239.9 million—by 2008, the combined total was slightly over $1 billion, and Obama’s winning Presidential campaign all by itself cost roughly $730 million—three times as much as both candidates spent a decade before.
It is little wonder, then, that elected officials can spend as much as a third of their time in office simply raising enough money to enable them to run for re-election.
In the US, political campaign money is divided into two categories. “Hard money” refers to donations that are given directly to a candidate who is running for office. In the United States, corporations and other organizations are not allowed to make direct contributions to candidates, but they are allowed to set up “Political Action Committees”, or PACs, which are then allowed to raise money from their individual members and make campaign contributions on their behalf. Independent PACs may also be established by a group of individuals who can then seek contributions from individual members of the public.
“Soft money” refers to donations that go to the political party itself rather than to any particular candidate. This money was intended to be used for “party-building” such as research or funding local party organizations. Soft money is often donated to political parties by “527 groups”, which are fundraising organizations set up under section 527 of the tax code. One of the most effective uses of soft money are for “issue advertising”, which can be used to publicize the party’s positions on issues but cannot endorse a particular candidate. Many 527’s produce and run their own issue-oriented ads. Such ads often skirt the very edge of legality—they may say, for instance, “Candidate A stands up for our issues, but Candidate B does not”, and as long as they do not specifically say “Vote for Candidate A” or “Vote against Candidate B”, they stay within the law.
Campaigns must spend massive amounts on political advertising, since in a national or statewide race, this is the only way that they can reach large numbers of voters with their message. To raise such huge amounts of money, of course, candidates must go where the money is—to the corporations. In 2008, according to the nonpartisan Center for Responsive Politics, 70.8 percent of all campaign contributions came from corporations, while 2.7 percent came from labor unions; and 69.5 percent of PAC contributions came from corporations, with 15.7 percent from unions. Corporate money, in short, dominates the electoral process.
It is a popular myth that the corporations consistently favor the Republican Party (the traditional supporter of big business interests) over the Democratic Party (which is viewed as more populist), but in reality this is not true—the corporations are in fact ruthlessly nonpartisan. The history of corporate contributions shows that they have no loyalty to either party—they simply tend to support whichever party happens to be in power and is in a position to give them what they want. In 2006, for instance, when the Republicans were in power, the health care industry gave 62% of its support to Republicans; two years later in 2008, when the Democrats were on the rise, health care industry contributions to Democratic candidates shot up to 53% of the total. Contributions by lobbyists, who make it their business to stay close to whoever is in power, jumped with ease from party to party; in 1994 the Democrats got 74% of lobbyist contributions, in 2006 the Republicans got 56%, and in 2008 the situation had reversed again, with Democrats getting 56%.
Many corporations, of course, simply play it safe by giving cash to both sides. In the period 2009-2010, commercial banks ranked thirteenth on the list of Congressional Republican contributions, and at the same time ranked twentieth on the Democratic list. Pharmaceutical interests ranked tenth for Republicans and fourteenth for Democrats; investment banks and securities corporations ranked number six for Republicans and number three for Democrats. Other industries that appeared in the top 20 contributors to both parties were real estate companies, law firms, lobbyists, electrical utilities, and entertainment corporations.
Increasingly, even the lobbyist’s own companies have become bi-party conglomerates, allowing their corporate clients to gain simultaneous access to both party leaderships with the same lobbyist—a sort of one-stop shopping for corporate political influence. Democrat Jack Quinn, for instance, once Chief of Staff to Al Gore and Counsel for Bill Clinton, formed the Quinn Gillespie and Associates lobbying firm in January 2000 with Republican Ed Gillespie, former Chairman of the Republican National Committee. The firm has been in the top ten richest lobbying companies every year since 2003. And in 2008, former Senators John Breaux (Democrat) and Trent Lott (Republican) formed the Breaux-Lott Leadership Group as a “bipartisan” lobbying firm.
The corporations, of course, care far more for their own economic interests than they do for partisan politics. Their basic mode of operation is to identify those political figures who can be most influential in the areas that matter to them, and then flood those politicians with money. A good example of this highly targeted corporate influence is the recent health care reform battle, in which the Democratic Party (who had total control of the White House and both chambers of Congress) managed to take a proposal (public-option health care) which had two-thirds of the public behind it, and turn it into a giveaway for the health care insurance industry.
The health care proposal was of huge financial interest for the health insurance companies, and they poured a ton of money into insuring that they got their own way. The 20 largest insurance and pharmaceutical companies poured over $35 million into lobbying efforts in just the first three months of 2009, a 41% increase. The Pfizer Corporation alone spent $6.1 million in 2009, and the Merck Corporation spent $1.5 million. The industry’s trade group, the Pharmaceutical Research and Manufacturers of America, spent $6.9 million.
Most of the corporate effort was focused on a handful of legislators. In the House of Representatives, the Chairman of the Ways and Means Committee, Democrat Charles Rangel, received $1.6 million in donations from the health industry in two years, and the committee’s senior Republican, Dave Camp, got $1 million. In the Senate, where the real fight would be, Finance Committee Chairman Max Baucus got $1.5 million (making a total of $3 million since 2003). Committee member Ben Nelson received $2 million during his last three re-election campaigns, placing him fourth in the list of largest health-care contribution recipients (behind John McCain, John Kerry and Chris Dodd). Mary Landrieu, another member of the Finance Committee, received $102,000 since 2007. And Democrat-turned–independent Senator Joe Lieberman received $2.1 million since 1989. All of these Senators played key roles in killing the public option, which the corporations opposed, and substituting mandated private insurance which would give 30 million new paying customers to the insurance companies.
It is no wonder, then, that Americans view corporate campaign donations as little more than legalized bribery.
Attempts have been made to remove the overpowering influence that corporate money has on the political system. In 1907, the Tillman Act outlawed direct campaign contributions to candidates by corporations. In 1974, after the Watergate scandals, the Federal Election Campaign Act was passed, which spelled out a number of disclosure requirements and limits on contributions, and established the Federal Election Commission (FEC) to enforce them.
The 1974 act was taken to court on a First Amendment challenge, and in the 1976 Buckley v Valeo case, the Supreme Court upheld most of the law, but, in its decision, introduced a line of reasoning that would have a huge impact later. The Court accepted the argument that in the political process it is necessary to spend money to gain access to the media and that, therefore, limiting corporate contributions was a restriction on the First Amendment right of free speech. However, while acknowledging (for the first time) that corporations were entitled to First Amendment protection under the theory that “money equals speech”, the Court then ruled that the state had an over-riding interest that negated that particular free speech right. The right of free speech is not, after all, absolute—it can be over-ridden if there is a compelling public interest in doing so. A person’s free-speech right to yell “fire” in a crowded theater, or to make false reports to the police, for instance, is over-ridden by the compelling public interest in a functional public safety system. In the case of elections, the Court concluded that the corporate right of free speech was over-ridden by the public’s compelling interest in having elections that were free from “corruption or the appearance of corruption”, and therefore corporate speech could properly be limited by electoral laws.
The Court also ruled that some restrictions could only apply to direct campaigning speech, but not apply to speech that was issue-oriented rather than candidate-oriented. It was this distinction that led to the establishment of “soft money” as distinct from “hard money”.
In 2002, the Bipartisan Campaign Reform Act (known as the McCain-Feingold Act) was passed, which regulated the ways that soft money could be used. Once again, the corporations sued, and in 2010 the Supreme Court issued its ruling in McConnell v FEC, which echoed the earlier Buckley case—the corporations had the First Amendment right to free speech, but the state’s interest in fair elections over-rode that right.
One of the restrictions imposed by the McCain-Feingold Act was a ban on third-party “electioneering communications” within 30 days of the election. This restriction became the subject of a lawsuit, and in 2009, in the Citizens United v FEC case, the Supreme Court reversed a portion of its earlier opinions, and now ruled that a corporation’s First Amendment right of free speech (through political contributions) was not, under certain circumstances, over-ridden by any compelling public interest in elections, and therefore it could not be legitimately restricted by the FEC. Although the court upheld the ban on direct corporate contributions to candidates, it ruled that certain soft money restrictions were invalid.
The real-world effect of the Citizens United case was far far less than its public-relations effect. It resulted in a flurry of public outrage, most of it based on the mistaken view that corporations had received a new right to “spend as much as they want on political candidates” (it was only a small number of soft-money restrictions that were lifted), that the case had “turned corporations into persons” (that had already been done by the Santa Clara case in 1886), and that the case had “given corporations First Amendment rights” (that had already been done by the Buckley case in 1974). In the real world, all the decision did in effect was legitimize what was already reality—corporate money already dominated the entire political process, including both parties, and corporations already could, by one means or another, spend as much as they wanted to win political favor. By forcing everyone to openly acknowledge that fact, however, the corporations may have awakened their own worst enemy—the public. As Sen. John McCain, one of the co-authors of the campaign finance bill, noted, “I think there’s going to be, over time, a backlash.”
By the end of the 1980’s and into the 21st century, however, the corporations were back on top, and once again dominated the US social and political system.
But the rest of the world was now ready to fight back.