“They only call it ‘class warfare’ when we fight back.”
--Anon.
“There is class warfare in America, and my class is winning.”
--Warren Buffett
America likes to think of itself as a “classless society.” One of our founding myths was that America was created so that all men would be free and equal before the law. (Women, of course, were a different matter. As were the slaves, Native Americans and those still working as indentured servants.) The way the story is told today, our country started out as and continues to be a place of unbounded opportunity where any person, by sheer grit and talent and hard work, can pull himself up by his own bootstraps and amass a fortune. For very similar reasons, we still tell our children that – in this country – “anybody can grow up to be President.”
It is a comforting story, but it is also a lot of nonsense. It is also a fairly recent invention, one that I think was created for the express purpose of papering over the real economic differences between U.S. citizens.
Indeed, while it may be one of our “founding myths” it is not a myth that has been around since our founding. Throughout the 18th and 19th centuries every American citizen would have been keenly aware that one’s status – and legal rights – turned largely on how much wealth one had. In many – perhaps all – of the original states, even the right to vote, the most fundamental right in a democracy, was limited to those men who owned a certain minimum amount of land or had a certain minimum annual income. The poor were explicitly and literally disenfranchised.
And this state of affairs, the rights that welled up not from innate ability or merit but instead from sheer wealth, continued well into the 20th century. Nevertheless, the early part of the last century saw great strides made by those who fought passionately on behalf of American laborers, and who recognized that the interests of laborers and employers did not coincide. This progress was made despite the fact that class conflicts often turned bloody and violent, as employers (very often with the help of the public authorities) took action to force laborers to work under terrible conditions for little pay. But the story of this progress is mostly swept under the rug and, less than 100 years later, is largely forgotten.
The Ludlow Massacre
One story about America that is never taught in our public schools is the Ludlow Massacre, which took place in 1914. It arose out of a dispute between coal miners and their employers. Essentially, the miners (who were working under conditions that resulted in about twice the number of deaths as other, non-unionized mines) made demands for increased mine safety, better pay, and 8 hour working days. When their employers refused these demands, the miners went on strike. They were promptly evicted from their homes – Ludlow was a “company town” – and so they and their families moved into a “tent city” that had been created on land leased by the United Mine Workers of America (“UMWA”) in anticipation of the strike.
The mine owners hired a detective agency known for taking aggressive action in breaking up striking workers. “Aggressive action,” in this case, included randomly shooting into the miners’ tent city. The miners took to digging trenches under their tents into which they could hide when the bullets started flying. Eventually, the Colorado National Guard was called in to quell the violence, but by the spring of 1914 Colorado had run out of money to maintain the Guard and most were recalled.
The Guard, under the command of a man named John Chase, essentially sided with management, and ultimately decided that the tent colony should be destroyed. To that end, a machine gun was installed on a ridge where it could shoot into the tent city, and it opened fire. Although most of the miners and their families were able to escape (a passing freight train stopped in the line of fire, and allowed the miners and their families to board and ride away to safety), 19 people were killed either by gunfire or by asphyxiation (the tent city was set ablaze and those who were unable to get away asphyxiated while lying in the trenches they had dug in order to avoid the bullets), including 4 women and 11 children.
The massacre so inflamed the country that much of Colorado descended into full-scale guerrilla war. Workers from neighboring states took to arming themselves and catching trains to the fighting, to take part. A coalition consisting of the Colorado National Guard, private mine guards, and hastily created “militia units” fought against the workers. Eventually, Woodrow Wilson was forced to send in Federal troops to disarm both sides.
It was the bloodiest labor conflict in United States history, and almost nobody who grew up after WWII and got a public education has ever heard of it.
The Battle of Blair Mountain
Something very similar would occur only seven years later, in West Virginia. For decades, coal operators – who were de facto feudal lords in the area – had used the local law enforcement agencies and private “detectives” -- in particular, the Baldwin-Felts agency, which was the same agency that had been hired by the Colorado mine operators in Ludlow – to prevent coal miners from unionizing. (Similar to what happened in Ludlow, in 1913 the Baldwin-Felts agency and a Kanawha County, WV mine operator drove an armored train called “The Bull Moose Special” through a tent colony of striking workers; while doing so, they opened fire on the striking miners and their families with machine guns and high-powered rifles.)
Despite these efforts, by 1920 most of West Virginia’s coal mines – but not the southern coalfields -- had been unionized. Accordingly, the United Mine Workers of America decided to attempt unionization in Mingo County, WV. Cabell Testerman was the mayor of the town of Matewan in Mingo County, and was sympathetic to the miners’ cause; so was his sheriff, a man named Sid Hatfield.
Once again, striking miners were evicted from their company-owned homes, and once again a tent city was set up by the UMWA. Once again, Balwin-Felts agents attempted to mount machine guns, this time on the roofs of buildings in Matewan, and offered Mayor Testerman a $500 bribe to allow them do so, which Testerman refused.
A firefight that came to be known as the “Matewan Massacre” broke out on May 19, 1920. Essentially, Albert and Lee Felts (brothers of the agency’s founder) and some other men went with guns to violently evict more miners’ families from their homes. Word spread to Sid Hatfield, who waited (along with a number of deputized miners) for these men to return to the train station. When they confronted each other, Sheriff Hatfield stated that they were all under arrest, to which Albert Felts replied that he had a warrant for Hatfield’s arrest. Mayor Testerman was alerted to the situation, and came out to review the warrant. When he exclaimed, “This is a bogus warrant,” a firefight broke out. Mayor Testerman was killed, as were Albert and Lee Felts and 8 other men.
Hatfield was tried for the killing of Albert Felts in January, 1921, but he and his other defendants were acquitted on all counts. However, on August 1st, 1921, when Hatfield had to appear in McDowell County to face other charges, he and his friend Ed Chambers were attacked by Baldwin-Felts agents standing at the top of the courthouse stairs. Hatfield was killed immediately. Chambers was shot and he rolled down the stairs. Then a Baldwin-Felts agent walked down and put two bullets in the back of his skull.
The killing of Hatfield enraged the miners, who began to pour out of the mountains and talk about militarizing. On August 7th a union meeting was held in Charleston, and a list of demands was presented to the WV governor, who promptly rejected them. The miners then started to talk about marching to Mingo County to free some jailed miners, end the martial law that had been imposed there, and organize the county. However, doing so would entail marching through notoriously anti-union Logan County, and many labor leaders (including Mary Harris “Mother” Jones, once called “the most dangerous woman in America” for her ability to unionize coal miners) urged restraint.
Nevertheless, by August 16th, 13,000 men had gathered in Kanawha county to begin marching toward Logan. Armed men began gathering at Lens Creek Mountain, not far away, and miners near St. Albans commandeered a freight train that they renamed the “Blue Steel Special” to meet up with another advanced column of marchers at Danville. While this was going on, Sheriff Dan Chafin of Logan County began setting up defenses on Blair Mountain. He was supported by the Logan County Coal Operator’s Association, and had a private armed force of about 2,000 men.
Skirmishes between the two forces began on August 25th. However, a meeting in Madison, WV seemed to resolve the issue and the miners agreed to return home. But Sheriff Chafin was not to be denied his chance to use his private army, and started deliberately shooting union sympathizers in the town of Sharples. When this news reached the miners, they turned back and again began marching on Blair Mountain, joining battle with Chafin’s forces on August 29th. During this conflict, the United States Army sent planes up to conduct aerial surveillance of the miners’ position, and then private planes were used to drop gas and explosive bombs on the miners. Sporadic gun battles between the miners and Chafin’s private army continued for about a week, until federal troops arrived to support Chafin and the mine owners. Faced with the threat of taking on the United States military, the union workers stopped fighting and went home.
It was the largest armed insurrection in United States history other than the Civil War, the United States military took up arms against United States citizens, and almost nobody who grew up after WWII and got a public education has ever heard of it.
The Bonus Army
In 1924, Congress authorized a bonus to be paid to veterans of WWI based on the number of days the veteran served. The veterans were given interest-bearing certificates that could not be redeemed until 1945.
However, by 1932 the country was three years deep into the Great Depression, and the WWI veterans began agitating for the immediate redemption of their promised bonuses. While there was some Congressional support for the immediate redemption of these bonuses, President Herbert Hoover and Congressional Republicans refused to do so because it would have required the government to increase taxes to pay the veterans’ bonuses. Hoover and the Republicans were determined not to take any action to alleviate the Depression, but to allow the market to “work things out” for itself.
By the summer of 1932, an assemblage of nearly 43,000 people gathered to march on the capital and demand the payment of their WWI bonuses (the marchers included not only the WWI veterans themselves, but also their families and other sympathizers). But it wasn’t simply a march; in an attempt to embarrass the government into paying their bonuses, the marchers created a shanty town named “Hooverville” across the Anacostia River from Washington. Although a shantytown, Hooverville had streets, sanitation facilities, and was tightly controlled by the veterans.
Although they had taken no violent action, and in fact their protest consisted of nothing more than existing and holding daily parades, on July 28th the U.S. Attorney General ordered the police to remove the Bonus Army (as it was called) from Hooverville. The veterans did leave, but then immediately returned. When they did they found two policemen cornered on the second floor of a building. The policemen opened fire on the returning veterans, killing two of them.
Because the police had killed two of the protesting veterans, President Hoover decided to order the United States Army to remove the veterans from Hooverville by force. Accordingly, General Douglas MacArthur (remember that guy with the corncob pipe, back in Korea? This was his dad) brought out an infantry regiment, a calvary regiment, and six battle tanks (commanded by George S. Patton; yes, that Patton) to take out the protesting WWI veterans.
The calvary charged the veterans, and then the infantry forces marched on them with fixed bayonets and tear gas, gassing our own citizens. They evicted the veterans and their families, who fled, and President Hoover called off the assault. However, MacArthur was sure that the Bonus Army was really a Communist attempt to overthrow the government – despite the lack of any evidence to support this theory – and ordered a new attack anyway. In all, 55 veterans were wounded, a veteran’s wife miscarried, and a 12 week old baby died in the hospital after being caught in the tear gas attack.
Once again, an example of the United States military taking up arms against its own citizens, gassing its own citizens, who were only exercising their right to peaceably assemble and protest. The military did so on the orders of the United States President just to save him from embarrassment, and almost nobody who grew up after WWII and got a public education has ever heard of it. (In fact, a close friend of mine whom I consider to be very intelligent and extremely well-educated called me up about this incident in American history only a year ago. He had just heard of it and was dumbfounded. “Why haven’t I heard about this before?” he wondered. “Because,” I explained to him, “these facts don’t fit the narrative we now like to tell ourselves.”)
* * *
What each of these three forgotten chapters in American history has to tell us is clear: America is not exceptional, at least not when it comes to what those in power are willing to do to ensure that their power is not taken way, or even what they are willing to do to make sure that their power is not curtailed in any way. If that means opening up on striking workers, women and children with machine guns in order to maintain as high a profit as possible, so be it; the government will even help you. And if that means attacking U.S. military veterans and their families with calvary, infantry, tanks and tear gas to save you from political embarrassment – well, what the hell, go right ahead. You are the government.
This is how the word Communism became such a filthy word in the American language. Remember, Communism didn’t start out synonymous with the Soviet Union and the worst type of totalitarianism. Communism didn’t even really start out as a political philosophy; originally concerned only with economics, Communists saw politics merely as a tool by which the promised economic utopia of Communism could be achieved.
Now, don’t get me wrong. I’m not advocating on behalf of Communism, and I do think that there was an awful lot of shortcomings in its philosophy that I would argue will almost necessarily lead to the totalitarian states we see wherever Communism is still considered the organizing principle of the state (China, Cuba, etc.). But it is important to realize that at the time Communism was sweeping across Europe and America, i.e., the first three decades of the 20th Century, it wasn’t perceived that way. It was perceived to be a very rational means by which the great mass of humanity might achieve a little bit more economic parity, and with that a little bit more personal autonomy, a little bit more freedom.
And this is why it was considered such a threat to our way of life in the United States. Because if the common man’s personal power increases, then necessarily the personal power of the elite few who already wield power will have to decrease. What is power, after all, but the ability to do as you wish and to make others do as you wish? If laborers can have a say in their working conditions, in their rates of pay, in the hours worked . . . well, then that means the employers can no longer establish these things by fiat. That is a direct decrease in the employer’s power.
And if all of these things that laborers are demanding are achieved, this necessarily will result in greater costs to the employer. Greater costs means less profit, because not all of those additional costs can be passed on to the customer. And less profit means less money, and less money means less power.
Remember the Medici family motto in 16th Century Florence? “Money to get power, and power to guard the money.” That is precisely how the mine operators and other employers saw things when dealing with the labor movement sweeping the country in the early part of the last century. Enough money had been amassed to acquire political power, and that political power was sufficient to bring violent force to bear when the source of the money was threatened. Nothing ever really changes.
* * *
But it seemed to, for a while. The Great Depression was sufficiently calamitous that a large enough number of people became convinced that America’s previous way of doing things was not, y’know, infallible. By the time of the Army’s attack on Hooverville, the country had been wallowing in the Depression for three long years and things did not look to be improving much when the economy was left to its own devices. Enter FDR and the embrace of Keynesian Economics.
In stark contrast to Hoover’s economic austerity policy, FDR decided that the way to get the country moving again was for the government to run up deficits and start spending money putting people back to work. And beginning in 1933, when he finally took office, he did just that. Vast public projects and public workforces were put together, huge investments were made not only in practical infrastructure (roads, dams, bridges, etc.) but also in things like artwork, murals, national parks, anything and everything that would put people back to work, jump start the economy and get the country moving again.
And it worked, at least for a little while. By 1937, unemployment was still at about 14%, but that was much much better than the 25% unemployment Roosevelt had inherited. With the economy looking to be on the upswing, Roosevelt was persuaded by those economic advisers worried about the increasing U.S. debt to enact tough spending cuts to balance the budget. He did, and the economy immediately began spluttering to a halt. Unemployment jumped by nearly 33% almost immediately, and manufacturing dropped significantly as the country once again limped into a recession.
It would take the largest federal spending program the country had ever seen to finally pull us out of our economic malaise: World War II.
And let’s be clear about that, too. World War II jumpstarted the economy not because it was a war but because it forced the government to spend the vast amounts of money necessary get the country moving again. The political problem with Keynesian fiscal policy is that it always looks like “out of control” spending. People tend to worry when it looks like the government is taking on too much debt; there is even a whiff of immorality about it (which, by that same reasoning, is as if a business availing itself of a line of credit is somehow immoral).
But people don’t bother too much about that stuff when it looks like the alternative is utter destruction by evil forces. All in all, people understood why we were fighting World War II, they understood the necessity for it, and they were willing to go all in to win the fight. That meant racking up enormous government debt to pay for the new military (prior to WWII, it should be remembered, the United States didn’t really maintain a standing military force), to pay for manufacturing munitions, to pay for fuel and transports, and new planes, ships, jeeps. It meant that women – ½ of the nation’s labor force, a huge asset that until then had been underutilized – were now welcome in the work force. And to support this spending it meant paying new taxes, too.
(Perhaps a war of this type, of this moral certitude, is the only thing that can convince enough Americans to pay the taxes and rack up the debt necessary to climb out of an economic hole as vast as the Great Depression was, but that doesn’t mean that “war spending” is somehow magical. If World War II hadn’t presented itself, the same effect could have been achieved by having the government invest the same resources to totally overhaul our infrastructure. The only problem is that overhauling infrastructure isn’t nearly as sexy as going to war, and it almost certainly does not command the kind of passion that makes most of us disregard tax increases and massive government deficit spending. Accordingly, while this alternative spending program would have worked, it would never have attained sufficient popular support to be enacted.)
* * *
Of course, there were additional benefits gained from the successful conclusion of World War II. First, muc of that “spending” the government did wasn’t really “spending” at all – at least not from an economist’s point of view. A great deal of it could more properly be classified as “investment.” Gearing up for and winning World War II necessary meant a huge investment in America’s manufacturing capabilities. That meant that when the War was over, we had the means to produce a whole lot of goods. Coincidentally, we had a whole lot of customers.
With the exception of Pearl Harbor and some U-Boat activity off of the U.S. coast, America was spared the destruction that Europe and Japan endured during the War. All of these war-ravaged countries needed to rebuild, but few had the means to do so quickly. This meant that America’s newly expanded manufacturing industry had a steady demand for its goods.
It also meant that this manufacturing industry needed a large work force and now that the War was over, it had one. A work force that was largely unionized, which meant that employees were bringing home decent paychecks, good benefits, and amassing retirement pensions that they could count upon in their later years. So immediately following World War II, Americans were for the most part fully employed, and employed with good paying jobs that could support the emerging middle class.
Oh, and one other thing came out of World War II that was immediately to the American people’s benefit. The GI Bill. This bill provided a college education for nearly 8 million returning veterans, it provided a year’s worth of unemployment benefits (which were largely unnecessary), and it made home loans available to returning veterans. Significantly, one of the motives for enacting the bill was to prevent another Bonus Army from being formed (so perhaps something good did come out of the Army’s attack on Hooverville thirteen years earlier). Making it possible for millions of Americans to obtain higher education, better jobs, and home ownership helped to cement the emergence of the new middle class.
And the middle class was new. Prior to World War II, there wasn’t really such a thing as the middle class, not as we conceive of it today. Generally speaking, there had existed in America a small number of wealthy people, a large number of poor people, and a small number of slightly less poor people who tended to own small shops, restaurants, or provide professional services. The “middle class” was not a term overly used.
But America’s emergence as an economic powerhouse, coupled with a strong union presence to ensure fair wages for laborers, a progressive income tax that heavily taxed the most stratospheric incomes, the estate tax that prevented (largely untaxed) wealth from being passed from generation to generation, the provision of higher education and home loans, and the social safety net that had been created as part of the New Deal, all worked to flatten the income disparities that the county historically had been saddled with. There were still wealthy people, but the measure of truly obscene wealth was scaled back. And while there were still a great deal of poor people, their numbers were also reduced. The number of elderly living in poverty was cut almost in half, and by and large America began to resemble the America we wish we still lived in today: an America largely populated by people who are economically secure, if not in fact wealthy.
But it wouldn’t last.
* * *
By the time the late 1960s and the 1970s rolled around, the first generation of Americans to have been born without a large percentage of it knowing or fearing serious material want had arisen. And with this generation arose a new focus on social – not economic – justice. The 1960s, of course, are famous for the Civil Rights movement, and the late 1960s through the 1970s are famous for the Women’s Liberation movement. The Stonewall Riots that took place in 1969 were perhaps the first emergence of what is now recognized as the Gay Rights movement. In all, this period brought an emerging focus on expanding social enfranchisement to those members of our society who perhaps had been seen as “equal but not as equal as straight, white men.” And, y’know, it’s hard to fault the people who fought that fight in the 60s and 70s for focusing on social rights.
But as ably demonstrated in Rick Perlstein’s Nixonland, this shifting of priorities from economic to social interests had some rather severe and unanticipated repercussions, many of which are still reverberating today.
First, it must be remembered that in addition to all of the clamor for equal civil rights for all people, there was a large and growing mobilization against the Vietnam War. The same people who protested the war on moral grounds were also those most likely to be protesting against discrimination against others on moral grounds. But in protesting the war, these people were also protesting against the government – the same government that through its policies had created the middle class, had provided the GI Bill, had made possible the type of economic security for which the older generation, watching, felt the younger generation, protesting, should be grateful. This set the stage for an enormous popular backlash against these “young punks.”
Second, the fact that the Vietnam War was a Democratic war when it started meant that most of this protest was initially directed against the Democratic Party. However, it was also the Democratic Party, under LBJ, that got the Civil Rights Act passed. Which meant that this was largely an internal protest pitting those already aligned with the Democratic Party against the party’s own leadership.
Third, the Democratic Party had been known for decades as the party most in touch with and reliant upon unions and the labor movement, both for financial support and for mobilizing its members to get out the vote. Which meant that the Democratic leadership, against which these new members were protesting, was aligned with union leadership. Which meant that those protesting against the leadership were also protesting against the union machines that produced that leadership. Which meant that a wedge was being driven slowly, surely, unintentionally but inevitably between the labor movement and the political party that backed the labor movement. This was the beginning of the “Democrats in Disarray” storyline that has now been a perennial favorite among the lazy Washington media for decades.
Of course, Richard Nixon would famously ride the crest of populist backlash against these protests and plow through the gap that had erupted within the Democratic Party to easily win election in 1968, and he would repeat the trick in 1972. But it was the weakening of ties between the Democratic Party and the labor unions that began in the late 60s that would have the longest lasting effect on America.
* * *
This weakening probably resulted in a mortal blow to private sector unions in 1978. During the early 1970s, corporate management had begun to recognize the need to organize itself and to lobby Washington directly for what it wanted. In 1972 the National Association of Manufacturers moved its main offices from New York – the national center of business and industry – to Washington, DC, announcing: “The interrelationship of business with business is no longer so important as the interrelationship of business with government.” Similarly, future Supreme Court Justice Lewis Powell circulated a memo in 1971 advising that: “Business must learn the lesson . . . that political power is necessary; that such power must be assiduously cultivated; and that when necessary, it must be used aggressively and with determination.” (emphasis added, in both cases).
The business community’s willingness to lobby Congress in pursuit of its own ends is reflected in the numbers. As Jacob Hacker and Paul Pierson point out in their Winner Take All Politics, in 1971, only 175 firms had registered lobbyists in DC; by 1982, 2,500 did. The number of corporations with public affairs offices in DC went from about 100 in 1968 to over 500 in 1978. The Chamber of Commerce’s membership doubled during the 1970s; so did membership in the National Federation of Independent Business. And in 1977, the March Group included 113 of the Forbes Top 200 companies – nearly one-half of the nation’s economy.
When Jimmy Carter assumed the Presidency in 1977, Democrats – long the party of the workers – controlled both houses of Congress and the White House. Organized labor no doubt thought the time was auspicious to introduce some much needed legislative reform. However, organized business now had become a powerful matching force, and as noted above, labor no longer was as tightly entwined with the Democratic Party as it had been. Labor was in for a rude shock.
* * *
The National Labor Relations Board is responsible for reviewing claims that companies have acted illegally to prevent unionization (you know, like shooting workers with machine guns). The NLRB is also responsible for setting penalties for companies found to have violated federal labor laws. But by 1978 the NLRB increasingly had been taking longer and longer to resolve such claims and, when companies were found liable, the penalties imposed had become increasingly light. This had opened the door for management to continue to engage in illegal, anti-union activity, eventually paying (maybe) a (small) penalty that could be considered just “another cost of doing business.” Accordingly, a labor bill was introduced that would have streamlined NLRB procedures and accelerated both the Board’s decision-making and the penalties it imposed. The bill passed the House in October, 1977.
However, in the Senate the bill faced a filibuster. The labor bill was the only piece of legislation that the Senate considered, and its consideration lasted for nearly five weeks. Nevertheless, and despite the Democrats holding sixty-one seats, the most votes the bill ever got in favor of cloture was 58. As Hacker and Pierson put it, “On June 22, after a sixth failure [to achieve cloture], [Sen. Robert] Byrd surrendered. The Senate voted to recommit the bill to the Committee on Human Resources. It would never return.”
One of the many people who understood what this meant now for American workers, and the American middle class, was United Auto Workers leader Douglas Fraser, who served on President Carter’s Labor-Management Group. Shortly after the labor bill was scuttled, Fraser resigned from that organization, explaining:
I believe leaders of the business community, with few exceptions, have chosen to wage a one-sided class war . . . against working people . . . and even many in the middle class of our society. The leaders of industry, commerce and finance in the United States have broken and discarded the fragile, unwritten compact previously existing during a past period of growth and progress . . . . The latest breakdown in our relationship is also perhaps the most serious. The fight waged by the business community against that Labor Law Reform bill stands as the most vicious, unfair attack upon the labor movement in more than 30 years . . . . It became an extremely moderate, fair piece of legislation that only corporate outlaws would have had need to fear. . . . At virtually every level, I discern a demand by business for docile government and unrestrained corporate individualism. Where industry once yearned for subservient unions, it now wants no unions at all . . . . Our tax laws are a scandal, yet corporate America wants even wider inequities. . . . The wealthy seek not to close loopholes, but to widen them by advocating the capital gains tax rollback that will bring them a huge bonanza . . . . For all these reasons, I have concluded that there is no point to continue sitting down at Labor-Management Group meetings and philosophizing about the future of the country and the world . . . . I cannot sit there seeking unity with the leaders of American industry, while they try to destroy us and ruin the lives of the people I represent.
(emphasis added.)
In many respects, Fraser’s resignation letter would prove to be prophetic.
* * *
Over the course of the last 30 years, there has been a marked division between the ‘have-mores’ and the ‘have-nots.’ Note that I did not say “between the ‘haves’ and the ‘have-nots.’” Increasingly, the “haves” are becoming less and less distinguishable from the “have-nots.” It hasn’t settled in yet, and will yet take some decades to go, but the very upper, upper-most among us are surely pulling away. Millionaires may be considered the “haves,” but they will not be among the anointed “have-mores” that the modern American economy is designed to serve. Increasingly, they are being shunted into the “have-nots” category.
And this is because, increasingly, the country is being run for the benefit only of the very upper, upper class of people. Of course, it is considered impolite to say this in public, and so you might not believe me when I make this statement. But (to quote The West Wing’s Jeb Bartlet) if you want to convince me of something, show me the Numbers. Well . . . here’s some Numbers:
Number the First
In 1976, the richest 1% of Americans took home about 9% of the entire American income.
In 2010, the richest 1% of Americans took home about 24% of the entire American income.
Clearly, while the American Pie is getting bigger, a bigger and bigger slice of that pie is going to a small number of people.
(But . . . Hey! At least they’re kicking in their bit too, right? At least they’re paying into the system that is making them so rich! Right . . . . .?)
Number the Second
Here’s another Number, courtesy of the Commonweal Institute. As of 2007, the combined net worth of the 400 wealthiest Americans was $1.5 trillion. The combined net wealth of the poorest 50% of the nation – some 155 million people – was only $1.6 trillion. 400 people in America are as wealthy as 155,000,000 other U.S. citizens combined.
(But . . . you know. I keep hearing about how the wealthy are overtaxed. So, sure, I mean . . . they’ve got all the money, but at least they’re kicking it back to America, right? The country that gave them the opportunity to get rich in the first place?)
Number the Third
Here’s another Number. As of 2007, the top 1% of households owned 34.6% of all privately held wealth; the next 19% owned 50.5%. Which means, of course, that the top 20% of the American people owned 85% of all the wealth in the country, leaving only 15% of the country’s wealth to be scrabbled over by the bottom 80%, the wage and salary workers.
(Yeah, yeah . . . okay, I get it. The rich are getting very, very rich. And everybody else is being left behind. Still . . . the rich pay all the taxes. Surely that is sufficient, right? I mean, they’re not just taking from the rest of us are they? Right?)
Number the Fourth
(1) In 1979, the overall tax rate (payroll taxes, state taxes, sales taxes, federal income taxes, property taxes) for the average schmoe was 22.2% of income. By 2007, that had fallen to only 20.4% of income.
An 8% decrease in taxes for the average schmoe! Yay!
(2) In 1979, the overall tax rate (payroll taxes, state taxes, sales taxes, federal income taxes, property taxes) for the top 1% of households was 37.0%. By 2007, that had fallen to only 29.5% of income.
A 20% decrease in taxes for the top 1%! Yay! (Wait. What?)
(3) In 1979, the overall tax rate (payroll taxes, state taxes, sales taxes, federal income taxes, property taxes) for the top 400 richest households (the ones that own half of the nation’s wealth) was 26.4%. By 2007, that had fallen to only 16.6% of income.
A 37% decrease in taxes! Ya – wait a minute. What the fuck?
* * *
So, let me get this straight. The already rich have been getting richer and richer over the past thirty years, and the richest among them have seen their tax rates go down over that same time?
Short answer: Yep.
* * *
George Carlin (RIP) used to tell this bit: “You know how I describe the economic and social classes in this country? The upper class keeps all of the money, pays none of the taxes. The middle class pays all of the taxes, does all of the work. The poor are there just to scare the shit out of the middle class. Keep ‘em showing up at those jobs.”
* * *
There are two main reasons the taxes on the wealthiest among us have gone down, while their income has soared. (Well, really, only one reason: “Money to get power, and power to guard the money.”)
But, okay . . . there are proximately two reasons this has occurred. Most recently, we’ve been eliminating the Estate Tax. The Estate Tax was first promoted by Republican icon (and all-around badass) Teddy Roosevelt, and it is designed to promote meritocracy, egalitarianism, and to prevent the arising of a hereditary aristocracy. As Teddy himself put it:
We grudge no man a fortune in civil life if it is honorably obtained and well used. It is not even enough that it should have been gained without doing damage to the community. We should permit it to be gained only so long as the gaining represents benefit to the community . . . . The really big fortune, the swollen fortune, by the mere fact of its size, acquires qualities which differentiate it in kind as well as in degree from what is possessed by men of relatively small means. Therefore, I believe in a graduated income tax on big fortunes, and . . . a graduated inheritance tax on big fortunes, properly safeguarded against evasion, and increasingly rapidly in amount with the size of the estate.
(In other words . . . just because your name is ‘Paris Hilton’ doesn’t mean you should receive $400,000,000 in income when your parents die. Some of that will have to be taxed, and go back into the system that gave your parents the opportunity to accumulate it in the first place.)
* * *
The second major factor that has resulted in the richest among us paying the least amount of taxes (by tax rate) is the gutting of the capital gains tax. A “capital gain” is the amount of money earned on a particular investment. For example, if I buy a share of Google stock at $100, and I sell it three years later for $150, then I’ve made a profit of $50. The Capital Gains tax rate is what I pay on that investment income.
Now . . . watch how the ball bounces.
If I am – say – a fireman in New York, making $80,000 a year, I have to pay the standard income tax on all of that income. The top marginal tax rate of which is (currently) about 35% on the last bit of my money.
But . . . if I am rich enough to live in New York and have enough money to invest, such that I make $80,000 a year on my investments alone (because, say, my stock portfolio went up) then I don’t pay regular income tax on that money. I pay the capital gains tax . . . which is only 15%.
D’you get that? If you are so poor that you actually have to work for a living – y’know, plunge into burning buildings and whatnot to save a bunch of people’s lives – then you get taxed the full amount of the income tax. But if you are rich enough that you can just invest the money you already have and you no longer have to actually work and contribute to our society. . . well, fuck! You should be getting a tax break, yah? Why should you have to pay what the proles do to keep this system working? Y’know . . . the system that let you get rich?
* * *
While we’re on the subject of the Capital Gains Tax, let me take a moment to clear up a few misconceptions about it: (i) that gains are being taxed twice, and (ii) that the cut rates are necessary to fuel investment in the economy.
As to the first, I covered this argument years ago with a friend of mine:
ME: Dude, how can you possibly justify taxing a firefighter, or a cop, or a teacher at full income tax levels, and yet give a cut-rate to some guy only because he has enough money such that his money is earning money for him without him doing anything?
FRIEND: Because, Sean, that money that is being taxed already has been taxed once. How can you justify taxing it a second time?
ME: Dude, what the hell are you talking about? Listen, suppose you come to the end of the year and you have paid all of your taxes, and you’ve salted away all of your savings, and now you still have $10,000 left over. Okay? That $10,000 is yours, free and clear, you will never be taxed on it ever again. You with me so far?
FRIEND: Yes.
ME: Okay. Now, you put that $10,000 into a stock mutual fund, and the fund does pretty well. You’ve got a 10% return on your money at the end of the year, you’ve made $1,000. You still following me?
FRIEND: Yeah.
ME: Great. Now, you take that money out because you want to take your wife on a nice weekend getaway. You now have $1,000 that you didn’t have before, that has never been taxed. Now . . . you explain to me why you should only pay 15% tax on this brand-new income, while if you were a firefighter, or a teacher, or a cop who had to work to earn that $1,000 they should pay, say, 28%. Explain that to me, ‘cause I ain’t getting it.
FRIEND: . . . . . . . . .
ME: Yeah, that’s what I thought.
As to the second . . . sigh. Look, as a lawyer I am more than aware of the vagaries of the English language. Here’s the deal: the word “investment” doesn’t always mean the same thing.
For example, when people generally talk about their “investments” they are referring to money they’ve plowed in to various ventures (CD’s, mutual funds, 401Ks, etc.) that they anticipate will bring them a return on investment at least greater than the interest rate, such that after a period of time they will have greater purchasing power. This is what individuals generally mean when they use the term “investment.”
But macroeconomists don’t use the term in the same way. Macroeconomically speaking, an “investment” means money is spent such that the return on the money provides an economic boost overall.
For example, suppose there are two towns separated by the Mighty Mississippi. The two towns might want to trade with each other, but the barge/ferry is inefficient and the nearest bridge is an hour’s drive south. That means that any trading that wants to be done will also entail at least a 4 hour round trip for the trading purposes.
Now, suppose in the grips of the Great Depression, the government came in and built a bridge between these two towns such that the trading time was cut from a 4 hour round trip to a 40 minute round trip. Suddenly goods would flow and trade would prosper and both towns could expect to see an increase in their standard of living. And that is not to mention the increase in the standard of living that came about by the government hiring the people to build the bridge in the first place. And the government could expect to get its expenditures back, because the increase in trade would mean an increase in GDP, and that would drive up tax revenues.
This is the type of “investment” macroeconomists talk about when they talk about investment. Investment in actual goods or services: a new bridge, or a new factory, or a new highway, or something that can be expected to return a profit on its expenditure. And don’t get me wrong . . . this is precisely the type of investment that the tax code should be used to try to incentivize. If we can get more people to invest in this sort of activity, the kind of activity that actually grows the economy, than I am all for it.
But that is not what our current capital gains tax laws do.
You have to understand the difference between the “primary market” and the “secondary market” for “security instruments” (i.e. bonds or stocks). For example, suppose General Motors wanted to open up a new factory in Danville, Virginia. This would require quite a bit of capital expenditure and GM wouldn’t want to pay for it all itself. Accordingly, GM could do at least one of two things: it could issue a new round of stock (an IPO “Initial Public Offering”) or it could issue corporate bonds to be repaid with the profits generated by the new plant. [Obviously, it could do a bunch of other things, or a combination of a bunch of other things, but I am trying to make this example simple; it is presented merely as a model.]
Whether GM issued new stock, or issued new debt (the corporate bonds) both would be issued on what is known as “the primary market.” And all of the proceeds generated from the sale of stock and/or bonds would be an actual, honest-to-go investment in the economy of the type that macroeconomists talk about.
And because all of this would be an investment in creating a new plant, to make new things, that would hire new workers . . . it would be an investment in the real economy of the United States. So, if we want to subsidize such things (and I think we do) I don’t have a problem with levying a smaller tax on the gains made by such investments. If we are going to use the tax code to incentivize people to do things we want them to do – like invest in the economy – well, okay, I can see that.
But that is not what the current capital gains tax actually does.
Y’see, in addition to the primary market, there is also the secondary market and this is where most of what individuals who “invest” their money invest in. The secondary market consists of every stock, every bond, every derivative or other security instrument that already has been sold at least one time. That is why it is “secondary.”
When the average schlub, like my friend from the dialogue above, invests his $10,000 in “the market” he isn’t buying new stock, and he isn’t helping to finance a new factory. He is buying an asset the previous purchase of which already did that. He is gambling that the asset will increase in value more than the interest rate, and the person he purchased it from is gambling that it will not. One of them will turn out to be right, and one of them will turn out to be wrong.
But nothing – absolutely nothing – about this trade will result in a greater economy.
So . . . tell me again . . . why should I give this little shit a tax break, just because he bet on “black” when “black” came up?
* * *
I saw Obama give a speech last Wednesday, and there was something that he said that nobody else has commented on yet (at least that I’ve seen) but was the first thing that I picked up on. He spoke of how pessimistic the Conservative view of America is, how the Conservatives don’t see much of a future for us. How they are certain they want to gut spending on education, and infrastructure, and new energy. How they don’t want to do anything about the global climate change, or even acknowledge there might be a threat. How they want to gut the small social safety net we have left, and turn it into tax breaks and giveaways for the billionaires.
I’ll tell ya . . . . I’m getting a bad feeling about all of this. I need to assume that the Republicans are just idiots, because the alternative – that they actually know what they are talking about – is too scary to contemplate. I know what Obama was talking about, I’ve felt it myself. There is an entire ½ of the American political divide that I don’t feel is actually invested in the future. I swear, Man, you listen to these guys – let’s suck up all the oil we can, screw the environment, Jeebus will save us – . . . they don’t expect to be here too long.
They don’t expect us to be here too long. The sense I get from the modern Republican/Conservative Party is that they don’t have to work with the rest of us, they don’t have to play nice with labor, they don’t have to be friends with anyone because Great Day A Comin’ Jeebus Has Arrived, and it’s all gonna be okay so long as you are rich. So you better get what you can now, hollow out the country as much as you can, and tuck it away in a nice safe place where you can wait out the Apocalypse.
I know, I know . . . it doesn’t make a lot of sense to me either.
But it is the sense I get from these nimrods.
* * *
Thank you, thank you ladies and gentlemen . . .
I don’t know how long I’ll be here, but I hope to entertain you until they kill me.
No comments:
Post a Comment