Category Archives: The Working Class and the Economy

We Are Worth More

Last month a few hundred retail and fast-food workers, from places like Sears, Dunkin’ Donuts, and McDonald’s, walked off their jobs for a rally in downtown Chicago.   Carrying signs saying “Fight for 15” (or “Lucha Por 15”) and “We Are Worth More,” these workers make $9 or $10 an hour, at best, and they figure they’re worth at least $15.

A one-shift walk-out and protest by a few hundred out of the thousands of such workers in the Chicago Loop and along Michigan Avenue’s Magnificent Mile cannot have the economic impact of a traditional strike – one that shuts down an entire workplace or industry for an extended period of time and, therefore, can bend an employer’s will.   And these workers’ chances of getting $15 an hour any time soon are worse than slim.   This “job action,” bolstered by community supporters organized by Action Now and with help from Service Employees International Union organizers, is more in the nature of a public protest than a “real strike.”   You could even call it “a public relations stunt,” but you’d be wrong to dismiss it as inconsequential.

“Public relations,” ironically, has a bad image.  But think of it as workers witnessing their own plight, calling for others in similar situations to join them and appealing to those of us with decent incomes to support them.  Witnessing, with its religious overtones, is not intended as an immediately practical action.  It’s first about individuals summoning the courage to put themselves forward to make a public claim that they are one of thousands (millions nationally) who are being treated unjustly.  In this case, it means taking the risk that they may be fired or otherwise disciplined for leaving work and going into the streets to proclaim “We are worth more.”

Witnessing is meant to make us think about justice as the witnesses simultaneously inspire and shame us with the courage of their individual actions.  I was at one of the first draft-card burnings that protested the Vietnam War in 1965, and I remember saying something like, “I’d do that if I thought it would do any good,” while knowing in my heart of hearts that I didn’t have the guts to take that kind of risk then.  But it inspired and shamed me – and thousands and then hundreds of thousands of others — to do many other things to fight against that war as we inspired and bolstered (and exerted peer pressure on) each other.

For the broader public, these initial job actions – in New York and Chicago among retail and fast-food workers; in California and Illinois among workers at Walmart warehouses; and all over the place among Walmart retail workers – are “public relations” that raise awareness and pluck consciences.   But for workers who watched workmates walk off the job to witness for them, there may be some of that inspiration and/or shame that is a particularly powerful call to action. That’s what organizers are counting on, in the hope that the numbers of such workers will grow helter-skelter across the retail industry, eventually initiating a contagion of worker direct action that can put these workers in a position to negotiate for “labor peace,” with or without the blessing of the National Labor Relations Board.

There’s another determined witness who couldn’t be more unlike these striking workers.  He’s a retired law professor from the University of Texas, Charles Morris, who is a leading expert on the legislative and early administrative history of the National Labor Relations Act and the Board that enforces it.  In a 2005 book, The Blue Eagle at Work, Morris makes the legal case that the Act defined a labor union as any group of two or more workers who act together (“in concert”) to seek redress of grievances from their employer.   According to Morris, the “concerted activity protection” articulated in the Act means that employers cannot legally fire workers for forming a non-majority  or “members-only” union (as few as two workers acting together), and what’s more, an employer is legally bound to “bargain in good faith” with that union.

Through meticulous legal research, Morris has shown that these worker rights were in the Act from the beginning but have been forgotten by the subsequent customary practice of defining a union as only that group of workers who have formally voted to be represented by a petitioning union. What’s more, other legal scholars have now signed on to Morris’s legal interpretation and are ready to bolster it before an NLRB that is willing to hear their case.  There would be such an NLRB, what Morris calls “a friendly Board,” if Republican Senators would allow a vote on President Obama’s nominees for the Board.

A favorable NLRB ruling would be important for a variety of legally technical reasons that workers and organizers could use to their tactical and strategic advantage – none of which includes the expectation that employers will voluntarily obey the law just because it is the law. But equally important is that Morris’s reading of the Act’s history restores the original meaning of a labor union that is based on workers’ decisions to act together “in concert” with one another.  That is, a labor union is not just an institution with a bureaucracy and a marble palace in Washington, D.C., though it may be that as well.  It is any group of workers in any workplace, no matter how big or small, who decide to and then do act in concert to advance their own interests in their workplace.

In March Chicago Working-Class Studies helped organize a public forum that brought Charles Morris together with workers and organizers from Fight for 15, the Walmart retail and warehouse strikers, and two other groups who are already acting as unions under this definition.  Though there were some disagreements between the elderly legal scholar and the mostly young workers and organizers — one emphasizing the importance of politics and administrative case law in the long run, the others focused on the potential of direct action in the here and now – they agreed that if and when the two come together, the possibilities for a worker-led upsurge of union organizing are great.

Nonetheless, through their actions these workers have already changed what a labor union is and is thought to be.   It is now, and really always has been — even a century before the National Labor Relations Act was passed in 1935, even when it was an illegal “conspiracy” — simply a group of two or more workers acting in concert with one another.   To be really effective there will need, of course, to be many, many more than the hundreds and thousands who have begun this process.  But it starts with a few brave witnesses who take a risk and ask others to join them.  The peer pressure is now on the rest of us.

Jack Metzgar, Chicago Working-Class Studies

Thatcher and the Working Class: Why History Matters

A kind of class war has broken out on the streets of the UK over the last week or so since the death of former Prime Minister Margaret Thatcher. Since her death was announced, the media has been full of people either paying tribute to her for ‘saving the country’ or condemning her for reigning over unprecedented deindustrialisation. Among these sound bites, the one that has become a constant refrain from those on the right has been that she ‘saved us from the unions.’ One particularly depressing manifestation of this was on a TV political panel show when young male audience member – he looked about 16 – said ‘well, imagine where we would be if we still had the unions.’ I can’t be certain, but given his accent – still one of the best ways in the UK to tell someone’s social origins – he was almost certainly working-class himself.  I started to think, yes, just imagine if we did have a stronger union moment . . . but maybe that’s for another blog.

Essentially what has been occurring here over the last week or so is a rewriting of history by the right – one where class is never far from the surface. Britain of the 1970s was portrayed as industrially backward with a terminal industrial relations problem. The right argue that the election of Margaret Thatcher in 1979 turned back this economic and social decline and created a brave new world.

Britain in the 1970s was, however, a complex place, not one dimensional as it’s being portrayed by the right. Although far from perfect, Britain was in this period a far more egalitarian society, in part due to near full employment, of course, but also because of a collective sense of fairness shared by both political left and right.  This is encapsulated for me in British media writer Andrew Collins’s memoir of the period Where did it all go right? Growing up normal in the 70s’.  Collins spent his youth in the English midlands, and while he was undoubtedly middle class, he wasn’t that different socially, culturally, or economically from his working-class peers. They would have attended the same schools, lived on the same streets or at least nearby, and so on. In part because of the kind of egalitarianism that Collins describes, 1976 was recently identified as the year when the British people were statistically about as equal as they had ever been – and possibly ever will be. They were also the happiest. After this period, the post-war consensus began to be eroded most notably by Thatcherism, as director Ken Loach has recently shown in a moving and thoughtful film on the social and economic reforms of the post-war Labour Government and the later breakdown of the consensus.

While the Tories were elected in part because they tapped into worries about unemployment, by using an image of a long dole queue with the tag line ‘Labour isn’t working,’ instead of ending unemployment, they drove it up.  Almost one million people were unemployed in 1979, but that rose rapidly in the early 1980s to 3 million and has never since fallen below one million.  And who has experienced the most job loss since from the 1980s onward? Yes, you guessed it: the working class, who lost jobs in coal mines, factories, shipyards, and steel mills.  These industries were closed as a result of either disastrous neo-liberal industrial policies, or, as was the case with the coal industry, simple political spite.  But the right wants us to remember Thatcher for ‘saving us from the unions.’

As I watched the state funeral for Mrs. Thatcher on TV, the BBC’s helpful live internet feed of the tickertape scrolling at the bottom of the screen highlighted the latest labor market statistics:  a 70,000 increase in joblessness this month and over 900,000 unemployed for over a year out of a total of 2.5 million. It was a fitting reminder of Thatcher’s gift to the working class.

But the right wing commentators have not been the only ones talking about Thatcher over the last week.  Many on the left have celebrated her death, though much of the opposition has been dismissed in some quarters as either left wing political extremism or simply distasteful. The tee-shirt maker Philosophy Football produced a souvenir shirt with ‘Rejoice – 08.04.2013’ emblazoned on the front and urged would be purchasers to order quickly to ensure deliver in time for the day of the funeral. Others celebrated musically, organizing an attempt to place the song ‘Ding Dong The Witch is Dead’ at number one in the download charts.  It narrowly missed climbing to number two! Impromptu street parties broke out in the centres of a number of British cities. In the Celtic fringes of the UK, Scotland and Wales especially, there has been a great deal of celebration at the news.  But nowhere has the bitter, visceral hatred of Thatcher and her governments of the 1980s been more pronounced than in the former coal mining villages of the North of England. While 3000 of the great and good of the British establishment were attending the lavish £10 million funeral service in St Paul’s Cathedral in the City of London, the places decimated by Thatcherism celebrated in a different style. In former colliery villages such as Easington in County Durham and Goldthorpe in South Yorkshire, effigies of the former Prime Minister were burnt with gusto.

The industrial and social changes that Britain suffered during the 1980s have left a lasting legacy that continues to impact the nation 23 years after she left office. Above all it is working-class communities that have paid the price of Thatcherism.  The true story of Thatcher’s influence, in the 70s and beyond, must be heard.  As one banner in the City of London proclaimed on the day of Thatcher’s funeral, ‘Rest in Shame.’

Tim Strangleman

Is Education the Answer to Economic Inequality?

One of the most common solutions offered to reverse America’ growing economic inequality is increased access to education.  President Obama may have started the trend with his call for universal, high-quality preschool, but others have joined the fray.  In March, Ronald Brownstein argued in National Journal that “Education remains critical to reversing the erosion in upward mobility that has made it harder for kids born near the bottom to reach the top in the United States than in many European nations.” On The Century Foundation’s website just last week, Benjamin Landy posted a blog entitled “To Battle Income Inequality, Focus on Educational Mobility.”   

According to Brownstein, colleges and  universities are failing to make those opportunities available, because higher education has become too expensive and doesn’t do enough to help lower-income students succeed. In their 2009 study of college completion rates, William G. Bowen, Matthew M. Chingos, and Michael S. McPherson, Crossing the Finish Line: Completing College at America’s Public Universities, showed that lower-income students were less likely to graduate than their wealthier counterparts regardless of where they went to school.

Their study also showed, however, that working-class students did better when they enrolled in more selective colleges, rather than choosing a more accessible public institution, but many working-class students choose less-selective schools.  Many don’t even apply to more elite colleges, for any number of reasons.  In a recent study, Caroline Hoxby and Sarah Turner suggest that working-class students believe, mistakenly, that it will cost less.  In fact, financial aid programs aimed at increasing economic diversity at elite schools often make such schools more affordable than public schools.  That may be increasingly true as state legislatures dramatically cut support for public higher education, making them even more expensive.

How worried should we be about that?  On the basis of justice, we should be outraged.  We should, as Hoxby and Turner suggest, push elite schools to work harder to recruit working-class students.  We should join the thousands of college students who have organized protests against cuts to public education.  And those of us who are educators should heed Mike Rose’s prescription for addressing the needs of working-class students: “If we want more students to succeed in college, then colleges have to turn full attention to teaching.”

Still, the idea that more or better college education will “solve” the problem of economic inequality is just silly.  While a college education still provides economic advantages, increasing lifetime income, achieving that benefit is harder than it used to be.  These days, getting a college degree doesn’t guarantee better middle-class job prospects, but it does often bring a lifetime of debt.  Unemployment rates for recent graduates remain high – 53% according to The Atlantic a year ago, and many have taken low-wage, hourly jobs that don’t require a college degree.  Meanwhile, student loan debt has increased to an average of $26,600.  For too many, higher education has become a trap door rather than an elevator.

I’m not suggesting that education isn’t worthwhile.  Far from it.  A good education brings many advantages, only some of which have to do with employment or income. Martha Nussbaum is just one of many scholars arguing that education has value for society. But education simply won’t address the root causes of today’s economic inequality.

First, while state legislatures and business organizations pressure public universities to focus on preparing students for jobs in specific fields, like health care or fracking, the widely-touted “skills gap” turns out to be a myth.  The American economy is not being stymied by a lack of appropriately trained workers.  Wharton School management professor Peter Cappelli suggests that we should “Blame It on the Employer.”  He suggests that employers ask themselves a few key questions starting with this zinger: “Have you tried raising wages? If you could get what you want by paying more, the problem is just that you are cheap.”

Second, even when we talk about increasing access or establishing “universal” programs, education addresses the individual, not the system.  Even at its best, education helps some working-class young people prepare to move into the middle class, an outcome that might improve the economic opportunities of those individuals but doesn’t address the broader economic structure.  A thousand well-trained nurses might earn a decent living, but they will work alongside aides, janitors, and clerical workers who don’t. Simply put, moving some people into better paying jobs doesn’t eliminate the low-wage jobs they left behind.

Moreover, we should expect to see more low-wage jobs over time, not fewer, and education won’t change that.  Indeed, as Jack Metzgar and I have both written here, multiple times, the Bureau of Labor Statistics predicts the most growth in jobs is in those that don’t require a college degree.  Regardless of how many people get college degrees, too many jobs in the U.S. will continue to pay low-wages, offer little or no benefits, and provide almost no job security. The only difference will be that workers will have more education and, in most cases, more debt.

If we want to improve the lives of low-wage workers and their families, we need public policies that will create more jobs, increase wages (see Metzgar’s suggestion earlier this year for a law requiring productivity sharing), and protect people from the financial ravages that often accompany illness, natural disasters, and other devastating and expensive events.   But how likely do you think it is that our state or federal legislators will create such policies?

The only possibilities for change lie in activism and organizing.  And what does it take to foster resistance and build solidarity?  As our labor studies colleagues might remind us, learning about economic, political, and social processes as well as the history of activism, theories of class, and narratives of oppression and resistance can prepare people to articulate and advocate for their own interests and for the common good.

Hmm, so maybe education is the answer, after all?

Sherry Linkon

Raising the Minimum Wage — The Right Way

Ever since President Obama took office I’ve periodically wished I had the ability to call the White House get him on the phone and say “Hey, you’re not doing it right!”  Let me be clear—I don’t mean he hasn’t done the right thing—just that he’s too often done the right thing the wrong way.

For example, like many economists and advocates for working families, including Paul Krugman and Robert Reich, I thought the President’s economic stimulus package was way long on help for the “Too Big to Fail” banks and other Wall Street institutions and way short on dollars for infrastructure projects, support for education and job training, and other programs that would have helped the working families who inhabit the nation’s Main Streets.

Same thing with health care reform.  Yes, it needed to be done. But he did it wrong.  Instead of a system that guarantees health insurers millions of new customers, does little to rein-in costs, and gives anti-reform advocates the ammo they need to scare small and medium sized businesses into opposing the plan, he could have done something simple: Medicare for all, or at least for all of us over the age of 55. Not only would this approach have made Medicare solvent by bringing younger, healthier people into the system, it would have given the government immense power to negotiate lower costs with providers.

Unfortunately, the same principle applies to the President’s proposed minimum wage increase.  It’s the right thing to do, but nine dollars an hour? Really?  At least in this instance Mr. Obama’s not alone in being wrong.  Predictably, the Republicans and their bosses in the business community, led by the U.S. Chamber of Commerce, the National Federation of Independent Business, and the National Restaurant Association, became apoplectic seconds after the words “raise the minimum wage” rolled off the President’s tongue during the State of the Union address.

Their reaction was as predictable as the specious claims they make about the cataclysmic effect giving the folks at the bottom of the economic ladder a boost will have on the economy.  Business leaders wail and gnash their teeth any and every time raising the minimum wage is proposed, including back in 2006 when labor led the successful effort to win voter approval of an Ohio Constitutional amendment that both raised the wage and indexed it to inflation.

The fact that their dire prognostications have never come to pass—last time I checked people are still doing business in Ohio despite the onerous burden of having to pay workers a whopping $7.85 an hour, and Costco is thriving despite paying starting employees more than $10 an hour –apparently doesn’t matter to them or to the members of the media who give their ludicrous contentions credence by repeating them.

Unfortunately, President Obama’s proposal was every bit as predictable in nature and scope as the arguments against it.  Raising the minimum wage $1.75 is fine as far as it goes—which isn’t far enough.  Had the President and his advisors really given the issue some thought they could have crafted a plan that would have both insulated the administration from criticism and gone a long way toward addressing the income inequality that plagues the working class and the middle class and has the U.S. economy stuck in neutral.

Here’s the deal.  Part A:  raise the minimum wage to $9 per hour for the vast majority of employers.  Their protestations to the contrary, it won’t bankrupt them or force them to cut jobs.  Especially when they begin ringing up the additional sales Part B of the plan generates: raising the wage to $15 per hour for full time and $11 per hour for part-time workers employed by the nation’s largest companies.

There’s little doubt that companies like Wal-Mart will attempt to avoid paying the higher wage by eliminating full-time employees or turning to temp services for workers.  To stop them, the law must include provisions that prevent employers from moving workers from full-time to part-time status and that classify temps as employees of the corporation using them.  Those two provisions would help ensure that companies comply with the letter and spirit of the law.

Which firms would qualify as “large” under the plan?  Those that directly employ 100,000 or more and average 80 workers per location.  Under this definition franchise operations like McDonalds and most other fast food chains would be exempt.  In addition, the increase would have little or no impact on large employers like IBM, UPS, FedEx and others who already pay above the proposed new minimum.

The plan would affect retailers like Wal-Mart, Target, Macy’s, Kroger’s, Home Depot, and Lowe’s along with America’s biggest banks—the folks responsible for the 2008 economic implosion would.  (Here’s an important side note: although most people equate low wages with retail, bank employees are grossly underpaid, and financial institutions are infamous for aggressively opposing union organizing drives.)

Using Wal-Mart as an example and assuming that 40% of the company’s 2.1 million workers are full time, their aggregate annual wages would climb from $17.4 billion to $26.2 billion.  Annual wages for the retail giant’s 1.2 million part-timers would jump by nearly $5 billion, from $12.5 billion to $17.2 billion.  In all, the increases would pump an additional $14 billion into the hands of Wal-Mart workers every year.

Imagine the staggeringly positive impact this one policy would have on the economy when the wages of the more than 3.5 million people who work for America’s other large corporations are factored in.  Billions of dollars will be spent on homes, cars, clothes, food, dining out, movies, electronics, and other goods.  People who now live paycheck to paycheck will actually be able to save and plan for the future. In short, millions of working families will have an opportunity to grab a piece of what has become a fading American Dream.

The increases would also help reduce the deficit—something Republicans should love.  Higher wages will generate more income and sales tax revenue.  As salaries rise, so will the flow of dollars into Social Security and Medicare, especially when tens of thousands of workers are no longer eligible for the Earned Income Tax Credit because they’re earning a living wage.  Finally, government spending will fall because the legions of low-wage workers at Wal-Mart and other firms that now receive government benefits will no longer need them.

But instead of a bold plan that could end decades of wage stagnation, we get a small across-the- board increase that simply won’t get the job done.  Someone get me the number for the White House.  I need to call Barack and tell him he’s doing the right thing the wrong way.

Again.

Leo Jennings

Productivity Sharing: There Ought to Be a Law

What if I told you that there is a way to increase wages and profits at the same time without any need to raise prices?  Most people would, I think, say that’s impossible because increased wages must always be paid for either by increasing prices or by reducing profits or both.  They’re wrong.  Most economists (including conservative ones) would say that not only is it possible, it should just naturally occur.  Economists can show with mathematical certainty that productivity growth makes it possible for both wages and profits to increase while prices remain stable. But just because it’s possible doesn’t mean it’s inevitable. Turns out, the connection between productivity and higher wages isn’t natural.

It did seem that way for a while.  Productivity and wages grew in tandem in the U.S. for the 30 years after World War II.  But for the past four decades, wage increases have not kept pace with productivity growth.

We’ve had plenty of growth in productivity across the U.S. economy (it’s doubled since 1973), but only piddling increases in real wages and family incomes.    That is the single biggest reason for the growth of income inequality in the U.S. since the 1970s – not the Reagan and Bush II tax cuts for the wealthy.  New York Times labor reporter Steven Greenhouse calculates that if productivity sharing had continued the way it was in the three decades preceding 1973, each full-time worker would now be earning some $20,000 a year more.  The median annual wage would be about $60,000 instead of $40,000.

We need a law requiring employers to share the fruits of productivity growth with their workers.  Put aside, for the moment, that there is probably no chance of passing such a law in the next four years or longer.  Debate over a proposed law would generate the broad public discussion of productivity growth and productivity sharing that we need to make clear the causes and consequences of our growing inequality of income and to help us figure out how to reverse it.

Productivity growth is one of only a handful of ways to increase a nation’s wealth.  Historically, the other most important way is plunder – raise an army and take wealth from somebody else.  As Adam Smith argued in 1776, productivity growth is the key to peace and prosperity because it is a way to increase “the wealth of nations” without going to war.  Capitalism, the factory system with its division of labor, the industrial revolution, and steadily increasing technological change continuously improve productivity, which simultaneously increases total wealth and reduces the need for human toil.  But productivity growth only produces widespread economic benefit if the wealth is shared, and if that ever occurred “naturally,” it was because workers’ organizations – labor unions and political parties – functioned as “forces of nature” once upon a time.

I have not given up on American unions figuring out a way to revive themselves and return to some semblance of their former power.  But even under the most optimistic scenario, the return of society-wide union power to the U.S. will take decades.  And without powerful unions, labor parties (or labor-influenced parties, as the Democrats once were) are impossible.  We need a shorter term alternative.  Here’s mine.

Pass an amendment to the Fair Labor Standards Act that requires employers to share productivity gains with their employees.  It should be modeled on the so-called Treaty of Detroit that for decades was a standard feature of United Auto Workers’ contracts with the auto companies – and came to be included in many other union contracts during the most prosperous decades in American history (the 1940s into the 1970s).   Such a law would require wage increases to match productivity increases, so a 2% increase in productivity would require the employer to increase real wages by at least 2%.   This will not require an increase in prices, and because labor is only a portion of total costs, there will typically be money left for profits to increase as well.  Since productivity can be measured in different ways, even in manufacturing and mining, the law should require employers to facilitate the election of a workers’ council, with a small budget to hire experts and with the power to negotiate how a company calculates productivity gains.

Figuring out the details and winning support for such a law would require more economic, legal, and political expertise than I can muster.  Workers and some unions might resist, because they think (mistakenly) that speed-up and brute force are the only or the primary means of achieving productivity growth.  Capitalists and their managers are probably not going to like it much either! But advocating for a specific form of legally required productivity sharing could bring some key points into public view.

  • Workers collectively make a very large contribution to productivity growth, and thus should share in its benefits.  Though “investment in new technology” also makes a substantial contribution, it is not as large as the roles played by workers’ tacit skills and on-the-job ingenuity as well as their technical education and formal on-the-job training.  (See Chapters 2 & 3 of Barry Bluestone and Bennett Harrison, Growing Prosperity.)
  • If workers do not get their share of the new wealth created by productivity growth, someone else gets it. This, in turn, contributes to levels of income inequality that will eventually mean there is insufficient consumer demand to keep the economy growing.  A good case can be made that “eventually” has already arrived.
  • Workers’ councils, even very narrowly defined as merely negotiating how productivity is measured in a single workplace, would increase workplace democracy and likely increase workers’ appetites for more.
  • And, oh, did I mention that real wage increases based on productivity growth do not require increased prices or the elimination of growth in company profits?  As such, they may be the one best way to both create and share prosperity – and maybe even peace.

 

Jack Metzgar, Chicago Working-Class Studies

More (Bad) Jobs: The Unexpected Consequences of the ACA

During Congressional debates over the Affordable Care Act (ACA), Republicans and business leaders warned that it would result in the loss of jobs. In fact, their first attempt at repealing the ACA was called “Repealing the Job-Killing Health Care Law Act.” But as it turns out, the ACA may have just the opposite effect: creating more jobs by splitting the work of the already underemployed. As a result, the growth of underemployment could be an unexpected consequence of the ACA.

The most notable example is Wal-Mart, the world’s largest private employer. After being a cheerleader for the law, Wal-Mart announced in December 2012 that it would join other employers in evading both the spirit and letter of the law. In the last year, it had already announced that part-time workers hired within the last 12 months would be subjected to an “Annual Benefits Eligibility Check” each August. Now, Forbes Magazine reports that “Employees hired after Feb. 1, 2012, who fail to average the magic 30-hours per week requiring a company to provide a healthcare benefit, will lose their healthcare benefits on the following January. Part-time workers hired after Jan. 15, 2011, but before Feb. 1, 2012, will be able to hang onto their Wal-Mart health care benefit if they work at least 24 hours a week.” Wal-Mart has insisted that it can’t afford to pay benefits, and it will avoid the ACA’s requirements by scheduling current workers for fewer hours and hiring more part-timers.

But the expansion of underemployment resulting from the ACA is not confined to the service and retail sectors. For example, my former employer, Youngstown State University (YSU), announced in November 2012 that it would limit the hours of all non-union part-time employees, including the part-time faculty who teach about 60% of YSU’s courses. Department chairs were instructed to cut part-timers to less than 24 hours per week. To make matters worse, all part-timers must now sign a form that states they acknowledge that the employer has informed them that they are not classified as public employees so the University will not pay into a state pension for them. Of course, part-time faculty and staff are among the lowest-paid workers at YSU, and any hope they had of supporting themselves by taking on more teaching is now gone. The shift to increased reliance on part-time and adjunct faculty is old news in higher education, but actions like those taken at YSU and elsewhere are making the problem even worse.  Like part-time retail workers who scrape by through multiple jobs at different stores, part-time faculty must take on multiple teaching assignments, becoming academic nomads, always moving from one teaching assignment to another.

Some may not even have that option. A new policy in Virginia will expand academic underemployment even further by claiming that the State is the employer, not separate campuses. This means that adjunct faculty can’t cobble together courses at various state schools without breaking the part-time threshold of 29 hours per week. Adjunct faculty will have to find work outside the state system.

Once seen as paragons of moral and ethical behavior, universities have embraced the market values of the retail and service industries. No doubt, the ACA provides additional incentive to expand underemployment in the academic community, contributing to the Wal-Martization of America’s public universities.  The effect might well move in the other direction, too.  As one of my former students suggested, it’s easy to imagine commercial companies justifying their limited-hours policies by pointing out that universities are doing it.

Cutting hours and creating more precarious jobs is only one of the tools employers are using to cover the expected costs of the ACA. Some companies are allowing workers more hours but requiring them to pay higher premiums and co-pays. We’re likely to see more of this kind of cost shifting, particularly in the low-wage service and retail industries where employers are not willing to reduce hours and avoid paying penalties under ACA.  Some employers might also reduce wages to cover increased costs and avoid penalties for not providing insurance. That is, until the minimum wage is reached, which could be another incentive to cut hours.

Other companies will simply stop providing health care coverage. The Wall Street Journal recently reported that the Congressional Budget Office has raised its prediction of how many people will lose employer-provided insurance from four million to seven million. The Congressional Budget Office has suggested that the ACA will cost the public more than originally expected, because more workers will be covered by the new health-care exchanges. The result will be large increases in government subsidies.

Employers’ efforts to avoid paying for health care contribute to another kind of cost shifting: from private business to taxpayers.  When companies increase the number of part-time, low-wage workers and refuse to provide health care benefits, states end out subsidizing businesses through the social safety net. For example, in 2010, the citizens of Ohio paid $64 million dollars in welfare payments to Wal-Mart employees. This will likely increase starting next year, because the ACA expands eligibility for Medicaid to anyone with income under 133% of the federal poverty level. Most Wal-Mart workers will fall under that income level.

It seems clear that many employers, especially in low-wage sectors, will do everything they can to limit workers’ hours or shift costs to workers or the public in order to avoid the requirements of the ACA. And I bet they’ll brag about how many new jobs they’ve created and how they’re contributing to economic recovery, without acknowledging the low wages and limited hours of those jobs. They’ll shift the burden to the growing working class, many of them part-time workers, who will have more insecurity, lower wages, and increased out-of-pocket health care costs.

Given the regulatory evasions, a single-payer system, like Medicare for all, would be better for both workers and taxpayers. After all, conservative business and political leaders said the ACA was first step toward universal coverage. Their evasions of the bill could make their predication come true. Let’s hope.

John Russo

Union Density: What’s Literature Got to Do with It?

So union density in United States has declined yet again. According to the Bureau of Labor Statistics, only 11.3% of American workers now belong to unions.   This compares to 11.8% in 2011, and it’s a long way from the all-time high of 35% in the early 1940s.   The “right to work” campaign is expanding – even to Michigan, of all states – along with “austerity” policies that target working people.   Since Ronald Reagan launched his attack on labor in 1980, when union density was at 20%, real wages have declined along with union membership to a point where we now have a “gilded age” level of income inequality.

In times like these, it is useful to be reminded of what unions can be good for.  A labor history like From the Folks Who Brought You the Weekend (2001) explains in readable style what it took to establish unions in the first place, while New York Times reporter Steven Greenhouse makes clear in The Big Squeeze: Hard Times for the American Worker (2008) why we need them now more than ever.  Novels, too, can make the case for working people’s rights, through compelling fictional narratives that engage us with characters we care about.  Two Depression-era novels from the Pittsburgh steel district, Thomas Bell’s Out of This Furnace and William Attaway’s Blood on the Forge, both in published in 1941, do this particularly well, though in very different ways.

Bell’s book – subtitled “a novel of immigrant labor in America” when it was republished by the University of Pittsburgh Press in 1976 – follows three generations of a Slovak-American family from arrival in the 1880s up to the unionization of the steel mills in the New Deal era.  Along the way it addresses the Homestead battle of 1892, as well as the great strike of 1919 and the struggles of 1934-37.  Attaway’s is a novel of the Great Migration, tracing the experience of three African-American brothers who are lured north from Kentucky to work the mills during the 1919 conflict.  By this time, eastern European laborers have been admitted into some union lodges, while blacks are excluded and demonized as strikebreakers.

Although Bell’s novel is a family saga spanning fifty years of steel-town history, while Attaway’s focuses on one pivotal year, they have several points of contact.  Both address the dislocations of [im]migration, the hazards of steelmaking, racial/ethnic subjugation, labor strife – and the strength of the human spirit in response to these conditions.  And there are telling coincidences of detail between the two: fourteen men die in the blast furnace “accident” that kills Joe Dubik in 1895 in Furnace and fourteen in the explosion that blinds Chinatown Moss in 1919 in Blood.

But there are equally telling points of divergence.  Bell takes the family as a social ideal and traces its process of Americanization within a known community.  The health of family and community depend on strong representation in the workplaces that dominate life in the steel towns.   Although they are discriminated against as “Hunkies,” assigned the worst housing and the worst tasks in the mill, Bell’s characters grow into a sense of citizenship and belonging.  And they are recognizably white in relation the lowest stratum of immigrants to the steel towns.  Looking back on how Braddock has changed in fifty years, Bell’s aging Slovaks lament the arrival of the “shines” in the First Ward, “brought here to break the strike” in 1919.

Attaway’s characters are rootless single men, focused on survival and what pleasure they can find in the present moment, with the aid of corn whisky, dog fights, and the prostitutes in Mex town.   Only the eldest brother, Big Mat, who has left his wife behind in Kentucky, sees any future for himself in the mill town.  Working steel, “His body was happy.  This was a good place for a big black man to be.”  When the strike starts, however, he lends his strength to the company’s campaign to crush it; as a sheriff’s deputy, he becomes a “black riding boss,” trampling those who have mocked him, including the Hunkies.

From contrasting standpoints, both novels demonstrate how racial division was as much a product of industrial management as steel from the mills, and how this division, reinforced by craft union prejudices and racial exclusion, bedeviled any attempt at industry-wide organization – that is, until the CIO swung into town in 1935.   Dobie, Bell’s third-generation protagonist, understands the racial system: “Once it was the Irish looking down on the Hunkies and now it’s the Hunkies looking down on the niggers.  The very things the Irish used to say about the Hunkies the Hunkies now say about the niggers.  And for no better reason.”  Whereas Bell does not criticize the steel unions for their part in maintaining this cycle of racism, its destructive power is central to Attaway’s story.

Differences in narrative style make it a pleasure to read the two novels alongside each other.  Bell writes with a naturalistic matter-of-factness, leavened with gentle irony, and sometimes with finely pointed commentary.  Of the death of Joe Dubik and his workmates, Bell writes:  “Officially it was put down as an accident, impossible to foresee or prevent . . ..  In a larger sense it was the result of greed, and part of the education of the American steel industry.”  His style is also capable of great tenderness, especially in his scenes of courtship, married love, and family losses.  Attaway’s writing, by contrast, crackles and hums with a dark music.  The novel’s first sentence reads: “He never had a craving in him that he couldn’t slick away on his guitar.”  But Melody’s healing blues cannot survive the move to the steel towns, nor can it save his brothers Chinatown and Big Mat, who used to love to hear him play in the red-clay hills of Kentucky.

The two novels’ titles suggest not only this contrast in style but also in narrative outcomes.  “Out of this furnace” comes a vindication of the steelworkers’ aspirations and the possibility of a better life for their families.  At the end of Bell’s novel, Dobie, having helped to build what became the United Steel Workers, engages in a nighttime reverie about issues the union could address in the future:  technological unemployment, environmental destruction, anti-worker politics, bosses and “bossism,” and the degradation of work itself.  As he spins this web at the bedroom window, his sleeping wife is pregnant with their first child – completing the picture of productive and reproductive futurity.

The “blood [spilled] on the forge” in Attaway’s novel is not redeemed by any such optimistic conclusion.  The book itself becomes a kind of blues, and any uplift it provides comes from Attaway’s ability to sing it.   Big Mat, Melody, and Chinatown do not recover from the combined violence of cultural dislocation, deadly working conditions, and racist labor politics – and they do not understand what has happened to them.  But we, as readers, are invited to develop the consciousness they can’t.  The novel offers us the insight and empathy out of which to draw our own conclusions about the industrial system and the need for racial solidarity in labor.

For me, novels like these suggests that unions can be good for much more than better hours, wages, and working conditions. What they achieved, on the evidence of Bell’s novel at any rate, included a sense of personal dignity and collective strength in the present, and a hopeful vision pulling one forward.   When Bell wrote that in 1937 “the fifty-year struggle to free the steel town was nearly over,” he was claiming that the fight to organize, to be recognized, to bargain implied more than “labor rights” alone; it was a struggle for what came to be called civil and human rights. Conversely, Attaway shows us, in visceral scenes, the damage done, no only when companies and their henchmen engage in violent suppression of those rights but also when unions play into a company’s hands by excluding the unorganized and the “other.”

Most unions today seem to get this – though, for now, they are still on the losing end of the most concerted legal and political assault since the robber barons ruled the roost.  But we would be much worse off without them, and they may be due for a revival.  Read any good labor novels lately?

Nick Coles

Home Health Workers: In Demand But Not Protected

In the nearly 20 years I’ve spent organizing long term care workers, I hadn’t really personally experienced the difficulty of being a care giver.   I worked the policy, political, advocacy, organizing and bargaining pieces in the Union for home care workers.  The women I organized were strong and bold and everyone had a story to tell. We told their stories of care giving in the hope that the workforce would no longer remain invisible and would begin to be seen as the emerging face of the labor movement along with immigrants and service workers.

I have a story to tell as well now.  My mom and dad are in their 80s and in poor health.  Caring for them is the most difficult work I have performed in my life, both mentally and physically.  I moved back home two years ago to care for them.  Ten years ago I used to fear that they would die.  Now I fear that they will live. Each day brings its own lessons in compassion, like when I wake up in the morning and there is no hot water to shower because Mom got up in the middle of the night and left the water running, or, when I am ready to walk out the door to take my son to pre-school and Dad’s colostomy bag breaks and I have a mess to clean up.  Then Dad begins to cry, I try to comfort him, and my son is late for school.   I think back to the women I’ve organized and look to them for strength. I do this for free, which prevents me from working full-time elsewhere, but the workers who did this for a living, mostly women and people of color, really aren’t doing much better financially.

Home health workers are among the most in demand but lowest paid workers in America.  There are 2.5 million caregivers in the workforce, and that number will grow over the next decade because of aging baby boomers, many of whom seem to prefer to receive care at home. Employment in care giving is expected to grow by 70% from 2010 to 2020, much faster than average for all occupations. Over one million workers in this industry have no health insurance. 90% of direct care workers are women, and many are primary breadwinners in their families. Caregivers are paid minimum wage or, if they’re lucky, just slightly above.  Earning such low wages with no health insurance means that 46% of direct care workers rely on some type of government program, such as food stamps, Medicaid, housing, child care, energy assistance, or transportation assistance.

Over one million direct care workers are consigned to near-poverty because of the structure of their employment. The home care workers bathe, change, dress, and feed their clients.  They also perform home-making duties, such as cooking, cleaning, and shopping. These workers face whatever they have to, depending on the kind of day their clients may be having.  Even if the home care agency tells them that they have one hour to get a client dressed, fed, and settled in his/her chair for the day, it may take longer.  But workers do not leave their clients.  Instead, they work “off the clock.”  A home care worker may have four clients for the day but does not get paid for mileage or travel time between clients, much less any benefits for themselves. If the worker’s client becomes ill and is admitted to the hospital, admitted to the nursing home for further care, or dies, or if the family takes the client to their home for the holidays, the worker simply loses that job and does not get paid.  There are no sick days and no vacation days.

Home care workers may be employed by an agency or be independent providers. In either case, the work environment includes a number of safety and health hazards: blood-borne pathogens and biological hazards, latex sensitivity, ergonomic hazards from client lifting, violence, hostile animals, and unhygienic and dangerous conditions.  They may also face hazards on the road as they drive from client to client.

Unfortunately, these workers have been denied the right to organize and bargain in some states, like Ohio. Home care workers are also excluded from the Fair Labor Standards Act, making them ineligible for overtime, including overnight stays at a client’s home. President Obama spent a day working as a home care worker in California not long after announcing his candidacy in 2007.  Last year, the President proposed a revising a Labor Department rule that would provide FLSA protections to home care workers, and the final rule is still being deliberated.  Guess who opposes the rule change?  The home care agencies.  Agencies receive at least $15 billion of Medicaid money annually for personal care services and are happy to have government money, which fueled a 9% average yearly increase in revenue between 2001 and 2009.  Government becomes harmful, it seems, only when setting a floor under workers’ wages.  The fight isn’t about raising the minimum wage or getting overtime legalized-that would still leave home care workers poor.  It’s about winning some labor standards, rights, and security after decades of losing them.

What happens to this growing element of the working class matters for the shape of our economy, the fate of unionism, and the establishment of a decent standard of living for all.

Debra Timko

Debra Timko was a leading health care organizer for 20 years and is now an independent health care researcher studying the lives of health care workers in Northeast Ohio

Jobs and the “Fiscal Cliff”

My relief that Mitt Romney was not going to be our president, with a Republican Senate along with the House of Representatives, barely lasted through Tuesday night.  By my lights, a lot of terrible stuff and a completely wrong direction in our policy and politics have just been avoided.  Whew!  But after months of both Republicans and Democrats talking about the lack of jobs being produced by our lackluster economy, with political reporters, operatives, and pundits hanging on the next unemployment report as if it might be vitally important for the future of the republic — by the end of the week our 8% official unemployment rate (and Romney’s oft-repeated “23 million unemployed and underemployed workers”) was again just one of those inconvenient realities that we’re going to have to live with.

President Obama’s victory speech Tuesday night looked ahead to his second term and promised to focus on “reducing our deficit” along with some other things (“reforming our tax code, fixing our immigration system, freeing ourselves from foreign oil”), with no mention of getting our economy growing at a rate that can reduce our debilitating unemployment and the damage it is doing to all of our lives, some of us much more than others.  Then, as the Wall Street Journal headlined two days later, “Political Focus Shifts to ‘Fiscal Cliff’,” and it’s all about budgets and the need for “shared sacrifice” and a “balanced approach” to easing the economy farther downhill rather than going off a cliff.

The fiscal cliff is not just about deficits.  It’s about jobs – jobs in an economy where there are not nearly enough of them for everybody who wants and needs them.  If the spending cuts and tax increases currently scheduled to take effect January 1st actually would take effect and then remain in place for all of 2013, it would reduce the federal deficit, at least at first, by more than $600 billion – that is, “cutting the deficit in half,” as the President once promised.  But it would also throw us back into recession and eliminate more than 4 million jobs.

Jobs and deficits are related.  Federal budget deficits (spending more than you take in in taxes) fuel economic growth and create jobs.  Likewise, stronger, faster economic growth creates jobs and, thereby, reduces federal budget deficits.  Without the annual $1 trillion deficits the federal government has been running since President Obama took office, we would still be in the Great Recession – or worse.  Right now, we need those deficits.  Cut them substantially, and you reduce economic growth and kill jobs.

Even though some Republicans deny it and almost no Democrats will say it, all the political players, including what is euphemistically called “the business community,” know these basic principles of macroeconomics.  They know that rapidly cutting the deficit in half – whether by cutting spending, increasing taxes, or some combination of the two in a so-called “balanced approach” – will send the economy back into its 2008-09 tailspin.  That’s why all the political players fear the fiscal cliff, and that widespread fear is also why we will not go over it.

But how we avoid the fiscal cliff matters.  And just avoiding it will at best leave us where we are, with a stagnant economy growing at 2% a year and with an official unemployment rate near 7% as far as the eye can see.  We also need to stimulate the economy immediately, get it growing fast enough so that it is creating 300,000 or 400,000 jobs a month (versus the recent trend of 150,000 a month).  This will increase the deficit in 2013, but it will also do more to cut the deficit in the long run than any spending cut or tax increase could do.

For a guide on how to avoid the fiscal cliff, see the Economic Policy Institute’s September policy brief, “A fiscal obstacle course, not a cliff” by Josh Bivens and Andrew Fieldhouse.  It’s very wonky and can be hard to follow in spots, but it’s only 13 pages of text.  And it has a wonderful table on page 7 that lists the various laws that expire at the end of this year and shows both how much each expiration will reduce the deficit AND how many jobs it will kill.  The fiscal cliff is not just the Bush tax cuts – which will lop $64 billion off the 2013 deficit if only the high-income cuts expire, but will also cut 102,000 jobs.  The cliff also includes Obama’s special recession-fighting unemployment compensation program (whose expiration will reduce the deficit by only $39 billion but will eliminate 448,000 jobs) and the expiration of the payroll tax cut (which will reduce the deficit by $115 billion but at the cost of killing more than one million jobs).

Bivens and Fieldhouse use these calculations to show how a jobs-sensitive strict cost-benefit analysis would lead to renewing (and even enhancing) federal government spending programs rather than renewing any of the tax cuts, while also showing that tax cuts targeted to lower- and middle-income workers create more jobs than those going to the wealthy and other high-income earners.  Tax increases do kill jobs, just as Republicans always say, but cuts in government social spending kill many, many more.

There is a lot of complicated economics here, and the politics of avoiding the fiscal cliff may be even more complicated.  I sympathize with the President and Congressional Democrats for having to work through these daunting problems while dealing with House Republicans.  But the President started on the wrong foot Tuesday night by focusing on “reducing our deficit.”  That’s a job killer if you do it now and if you do it the wrong way.  With an economy growing at 2% (at best) and an unemployment rate hovering around 8%, the very last thing we need now is to reduce our deficit.  Rather, first we need to preserve our deficit and the jobs it is supporting.  And then we need the President’s American Jobs Act with its increased spending for infrastructure and for state and local governments to hire and rehire “teachers, cops, and firefighters” – namely, the stuff he campaigned on, the promise of “Forward.” Reducing our deficit and a long-term plan for managing our accumulating national debt are for later, not now.  Right now we need jobs, millions of them.  And we need our newly elected President paying attention to that in a way he has not since February of 2009.

Jack Metzgar

Chicago Working-Class Studies

The New Precariat and Electoral Politics

During the Presidential campaign, Americans have heard endless discussions about unemployment. But neither candidate has said much, at least not directly, about precarious employment or about the new precariat – that growing group (some would even say the growing class) of workers in temporary, part-time, and/or contingent work that often doesn’t pay a living wage.

Who is the precariat? According to Guy Standing, the author of The Precariat: the New Dangerous Class, all of us could be.  For now, the precariat involves largely women, the young, the disabled, retirees forced back to work, former prisoners, and migrants. It also includes large numbers of formerly middle-class professionals, skilled and semi-skilled people who have been displaced by economic change. While each of these groups has gotten some attention, Standing argues that as a group, the precariat is still “a class in the making,” united by an overwhelming sense of insecurity and vulnerability.

The growth of the precariat has its roots in globalization and technological change, which flooded flexible labor markets and advanced international divisions of labor.  These conditions coincided with changes in government regulation, corporate restructuring, reduced access to and distribution of social programs, and the creation of coercive social policies such as workfare, mass incarceration, and means testing.

Historically, precarious employment was associated with the informal economy.  But with economic changes in the last several decades, informality has moved beyond traditional practices of black market exchanges or services such as day care or tutoring. As workers have been displaced from the formal economy, many are turning to consulting, internships, and subcontracting to find contingent and intermittent work. In general, more and more people are involved in unregulated work characterized by irregular employment, short job ladders, substandard wages and working conditions, and increased stigmatization. During the current economic crisis, with declining standards of living and loss of public assistance, the new precariat – like the old precariat — survives by working longer hours, holding multiple jobs, and when possible relying on the kindness and generosity of friends and family.

While the growth of the precariat creates real social and economic challenges for workers in the informal economy, in places like Youngstown, where the cost of living is low, some mostly younger adults are making a virtue of the situation. As cultural anthropologist Hannah Woodroofe has argued, Youngstown is becoming home to increasing numbers of highly individualistic, anti-materialistic, entrepreneurial adults with episodic employment in largely deregulated work environments. While some define themselves as entrepreneurs, many also see their rejection of materialism as providing a measure of freedom and dignity that challenged capitalist and “older parental” values surrounding work.

Their economic conditions are anemic and often do not reflect their education and experience (many have college and even graduate degrees). They don’t earn much and have little savings, health care, or pension benefits. Their work experiences and the difficulties they’ve had in finding jobs in the formal economy have reduced their expectations about the future.  They have internalized their economic insecurity, and their personal lives tend to mirror their work lives, with contingent and episodic relationships and living situations. Many embrace sustainability and green values, starting urban farms or homesteading in abandoned houses.  Others are part of a contingent creative class, doing freelance work in the arts, web development, and education, but because of the precarity of their work, they don’t make the kinds of stabilizing contributions to the local economy that Richard Florida predicted.  Some just want to be left alone, comfortable with their inexpensive lifestyles.

Just how big is the new precariat? It’s difficult to measure, but the Federal Reserve Board of Cleveland suggests that the ‘Great Recession’ has resulted in increases in self-employment, and the Bureau of Labor Statistics reports that 35 million people work part time.  While the data on how many people have precarious employment is far from definitive, the precariat clearly seems to be large and growing.

That suggests that the new precariat could have a significant impact on the election. Most of them don’t believe that the government or other institutions can do much to ameliorate their situation.  Many consider themselves to be small business people. As Arun Gupta and Michelle Fawcett have suggested, “Republicans have turned small business into a catch-all group the way ‘working class’ once served that function for the left.” That suggests that the precariat may be persuaded by campaign rhetoric about taxes and economic development.  On the other hand, many see themselves as anti-capitalist, committed to green values and social justice. So will they vote like those who share their educational backgrounds, who are more likely to be politically independent and have socially progressive leanings, thus revealing themselves to be the fallen faction of the middle class?  Or do they, like much of the old white working class, vote on the basis of economic aspiration?  Or does the precariat now include so many Americans, from diverse backgrounds and in varied situations, that their political views can’t be easily predicted?  In 2012 in states like Ohio, the new precariat could determine the presidential election and America’s future.

John Russo, Center for Working-Class Studies