Category Archives: Jack Metzgar

Graduating College is Highly Overrated

That’s the headline I propose for the Bureau of Labor Statistics (BLS) to attract public attention to its most recent projection of job growth in the next decade.   Though a tendentious conclusion from the BLS study, such a headline could draw the kind of bipartisan outrage that might lead to a more honest and accurate discussion of the relation between education, jobs, and income in these United States.

The BLS does its study of U.S. occupations every two years, showing the number of jobs in each occupation, its educational requirements, and how much it pays.   Though the specifics change, every two years the study shows that a large majority of jobs now and in the future require no education beyond high school.  And every two years the carefully compiled BLS data is ignored, leaving the field clear for everybody from the editorial pages of The Wall Street Journal to President Obama to proclaim that “education is the answer” to economic inequality, poverty, and low wages.

“Graduating college is highly overrated” is about as half-true, and therefore false, as “education is the answer.”  But each claim has some evidence to support it.

According to the BLS, in 2012 only 22% of all jobs required a bachelor’s degree or more, and of the more than 50 million job openings the BLS projects by 2022, only 22% will require a bachelor’s or more.  (In fact, if all you have is a bachelor’s degree, there are only 17% of jobs now and 17% of job openings projected by 2022 that require that degree and no more.)  Problem is that about 32% of the population over the age of 25 has a bachelor’s, and among young people ages 25 to 34, it is a bit higher at 34%.  In other words, there are only two jobs for every three persons who have a bachelor’s degree, and the number of people getting bachelor’s degrees is growing faster than the number of jobs that require that degree – or anything close to it.

Indeed, 26% of jobs in 2012 did not even require a high school diploma, and another 40% required only a high school diploma.  And the BLS projects that it will get worse by 2022, when nearly a third of all job openings will require “less than high school.”

There is a more ambiguous category of jobs that require some “postsecondary education,” whether an associate’s degree or some kind of specialized training certificate or simply “some college.”  But they are required for only about 11% of jobs now, and are projected to provide about 12% of job openings going forward.

The table below summarizes how overeducated our population is for the jobs we actually have.

Level of education

% of people over 25 with this level of education

% of jobs that require this level

Less than high school

12

26

High school diploma

30

40

Some college, A.A., or postsecondary

26

11

Bachelor’s or higher

32

22

We have an oversupply of jobs that require high school or less (66%) compared to the 42% of people whose education fits those jobs.  And conversely, we have an oversupply of people with some postsecondary education (58%) for the 33% of jobs that require something like that level of education.

Just looking at what jobs are now and will be available in the U.S. economy, graduating college seems highly overrated – and it might even be that “going to college is for suckers.”  If all you need for most jobs is a high school education, why bother with college?  That’s simple: wages.

A recent Pew Research Center study, The Rising Cost of NOT Going to College, looks at how income correlates with earnings.  As previous studies have found, high school graduates make $7,000 more a year than those who do not graduate.   Those with “some college” make an additional $2,000, and those who get bachelor’s degrees make $13,000 more on top of that.  The gradient could not be clearer: those with bachelor’s degrees have average incomes twice that of those without high school diplomas ($45,000 vs. $23,000).  What’s more, unemployment rates, poverty rates, and other things follow a similar gradient: the more education, the lower the unemployment rate, the lower the poverty rate, and the more likely you are to have full-time employment and employer-paid benefits.  Conversely, though there are and will be plenty of jobs for people who do not graduate from high school and for those whose education ends with a high school diploma, these jobs generally pay miserable wages – almost uniformly less than $30,000 a year, and most much less.

So, “education is the answer” has some evidence to support it, too.   But both statements are half-truths – not much education is required for most American jobs (now and in the future) and more education leads to higher pay and steadier employment.   It is only when you put the two half-truths together that you can see the whole picture.

If you are an individual 18-year-old, your only chance for a decent income is to go to college or to get some other form of postsecondary education.  Statistically, it will give you a 2 to 1 shot at a decent standard of living vs. a thousand to one for high school graduates and a million to one for those who never graduate from high school.   But if all 18-year-olds – or even most of them – play these odds by going to college, it will do nothing to remedy economic inequality, low wages, and poverty.   In fact, it would probably make all these things worse.

The increasing imbalance of supply and demand — more college graduates than jobs that require them — puts downward pressure on the wages of jobs that require higher education and ensures that more college graduates will be forced to take jobs that do not require college.  Pew found that more than one-third of the recent college graduates it surveyed were currently working in jobs that do not require any college.  Likewise, as more college graduates take jobs that require only high school, more high school graduates are forced to take jobs that do not require a high school diploma, and those who did not graduate from high school have great difficulty finding and keeping any job.   It’s a perfect formula for cheapening all labor.  More and more education is required to attain a decent standard of living, but as more and more people gain higher levels of education, they further flood those higher-paying job markets, leading to lower average wages and living standards for everybody.

The Pew study emphasizes the growing gap between the incomes of college graduates and non-graduates, but it also shows that the real wages of recent college graduates have basically stagnated since 1986.  The growing premium paid to people with bachelor’s degrees is almost entirely the result of 13% and 18% declines in real wages for high school graduates and those with “some college.”

Earnings

More formal education may be an answer for individuals – and I do all I can to convince my grandsons of that.   But it is not and cannot be any part of the solution to economic inequality, poverty, and low wages.   The remedy for all three is the same: higher wages, starting at the low end and reaching up to frontline supervisors.  To get higher wages, workers with and without college degrees are going to need the kind of organized, disciplined collective action that we are beginning to see the first glimmers of among fast-food, Walmart, warehouse, and many other workers.

Those of us in higher education can help by developing a curriculum that will be relevant to those one out of three of our graduates who will not be getting jobs that require college educations.   They need courses in the history of American social movements and courses that teach organizing tactics and strategies for workplace, community, and political organizing, complete with “service learning” internships.   Those are the skills that are needed to raise wages and reduce poverty for the vast majority of American workers.  If we taught those skills, then graduating college might be a bit less overrated than it is today.

Jack Metzgar

Jobs and Safety Nets

Teaching macroeconomics with a group of union stewards and local leaders last month, I had just finished explaining the enormous economic stimulus the combination of “food stamps” and unemployment compensation is providing to our struggling economy.  When you include the “macroeconomic multiplier effects” of these “automatic stabilizers,” it was about $260 billion in 2012, and that was enough to create or save some 3 million jobs – “possibly including yours.”

Having taught this subject many times before, I was ready for somebody to complain about having seen a “food stamp” recipient use their SNAP card to buy caviar or lobster at the Jewel.  When nobody did, I smirkily recounted my past experience with students anyway, because that experience has caused me to wonder if this widely reported occurrence might be an urban legend.  Over the years most students, when questioned, hadn’t seen such an incident themselves but had been told about it by a relative or friend.  Once I asked, “Do they even sell caviar at the Jewel?” – and nobody knew.

At break David, a youngish Teamster truck driver, told me about a SNAP recipient he had seen buying steak, which troubled him because “we have to stretch every dollar” buying groceries.  He asked: “So you’re saying this shouldn’t bother me because it’s stimulating the economy and creating jobs?”

With what seemed like half the class gathered to hear my answer, I pulled out my standard response: “Yeah, basically that is what I’m saying.  But it depends on the magnitude.  If it was widespread, it might be a problem, but there is no evidence that it is.  Besides how would the government enforce something like that?  It could cost $1,000 to catch someone buying a $20 steak!”

My answer, with its rough-and-ready cost-benefit analysis, satisfied a large group of students, who walked away to go on break, but not David: “But it’s still wrong.  It costs a lot to investigate murder too.”  At which point an older Teamster driver from the same local intervened: “What do you give a shit if somebody has a little steak?  It’s not murder!  More like speeding on the toll way.”  Being the professor, I went all Socratic on David: “What kind of steak was it anyway – London Broil or filet mignon?” (inadvertently implying that London Broil was like speeding while buying filet mignon could be like murder!).  This shut David up, as it was clear from his facial expression that he didn’t know the difference between high-priced and low-priced steak, and it occurred to me that he and his family might never themselves have enjoyed a steak.  The older Teamster put his arm around David and jibed as they started out for break, “Don’t be an asshole.  That steak was probably delivered in a truck.”

On reflection I regret opening this door about what SNAP recipients should be allowed to purchase (they are and always have been allowed to buy steak and lobster) and what they actually do purchase (which nobody knows, but I’m guessing is almost never steak and lobster).  I’m trying to teach about how deficit-spending and the high-powered multiplier effects of social-safety-net measures are crucial for reducing unemployment in a depressed economy.   And I end up in a discussion about whether some poor soul buying a slice of London Broil is “taking advantage” of us hard-working taxpayers.

As a middle-class professional who often eats steak (including filet), I don’t get how anybody might see a  SNAP recipient being a bit extravagant (or even a lot extravagant) as a moral challenge to a policy which creates or saves 3 million jobs – that is, 3 million “livelihoods,” such as they are.   We could argue about the size of the multipliers – or about the potential downsides of deficit-spending.  But the fact is that the $135 average monthly SNAP allotment, especially when combined with an average $300 weekly unemployment compensation check, is nearly as valuable to the rest of us as it is to those who receive them.  These meager individual amounts spread across millions of recipients inject consumer spending power into an economy that greatly needs it because we live in a society where most income flows to the top 10% or 11%.  Without these welfare-state “transfer payments,” unemployment would be much worse and, as a result, real wages and median household incomes would be declining even more than they have been.  And as Benjamin Friedman’s classic The Moral Consequences of Economic Growth exhaustively documents, these minor individual ameliorations correlate with more peace, less war and less crime, as well as with more prosperity for everybody.

I have learned, however, not to be indifferent to the moral concern some people have about cheaters and slackers “taking advantage.”  Even a handful of people gaming the system challenges the hyper-vigilant work ethic of many settled-living working-class people, especially white men, but not only them.  Like David, who is Latino, they tend to be a certain social type: They live frustratingly on the edge of a cliff doing work they often hate and struggling every day to keep themselves together, incessantly delaying gratification, “stretching every dollar,” never “letting themselves go” lest they fall off that cliff, taking their families with them.

There are a lot of people on that cliff, and a large group of them seem to think that a stern prejudice against “the poor” helps sustain the integrity of their own characters.  Along with a superstition that “bad things do not happen to good people,” they hope their good characters will protect them from falling into poverty.   A recent Hamilton Project study documents that at least one-third of working-age families with children (with incomes of up to $60,000) are “one major setback” away from “economic chaos.”   Neither poor nor comfortably middle class, this group is dubbed by the study as “America’s struggling lower-middle class,” what many of us would call “working class.”  It is also roughly the same income group ($30,000 to $75,000, in this case) that is most likely to blame the poor for being poor.

It is something like this complex social psychology that Republicans play into when they seek to justify cutting “food stamps” and extended unemployment compensation.   Their latest efforts are especially hypocritical.  A recent Fox News video of a surfer lad exuberantly buying lobster with his SNAP card evoked the cheaters-and-slackers meme.   But the actual cuts Republicans have achieved and additional ones they are now seeking just cut benefits across the board and lop people from the program without any attempt to distinguish the cheaters from the frugally hungry.  Surfer lad will still be able to buy lobster while bad things will happen to good people.

Worse to my mind, however, is the recent argument Republicans make for not extending unemployment compensation for the long-term unemployed.  On the one hand, they insist that the $25 billion it would cost to fund for all of 2014 “be paid for” with cuts from somewhere else, thereby undermining the stimulative job-creating impact it would have.  On the other, they insist on the need to address “the real problem: to find these people jobs.”   Though some Tea Party crazies may not be aware of it, the GOP leadership group surely knows that, according to both the Congressional Budget Office and one of their own economic advisers (Moody Analytics), unemployment compensation has a multiplier effect that creates five times more jobs than corporate tax cuts do.  The only federal government job-creation proposal from either party that has a larger macroeconomic multiplier effect, and thus creates more jobs, is the Supplemental Nutrition Assistance Program (SNAP).

I have about as much contempt for welfare cheaters as anybody on a cliff, but even surfer lad is a moral paragon compared to these guys.

Jack Metzgar

Chicago Working-Class Studies

The Lunch Bucket Award

One of my grandsons won the Lunch Bucket Award on his high school football team a couple weeks ago.   And his father’s reaction to it and mine surprised me, especially for what it showed about class differences across generations.

The Lunch Bucket Award is given each week to the player who made the greatest contribution during practice in the week leading up to the Friday night game.   My grandson is a third-string running back on a state-ranked top 20 team, and he seldom gets into the game unless his team is way ahead – and sometimes, not even then.    He was proud to get the award and, as required, to carry a somewhat rusty lunch bucket to all his classes for the week after the game.   His father, my son, was dismissive of it, calling it “the tackling dummy award” and suggesting that it should have been humiliating to lug an old-time lunch bucket around for a week – signaling to all his classmates that he was not first-team.

My grandson is an exuberant, talkative, sort-of-flashy 16-year-old who both teachers and coaches have designated as “very coachable.”   He’s not that interested in academics, and school has never come easy to him, but he works hard and brings home good grades and is diligently prepping for his ACT test so he can “get into a really good college.”   As an athlete he has some natural ability, and he’s a really good wrestler, but his main assets even there are self-discipline, the ability to learn and improve, and his willingness to work hard so he can do a good job.

His job as a practice-squad running back is to learn the offensive scheme of each week’s opponent, and then run it as good as he can to prepare the first-team defense for what they will face Friday night.   It is potentially a highly confusing intellectual assignment, learning a new set of plays each week, followed by running hard and being tackled by the hardest hitting players at his school.   It seems like highly honorable work to me, where the “dummy” part of “tackling dummy” is clearly not appropriate.   But even more honorable is the grit it takes to do it at all, let alone to do it well “when nobody is watching” (except, of course, his teammates and coaches, who gave him an award for it).

Sports iconography is, of course, full of working-class imagery about “blue-collar” players who “simply show up for work and do a good job,” and who get little or no recognition for what they do – unless, of course, they do it badly.   The old-time barn-shaped lunch bucket is a particularly powerful symbol of this steady, reliable, just doing-your-part work ethic — especially when your part is dirty, distasteful, or dangerous, or maybe just monotonous in a way that middle-class people sometimes call “mind-numbing” or “soul-deadening.”  Most work that needs to be done in our society is like this.  Even though what is often called “unskilled work” almost always requires a wide variety of skills to actually do a good job, these jobs also require a daily kind of self-sacrifice that is hard, very hard, to do day in and day out – and that is actively disrespected in our mainstream culture with its celebration of the best and the brightest, the entrepreneurs and the innovators.  Sports is just about the only place in America that ever recognizes and celebrates the value of those who “simply show up every day and do a good job” at the kind of work upon which everything else, including all of us, depends.

My wife and I were raised in families that carried those kinds of lunch buckets to those kinds of jobs, and though once upon a time we did, too, for a while, we’re both glad we never had to find out whether we could have summoned the everyday courage, the true grit it takes to do it for a lifetime.

We were well on our way to becoming thoroughly middle class by the time our son was our grandson’s age, but even as well-educated grown-ups we didn’t know how to properly raise middle-class children —  in what sociologist Annette Lareau calls “concerted cultivation.”  Our son knows that and, while he’s very forgiving of us, he’s bound and determined to raise his children in that way – to make sure they have the education and skills they’ll need to avoid lunch-bucket jobs and to cultivate that never-settle-for-second-best achievement-orientation that so many middle-class people think is essential to living a good life.  He has a middle-class job at which he earns a very good living, but just as our fathers did, he hates both the work itself and the kind of work he does.   He wants better than that for his kids, and for him the Lunch Bucket Award somehow seemed to challenge that aspiration.

Our grandson needs no help from us in pushing back against his father.  When asked if he was demoralized at not getting much playing time, he said, “No, I’m a big part of this team.  On the practice squad I help the first team get better – and that puts me out there on the field even when I’m not actually out there.”   I got a little too emotional in trying to congratulate him for his Lunch Bucket Award by referencing my grandfather (his great-great grandfather) who, as he knows from family legend, walked out of a steel mill in 1916 “on his own” right after losing both arms in a rolling mill.  I said something like, “That’s an award for character, buddy, and that will be with you long after you can’t juke and jive anymore.”  He said, “Huh?”   Followed by a polite, though possibly comprehending, “Thanks, Pap.”

I understand that sky-high, you-can-do-anything aspirations — even when palpably illusory – can spur young people onward and upward in healthy ways.   I also understand why parents often fear low expectations for their kids.  But finding out what are realistic aspirations and expectations for ourselves and our children is a tricky business, and it will not help to believe that “you can never aim too high.”   Most of us are going to need some lunch-bucket mentality for some or all of our lives.  We’ll need the steady will to do what we have to do to earn a living and to have the personal integrity to do a good job even when we don’t feel like it and nobody is watching.  I loved my job as a teacher, but even on my best days at work I brought that mentality with me just in case I needed it — and because I couldn’t shake it if I wanted to.    My son is a maniac helicopter parent who hates his job, but he does it conscientiously and well more than five days a week.   His son undoubtedly has noticed that.

Sometimes, for both good and ill, parents teach their children less with what they say than with what they do.  For parents, somebody is always watching.  Congratulations, Max, for finally getting our family a Lunch Bucket Award.

Jack Metzgar

Chicago Working-Class Studies

Working-Class Renegades and Loyalists

I never had much time or sympathy for working-class renegades until I read Allison Hurst’s College and the Working Class this summer.

Hurst, a sociologist at Furman University, classifies first-generation college students as renegades if they “have learned to value what the greater society values, academic success, social prestige, and high class position.  They believe that moving away from families and assimilating into the mainstream are necessary for achievement.”   Loyalists, in contrast, are college students from working-class families whose “first priority is to their home communities and [they] are sometimes willing to forgo success if this is predicated on assimilation [to middle-class values and norms].”

As a lifelong (if sometimes unfaithful) loyalist, married to another loyalist, I’ve usually seen renegades’ headlong pursuit of middle-class life and culture as too often leading them to adopt extreme versions of what I see as the worst aspects of middle-class culture – a single-minded focus on personal achievement and publicly recognized accomplishments, which often leads to unreliable serial friendships (if any friendships at all) and a never-far-from-the-surface status anxiety.

Though in the early stages, renegades’ determined rejection of and flight from working-class ways can seem heroically self-actualizing, all too often it turns into phony resume-building, bitterness at being “left behind” no matter where they end up, and a guilt-ridden substitution of show for substance.  In late middle age this pattern can get particularly distasteful, regardless of class background, when the exercise of arbitrary power over others begins to compensate for real accomplishment, thereby spreading the misery.

Hurst’s exploration of today’s “first-generation college students,” however, reveals a historical situation so very different from the one I experienced that I have changed my mind.  Expressing a broad range of empathy for the variety of complex situations working-class college students face today, Hurst captures both the loyalists’ and the renegades’ worlds during a time when “finding yourself” can severely undermine the development of your “competitive advantages.”

Though also insightful about the problems of applying and paying for higher education today, Hurst’s analysis focuses on the clash of cultures working-class students experience in a variety of forms, a clash Barbara Jensen has so poignantly revealed at all levels of education (and life) in Reading Classes: On Culture and Classism in America.  Hurst draws on recent university research on “retaining first-generation college students” as well as on the more nuanced efforts of her colleagues in the Association of Working-Class Academics to argue for a wide array of practices that universities could adopt to help working-class students negotiate the nexus between everyday practical problems and more deeply rooted cultural issues.

What I found most insightful, however, was her creation of five composite characters – three loyalists and two renegades of diverse races/ethnicities – whose progress through the same state university is followed throughout the book.  These characters illustrate both common problems and complexly different ways of handling them.  Collectively, Hurst says of them, “This generation of working-class college students . . . shares some things in common with past generations of ‘scholarship boys and girls,’ but they are also unique in that they are pushed, not just pulled, into college.”

As someone who went to college in the 1960s at four different undergraduate institutions off and on for seven years and who was anything but a “scholarship boy,” I realize how much easier it was then.  College was an option, not a necessity for one thing – no push, all pull.  It also cost a lot less, and many community colleges, university extensions, and universities themselves had a vital sense of mission about expanding democratic values as well as economic opportunities.  And as the sixties progressed, more and more middle-class (and especially upper middle class) students were challenging the middle-class manners, mores, and values of the time.  Working-class life then exerted its own considerable pull, making the culture clash possibly more difficult in some ways than it is today, but there was also so much more space to mix and match, consciously adopting some middle-class ways while rejecting others – more ways to be a “straddler” and not go “all in.”   On both sides of college, working-class life is much less attractive today – more punishing at work, more insecure at home, and weaker as a proud and independent culture that can unselfconsciously scoff at middle-class ways.

My natural sympathies were with Hurst’s three loyalists, but fearing the economic consequences of their loyalties, I found myself hoping they would go “more all in” than they did – before realizing that there really is no more or less to “all in.”  As loyalists, however, they face culturally richer but more economically insecure futures in a job market that has only two jobs for every three college graduates.  One of Hurst’s two renegades, on the other hand, is fleeing a family that abandoned her in her mid-teens, and the other is motivated to be all in culturally because she sees it as a necessity for single-handedly lifting her mother and siblings from economic poverty.

As more and more working-class kids are pushed and pulled into higher education, Hurst is optimistic that universities will become more welcoming – as an administrative “retention strategy,” if nothing else.  But because she values working-class culture as much as middle-class cultural capital, she sets a pretty high standard: “Whether college responds by losing its middle-class character so as to better welcome these students or whether working-class college students will continue to be forced to assimilate to middle-class norms in order to succeed, is a question only future events can answer.”

If colleges and universities want to become more welcoming to working-class students, Hurst has checklist upon checklist for both easy and difficult things they can do.  But “losing its middle-class character” is not something we can expect until there is a stronger, more collectively active working class in the workforce and in the streets — as well as more working-class college students who organize on campus to undermine the narrow-minded self-confidence of a hyper-middle-class culture that, unchallenged, cannot imagine that theirs is not the only “right” way.

There’s much to admire in our middle-class’s aspirational individualism and achievement-orientation, but unchecked by a more rooted communitarian culture, it can turn toxic within individuals and seems to foster deadening institutions run by career-calculating conformists who like to make speeches about “innovation,” “transformation,” and “empowerment” while working to ensure that it’s hard for any of those things to actually occur.  Higher education in America needs a stronger, more vital working-class presence to save it from its own cocky cultural hegemony and its growing attraction to pleasing an increasingly crass Big Money ruling class.

Jack Metzgar

Chicago Working-Class Studies

The Incredibly Shrinking Working Class? The View from the “Professional” Bubble

In a semi-sympathetic article about unions organizing professional workers, a Chicago Tribune/Los Angeles Times reporter last month provided the following, colossally wrong, picture of American workers: “Professionals account for 62 percent of the U.S. workforce, up from 15 percent in 1977.”

It’s true that “professional and related occupations” have grown a lot in the past 35 years when they were, as reported, about 15% of the workforce.  But today they are about 22% of the entire workforce (including part-time workers) and 24% of full-time workers – not 62% or anywhere close to that!

If nearly 2/3rds of all U.S. jobs were “professional” – with its connotations of well-paid autonomy at work, requiring high levels of education — the median annual salary of American workers would be in the $50,000 range instead of the $30,000 range.  And that would mean that income inequality would be dramatically reduced – from the top 10% getting half of all adjusted gross income now to them getting maybe only a quarter.  It would also be likely that 2/3rds of the adult population would have bachelor’s degrees vs. less than 1/3rd now, and it would mean that many more entry-level jobs would require that degree.  Now only 20% of jobs require a bachelor’s and, according to the Bureau of Labor Statistics, that isn’t going to change much in the next decade.

In other words, this report turns the American job structure upside down.  Michael Zweig’s most recent analysis of occupations, for example, finds that The Working Class Majority is now 63%, slightly larger than a decade ago.

This is a huge reporting error, and it’s clear in the context that it was not a typo.  I emailed the reporter, calling attention to the error, but haven’t heard back, and there has been no printed correction.   Factual misreporting like this occurs all the time in American newspapers, especially at second-tier outfits like the Tribune. Economist Dean Baker provides a delightfully smart-ass (and clear) daily blog, Beat the Press, that calls attention to errors of fact and reasoning in the top tier of newspapers – and he is never at a loss for material.  But there is often a pattern to these errors, one that reflects the limited worldview and social experience of both reporters and the “upscale” audiences advertisers encourage them to address.

Though I have rarely seen numerical misreporting of this sort, most mainstream and elite discussion of “the knowledge economy,” its “knowledge workers,” and “the creative class” clearly assumes this kind of disproportionate misunderstanding of the jobs most Americans actually do.  Likewise, President Obama’s repetitive (and uncontested) insistence on the need for everybody to go to college so they can do “the jobs of the 21st Century” must be based on a similar misunderstanding.  (For more detail on this see previous Working-Class Perspectives blogs by Sherry Linkon and me.)

The conspiracy-minded could make a good argument, I think, that our elite opinion-makers and leading politicians are deliberately lying to us in order to flood the labor market with college-educated workers who can then be paid less and bossed around more because their supply is so much greater than the demand for them.   But the scope and scale of such a conspiracy makes this hypothesis highly unlikely.     My guess is that the spectacular magnitude of this particular reporting error reflects the increasingly extreme class segregation of American life – not only in residential life, as dramatically documented in Bill Bishop’s The Big Sort, but in social interaction and experience.  Besides, it is almost comforting to think that our ruling class and its elite professional middle-class opinion-makers actually know the truth and are hiding it from us — rather than to realize that the captains and crew of the ship of state are navigating with such a faulty map of the actually existing American people and the work we do.

How could they, the “data-driven” best and brightest, be so woefully misguided?  Here’s my guess:

Imagine the children of two professional workers – a doctor and lawyer, for example, or a university professor and an accountant – who go to one of the many excellent public schools in the dozens of affluent (not rich-richy, just comfortably “middle class”) suburbs around most American cities.  Their highly dedicated parents schedule them for a wide variety of activities that cultivate social and cultural skills while insisting on their getting good grades in school.  These children, both the” over-achievers” and the just-plain-achievers, then go on to one of the better colleges and universities, which are populated for the most part by the offspring of professional workers from affluent suburbs like theirs.   Assuming they have done well in college, upon graduation these young people get entry-level professional jobs from which they launch careers that, like their parents, are both high stress and high reward.   After some years enjoying life in the city, they marry, have children and move to a suburb with an excellent public school.

This may be a bit of a caricature, but it is by no means uncommon.  Even adding some complexity, it will be very difficult for such people, particularly the high-achievers among them, to understand that America is mostly populated with people who are very unlike them.  Yes, there may have been working-class and even poor kids in their high school or at college, but they are a relatively small minority.  Likewise, at work they are aware of clerical workers and maybe even the janitorial staff as they leave work in the evening, but that’s not where their focus is as they go about their daily work routine.   At restaurants and in other leisure activities, they interact with non-professional workers, but they hardly notice the ones who are not directly serving them.   Everything in their lives fosters the illusion that their lives are “typical” or “normal” and that poorly paid nonprofessional workers who get bossed around are a small and declining group.

These professionals may be conservative Republicans or progressive Democrats.  They may be arrogant, self-absorbed, status-anxious climbers or large-spirited, generous and even nurturing leaders and mentors who do volunteer work among “the less fortunate.”  But what is there in their lives – in their direct observation and experience – that would challenge the idea that we are a “knowledge economy” full of well-educated knowledge workers?   And if they were a reporter, a copy editor, or a well-educated reader of the daily press, what would make them slap their heads in disbelief at the idea that a substantial majority of American workers are “professionals” like them?   Not much – and especially when our elite institutions of cultural production and reproduction (media, universities, politicians and their staffs) are peopled by folks with similar life trajectories who naturally recycle and confirm these professional notions of their own disproportionality.

Zweig’s The Working Class Majority is subtitled America’s Best Kept Secret, and despite the substantial attention the book received more than a decade ago, its recent new edition justifiably retained that subtitle.   But it and all the other work of Working-Class Studies are up against formidable cultural odds.  If the captains and crew of our ship of state are navigating with a terribly faulty map of who we are and what we do, only a large-scale and sustained mutiny can break through the professional bubble.  Hopefully, the newly protesting Walmart retail and warehouse workers and the spreading intermittent strikes of fast-food workers may be the beginnings of such a mutiny.

Jack Metzgar

Chicago Working-Class Studies

 

We Are Worth More

Last month a few hundred retail and fast-food workers, from places like Sears, Dunkin’ Donuts, and McDonald’s, walked off their jobs for a rally in downtown Chicago.   Carrying signs saying “Fight for 15” (or “Lucha Por 15”) and “We Are Worth More,” these workers make $9 or $10 an hour, at best, and they figure they’re worth at least $15.

A one-shift walk-out and protest by a few hundred out of the thousands of such workers in the Chicago Loop and along Michigan Avenue’s Magnificent Mile cannot have the economic impact of a traditional strike – one that shuts down an entire workplace or industry for an extended period of time and, therefore, can bend an employer’s will.   And these workers’ chances of getting $15 an hour any time soon are worse than slim.   This “job action,” bolstered by community supporters organized by Action Now and with help from Service Employees International Union organizers, is more in the nature of a public protest than a “real strike.”   You could even call it “a public relations stunt,” but you’d be wrong to dismiss it as inconsequential.

“Public relations,” ironically, has a bad image.  But think of it as workers witnessing their own plight, calling for others in similar situations to join them and appealing to those of us with decent incomes to support them.  Witnessing, with its religious overtones, is not intended as an immediately practical action.  It’s first about individuals summoning the courage to put themselves forward to make a public claim that they are one of thousands (millions nationally) who are being treated unjustly.  In this case, it means taking the risk that they may be fired or otherwise disciplined for leaving work and going into the streets to proclaim “We are worth more.”

Witnessing is meant to make us think about justice as the witnesses simultaneously inspire and shame us with the courage of their individual actions.  I was at one of the first draft-card burnings that protested the Vietnam War in 1965, and I remember saying something like, “I’d do that if I thought it would do any good,” while knowing in my heart of hearts that I didn’t have the guts to take that kind of risk then.  But it inspired and shamed me – and thousands and then hundreds of thousands of others — to do many other things to fight against that war as we inspired and bolstered (and exerted peer pressure on) each other.

For the broader public, these initial job actions – in New York and Chicago among retail and fast-food workers; in California and Illinois among workers at Walmart warehouses; and all over the place among Walmart retail workers – are “public relations” that raise awareness and pluck consciences.   But for workers who watched workmates walk off the job to witness for them, there may be some of that inspiration and/or shame that is a particularly powerful call to action. That’s what organizers are counting on, in the hope that the numbers of such workers will grow helter-skelter across the retail industry, eventually initiating a contagion of worker direct action that can put these workers in a position to negotiate for “labor peace,” with or without the blessing of the National Labor Relations Board.

There’s another determined witness who couldn’t be more unlike these striking workers.  He’s a retired law professor from the University of Texas, Charles Morris, who is a leading expert on the legislative and early administrative history of the National Labor Relations Act and the Board that enforces it.  In a 2005 book, The Blue Eagle at Work, Morris makes the legal case that the Act defined a labor union as any group of two or more workers who act together (“in concert”) to seek redress of grievances from their employer.   According to Morris, the “concerted activity protection” articulated in the Act means that employers cannot legally fire workers for forming a non-majority  or “members-only” union (as few as two workers acting together), and what’s more, an employer is legally bound to “bargain in good faith” with that union.

Through meticulous legal research, Morris has shown that these worker rights were in the Act from the beginning but have been forgotten by the subsequent customary practice of defining a union as only that group of workers who have formally voted to be represented by a petitioning union. What’s more, other legal scholars have now signed on to Morris’s legal interpretation and are ready to bolster it before an NLRB that is willing to hear their case.  There would be such an NLRB, what Morris calls “a friendly Board,” if Republican Senators would allow a vote on President Obama’s nominees for the Board.

A favorable NLRB ruling would be important for a variety of legally technical reasons that workers and organizers could use to their tactical and strategic advantage – none of which includes the expectation that employers will voluntarily obey the law just because it is the law. But equally important is that Morris’s reading of the Act’s history restores the original meaning of a labor union that is based on workers’ decisions to act together “in concert” with one another.  That is, a labor union is not just an institution with a bureaucracy and a marble palace in Washington, D.C., though it may be that as well.  It is any group of workers in any workplace, no matter how big or small, who decide to and then do act in concert to advance their own interests in their workplace.

In March Chicago Working-Class Studies helped organize a public forum that brought Charles Morris together with workers and organizers from Fight for 15, the Walmart retail and warehouse strikers, and two other groups who are already acting as unions under this definition.  Though there were some disagreements between the elderly legal scholar and the mostly young workers and organizers — one emphasizing the importance of politics and administrative case law in the long run, the others focused on the potential of direct action in the here and now – they agreed that if and when the two come together, the possibilities for a worker-led upsurge of union organizing are great.

Nonetheless, through their actions these workers have already changed what a labor union is and is thought to be.   It is now, and really always has been — even a century before the National Labor Relations Act was passed in 1935, even when it was an illegal “conspiracy” — simply a group of two or more workers acting in concert with one another.   To be really effective there will need, of course, to be many, many more than the hundreds and thousands who have begun this process.  But it starts with a few brave witnesses who take a risk and ask others to join them.  The peer pressure is now on the rest of us.

Jack Metzgar, Chicago Working-Class Studies

Productivity Sharing: There Ought to Be a Law

What if I told you that there is a way to increase wages and profits at the same time without any need to raise prices?  Most people would, I think, say that’s impossible because increased wages must always be paid for either by increasing prices or by reducing profits or both.  They’re wrong.  Most economists (including conservative ones) would say that not only is it possible, it should just naturally occur.  Economists can show with mathematical certainty that productivity growth makes it possible for both wages and profits to increase while prices remain stable. But just because it’s possible doesn’t mean it’s inevitable. Turns out, the connection between productivity and higher wages isn’t natural.

It did seem that way for a while.  Productivity and wages grew in tandem in the U.S. for the 30 years after World War II.  But for the past four decades, wage increases have not kept pace with productivity growth.

We’ve had plenty of growth in productivity across the U.S. economy (it’s doubled since 1973), but only piddling increases in real wages and family incomes.    That is the single biggest reason for the growth of income inequality in the U.S. since the 1970s – not the Reagan and Bush II tax cuts for the wealthy.  New York Times labor reporter Steven Greenhouse calculates that if productivity sharing had continued the way it was in the three decades preceding 1973, each full-time worker would now be earning some $20,000 a year more.  The median annual wage would be about $60,000 instead of $40,000.

We need a law requiring employers to share the fruits of productivity growth with their workers.  Put aside, for the moment, that there is probably no chance of passing such a law in the next four years or longer.  Debate over a proposed law would generate the broad public discussion of productivity growth and productivity sharing that we need to make clear the causes and consequences of our growing inequality of income and to help us figure out how to reverse it.

Productivity growth is one of only a handful of ways to increase a nation’s wealth.  Historically, the other most important way is plunder – raise an army and take wealth from somebody else.  As Adam Smith argued in 1776, productivity growth is the key to peace and prosperity because it is a way to increase “the wealth of nations” without going to war.  Capitalism, the factory system with its division of labor, the industrial revolution, and steadily increasing technological change continuously improve productivity, which simultaneously increases total wealth and reduces the need for human toil.  But productivity growth only produces widespread economic benefit if the wealth is shared, and if that ever occurred “naturally,” it was because workers’ organizations – labor unions and political parties – functioned as “forces of nature” once upon a time.

I have not given up on American unions figuring out a way to revive themselves and return to some semblance of their former power.  But even under the most optimistic scenario, the return of society-wide union power to the U.S. will take decades.  And without powerful unions, labor parties (or labor-influenced parties, as the Democrats once were) are impossible.  We need a shorter term alternative.  Here’s mine.

Pass an amendment to the Fair Labor Standards Act that requires employers to share productivity gains with their employees.  It should be modeled on the so-called Treaty of Detroit that for decades was a standard feature of United Auto Workers’ contracts with the auto companies – and came to be included in many other union contracts during the most prosperous decades in American history (the 1940s into the 1970s).   Such a law would require wage increases to match productivity increases, so a 2% increase in productivity would require the employer to increase real wages by at least 2%.   This will not require an increase in prices, and because labor is only a portion of total costs, there will typically be money left for profits to increase as well.  Since productivity can be measured in different ways, even in manufacturing and mining, the law should require employers to facilitate the election of a workers’ council, with a small budget to hire experts and with the power to negotiate how a company calculates productivity gains.

Figuring out the details and winning support for such a law would require more economic, legal, and political expertise than I can muster.  Workers and some unions might resist, because they think (mistakenly) that speed-up and brute force are the only or the primary means of achieving productivity growth.  Capitalists and their managers are probably not going to like it much either! But advocating for a specific form of legally required productivity sharing could bring some key points into public view.

  • Workers collectively make a very large contribution to productivity growth, and thus should share in its benefits.  Though “investment in new technology” also makes a substantial contribution, it is not as large as the roles played by workers’ tacit skills and on-the-job ingenuity as well as their technical education and formal on-the-job training.  (See Chapters 2 & 3 of Barry Bluestone and Bennett Harrison, Growing Prosperity.)
  • If workers do not get their share of the new wealth created by productivity growth, someone else gets it. This, in turn, contributes to levels of income inequality that will eventually mean there is insufficient consumer demand to keep the economy growing.  A good case can be made that “eventually” has already arrived.
  • Workers’ councils, even very narrowly defined as merely negotiating how productivity is measured in a single workplace, would increase workplace democracy and likely increase workers’ appetites for more.
  • And, oh, did I mention that real wage increases based on productivity growth do not require increased prices or the elimination of growth in company profits?  As such, they may be the one best way to both create and share prosperity – and maybe even peace.

 

Jack Metzgar, Chicago Working-Class Studies

The White Vote in 2012 & the Obama Coalition

I’ve had it with “the white working class.”  Not the actually existing part of the working class that is white, which is composed of complex and interesting people most of whom don’t vote like I think they should, but rather the fictional character who got so much attention during this year’s election campaign.

The fictional character is a white guy who works in a decrepit factory or drives a truck.  He drinks boilermakers (not wine and never a latte) and is good at bowling rather than golf.  Depending on political point of view, he is a “culturally confused but good-hearted racist” or a “salt-of-the-earth real American who loves God and guns and hates both gays and Wall-Street bankers.”

As a demographic category that divides white voters without bachelor’s degrees from those who have that “middle-class” credential, the “white working class” concept makes sense to me, but only if its use fulfills two conditions that the political media apparently cannot manage:

  • First, that we always keep in mind that “white working class” is a demographic category that clumps together more than 45 million voters who share two characteristics and only two – race, as conventionally defined, and the absence of a bachelor’s degree.  The category includes women and men of all religions (and varying levels of religious commitment) and regions. They come from big cities, suburbs, small towns, and isolated shacks in all parts of the country.  It includes Bill Gates and other fabulously rich people who never completed bachelor’s degrees, and it leaves out the many factory workers, truck drivers, waitresses, and retail clerks who did. That is, like all concepts, “white working class” is a convenience for getting a hold on the big picture, but it grossly simplifies a much more complex and varied social reality.  We need to constantly remind ourselves that there is not now, never has been, and never could be a “typical” white working-class person.
  • Second, that as a demographic category for the purposes of electoral analysis, “white working class” is valuable only as part of a comprehensive discussion of the white vote in U.S. elections.

I’ve made the first point before, more than once.  Here let me concentrate on the second by detailing my conclusions about how the concept has played out in the 2012 presidential election.

After much pre-election discussion of how the “white working-class” would vote, the major news media who commissioned the massive election-day exit poll have not reported on their websites how this group actually voted.  In fact, the websites listing that information — voter-category by voter-category, state by state — in 2012 have less than 1/10th the information that CNN had (and still has) on its web site for 2008.   But here’s what I can report based on what is available on Fox News, CNN, and the New York Times, plus some numbers from reporters who have access to the poll’s internals – most importantly, “The Obama Coalition in the 2012 Election and Beyond” by Ruy Teixeira and John Halpin.

  • Class in itself had almost no impact on how people voted for president in 2012.  The middle class (folks of all shades and colors with at least a bachelor’s degree) voted 50/48 for President Obama, and the somewhat larger group of voters with no bachelor’s degree, the working class, voted 51/47 for the President.  Thus, because the middle and working classes voted basically the same, class by itself did not matter.
  • Race, on the other hand, makes a huge difference in how people vote.  Nonwhites (Black, Hispanic/Latino, Asian and Other) voted a little more than 80% for Obama while only 39% of whites did that – a difference of more than 40 percentage points.  Both the middle class and the working class gave Obama slight majorities based primarily on nonwhite voters who offset his 20-point loss among whites.
  • Among whites, the white working class is far from unique in giving Mitt Romney substantial majorities.  Nationally, working-class whites gave Obama only 36% of their vote, but middle-class whites, though slightly more favorable at 42%, also gave Romney a large majority.  Other demographics within the white vote show similar patterns.  Though there are important differences among white voters, most white demographics vote strongly Republican.  For example:
    • Women gave Obama a 55% majority, but not white women, who voted 56/42 for Romney.  White men, on the other hand, were even more strongly for Romney (62/35).  The gender gap is actually bigger among Blacks and Latinos than it is among whites.  Black women voted 9 points more for Obama than their male counterparts; Latino women, 11 points more, and white women, 7 points more.
    • Obama won a bare majority among Catholics (50/48), but lost white Catholics by 19 points – which, however, is a lot better than he did among white Protestants who he lost by 39 points.  On the other hand, Obama won substantial majorities among whites who self-identified as non-Christian or as having no religion.
    • Obama also famously won big (60/37) among young people aged 18-29, but the majority of whites in this age group voted for Romney (51/44).  On the other hand, no other white age group gave Obama more than 39% of their vote.
    • Where whites live matters a lot.  There were no exit polls in some states this year, and so far there is no breakdown of voters by both race and education (as there was in previous years).  From what we have, however, it is clear that the national white vote of 39% for the President hides a lot of variation – whites in Vermont and Alabama vote very differently (66% vs. 15% for Obama in 2012), as do whites in Iowa and Missouri (51% vs. 32% for Obama).  Likewise, whites in large and medium-sized metropolitan areas (250,000 and above) vote more Democratic than whites in the small-town and rural areas of the same states.

Though shrinking as a proportion of the population and thus of the electorate, whites are still a very large majority (72% of the 2012 electorate), and the 39% of us who voted for President Obama provided the bulk of his votes in 2012 (36 million vs. 29 million from nonwhites). But our voices would not have been heard without strong turnouts (against formidable efforts at voter suppression) and lopsided votes for Obama among nonwhites.  On the other hand, their voices would have been drowned out – and worse – without us.  That’s what a multiracial coalition looks like.  Though its weakest link, the white working class is a significant portion of the coalition, and not just in the Midwest battlegrounds.  Of Obama’s 65 million votes in 2012, 30% came from whites with bachelor’s degrees and 25% (more than 16 million) came from those without them.

Part of the reason progressive Democrats have focused on the white working class over the past decade is that among whites, they are much more likely to benefit from progressive economic programs than middle-class whites – programs like universal health care, enhancements of earned income and child tax credits, infrastructure spending, green manufacturing, and unemployment benefits and food stamps.  This has not worked yet to produce more white working-class voters for Dems, at least not at a national level, but the logic is good because all these programs disproportionately benefit working-class Blacks, Latinos, and Asians as well.  And that basic approach, as qualified and compromised as it has played out in practice, is working so far politically, if not economically.  As Teixeira and Halpin conclude:

President Obama and his progressive allies have successfully stitched together a new coalition in American politics, not by gravitating toward the right or downplaying the party’s diversity in favor of white voters.  Rather, they did it by uniting disparate constituencies – including an important segment of the white working class – behind a populist, progressive vision of middle-class economics and social advancement for all people regardless of race, gender, ethnicity, religion, or sexual orientation.

I find the Democrats’ obsessive use of “middle class” irritating, and I’m not sure they’ve articulated anything I want to call “a populist, progressive vision” (as opposed to some of their actual programs), but it is worth appreciating the enormous accomplishment, however fragile and flawed, of what Teixeira and Halpin call “a multiracial, multiethnic, cross-class coalition” that put Barack Obama in the White House for a second term.

Jack Metzgar

Chicago Working-Class Studies

Jobs and the “Fiscal Cliff”

My relief that Mitt Romney was not going to be our president, with a Republican Senate along with the House of Representatives, barely lasted through Tuesday night.  By my lights, a lot of terrible stuff and a completely wrong direction in our policy and politics have just been avoided.  Whew!  But after months of both Republicans and Democrats talking about the lack of jobs being produced by our lackluster economy, with political reporters, operatives, and pundits hanging on the next unemployment report as if it might be vitally important for the future of the republic — by the end of the week our 8% official unemployment rate (and Romney’s oft-repeated “23 million unemployed and underemployed workers”) was again just one of those inconvenient realities that we’re going to have to live with.

President Obama’s victory speech Tuesday night looked ahead to his second term and promised to focus on “reducing our deficit” along with some other things (“reforming our tax code, fixing our immigration system, freeing ourselves from foreign oil”), with no mention of getting our economy growing at a rate that can reduce our debilitating unemployment and the damage it is doing to all of our lives, some of us much more than others.  Then, as the Wall Street Journal headlined two days later, “Political Focus Shifts to ‘Fiscal Cliff’,” and it’s all about budgets and the need for “shared sacrifice” and a “balanced approach” to easing the economy farther downhill rather than going off a cliff.

The fiscal cliff is not just about deficits.  It’s about jobs – jobs in an economy where there are not nearly enough of them for everybody who wants and needs them.  If the spending cuts and tax increases currently scheduled to take effect January 1st actually would take effect and then remain in place for all of 2013, it would reduce the federal deficit, at least at first, by more than $600 billion – that is, “cutting the deficit in half,” as the President once promised.  But it would also throw us back into recession and eliminate more than 4 million jobs.

Jobs and deficits are related.  Federal budget deficits (spending more than you take in in taxes) fuel economic growth and create jobs.  Likewise, stronger, faster economic growth creates jobs and, thereby, reduces federal budget deficits.  Without the annual $1 trillion deficits the federal government has been running since President Obama took office, we would still be in the Great Recession – or worse.  Right now, we need those deficits.  Cut them substantially, and you reduce economic growth and kill jobs.

Even though some Republicans deny it and almost no Democrats will say it, all the political players, including what is euphemistically called “the business community,” know these basic principles of macroeconomics.  They know that rapidly cutting the deficit in half – whether by cutting spending, increasing taxes, or some combination of the two in a so-called “balanced approach” – will send the economy back into its 2008-09 tailspin.  That’s why all the political players fear the fiscal cliff, and that widespread fear is also why we will not go over it.

But how we avoid the fiscal cliff matters.  And just avoiding it will at best leave us where we are, with a stagnant economy growing at 2% a year and with an official unemployment rate near 7% as far as the eye can see.  We also need to stimulate the economy immediately, get it growing fast enough so that it is creating 300,000 or 400,000 jobs a month (versus the recent trend of 150,000 a month).  This will increase the deficit in 2013, but it will also do more to cut the deficit in the long run than any spending cut or tax increase could do.

For a guide on how to avoid the fiscal cliff, see the Economic Policy Institute’s September policy brief, “A fiscal obstacle course, not a cliff” by Josh Bivens and Andrew Fieldhouse.  It’s very wonky and can be hard to follow in spots, but it’s only 13 pages of text.  And it has a wonderful table on page 7 that lists the various laws that expire at the end of this year and shows both how much each expiration will reduce the deficit AND how many jobs it will kill.  The fiscal cliff is not just the Bush tax cuts – which will lop $64 billion off the 2013 deficit if only the high-income cuts expire, but will also cut 102,000 jobs.  The cliff also includes Obama’s special recession-fighting unemployment compensation program (whose expiration will reduce the deficit by only $39 billion but will eliminate 448,000 jobs) and the expiration of the payroll tax cut (which will reduce the deficit by $115 billion but at the cost of killing more than one million jobs).

Bivens and Fieldhouse use these calculations to show how a jobs-sensitive strict cost-benefit analysis would lead to renewing (and even enhancing) federal government spending programs rather than renewing any of the tax cuts, while also showing that tax cuts targeted to lower- and middle-income workers create more jobs than those going to the wealthy and other high-income earners.  Tax increases do kill jobs, just as Republicans always say, but cuts in government social spending kill many, many more.

There is a lot of complicated economics here, and the politics of avoiding the fiscal cliff may be even more complicated.  I sympathize with the President and Congressional Democrats for having to work through these daunting problems while dealing with House Republicans.  But the President started on the wrong foot Tuesday night by focusing on “reducing our deficit.”  That’s a job killer if you do it now and if you do it the wrong way.  With an economy growing at 2% (at best) and an unemployment rate hovering around 8%, the very last thing we need now is to reduce our deficit.  Rather, first we need to preserve our deficit and the jobs it is supporting.  And then we need the President’s American Jobs Act with its increased spending for infrastructure and for state and local governments to hire and rehire “teachers, cops, and firefighters” – namely, the stuff he campaigned on, the promise of “Forward.” Reducing our deficit and a long-term plan for managing our accumulating national debt are for later, not now.  Right now we need jobs, millions of them.  And we need our newly elected President paying attention to that in a way he has not since February of 2009.

Jack Metzgar

Chicago Working-Class Studies

“By My Lights” and “Studies Have Shown”

Recently while writing an article, I found myself using an old-time expression I don’t think I have ever used in writing before: “by my lights,” which means something like “in my view.”  It’s an expression I heard a lot growing up in a working-class family decades ago and still hear among the old-timers of my generation.  Though I sometimes use it in conversation, I thought it might be obscure and/or too colloquial for readers, but the meticulous editor of the piece let it pass without comment.

Then as I read Barbara Jensen’s new book Reading Classes: On Culture and Classism in America, I thought about notions I’ve had for some time about a distinct working-class epistemology that is often more complex and sophisticated than the standard educated middle-class one.   Reading Classes lays out in detail what Jensen sees as competing class cultures, with special emphasis on how middle-class cultural imperialism in schools (from kindergarten to graduate school) makes life and learning more difficult for working-class students.

Though the book is rich in showing oppositions between categorically distinct working-class and middle-class cultures, Jensen’s effort is to put the cultures into dialogue with each other so that they can benefit from each other’s strengths and compensate for their contrary weaknesses.  Firmly based in a memoir of her own experience as a working-class girl who became (somewhat accidentally) middle class, Jensen draws on a wide range of social science studies to supplement her own direct observation as a counseling psychologist, especially of mixed-class couples and high school students.  In doing that, she brings together what I take to be contrary but potentially complementary epistemologies, captured perhaps by the expressions “by my lights” and “studies have shown.”

In my undergraduate classes, I have long warred against the usage “studies have shown” because of its passive-voice exaggeration of the certainty of conclusions drawn from social science studies.  I read a fair number of such studies, and I have yet to come upon one whose data would not support more than one interpretation, no matter how rigorous the research methodology.   I encourage students to use somewhat more awkward phrasing that acknowledges that fallible human beings are actively drawing conclusions from their study – e.g., “researchers [or even “experts”] who have made systematic studies of X have concluded that . . .” Studies do not “find” things or “show” things.  People do.

Systematic studies by people who are knowledgeable about what has been thought and said in their discipline or field of study should be given greater weight than my or my students’ off-hand impressions based on our direct observation and experience.  But, like our off-hand impressions, studies are products of creative human thought.  And one of my off-hand impressions is that one out of three times when the expression “studies have shown” is used it actually means “shut the fuck up.”  That is, it is an educated middle-class bullying tactic to close off discussion by an appeal to authority.

At least as it is reported in both mainstream and, especially, progressive media, this often seems to be the case with disputes about teaching climate change and evolution in public schools.  Without discounting the ideological power politics of local school boards, I don’t see why popular skepticism about scientific findings (even in the natural sciences) does not present opportunities for educating students about the values and procedures of scientific methods, let alone for the exercise and development of critical thinking.   In any case, dismissing and thereby disrespecting popular skepticism strengthens that skepticism – or, rather, has a tendency to turn skepticism into ideologically rigid resistance.   Thus, my war on “studies have shown” in undergraduate general education courses is part of gaining students’ respect for such studies by requiring them to think about the conclusions experts have derived from them – and not simply learn to repeat “what studies have shown.”

On the other hand, in my experience working-class adults have a strong tendency to give too much weight to their own direct observation and experience.  There is a clear strength to this, as they are often very complex interpreters of what they have seen and lived.  But it can often cause them to discount the value of “book-learning” and “abstractions,” and it can be difficult for them to articulate their interpretations of their direct observation and experience in a mixed-class, mixed-race, mixed-everything public setting.  On the plus side, though, “by my lights” is one of several expressions whereby people acknowledge that not only is their own observation and experience necessarily limited – that is, they know they’re only seeing or feeling one small part of a massive elephant – but that they also are bringing their own unique framework, their way of seeing and thinking, to their report/interpretation of that experience.  And, in most cases, the expression invites others to share how they see things by their lights while firmly asserting the value of one’s own lights.  That is, I fancy that there is a grassroots working-class relativism that thinks and lives within an experientially based subjectivity that claims a large space (often too large, in my view) for belief and faith, but that also sees a path to truth in inter-subjective dialogue – usually looking for confirmation, but existentially open to correction and refinement by how others read their different experiences.

The educated middle-class, on the other hand, while officially recognizing a thorough-going epistemological relativism (“observation interferes” even in physics), has a strong tendency to overestimate the number and certainty of “known facts,” to confuse “evidence” with “proof,” and to try to “escape” from belief through the use of rigorous methodologies that can overcome or get beyond “subjective biases.”  The whole project of the sciences (social as well as natural) is to design and implement methods that get researchers free not only of their own subjectivity, but of all subjectivity so that they can “find” objective truth.  These efforts can sometimes be quixotic and are often highly disingenuous, but over the past several centuries they have compiled an impressive array of “known facts” that could not have been derived from undisciplined sharing of beliefs and experiences.  Though the arts and humanities operate very differently, placing much more emphasis on the interpretation of direct experience, interior as well as exterior, we generally respect and pay deference to “scientific truth” without thinking that it is all there is.   But we too tend to overestimate how large what is known is and the degree of certainty with which it is known.

If I had my way, there would be more experimentation with putting these two contrary, but potentially complementary epistemologies together.  Barbara Jensen’s Reading Classes is not the first to do that within Working-Class Studies, but it is the most thorough and comprehensive (and admirably risky) attempt so far.   There are more such efforts in progress.  Christine Walley, for example, who spoke at last year’s How Class Works conference, will soon publish Exit Zero: Family and Class in Postindustrial Chicago.  Walley calls it an “autoethnography.” The book begins with her childhood recollections of the day her father lost his job when Wisconsin Steel shut down forever, and Walley uses anthropological methods to understand the long arm of consequences deindustrialization continues to visit not only on her family and its neighborhood but on a whole world of meanings and relationships that extend well beyond.

By my lights, these and other working-class studies have shown that there is a lot more to life and learning than is dreamt of in an exclusively middle-class philosophy.  But that’s true of a working-class one as well.  Cross-class coalitions, besides being crucial to our politics going forward, have a vast, nearly untapped potential for cultural sharing — not just of information and ideas, but of different ways of knowing.   With Reading Classes and Exit Zero we are better able to tap some of that potential.

Jack Metzgar

Chicago Working-Class Studies