Showing posts with label Sustainability. Show all posts
Showing posts with label Sustainability. Show all posts

05 September 2013

Robert Jensen : Truce at the UT Factory

East Mall fountain, University of Texas at Austin. Photo by Frank Jaquier / Flickr.
With truce at the UT factory,
time to face tough choices
More than ever we need a university that refuses to serve power and instead focuses its resources on the compelling questions of social justice and ecological sustainability.
By Robert Jensen / The Rag Blog / September 5, 2013

AUSTIN -- A truce seems to have been negotiated in the long-running skirmish between the University of Texas and its conservative critics. The Board of Regents’ new chairman has toned down the rhetoric and signaled he wants to reduce tensions that have built over the past two years, which suggests that UT president Bill Powers may keep his job, at least for now.

The start of a new school year, along with this lull in the public squabbling (though Lord only knows what is going on behind the scenes), is a good time to step back and evaluate both sides of the debate.

On one side are Gov. Rick Perry and the regents he has appointed. Their basic complaint is that UT isn’t efficient enough in pursuing what they seem to believe is the primary purpose of the modern university: Churning out technologically competent and politically compliant graduates who will take their slots in the corporate capitalist hierarchy without complaining or questioning.

From that perspective, too many resources are being wasted on irrelevant research by a self-indulgent faculty, and the campus needs a president who can crack some professorial skulls.

On the other side are UT officials and supporters in the state Legislature who defend the quality of the instruction and research at the university, invoking the tradition of academic freedom and an intellectually diverse university, and/or Longhorn loyalty.

As a longtime UT faculty member (with tenure, and hence wide latitude to say what I really think), I don’t hesitate to condemn the anti-intellectual attacks coming from right-wing forces that want to undermine genuine critical thinking. But I also recognize that some of the conservative critique is on target -- there is a lot of irrelevant research being done by a lot of self-indulgent professors, though conservatives misunderstand the problem that creates.


Not enough oxygen for critical thinking

Because of both forces -- attacks pushing the university to the right, and faculty complacency -- there’s not enough genuine critical thinking going on at UT, at a time in the world when multiple cascading crises -- economic and ecological -- demand a critical thinking that is tougher than ever.

Stated bluntly: In 21 years of teaching at UT, I have seen how the reactionary politics of the conservatives and self-serving reactions of the faculty have not served students or society very well. The solution isn’t to force the university to become more factory-like or to defend the existing system of evaluating professors. Instead, we should ask: What is real critical thinking, and on what should it be focused?

Let’s start with the roots of the public squabble: Right-wing forces run the United States, and most of the world, but are never satisfied. Corporate profits are healthy and democracy is ailing; the increasing concentration of wealth undermines the radicalizing potential of democratic processes.

But that level of domination is never enough for the masters, and the right-wing has long wanted to shut down spaces where even token resistance is still possible, especially in journalism and education.

Challenges to corporate values are possible in the university, but that doesn’t mean they are widespread. Take one look at the UT catalog -- pay special attention to the economics and business courses -- and you will see the university isn’t exactly on the front lines of the revolution. The University of Texas is a corporately-run institution largely supportive of corporate values.

Where do faculty members fit in all this? The vast majority coexist with that corporate structure and value system, either because they agree with it or because they have decided not to fight. For the past three decades -- after the threat to an “orderly” society that broke out on college campuses in the 1960s was largely contained -- most faculty have been willing to keep their heads down and let individual career interests be their guide.

In science and technology fields, the result has been increased capitulation of research agendas to corporate demands. University research increasingly is valued when it can be turned into profit, the sooner the better, regardless of the effects on society or ecosystems. Basic science that has no immediate profit-potential is allowed, in part because it provides a necessary foundation for more applied work.

In the humanities and social sciences, the result has been a trend not only toward research that serves the master, but toward research that just doesn’t much matter. The most glaring example is the faddishness of so-called “postmodern” approaches to society, in which marginally coherent “theorizing” that is detached from the real world is not only accepted but celebrated.

When I ask students how they react to this allegedly sophisticated material, they usually roll their eyes. To them, it’s just one more part of college that must be endured to get a degree, like standing in line to get forms signed.

In the social sciences, researchers can easily advance careers not by asking important questions about how systems of power work, but by constructing complex models and methodologies that are, again, so allegedly sophisticated that they have to be important. Students also find most of this kind of work annoying, especially when faculty members have a hard time explaining why the articles being assigned are worth plodding through.


Finding our focus

I’m painting with a broad brush, of course. The University of Texas has many outstanding faculty members who care -- both about students and about the state of the world. I have colleagues I respect and from whom I learn. But the mediocrity and mendacity that I am describing is routine, and the system not only allows but rewards it.

If an individual professor breaks out of the system and spends too much time writing in plain language about subjects that potentially threaten the powerful, the career path gets rocky. As a result, most faculty members take the path of least resistance, accepting the conventional politics of the university and their academic disciplines.

I’ve been lucky in my own career, entering academic life more than two decades ago when it was easier to chart an alternative path; being white and male with the accompanying privileges; and getting some lucky breaks from sympathetic colleagues. As a result, I’ve been able to spend my career writing and teaching from a sharply critical perspective, and keep my job.

My focus has been on the human and ecological crises that existing systems of power -- both corporate and governmental -- have created and the problems those systems cannot honestly face, let alone solve.

That’s what I mean by critical thinking: Focusing first on power, and how concentrations of power undermine decent human communities. That focus is more important than ever, as the human species faces new and unique threats to a sustainable future. Climate change, soil erosion, fresh water shortages, chemical contamination, species extinction -- pick a topic in ecology, and the news is bad and getting worse, and our economic system is compounding the problems.

More than ever we need a university that refuses to serve power and instead focuses its resources on the compelling questions of social justice and ecological sustainability. Instead, the University of Texas has been caught up in a struggle with right-wing forces that want to eliminate what little space for critical thinking still exists. Given the siege mentality that this attack produces, critical self-reflection by faculty members is more difficult than ever.

I believe in the power of people to collectively face these problems and turn away from the death cult of contemporary consumer capitalism shaped by corporate values, and I believe that education is an important part of that struggle. I do not believe the University of Texas, as it exists today, is likely to contribute much to that struggle unless it not only fights the right-wing forces but recognizes that it is failing students and society.

To my faculty colleagues who scoff at this analysis, I would say: You are smart people, probably smarter than I, but being smart isn’t everything. Instead of investing time in your building status in academic cliques -- where you spend a lot of energy reminding each other how smart you are -- wade out into the world and let your work be guided by a simple question: How are we humans going to save ourselves and save the planet from ruin? We live in an unsustainable system that was created by systems that concentrate wealth and power. Do we care?

America is burning, and professors have a choice to fiddle or fight.

This article was also published at the Austin Post.

[Robert Jensen is a professor in the School of Journalism at the University of Texas at Austin and board member of the Third Coast Activist Resource Center in Austin. His latest books are Arguing for Our Lives: A User’s Guide to Constructive Dialogue and We Are All Apocalyptic Now: On the Responsibilities of Teaching, Preaching, Reporting, Writing, and Speaking Out. His writing is published extensively in mainstream and alternative media. Robert Jensen can be reached at rjensen@austin.utexas.edu. Read more articles by Robert Jensen on The Rag Blog.]

The Rag Blog

[+/-] Read More...

19 June 2013

Robert Jensen : The Craziest Person in the Room


The craziest person in the room:
Reflections on how a mediocre
white guy can try to be useful
It’s the job of people with critical sensibilities -- those who consistently speak out for justice and sustainability, even when it’s difficult -- not to back away just because the world has grown more ominous.
By Robert Jensen / The Rag Blog / June 18, 2013

[This is an edited version of a talk given at the annual National Conference on Race and Ethnicity in American Higher Education (NCORE) in New Orleans on June 1, 2013.]

I recognize that the title for this presentation -- “The Craziest Person in the Room: Reflections on How a Mediocre White Guy Can Try to Be Useful” -- is not particularly elegant or enticing, maybe not very clear or even coherent. So, let me begin by explaining what I mean by some of these terms.

First, the “white guy”: For some years now, I’ve begun talks on injustice and inequality by acknowledging my status: White, male, educated, comfortably middle class, and born in the United States -- in short, a privileged citizen of a predatory imperial nation-state within a pathological capitalist economic system. Borrowing a line from a friend with the same profile, I observe that, “If I had been born good-looking, I would have had it all.”

That approach communicates to people in this room who don’t occupy these categories that I recognize my unearned privilege and the unjust systems and structures of power from which that privilege flows. (It also indicates that I am not afraid to look in a mirror.)

But today I won’t offer much more of that reflexive white liberal/progressive/radical genuflecting, which while appropriate in many situations increasingly feels to me like a highly choreographed dance that happens in what we might call “social-justice spaces.” In rooms such as this, such a performance feels like that -- just a performance.

So, yes, there are some things I don’t know and can’t know because I’m a white guy, and that demands real humility, a recognition that people on the other end of those hierarchies have different, and typically deeper, insights than mine. But after 25 years of work to understand the world in which I live, there are some things I am confident that I do know and that are more vitally important than ever.

This confidence flows from an awareness that I am mediocre.

About “mediocre”: Don’t worry, I don’t have a self-esteem problem. I am a tenured full professor at a major state research university, a job that I work hard at with some success. This is not false modesty; I believe I’m an above-average teacher who is particularly good at expressing serious ideas in plain language.

I describe myself as mediocre because I think that, whatever skills I have developed, I’m pretty ordinary and I think that most of us ordinary people are pretty mediocre -- good enough to get by, but nothing special. If we put some effort into our work and catch a few breaks (and I’ve had more than my share of lucky breaks), we’ll do OK. Too many bad breaks, and things fall apart quickly. I think this is an honest, and healthy, way to understand ourselves.

So, for me, “coming out” as mediocre is a way of reminding myself of my limits, to help me use whatever abilities I do have as effectively as possible. I’ve spent a quarter-century in academic and political life, during which time I’ve met some really smart people, and I can tell the difference between them and me. I have never broken new theoretical ground in any field, and I never will. I probably have never had a truly original idea. I’m a competent, hard-working second-tier intellectual and organizer.

As a result, I’ve focused on trying to get clear about basic issues: Why is it so difficult for U.S. society to transcend the white-supremacist ideas of its founding, even decades after the end of the country’s formal apartheid system? Why do patriarchal ideas dominate everywhere, even in the face of the compelling arguments of feminists?

Why do we continue to describe the United States as a democratic society when most ordinary people feel shut out of politics and the country operates on the world stage as a rogue state outside of international law? Why do we celebrate capitalism when it produces a world of unspeakable deprivation alongside indefensible affluence?

And why, in the face of multiple cascading ecological crises, do we collectively pretend that prosperity is just around the corner when what seems more likely to be around the corner is the cliff that we are about to go over?

Those are some really heavy questions, but people don’t have to pretend to be something special to deal with these challenges. We can be ordinary, average -- mediocre, in the sense I mean it -- and still do useful things to confront all this. Instead of trying to prove how special and smart we are, it’s fine to dig in and do the ordinary work of the world.

But people like me -- those of us with identities that come with all that unearned privilege -- do have one opportunity to do at least one thing that can be special: We don’t have to pretend to be the smartest, but we can strive to be the craziest person in the room.

Third, and final, clarification, about “crazy”: In this context, I mean crazy not in a pejorative but in an aspirational sense. I want to be as crazy as I can, in the sense of being unafraid of the radical implications of the radical analysis necessary to understand the world.

When such analysis is honest, the implications are challenging, even frightening. It is helpful to be a bit crazy, in this sense, to help us accept the responsibility of pushing as far and as hard as is possible and productive, in every space.

I take that to be my job, to leverage that unearned privilege to create as much space as possible for the most radical analysis possible, precisely because in some settings I am taken more seriously than those without that status.

If it’s true that white people tend to take me more seriously than a non-white person when talking about race, then I should be pushing those white folk. If I can get away with talking not just about the need for diversity but also about the enduring reality of racism -- and in the process, explain why the United States remains a white-supremacist society -- then I should talk “crazy” in that way, to make sure that analysis is part of the conversation, and to make it easier for non-white people to push in whatever direction they choose.

Once I’ve used the term “white supremacy,” it’s on the table for others who might be dismissed as “angry” if they had introduced it into the conversation.

If it’s true that men tend to take me more seriously than a woman when talking about gender, then I should be pushing the envelope. If I can get away with talking not just about the importance of respecting women but also about the enduring reality of sexism, then I should talk “crazy” about how rape is not deviant but normalized in a patriarchal culture, about how the buying and selling of women’s bodies for the sexual pleasure of men in prostitution, pornography, and stripping is a predictable consequence of the eroticizing of domination and subordination.

I should talk about the violent reality of imperialism, not just questioning the wisdom of a particular war but critiquing the sick structure of U.S. militarism. I should talk not just about the destructive nature of the worst corporations but also about the fundamental depravity of capitalism itself.

As someone with status and protection, I should always be thinking: What is the most radical formulation of the relevant analysis that will be effective in a particular time and place? Then I should probably take a chance and push it a half-step past that. I should do all this without resorting to jargon, either from the diversity world or the dogmatic left. I should say it as clearly as possible, even when that clarity makes people -- including me -- uncomfortable.
Outside of overtly reactionary political spaces, most people’s philosophical and theological systems are rooted in basic concepts of fairness, equality, and the inherent dignity of all people.
This isn’t always as difficult or risky as it seems. Outside of overtly reactionary political spaces, most people’s philosophical and theological systems are rooted in basic concepts of fairness, equality, and the inherent dignity of all people. Most of us endorse values that -- if we took them seriously -- should lead to an ethics and politics that reject the violence, exploitation, and oppression that defines the modern world.

If only a small percentage of people in any given society are truly sociopaths -- incapable of empathy, those who for some reason enjoy cruel and oppressive behavior -- then a radical analysis should make sense to lots of people.

But it is not, of course, that easy, because of the rewards available to us when we are willing to subordinate our stated principles in service of oppressive systems. I think that process works something like this:
  • The systems and structures in which we live are hierarchical.
  • Hierarchical systems and structures deliver to those in the dominant class certain privileges, pleasures, and material benefits, and some limited number of people in subordinated classes will be allowed access to most of those same rewards.
  • People are typically hesitant to give up privileges, pleasures, and benefits that make us feel good.
  • But, those benefits clearly come at the expense of the vast majority of those in the subordinated classes.
  • Given the widespread acceptance of basic notions of equality and human rights, the existence of hierarchy has to be justified in some way other than crass self-interest.
  • One of the most persuasive arguments for systems of domination and subordination is that they are “natural” and therefore inevitable, immutable. There’s no point getting all worked up about this -- t’s just the way things are.
If this analysis is accurate, that’s actually good news. I would rather believe that people take pains to rationalize a situation they understand to be morally problematic than to celebrate injustice. When people know they have to rationalize, it means they at least understand the problems of the systems, even if they won’t confront them.

So, our task is to take seriously that claim: Is this domination/subordination dynamic natural? Yes and no. Everything humans do is “natural,” in the tautological sense that since we do it, human nature obviously includes those particular characteristics. In that sense, a pacifist intentional community based on the collective good and a slave society based on exploitation are both natural.

We all know from our own experience that our individual nature includes varied capacities; we are capable of greedy, self-interested behavior, and we also can act out of solidarity and compassion. We make choices -- sometimes consciously, though more often without much deliberation -- within systems that encourage some aspects of our nature and suppress other parts.

Maybe there is a pecking order to these various aspects of human beings -- a ranking of the relative strength of these various parts of our nature -- but if that is the case, we know virtually nothing about it, and aren’t likely to know anytime soon, given the limits of our ability to understand our own psychology.

What we do understand is that the aspect of our nature that emerges as primary depends on the nature of the systems in which we live. Our focus should be on collective decisions we make about social structure, which is why it’s crucial to never let out of our sights the systems that do so much damage: white supremacy, patriarchy, imperialism, capitalism.

There are serious implications to that statement. For example, I do not think that meaningful social justice is possible within capitalism. My employer, the University of Texas at Austin, doesn’t agree. In fact, some units of the university -- most notably the departments of business, advertising, and economics -- are dedicated to entrenching capitalism. That means I will always be in a state of tension with my employer, if I’m true to my own stated beliefs.

Education and organizing efforts that stray too far from this focus will never be able to do more than smooth the rough edges off of systems that will continue to produce violence, exploitation, and oppression -- because that’s what those systems are designed to do.

If we are serious about resisting injustice, that list of systems we must challenge is daunting enough. But it is incomplete, and perhaps irrelevant, if we don’t confront what in some ways is the ultimate hierarchy, the central domination/subordination dynamic: the human belief in our right to control the planet.
Let me put this in plain terms: We live in a dead world. Not a world that is dying, but a world that is dead -- beyond repair, beyond reclamation, perhaps beyond redemption.
Let me put this in plain terms: We live in a dead world. Not a world that is dying, but a world that is dead -- beyond repair, beyond reclamation, perhaps beyond redemption. The modern industrial high-energy/high-technology world is dead. I do not know how long life-as-we-know-it in the First World can continue, but the future of our so-called “lifestyle” likely will be measured in decades not centuries.

Whatever the time frame for collapse, the contraction has begun. I was born in 1958 and grew up in a world that promised endless expansion of everything -- of energy and material goods, of democracy and freedom. That bounty was never equitably distributed, of course, and those promises were mostly rhetorical cover for power. The good old days were never as good as we imagined, and they are now gone for good.

If that seems crazy, let me try again: The central illusion of the industrial world’s extractive economy -- propped up by a technological fundamentalism that is as irrational as all fundamentalisms -- is that we can maintain indefinitely a large-scale human presence on the earth at something like current First-World levels of consumption.

The task for those with critical sensibilities is not just to resist oppressive social arrangements, but to speak a simple truth that almost no one wants to acknowledge: This high-energy/high-technology life of affluent societies is a dead end. We can’t predict with precision how resource competition and ecological degradation will play out in the coming decades, but it is ecocidal to treat the planet as nothing more than a mine from which we extract and a landfill into which we dump. We cannot know for sure what time the party will end, but the party’s over.

Does that still sound crazy? Look at any crucial measure of the health of the ecosphere in which we live -- groundwater depletion, topsoil loss, chemical contamination, increased toxicity in our own bodies, the number and size of dead zones in the oceans, accelerating extinction of species, and reduction of biodiversity -- and ask a simple question: Where are we heading?

Remember also that we live in an oil-based world that is rapidly depleting the cheap and easily accessible oil, which means we face a major reconfiguration of the infrastructure that undergirds daily life. Meanwhile, the desperation to avoid that reconfiguration has brought us to the era of “extreme energy,” using more dangerous and destructive technologies (hydrofracturing, deep-water drilling, mountaintop coal removal, tar sands extraction). Instead of gently putting our foot on the brakes and powering down, we are slamming into overdrive.

And there is the undeniable trajectory of global warming/global weirding, climate change/climate disruption -- the end of a stable planet.

Scientists these days are talking about tipping points (June 7, 2012, issue of Nature) and planetary boundaries (September 23, 2009, issue of Nature), about how human activity is pushing Earth beyond its limits. Recently 22 top scientists warned that humans likely are forcing a planetary-scale critical transition “with the potential to transform Earth rapidly and irreversibly into a state unknown in human experience,” which means that “the biological resources we take for granted at present may be subject to rapid and unpredictable transformations within a few human generations.” (Anthony Barnosky, et al, “Approaching a state shift in Earth’s biosphere,” (Nature, June 7, 2012.)

That conclusion is the product of science and common sense, not supernatural beliefs or conspiracy theories. The political/social implications are clear: There are no solutions to our problems if we insist on maintaining the high-energy/high-technology existence lived in much of the industrialized world (and desired by many currently excluded from it).

Many tough-minded folk who are willing to challenge other oppressive systems hold on tightly to this lifestyle. The critic Fredric Jameson wrote that, “It is easier to imagine the end of the world than to imagine the end of capitalism,” but that’s only part of the problem -- for some, it may be easier to imagine the end of the world than to imagine the end of air conditioning.

I’m not moving into rapture talk, but we do live in end-times, of a sort. Not the end of the world -- the planet will carry on with or without us -- but the end of the human systems that structure our politics, economics, and social life.

All this matters for anyone concerned not only about the larger living world but also the state of the human family. Ecological sustainability and social justice are not separate projects. One obvious reason is that ecological crises do not affect everyone equally -- as those in the environmental justice movement say, the poor and oppressed of the planet tend to be hit “first and worst, hardest and longest” by ecological degradation.

These ecological realities also affect the landscape on which we organize, and progressive and radical movements on the whole have not spent enough time thinking about this.

First, let me be clear, even though there is no guarantee we can change the disastrous course of contemporary society, we should affirm the value of our work for justice and sustainability. We take on projects that we realize may fail because it’s the right thing to do, and by doing so we create new possibilities for ourselves and the world. Just as we all know that someday we will die and yet still get out of bed every day, an honest account of planetary reality need not paralyze us.

Then let’s abandon worn-out clichés such as, “The American people will do the right thing if they know the truth,” or “Past social movements prove the impossible can happen.” There is no evidence that awareness of injustice will automatically lead U.S. citizens, or anyone else, to correct it. When people believe injustice is necessary to maintain their material comfort, some accept those conditions without complaint.

Social movements around race, gender, and sexuality have been successful in changing oppressive laws and practices, and to a lesser degree in shifting deeply held beliefs. But the movements we most often celebrate, such as the post-World War II civil rights struggle, operated in a culture that assumed continuing economic expansion.

We now live in a time of permanent contraction -- there will be less, not more, of everything. Pressuring a dominant group to surrender some privileges when there is an expectation of endless bounty is a very different project than when there is intensified competition for increasingly scarce resources. That doesn’t mean nothing can be done to advance justice and sustainability, only that we should not be glib about the inevitability of it.
Never in human history have potential catastrophes been so global; never have social and ecological crises of this scale threatened at the same time; never have we had so much information about the threats we must come to terms with.
If all this seems like more than one can bear, it’s because it is. We are facing new, more expansive challenges. Never in human history have potential catastrophes been so global; never have social and ecological crises of this scale threatened at the same time; never have we had so much information about the threats we must come to terms with.

It’s easy to cover up our inability to face this by projecting it onto others. When someone tells me “I agree with your assessment, but people can’t handle it,” I assume what that person really means is, “I can’t handle it.” But handling it is, in the end, the only sensible choice. To handle it is to be a moral agent, responsible for oneself and one’s place in a community.

Mainstream politicians will continue to protect existing systems of power, corporate executives will continue to maximize profit without concern, and the majority of people will continue to avoid these questions. It’s the job of people with critical sensibilities -- those who consistently speak out for justice and sustainability, even when it’s difficult -- not to back away just because the world has grown more ominous.

Facing this doesn’t demand that we separate from mainstream society or give up ongoing projects that seek a more just world within existing systems. I am a professor at a university that does not share my values or analysis, yet I continue to teach.

In my community, I am part of a group that helps people create worker-cooperatives that will operate within a capitalist system that I believe to be a dead end. I belong to a congregation that struggles to radicalize Christianity while remaining part of a cautious, often cowardly, denomination. We do what we can, where we can, based on our best assessment of what will move us forward.

That may not be compelling to everyone. So, just in case I have dug myself in a hole with some people, I’ll deploy a strategy well known to white people talking about social justice: When you get in trouble, quote an icon from the civil-rights movement. In this case, I’ll choose James Baldwin, from a 1962 essay about the struggles of artists to help a society, such as white-supremacist America, face the depth of its pathology.

On this question of dealing honestly with hard truths, Baldwin reminds us, “Not everything that is faced can be changed; but nothing can be changed until it is faced.” In that essay, titled “As Much Truth as One Can Bear,” Baldwin suggested that a great writer attempts “to tell as much of the truth as one can bear, and then a little more.” (James Baldwin, “As Much Truth As One Can Bear,” in Randall Kenan, ed., The Cross of Redemption: Uncollected Writings [New York: Pantheon, 2010], pp. 28-34.)

He was speaking about the struggle for justice within the human family, but if we extend that spirit to the state of the larger living world, the necessary formulation today would be “to tell as much of the truth as one can bear, and then all the rest of the truth, whether we can bear it or not.”

By avoiding the stark reality of our moment in history we don’t make ourselves safe. All we do is undermine the potential of struggles for justice and sustainability and guarantee the end of the human evolutionary experiment will be ugly beyond our imagination. We must remember, as Baldwin said, “that life is the only touchstone and that life is dangerous, and that without the joyful acceptance of this danger, there can never be any safety for anyone, ever, anywhere.”

This article was also published at Racism Review.

[Robert Jensen is a professor in the School of Journalism at the University of Texas at Austin and board member of the Third Coast Activist Resource Center in Austin. His latest books are Arguing for Our Lives: A User’s Guide to Constructive Dialogue and We Are All Apocalyptic Now: On the Responsibilities of Teaching, Preaching, Reporting, Writing, and Speaking Out. His writing is published extensively in mainstream and alternative media. Robert Jensen can be reached at rjensen@austin.utexas.edu. Read more articles by Robert Jensen on The Rag Blog.]

The Rag Blog

[+/-] Read More...

16 May 2013

Robert Jensen : The Collapse of Journalism

Graphic treatment by James Retherford / The Rag Blog.
The collapse of journalism and 
the journalism of collapse:
From royal, to prophetic, to apocalyptic
When we strip away supernatural claims and delusions of grandeur, we can understand the prophetic as the calling out of injustice, the willingness not only to confront the abuses of the powerful but to acknowledge our own complicity.
By Robert Jensen / The Rag Blog / May 16, 2013
Listen to the podcast of Thorne Dreyer's May 10, 2013, Rag Radio interview with Bob Jensen at the Internet Archive. Rag Radio, a syndicated radio show, is first broadcast -- and streamed live -- Fridays from 2-3 p.m. on KOOP 91.7-FM in Austin, Texas.
For those who believe that a robust public-affairs journalism is essential for a society striving to be democratic, the 21st century has been characterized by bad news that keeps getting worse.

Whatever one’s evaluation of traditional advertising-supported news media (and I have been among its critics; more on that later), the unraveling of that business model has left us with fewer professional journalists who are being paid a living wage to do original reporting. It’s unrealistic to imagine that journalism can flourish without journalists who have the time and resources to do journalism.

For those who care about a robust human presence on the planet, the 21st century has been characterized by really bad news that keeps getting really, really worse.

Whatever one’s evaluation of high-energy/high-technology civilization (and I have been among its critics; more on that later), it’s now clear that we are hitting physical limits; we cannot expect to maintain contemporary levels of consumption that draw down the ecological capital of the planet at rates dramatically beyond replacement levels. It's unrealistic to imagine that we can go on treating the planet as nothing more than a mine from which we extract and a landfill into which we dump.

We have no choice but to deal with the collapse of journalism, but we also should recognize the need for a journalism of collapse. Everyone understands that economic changes are forcing a refashioning of the journalism profession. It’s long past time for everyone to pay attention to how multiple, cascading ecological crises should be changing professional journalism’s mission in even more dramatic fashion.

It’s time for an apocalyptic journalism (that takes some explaining; a lot more on that later).


The basics of journalism: Ideals and limitations

With the rapid expansion of journalistic-like material on the Internet, it’s especially crucial to define “real” journalism. In a democratic system, ideally journalism is a critical, independent source of information, analysis, and the varied opinions needed by citizens who want to play a meaningful role in the formation of public policy.

The key terms are “critical” and “independent” -- to fulfill the promise of a free press, journalists must be willing to critique not only specific people and policies, but the systems out of which they emerge, and they must be as free as possible from constraining influences, both overt and subtle.

Also included in that definition of journalism is an understanding of democracy -- “a meaningful role in the formation of public policy” -- as more than just lining up to vote in elections that offer competing sets of elites who represent roughly similar programs. Meaningful democracy involves meaningful participation.

This discussion will focus on what is typically called mainstream journalism, the corporate-commercial news media. These are the journalists who work for daily newspapers, broadcast and cable television, and the corporately owned platforms on the internet and other digital devices.

Although there are many types of independent and alternative journalism of varying quality, the vast majority of Americans continue to receive the vast majority of their news from these mainstream sources, which are almost always organized as large corporations and funded primarily by advertising.

Right-wing politicians and commentators sometimes refer to the mainstream media as the “lamestream,” implying that journalists are comically incompetent and incapable of providing an accurate account of the world, likely due to a lack of understanding of conservative people and their ideas. While many elite journalists may be dismissive of the cultural values of conservatives, this critique ignores the key questions about journalism’s relationship to power.

Focusing on the cultural politics of individual reporters and editors -- pointing out that they tend to be less religious and more supportive of gay and women’s rights than the general public, for example -- diverts attention from more crucial questions about how the institutional politics of corporate owners and managers shapes the news and keeps mainstream journalism within a centrist/right conventional wisdom.

The managers of commercial news organizations in the United States typically reject that claim by citing the unbreachable “firewall” between the journalistic and the business sides of the operation, which is supposed to allow journalists to pursue any story without interference from the corporate front office.

This exchange I had with a newspaper editor captures the ideology: After listening to my summary of this critique of the U.S. commercial news media system, this editor (let’s call him Joe) told me proudly: “No one from corporate headquarters has ever called me to tell me what to run in my paper.” I asked Joe if it were possible that he simply had internalized the value system of the folks who run the corporation (and, by extension, the folks who run most of the world), and therefore they never needed to give him direct instructions.

He rejected that, reasserting his independence from any force outside his newsroom.

I countered: “Let’s say, for the purposes of discussion, that you and I were equally capable journalists in terms of professional skills, that we were both reasonable candidates for the job of editor-in-chief that you hold. If we had both applied for the job, do you think your corporate bosses would have ever considered me for the position, given my politics? Would I, for even a second, have been seen by them to be a viable candidate for the job?”

Joe’s politics are pretty conventional, well within the range of mainstream Republicans and Democrats -- he supports big business and U.S. supremacy in global politics and economics. I’m a critic of capitalism and U.S. foreign policy. On some political issues, Joe and I would agree, but we diverge sharply on these core questions of the nature of the economy and the state.

Joe pondered my question and conceded that I was right, that his bosses would never hire someone with my politics, no matter how qualified, to run one of their newspapers. The conversation trailed off, and we parted without resolving our differences.

I would like to think my critique at least got Joe to question his platitudes, but I never saw any evidence of that. In his subsequent writing and public comments that I read and heard, Joe continued to assert that a news media system dominated by for-profit corporations was the best way to produce the critical, independent journalism that citizens in a democracy needed.

Because he was in a position of some privilege and status, nothing compelled Joe to respond to my challenge.

Partly as a result of many such unproductive conversations, I continue to search for new ways to present a critique of mainstream journalism that might break through that ideological wall. In addition to thinking about alternatives to this traditional business model, we should confront the limitations of the corresponding professional model, with its status-quo-supportive ideology of neutrality, balance, and objectivity.

Can we create conditions under which journalism -- deeply critical and truly independent -- can flourish in these trying times?

In this essay I want to try out theological concepts of the royal, prophetic, and apocalyptic traditions. Though journalism is a secular institution, religion can provide a helpful vocabulary. The use of these terms is not meant to imply support for any particular religious tradition, or for religion more generally, but only recognizes that the fundamental struggles of human history play out in religious and secular settings, and we can learn from all of that history.

With a focus on the United States, I’ll draw on the concepts as they are understood in the dominant U.S. tradition of Judaism and Christianity.


Royal journalism

Most of today’s mainstream corporate-commercial journalism -- the work done by people such as Joe -- is royal journalism, using the term “royal” not to describe a specific form of executive power but as a description of a system that centralizes authority and marginalizes the needs of ordinary people.

The royal tradition describes ancient Israel, the Roman empire, European monarchs, or contemporary America -- societies in which those with concentrated wealth and power can ignore the needs of the bulk of the population, societies where the wealthy and powerful offer platitudes about their beneficence as they pursue policies to enrich themselves.

In his books The Prophetic Imagination and The Practice of Prophetic Imagination, theologian Walter Brueggemann points out that this royal consciousness took hold after ancient Israel sank into disarray, when Solomon overturned Moses -- affluence, oppressive social policy, and static religion replaced a God of liberation with one used to serve an empire.

This consciousness develops not only in top leaders but throughout the privileged sectors, often filtering down to a wider public that accepts royal power. Brueggemann labels this a false consciousness: “The royal consciousness leads people to numbness, especially to numbness about death.”

The inclusion of the United States in a list of royalist societies may seem odd, given the democratic traditions of the country, but consider a nation that has been at war for more than a decade, in which economic inequality and the resulting suffering has dramatically deepened for the past four decades, in which climate change denial has increased as the evidence of the threat becomes undeniable. Brueggemann describes such a culture as one that is “competent to implement almost anything and to imagine almost nothing.”

Almost all mainstream corporate-commercial journalism is, in this sense, royal journalism. It is journalism without the imagination needed to move outside the framework created by the dominant systems of power. CNN, MSNBC, and FOX News all practice royal journalism. The New York Times is ground zero for royal journalism.

Marking these institutions as royalist doesn’t mean that no good journalism ever emerges from them, or that they employ no journalists who are capable of challenging royal arrangements. Instead, the term recognizes that these institutions lack the imagination necessary to step outside of the royal consciousness on a regular basis. Over time, they add to the numbness rather than jolt people out of it.

The royal consciousness of our day is defined by unchallengeable commitments to a high-energy/high-technology worldview, within a hierarchical economy, run by an imperial nation-state. These technological, economic, and national fundamentalisms produce a certain kind of story about ourselves, which encourages the belief that we can have anything we want without obligations to other peoples or other living things, and that we deserve this.

Brueggemann argues that this bolsters notions of “U.S. exceptionalism that gives warrant to the usurpatious pursuit of commodities in the name of freedom, at the expense of the neighbor.”

If one believes royal arrangements are just and sustainable, then royal journalism could be defended. If the royal tradition is illegitimate, than a different journalism is necessary.


Prophetic journalism 

Given the multiple crises that existing political, economic, and social systems have generated, the ideals of journalism call for a prophetic journalism. The first step in defending that claim is to remember what real prophets are not: They are not people who predict the future or demand that others follow them in lockstep.

In the Hebrew Bible and Christian New Testament, prophets are the figures who remind the people of the best of the tradition and point out how the people have strayed. In those traditions, using our prophetic imagination and speaking in a prophetic voice requires no special status in society, and no sense of being special. Claiming the prophetic tradition requires only honesty and courage.

When we strip away supernatural claims and delusions of grandeur, we can understand the prophetic as the calling out of injustice, the willingness not only to confront the abuses of the powerful but to acknowledge our own complicity. To speak prophetically requires us first to see honestly -- both how our world is structured by systems that create unjust and unsustainable conditions, and how we who live in the privileged parts of the world are implicated in those systems.

To speak prophetically is to refuse to shrink from what we discover or from our own place in these systems. We must confront the powers that be, and ourselves.

The Hebrew Bible offers us many models. Amos and Hosea, Jeremiah and Isaiah -- all rejected the pursuit of wealth or power and argued for the centrality of kindness and justice. The prophets condemned corrupt leaders but also called out all those privileged people in society who had turned from the demands of justice, which the faith makes central to human life.

In his analysis of these prophets, the scholar and activist Rabbi Abraham Joshua Heschel concluded:
Above all, the prophets remind us of the moral state of a people: Few are guilty, but all are responsible. If we admit that the individual is in some measure conditioned or affected by the spirit of society, an individual’s crime discloses society’s corruption.
Critical of royal consciousness, Brueggemann argues that the task of those speaking prophetically is to “penetrate the numbness in order to face the body of death in which we are caught” and “penetrate despair so that new futures can be believed in and embraced by us.” He encourages preachers to think of themselves as “handler[s] of the prophetic tradition,” a job description that also applies to other intellectual professions, including journalism.

Brueggemann argues that this isn’t about intellectuals imposing their views and values on others, but about being willing to “connect the dots”:
Prophetic preaching does not put people in crisis. Rather it names and makes palpable the crisis already pulsing among us. When the dots are connected, it will require naming the defining sins among us of environmental abuse, neighborly disregard, long-term racism, self-indulgent consumerism, all the staples from those ancient truthtellers translated into our time and place.
None of this requires journalists to advocate for specific politicians, parties, or political programs; we don’t need journalists to become propagandists. Journalists should strive for real independence but not confuse that with an illusory neutrality that traps mainstream journalists within ideological boundaries defined by the powerful.

Again, real independence means the ability to critique not just the worst abuses by the powerful within the systems, but to critique the systems themselves.

This prophetic calling is consistent with the aphorism many journalists claim as a shorthand mission statement: The purpose of journalism is to comfort the afflicted and afflict the comfortable. That phrase focuses on injustice within human societies, but what of the relationship of human beings to the larger living world? How should journalists understand their mission in that arena?


Ecological realities

Let’s put analysis of journalism on hold and think about the larger world in which journalism operates. Journalistic ideals and norms should change as historical conditions change, and today that means facing tough questions about ecological sustainability.

There is considerable evidence to help us evaluate the health of the ecosphere on which our own lives depend, and an honest evaluation of that evidence leads to a disturbing conclusion: Life as we know it is almost over. That is, the high-energy/high-technology life that we in the affluent societies live is a dead-end.

There is a growing realization that we have disrupted planetary forces in ways we cannot control and do not fully understand. We cannot predict the specific times and places where dramatic breakdowns will occur, but we can know that the living system on which we depend is breaking down.

Does that seem histrionic? Excessively alarmist? Look at any crucial measure of the health of the ecosphere in which we live -- groundwater depletion, topsoil loss, chemical contamination, increased toxicity in our own bodies, the number and size of “dead zones” in the oceans, accelerating extinction of species and reduction of bio-diversity -- and the news is bad.

Add to that the mother of all ecological crises -- global warming, climate change, climate disruption -- and it’s clear that we are creating a planet that cannot indefinitely support a large-scale human presence living this culture’s idea of the good life.

We also live in an oil-based world that is rapidly depleting the cheap and easily accessible oil, which means we face a huge reconfiguration of the infrastructure that undergirds our lives. Meanwhile, the desperation to avoid that reconfiguration has brought us to the era of “extreme energy” using even more dangerous and destructive technologies (hydrofracturing, deep-water drilling, mountain-top removal, tar sands extraction) to get at the remaining hydrocarbons.

Where we are heading? Off the rails? Into the wall? Over the cliff? Pick your favorite metaphor. Scientists these days are talking about tipping points and planetary boundaries, about how human activity is pushing the planet beyond its limits.

Recently 22 top scientists in the prestigious journal Nature warned that humans likely are forcing a planetary-scale critical transition “with the potential to transform Earth rapidly and irreversibly into a state unknown in human experience.” That means that “the biological resources we take for granted at present may be subject to rapid and unpredictable transformations within a few human generations.”

That means that we’re in trouble, not in some imaginary science-fiction future, but in our present reality. We can’t pretend all that’s needed is tinkering with existing systems to fix a few environmental problems; significant changes in how we live are required. No matter where any one of us sits in the social and economic hierarchies, there is no escape from the dislocations that will come with such changes.

Money and power might insulate some from the most wrenching consequences of these shifts, but there is no permanent escape. We do not live in stable societies and no longer live on a stable planet. We may feel safe and secure in specific places at specific times, but it’s hard to believe in any safety and security in a collective sense.

In short, we live in apocalyptic times.


Apocalypse

To be clear: Speaking apocalyptically need not be limited to claims that the world will end on a guru’s timetable or according to some allegedly divine plan. Lots of apocalyptic visions -- religious and secular -- offer such certainty, imaging the replacement of a corrupt society by one structured on principles that will redeem humanity (or at least redeem those who sign onto the principles). But this need not be our only understanding of the term.

Most discussions of revelation and apocalypse in contemporary America focus on the Book of Revelation, also known as The Apocalypse of John, the final book of the Christian New Testament. The two terms are synonymous in their original meaning; “revelation” from Latin and “apocalypse” from Greek both mean a lifting of the veil, a disclosure of something hidden from most people, a coming to clarity.

Many scholars interpret the Book of Revelation not as a set of predictions about the future but as a critique of the oppression of the empire of that day, Rome.

To speak apocalyptically, in this tradition, is first and foremost about deepening our understanding of the world, seeing through the obfuscations of people in power. In our propaganda-saturated world (think about the amount of advertising, public relations, and marketing that we are bombarded with daily), coming to that kind of clarity about the nature of the empires of our day is always a struggle, and that notion of revelation is more crucial than ever.

Thinking apocalyptically, coming to this clarity, will force us to confront crises that concentrated wealth and power create, and reflect on our role in these systems. Given the severity of the human assault on the ecosphere, compounded by the suffering and strife within the human family, honest apocalyptic thinking that is firmly grounded in a systematic evaluation of the state of the world is not only sensible but a moral obligation.

Rather than thinking of revelation as divine delivery of a clear message about some fantastic future above, we can engage in an ongoing process of revelation that results from an honest struggle to understand, a process that requires a lot of effort.

Things are bad, systems are failing, and the status quo won’t last forever. Thinking apocalyptically in this fashion demands of us considerable courage and commitment. This process will not produce definitive answers but rather help us identify new directions.

Again, to be very clear: “Apocalypse” in this context does not mean lakes of fire, rivers of blood, or bodies lifted up to heaven. The shift from the prophetic to the apocalyptic can instead mark the point when hope in the viability of existing systems is no longer possible and we must think in dramatically new ways.

Invoking the apocalyptic recognizes the end of something. It’s not about rapture but a rupture severe enough to change the nature of the whole game.


Apocalyptic journalism

The prophetic imagination helps us analyze the historical moment we’re in, but it’s based on an implicit faith that the systems in which we live can be reshaped to stop the worst consequences of the royal consciousness, to shake off that numbness of death in time.

What if that is no longer possible? Then it is time to think about what’s on the other side. “The arc of the moral universe is long, but it bends toward justice,” said Martin Luther King, Jr., one of the more well-known voices in the prophetic tradition. But if the arc is now bending toward a quite different future, a different approach is needed.

Because no one can predict the future, these two approaches are not mutually exclusive; people should not be afraid to think prophetically and apocalyptically at the same time. We can simultaneously explore immediate changes in the existing systems and think about new systems.

Invoking the prophetic in the face of royal consciousness does not promise quick change and a carefree future, but it implies that a disastrous course can be corrected. But what if the justification for such hope evaporates? When prophetic warnings have not been heeded, what comes next? This is the time when an apocalyptic sensibility is needed.

Fred Guterl, the executive editor of Scientific American, models that spirit in his book The Fate of the Species. Though he describes himself on the “techno-optimistic side of the spectrum,” he does not shy away from a blunt discussion of the challenges humans face:
There’s no going back on our reliance on computers and high-tech medicine, agriculture, power generation, and so forth without causing vast human suffering -- unless you want to contemplate reducing the world population by many billions of people. We have climbed out on a technological limb, and turning back is a disturbing option. We are dependent on our technology, yet our technology now presents the seeds of our own destruction. It’s a dilemma. I don’t pretend to have a way out. We should start by being aware of the problem.
I don’t share Guterl’s techno-optimism, but it strikes me as different from a technological fundamentalism (the quasi-religious belief that the use of advanced technology is always a good thing and that any problems caused by the unintended consequences of such technology can be remedied by more technology) that assumes that humans can invent themselves out of any problem. Guterl doesn’t deny the magnitude of the problems and recognizes the real possibility, perhaps even the inevitability, of massive social dislocation:
[W]e’re going to need the spirit with which these ideas were hatched to solve the problems we have created. Tossing aside technological optimism is not a realistic option. This doesn’t mean technology is going to save us. We may still be doomed. But without it, we are surely doomed.
Closer to my own assessment is James Lovelock, a Fellow of the Royal Society, whose work led to the detection of the widespread presence of CFCs in the atmosphere. Most famous for his “Gaia hypothesis” that understands both the living and non-living parts of the earth as a complex system that can be thought of as a single organism, he suggests that we face these stark realities immediately:
The great party of the twentieth century is coming to an end, and unless we now start preparing our survival kit we will soon be just another species eking out an existence in the few remaining habitable regions. ... We should be the heart and mind of the Earth, not its malady. So let us be brave and cease thinking of human needs and rights alone and see that we have harmed the living Earth and need to make our peace with Gaia.
Anything that blocks us from looking honestly at reality, no matter how harsh the reality, must be rejected. It’s a lot to ask, of people and of journalists, to not only think about this, but put it at the center of our lives. What choice do we have? To borrow from one of 20th century America’s most honest writers, James Baldwin, “Not everything that is faced can be changed; but nothing can be changed until it is faced.”

That line is from an essay titled “As Much Truth as One Can Bear,” about the struggles of artists to help a society, such as the white-supremacist America, face the depth of its pathology. Baldwin suggested that a great writer attempts “to tell as much of the truth as one can bear, and then a little more.” If we think of Baldwin as sounding a prophetic call, an apocalyptic invocation would be “to tell as much of the truth as one can bear, and then all the rest of the truth, whether we can bear it or not.”

That task is difficult enough when people are relatively free to pursue inquiry without external constraints. Are the dominant corporate-commercial/advertising-supported media outlets likely to encourage journalists to pursue the projects that might lead to such questions? If not, the apocalyptic journalism we need is more likely to emerge from the margins, where people are not trapped by illusions of neutrality or concerned about professional status.


[INSERT HOPEFUL ENDING HERE] 

That subhead is not an editing oversight. I wish there were an easy solution, an upbeat conclusion. I don’t have one. I’ve never heard anyone else articulate one. To face the world honestly at this moment in human history likely means giving up on easy and upbeat.

The apocalyptic tradition reminds us that the absence of hope does not have to leave us completely hopeless, that life is always at the same time about death, and then rejuvenation. If we don’t have easy, upbeat solutions and conclusions, we have the ability to keep telling stories of struggle. Our stories do not change the physical world, but they have the potential to change us. In that sense, the poet Muriel Rukeyser was right when she said, “The universe is made of stories, not of atoms.”

To think apocalyptically is not to give up on ourselves, but only to give up on the arrogant stories that we modern humans have been telling about ourselves. The royal must give way to the prophetic and the apocalyptic. The central story that power likes to tell -- that the domination/subordination dynamic that structures so much of modern life is natural and inevitable -- must give way to stories of dignity, solidarity, equality. We must resist not only the cruelty of repression but the seduction of comfort.

The best journalists in our tradition have seen themselves as responsible for telling stories about the struggle for social justice. Today, we can add stories about the struggle for ecological sustainability to that mission. Our hope for a decent future -- indeed, any hope for even the idea of a future -- depends on our ability to tell stories not of how humans have ruled the world but how we can live in the world.

Whether or not we like it, we are all apocalyptic now.

This article was also published at AlterNet.

 [Robert Jensen is a professor in the School of Journalism at the University of Texas at Austin and board member of the Third Coast Activist Resource Center in Austin. His latest books are Arguing for Our Lives: A User’s Guide to Constructive Dialogue and We Are All Apocalyptic Now: On the Responsibilities of Teaching, Preaching, Reporting, Writing, and Speaking Out. His writing is published extensively in mainstream and alternative media. Robert Jensen can be reached at rjensen@austin.utexas.edu. Read more articles by Robert Jensen on The Rag Blog.]

The Rag Blog

[+/-] Read More...

25 April 2013

Norman Pagett and Josephine Smit : Can We 'Downsize' and Survive?

Sewers under construction, north bank of the Thames looking west. Image from End of More.
The end of more:
Can we 'downsize' and survive?
We continue to delude ourselves that 'downsizing' will somehow allow us to carry on with our current lifestyle with perhaps only minor inconveniences.
By Norman Pagett and Josephine Smit / The End of More / April 25, 2013
"Healthy citizens are the greatest asset any country can have.” -- Winston Churchill
LONDON -- Faced with inevitable decline in our access to hydrocarbon resources, we read of numerous ways in which we will have to downsize, use less, work less, grow our own food, use goods and services close to home, consume only what we can manufacture within our own personal environment, or within walking distance.

If we are to survive, we must "live local" because the means to exist in any other context is likely to become very difficult. There is rarely, if ever, any mention of the healthcare we currently enjoy, which has given us a reasonably fit and healthy 80-year average lifespan.

There seems to be a strange expectation that we will remain as healthy as we are now, or become even healthier still through a less stressful lifestyle of bucolic bliss, tending our vegetable gardens and chicken coops, irrespective of any other problems we face.

And while "downsizing" -- a somewhat bizarre concept in itself -- might affect every other aspect of our lives, it will not apply to doctors, medical staff, hospitals and the vast power-hungry pharmaceutical factories and supply chains that give them round the clock backup.

Nor does downsizing appear to apply to the other emergency services we can call on if our home is on fire or those of criminal intent wish to relieve us of what is rightfully ours. Alternative lifestylers seem to have blanked out the detail that fire engines, ambulances and police cars need fuel, and the people who man them need to get paid, fed, and moved around quickly.

In other words "we" can reduce our imprint on the environment, as long as those who support our way of life do not. Humanity, at least our "Western" developed segment of it, is enjoying a phase of good health and longevity that is an anomaly in historical terms. There is a refusal to recognize that our health and well-being will only last as long as we have cheap hydrocarbon energy available to support it.

Only 150 years ago average life expectancy was around 40 years and medical care was primitive, basic, and dangerous. Children had only a 50/50 chance of reaching their fifth birthday. Death was accepted as unfortunate and inevitable, but big families ultimately allowed survival of a few offspring to maturity, which gave some insurance against the inevitable privations of old age.

The causes of disease, many of which we know to be the result of the filth and chaos of crowded living, contaminated water. and sewage, were merely guessed at. The overpowering smell of this waste was generally accepted as a cause of a great deal of otherwise unexplained sickness.

Even the ancient Romans built their sewers to contain the smells they considered dangerous; getting rid of sewage was a bonus. Malaria literally meant "bad air," and the name of the disease has stayed with us even though we now know its true cause.


Prevailing winds

As cities developed, particularly in Europe, the more prosperous quarters were, and still are, built in the south and west, to take advantage of the general prevailing winds blowing the smells of the city eastwards. Thus the east side of many cities had to endure the industrialization that created the prosperity of the western suburbs.

In many respects the populations of European cities of the eighteenth and nineteenth centuries reflected the problems of our own times: they were growing faster than any means could be found to sustain them. Cities were seen as sources of wealth and prosperity, so people crowded together in them, but in so doing they created the seedbeds for the diseases that were making the cities ultimately untenable.

To quote from Samuel Pepys’ Diary:
This morning one came to me to advise with me where to make me a window into my cellar in lieu of one that Sir W Batten had stopped up; and going down into my cellar to look, I put my foot into a great heap of turds, by which I find that My Turner’s house of office is full and comes into my cellar, which doth trouble me. October 20th 1660; …
People were being debilitated and killed by the toxicity of their own wastes and that of the animals used for muscle power and food. By 1810 the million inhabitants of London (by then the biggest city in the world) used 200,000 cesspits; their contents could only be cleared out manually and so were usually neglected. Waste simply accumulated because no authority took final responsibility for doing anything about it, and any laws on the matter were widely flouted.

By the 1840s, water closets were coming into general use in more affluent homes through the availability of pumped water. While these were seen as an improvement on the chamber pots of previous eras, the water closets resulted in greater quantities of water flowing into the cesspits.

This water in turn overflowed into street drains that had only been created to take rainwater into ditches and tributaries of the River Thames. Improvements in personal hygiene, allowing the upper classes to "flush and forget," had unwittingly created an even bigger danger to public health for everyone else.

Cities and towns were expanding under the pressure of industrialization, but by continuing to use a pre-industrial infrastructure of waste disposal they were being constantly hit by outbreaks of diseases that swept through huddled tenements and luxury homes alike.

Draw off points for public drinking water were often carelessly close to sewage discharges, or the water came from town wells that were contaminated by overflowing cesspits. Cholera and typhoid fever became the scourge of Victorian London.

The Thames as it ran through the city became an open sewer, as tidal flows washed effluent back and forth twice a day. It was a problem that grew throughout the early part of the nineteenth century, culminating in the unusually hot summer of 1858 when bacteria thriving in the fetid water created what became known as the "great stink."

Even the business of government itself was overcome, and plans were made to evacuate parliament to Oxford or St Albans, such was the overpowering stench of the river. Even curtains soaked in chloride of lime could not counteract the smell of raw sewage coming up from the Thames outside, but at least it focused minds and money on the problem.

Numerous proposals were made to deal with it, but only Joseph Bazalgette, chief engineer of the London Metropolitan Board of Works, came up with a workable solution. This was a truly stupendous undertaking that involved building 82 miles of intercepting sewers on the north and south banks of the Thames serving 450 miles of main sewers, linking to 13,000 miles of minor street drains. The completed system could deal with a daily waste output of half a million gallons of sewage.

The sewers were designed to take the raw effluent out to the coast to the north and south of London by gravity, terminating in giant pumping stations driven by Cornish beam engines each needing 5,000 tons of coal a year to keep them running. They lifted the sewage into giant reservoirs that discharged it out to sea on ebb tides. No attempt was made to treat the sewage, merely to get rid of it.

To build those sewers required 315 million bricks, and almost a million tons of mortar and cement. You can’t make bricks and mortar without heat, and lots of it. The only source of heat on that scale was coal, which could only be got in quantity by deep mining. With the heat energy from coal, Victorian engineers could manufacture top quality bricks by the million in enormous new kilns, rather than on the relatively small scale previously allowed by using wood as a heat source.

London embankment sewer brickwork under construction. Image from End of More.

A marvel of Victorian engineering

The entire scheme was completed between 1856 and 1870 and was a marvel of Victorian engineering, but it was only made feasible by fossil fuel energy. Coal from deep mines had only become widely available in the late 1700s, when the invention of the viable steam engine allowed miners to pump out flood water from deep shafts (the same type of steam engines that pumped the sewage to the sea).

Bazalgette’s enterprise was the biggest undertaking of civil works in the world at that time, and from firing the bricks to discharging waste into the open sea it depended entirely on the availability of cheap energy from coal. Even the delivery of the bricks and materials into the heart of the city could only have been done by the recently constructed steam powered railways.

The sewer system is out of sight and largely out of mind but remains a stark example of how we need continual energy inputs at the most basic level to sustain our health. The same sewers still keep London healthy today, and they discharge a hundred times the volume anticipated by Bazalgette’s original design.

It was ironic that burning cheap coal would save thousands of lives in the capital city by providing the means to build its sewers, while simultaneously causing thousands of deaths over the following century by poisoning its air until the introduction of the clean air act in 1956.

Every developed town and city across the world now safeguards the health of its citizens in the same way, by pumping away wastes to a safe distance before treatment. But to do it there must be constant availability of hydrocarbon energy. Electricity will enable you to pump water and sewage but it cannot provide all the infrastructure needed to build or maintain a fresh water or waste treatment plant; for that you need oil, coal, and gas.

Modern domestic plumbing systems are now made largely of plastic, which is manufactured exclusively from oil feedstock, while concrete main sewer pipes are produced using processes that are equally energy intensive. The safe discharge of human waste and the input of fresh water have been critical to health and prosperity across the developed world, yet we continue to delude ourselves that "downsizing" will somehow allow us to carry on with our current lifestyle with perhaps only minor inconveniences.

But we are even more deluded when it comes to the medical profession and all of the advanced treatments and technologies it can provide to keep us in good health for ever longer lifespans and make our lives as pain-free as possible. We have a blind faith that we can continue to benefit from a highly complex, energy-intensive healthcare system, irrespective of what happens to our energy supplies.

We read of the conditions endured by our not-so-distant forebears, and recoil in horror at the prevalence of the dirt and diseases they had to accept as part of their lives. We should perhaps stop to consider that they did not have the means to make it otherwise. In the absence of any real medical help, people who could afford them carried a pomander, a small container of scented herbs held to the nose as some kind of protection against disease and the worst of the city odours.

We think of ourselves as somehow different, but our modern health system will survive only as long as the modern day pomander of our hydrocarbon shield is there to protect it.

The last century saw massive advances in healthcare, driven by both fossil fuel and world war. The new technology and energy sources available at the start of the First World War allowed killing on an industrial scale but it also drove innovation and industrialization of medical care. The war saw the development of the triage system of prioritizing treatment for the wounded, and new means of transporting patients away from the dangers of the battlefield quickly.

In 1914 Marie Curie adapted her X-ray equipment into mobile units, specifically designed to be used in battlefield conditions. At the same time, disease was being contained with the help of mobile laboratories, tetanus antitoxin, and vaccination against typhoid. All this was no defence against the virus of the so called Spanish flu, which broke out and spread among troops and civilians alike, killing more people than the previous four years of conflict in a pandemic that ran from 1918 till 1920.

The war had killed 37 million people, and estimates put the total number of fatalities of the flu epidemic at up to another 50 million, but even those enormous numbers show as barely a blip when we look back on the inexorable rise in population in the last century.


Laying a foundation for modern medical care

The skills that had been employed to create the sewage disposal and fresh water pumping works of the nineteenth century now provided the foundations for making medical care and childbirth cleaner and safer in the twentieth.

But every innovation demanded energy input. Even the production of chlorine based bleach, which kills the bacteria of tetanus, cholera, typhus, carbuncle, hepatitis, enterovirus, streptococcus, and staphylococcus, and which we now take for granted, would not have been be possible without the industrial backup to manufacture and distribute it.

Incorrectly handled, chlorine will kill almost anything, including us. Progress in healthcare might have appeared slow to those involved, but in historical terms it began to move rapidly. Fossil fuel energy provided a cleaner environment for humanity to breed, and we began to make up the numbers lost between 1914 and 1920.

While human ingenuity was critical to such rapid progress, none of it would have been possible without the driving force of oil, coal, and gas. Our collective health today still hangs by that thread of hydrocarbon.

As the industrial power of nations forced technology ahead at an ever increasing pace after World War One, the underlying energy driving our factory production systems increased general prosperity, and that in turn financed research into unknown areas of disease.

Alexander Fleming, professor of bacteriology at St Mary’s Hospital in London first identified Penicillium mould in a petri dish in his laboratory in 1928, and began to recognize its potential for preventing post-surgical wound infections. But its full potential was not brought into play until World War Two, just over a decade later.

The drug had been created on the laboratory bench, but it needed the power of energy-driven industry to make it available in quantity. Constraints in Britain’s wartime manufacturing capacity meant that production had to be carried out in the U.S., and even there it proved difficult to refine the process to produce penicillin on an industrial scale.

John L Smith, who was to become president and chairman of Pfizer and who worked on the deep-tank fermentation process that provided a successful solution to large scale production, said of penicillin:
The mold is as temperamental as an opera singer, the yields are low, the isolation is difficult, the extraction is murder, the purification invites disaster, and the assay is unsatisfactory.
Even with the power of American industry behind it, penicillin only became available for limited use on war wounds by 1944/5, and was not made available for general use until after the war.

For little more than a century developments in safe drinking water supply, sanitation, and medical science have allowed us progressively to tackle many once-fatal diseases and illnesses. We minimized the risk of infection and created vaccines, cures, or life-prolonging treatments for everything from measles to cancers.

Western affluence and medical technologies support lives that would not otherwise be viable, for those who are born prematurely or who suffer serious injury, disability or illness. Medical treatment now incorporates preventative measures to extend lives and keep people in "perfect" health for as long as possible. As a result, average life expectancy across the global population has grown from just under 50 years in the 1950s to 67 years today.

So-called "miracle" drugs gave man a sense of omnipotence that tipped into hubris when, in 1969, U.S. Surgeon General William Stewart, was reported to have said it was time to “close the book on infectious disease.”


Fighting a losing battle

But we have not closed that book, nor are we likely to. Sir Alexander Fleming forecast that bacteria killed by his new wonder drug would eventually mutate a resistance to it. Within decades the effectiveness of antibiotics in tackling staphylococcus aureus bacteria was diminishing and the methicilin-resistant staphylococcus aureus, or MRSA "superbug," was taking hold.

It is easy to forget that before the development of the antibiotic the medical profession could provide no effective cure for infections such as pneumonia, and a slight scratch from a rose thorn bush could be enough to cause death from blood poisoning.

We are fighting a losing battle against nature; bacteria will always win the war of numbers. No matter what medication we add to our arsenal, bacteria will always mutate to resist it. Since the emergence of MRSA, hospitals have had to deal with constantly mutating new strains, each one more virulent than the last, testing our ingenuity in dealing with them, and killing patients we thought could be protected from such infection.

In some regions of the world the malaria parasite is becoming resistant to the anti-malarial drug artemisinin, while drug-resistant tuberculosis has been reported in 77 countries, according to research by the U.S. Centers for Disease Control and Prevention.

In our arrogance we have failed to take account of nature’s resilience, and have also neglected to consider human nature and our instinct to put self-interest above the common good, even if contagion is spread in the process. The behavior of the human race is less easily controlled than bacteria in a petri dish.

In less developed parts of the world, notably Africa, HIV/Aids and other infectious diseases continue to claim nearly 10 million lives a year. Global political directives and programmes to prevent and tackle disease are commonly falling short of their objectives for a variety of reasons, including localised corruption, lack of financial support from the wealthy West and misinformation propagated through local superstition or by religious groups.


Tending to the rich

In spite of the good intentions of global leaders, there continues to be a huge disparity between the health risks and care of rich and poor within cities, nations,and regions of the world. The U.S. has more than a third of the world’s health workers, tending the diseases of the affluent: heart disease, stroke, and cancer.

Many of the consuming world’s ills are being caused by people’s excesses, eating too much of the wrong foods, drinking too much alcohol, smoking, or sunbathing. A billion of the world’s people are overweight, a figure that is balanced in the cruelest of ironies by the billion who cannot find enough to eat.

At the same time, the poor of the world often lack access to medical facilities, doctors, and drugs, and also to the basics of safe drinking water, sanitation, and waste disposal. It is estimated that almost half of the developing world’s population live without sanitation, and as increasing numbers of people are living in overcrowded, urban conditions the potential for transmission of infectious disease grows.

The consuming nations had the geological good fortune to be sitting on resources -- coal and iron -- that could be used to build water and waste disposal systems, but others have been far less fortunate. We now see megacities like Lagos and others with populations of 10 million or more with little or no water or sewage infrastructure, in tropical heat.

For them, the energy to build a modern health infrastructure is a dream that will never materialize: there is too little energy left and it has all become too expensive.

It is also becoming too expensive for the consuming countries of the west, as can be seen in the government cuts in health service budgets now taking place. We have developed extremely successful and innovative medical technologies, a pill for every ill and a physical infrastructure of surgeries, clinics and hospital buildings: all are highly sophisticated luxuries that we can no longer afford and consume vast amounts of energy.

The U.S. Environmental Protection Agency estimates that hospitals use twice as much energy per square foot as a comparable office block, to keep the lights, heating, ventilation, and air conditioning on 24/7 and run an array of equipment from refrigerators to MRI scanners.

But don’t take our word for it. Dan Bednarz, PhD, health-care consultant and editor of the Health after Oil blog, presented his view of the future at a nurses’ conference in Pennsylvania, USA:
Fossil fuel costs will continue to rise and eventually the healthcare system will be forced to downsize -- just as the baby boomers and (possibly) climate change effects inundate the system.
Without energy input our hospitals and medical systems cannot be maintained at their present levels, and concepts of health and care become very different.

We are already seeing a resurgence of alternative medical therapies, often using herbs similar to those in the historic pomander. This foreshadows what will happen in your post-industrial future as well-fed health and wellbeing give way to weakness and disease, accentuated by poor nutrition, and the energy-driven skills of modern medicine are no longer readily available.

A doctor might have a knowledge of what ails you, but that might be almost his only advantage over his medieval counterpart. Knowing that you need an antibiotic to stop a raging infection will be of little use if there’s no means of getting hold of it.

Just contemplate the "innovative" methods of the surgeons in northern Italy’s medieval universities in the 1400s:
"They washed the wound with wine, scrupulously removing every foreign particle; then they brought the edges together, not allowing wine nor anything to remain within -- dry adhesive surfaces were their desire. Nature, they said, produce the means of union in a viscous exudation, or natural as it was afterwards called by Paracelsus, Pare, and Wurtz. in older wounds they did their best to obtain union by desiccation, and refreshing of the edges. Upon the outer surface they laid only lint steeped in wine.” -- Sir Clifford Allbutt, regius professor of physic, University of Cambridge
The modern health system has replaced our need to take responsibility for our own bodies. It cannot give us immortality, but it has given us the next best thing: long, safe, and comfortable lives. We built our good health on hydrocarbon energy, but in the future a wealth of factors will make it progressively more difficult for us to exert control over disease as that energy source slips from our grasp.

Disease will become more prevalent, not only in localized outbreaks, but at epidemic and even pandemic levels. Your healthcare system cannot downsize, it’s either there or it isn’t.

[Norman Pagett is a UK-based professional technical writer and communicator, working in the engineering, building, transport, environmental, health, and food industries. Josephine Smit is a UK-based journalist specializing in architecture and environmental issues and policy who has freelanced for British newspapers including the Sunday Times.Together they edit and write The End of More.]

The Rag Blog

[+/-] Read More...

Only a few posts now show on a page, due to Blogger pagination changes beyond our control.

Please click on 'Older Posts' to continue reading The Rag Blog.