Keywords for the Age of Austerity 11: Civility (at NYU and the University of Illinois)

Earlier this month, the University of Illinois-Urbana Champaign recently took the unprecedented step of rescinding a job offer to the Palestinian-born scholar Steven Salaita, who was set to begin classes there this week. It was a unilateral move by the upper administration, apparently taken in response to a series of tweets in which Salaita condemned the Israeli bombardment of Gaza. Others have already written on the case and its implications for academic freedom—see especially Corey Robin’s blog and this op-ed by many Illinois faculty, for example. (Also check out @FakeCaryNelson on Twitter, for all the latest from a fictional version of the former advocate of academic freedom.)

In the spirit of this blog, I want to focus on the 2 official statements on the case from Illinois’ Chancellor, Phyllis Wise, and its Board of Trustees. As efforts at damage control, they are on the one hand singular in their ineloquence and ineptitude. Yet on the other hand they are familiar in their abuse of notions like “civility,” “debate,” and “discourse”—especially when the latter are “robust,” a keyword forthcoming on this blog.

As others have already observed, the letters from the Chancellor and the Board make a mockery of important scholarly concepts like academic freedom, constitutionality, and English syntax. In a key section of her letter, published as a blog post on her office’s website, Chancellor Wise reaches a cannot-and-will-not crescendo that is meant to signal to you that this is a Robust Leader speaking. It ends with an illogical mess that signals to me that this is instead a rather desperate manager (without a copy editor) grasping at rhetorical straws:

What we cannot and will not tolerate at the University of Illinois are personal and disrespectful words or actions that demean and abuse either viewpoints themselves or those who express them.

Viewpoints, of course, can’t be demeaned—nor is there any attempt to explain what constitutes “personal,” “disrespectful,” demeaning, or abusive words, much less the combination of all four, much less still the relationship between viewpoints and those that express them.

Among these other sins, though, Wise’s short letter is also rather redundant: it uses “diverse and diversity” 4 times, “discourse” three times, and “civil” or “civility” 3 times. To quote her again at length:

Some of our faculty are critical of Israel, while others are strong supporters. These debates make us stronger as an institution and force advocates of all viewpoints to confront the arguments and perspectives offered by others. We are a university built on precisely this type of dialogue, discourse and debate.

Note the redundant use of “dialogue, discourse and debate” here, in which all 3 are treated as identical concepts, their differences elided in the banal, alliterative evocation of intellectual life as imagined by bureaucrats—a sing-songy pantomime of actual thinking.

The follow-up letter from the Board of Trustees doubles down on Wise’s careless invocation of “civility” as the highest virtue of intellectual life. They use it as part of a grander claim about the university’s social and political mission:

Our campuses must be safe harbors where students and faculty from all backgrounds and cultures feel valued, respected and comfortable expressing their views…The University of Illinois must shape men and women who will contribute as citizens in a diverse and multi­cultural democracy. To succeed in this mission, we must constantly reinforce our expectation of a university community that values civility as much as scholarship.

Disrespectful and demeaning speech that promotes malice is not an acceptable form of civil argument if we wish to ensure that students, faculty and staff are comfortable in a place of scholarship and education. If we educate a generation of students to believe otherwise, we will have jeopardized the very system that so many have made such great sacrifices to defend.

(Please note, just as an aside, the allusion to American military casualties, and the consequent suggestion that the war dead gave all for the Illinois Board of Trustees.)

The Board’s combination of scholarly “civility” and democratic citizenship brings together two threads in the use of this vague, popular term. Besides the above, think of the “Civility Caucus” in Congress, or the regular lamentations in the press at election time that inter-party squabbling is too “coarse” and hostile. In all these cases, the celebration of “civility” conflates the tone of disagreement with disagreement itself, and ultimately suppresses both. As I wrote in a longer essay on the subject in Guernica:

The desire for civil discourse in mainstream politics conceals a deeper desire for a politics of consensus, with no major points of either ideological or practical disagreement. In this view, politics becomes simply a process of managing government bureaucracy; fundamental social conflicts do not exist, only rhetorical ones do.

The other trouble with “civility” is that it is unclear what it means, or if it means anything. In the Salaita case, if his offense is anti-Semitism—a demonstrably untrue charge—than it should be enough for Wise to denounce him for that alone. Instead, as Brian Leiter writes in a piece on the Salaita affair, “incivility” seems here to simply mean bad manners—something nobody should want university administrators adjudicating, nor people losing their livelihoods over. 

Of course, these notions of civility (and again, Wise’s related four D’s—debate, discourse, diversity, and dialogue) as the glue holding campuses together are always summoned by administrators as rhetorical weapons against particularly troublesome campus dissenters. So on the simplest level, “civility” is merely an invention to discredit your opponent’s point of view as irrational. Given the word’s etymological links with “civilize” and “civilization,” this is a mode of attack with which Palestinians like Salaita are likely quite familiar.


A photo of Cary Nelson (at left), uncivilly blocking traffic at the NYU library in 2005, during the graduate assistant strike (via Mondoweiss)

As a graduate student at NYU during a 2005-06 strike by the graduate employee union, we heard a lot of civility talk from university administrators who were hostile to graduate assistant unionization but were unwilling to honestly say why. NYU loved to intimate that our parent union, the UAW, would try to rewrite syllabi, that unionization would forever sully ties between faculty and students, that it was hostile to undergraduates.

As with so many keywords beloved by university administrators— “innovation,” “entrepreneurship,” and so on—there is an opportunistic element of the sacred, or at least the sacrosanct, in these treatments of the university. Once administrators feel threatened, campuses become halls of peaceful contemplation, “safe harbors,” as the Illinois Board of Trustees puts it, from the tumult of the world outside.

For academic workers, via Corey Robin: If you want to join a specific pledge from a discipline or wish to sign the general statement, here are the critical links:

  1. General, non-discipline-specific, boycott statement: 1402 and counting!
  2. Philosophy: 340. Email John Protevi at or add your name in a comment at this link.
  3. Political Science: 174. Email Joe Lowndes at
  4. Sociology: 248.
  5. History: 66.
  6. Chicano/a and Latino/a Studies: 74
  7. Communications: 94
  8. Rhetoric/Composition: 32.
  9. English: 266. Email Elaine Freedgood at
  10. Contingent academic workers: 210.

11.Anthropology: 134

  1. Women’s/Gender/Feminist Studies: 54. Email Barbara Winslow

And if you’re not an academic but want to tell the UI to reinstate Salaita, you can sign this petition. More than 15,000 have.


Stakeholders in Ferguson

As the militarized police occupation of Ferguson, MO, drew comparisons between the midwestern suburb and a “foreign authoritarian country,” the town’s police chief affected a different sort of vocabulary in one of his press conferences. [Put aside, for a moment, the deep naivete of a writer, like this one for, so stymied by violent repression in the United States, God’s country and freest land on earth, that he must invoke “Middle East dictatorships” as the only available comparison for the images on his TV screen.] The Ferguson PD released the name of the uniformed killer of young Mike Brown, the Boston Globe reported,after consulation with “stakeholders”:


Obviously the decision was taken at the highest levels of the local police brass; likely Missouri’s governor and the Department of Justice had a role in the decision. Nothing this police department has done yet smacks of consulation or transparency, so the likely trained recourse to the discourse of”stakeholders” is laughable here. Stakeholder, as I argued in an earlier post, is an austerity keyword that started in business schools and has migrated into the world of municipal government, non-profits, and organizations of all types. The word has financial origins, but it aims to reassure audiences that what they are witnessing is an egalitarian partnership, not a hierarchical enterprise, at work. As I wrote then:

Like other phrases derived from gambling and finance that have migrated into democratic politics—the appropriately gruesome phrase “skin in the game” comes to mind—stakeholder conflates access with rights, obscuring hierarchies of power under the veneer of cooperation.

A determined group of citizens in Ferguson seem undeceived by the laughably thin veneer of cooperation on display there, however.

Keywords for the Age of Austerity 10: Sustainability

 “Sustainable” is an old word, which once referred negatively to an emotional burden one could endure; it also enjoyed popularity as a synonym of “provable,” in a legal sense. These now-obsolete usages gave way to the more general modern meaning, as “capable of being maintained or continued at a certain rate or level.”

For this contemporary definition the Oxford English Dictionary gives mostly economic examples, and indeed “sustainable” was until quite recently used to refer to “steady” growth, with none of the ethical or environmental meanings we now associate with the term. “The Big Three’ s first-quarter production plans look more sustainable now than they did a month ago,” wrote the Wall Street Journal in 1986, referring only to car sales projections, not gas mileage or carbon footprints.

Since the turn of the last century, the word has been used to mean “capable of being maintained” with the implied adverb “environmentally.” As a marketing term [do not click on this link, I am warning you]—and it is ubiquitous as a marketing term— “sustainable” is roughly synonymous with “smart,” suggestive of technological innovation along with a sense of moral conscientiousness and forward thinking. (Moral improvement is deeply embedded in the ideology of “innovation,” as well, as we saw in that keyword essay). “Sustainable” is the cornerstone of what a wince-inducing urbanist blog calls the “New Artisan Economy”: “By producing small quantities of artisanal products in an environmentally friendly way,” this author writes, “the overall economy becomes more sustainable which is a benefit for everyone” [sic].

The contemporary ethical-conservationist meaning of the word “sustainable” tracks with the rise of the noun form “sustainability,” a word almost unknown before the 1980s. BYU’s Corpus of Historical American English, which tracks word usage in popular written media, shows no uses of the term before the 1980s. Google’s ngram offers just a handful, mostly Defense Department memos and other bureaucratic documents lacking public circulation. coinage of “sustainability” correlates with the rise of “sustainable development,” a conservationist critique of development economics that emphasizes the frailty of nature—which the World Bank lovingly calls “natural capital.” Where mid-20th century development theory once advanced economic growth as its ideal, sustainable development offers “sustainability.” Interestingly, this move from “growth” to “sustainability” can be seen in the changes in popular uses of the word “sustainable” itself, from the Wall Street Journal’s 1987 usage to today. 

The United Nations has helped define and popularize the concept in various summits and proclamations: the 1987 Brundtland Report defined “sustainable development” as “development that meets the needs of the present without compromising the ability of future generations to meet their own needs.” The word came into broader circulation in the 1990s, when it was the focus of the 1992 Rio Earth Summit, which made “sustainable” a byword of developmentalist ethics and official environmental policy-making. From the Summit’s report, called Agenda 21: 

Principle 1: Human beings are at the centre of concerns for sustainable development.  They are entitled to a healthy and productive life in harmony with nature.


Principle 8: To achieve sustainable development and a higher quality of life for all people, States should reduce and eliminate unsustainable patterns of production and consumption and promote appropriate demographic policies.

“Sustainable” has the advantage of being unambiguously good—who wants to be exhaustible?—and invitingly vague. It can accommodate Marxist critics of capitalism and neo-Malthusian doomsday cranks. Mining companies love it. And as you might expect, BP is totally committed to “sustainability,” and has a website to prove it. (In a happy coincidence, sustaining Earth’s ecology and sustaining BP’s shareholder dividends are two sides of the same sustainable coin: “The best way for BP to achieve sustainable success as a company,” their website cheers, “is to act in the long-term interests of our shareholders, our partners and society.”) This combination of ethical straightforwardness in theory—we must be responsible stewards of natural resources for future generations, yes, yes, we all agree—and subjective imprecision in practice is the source of much of its popularity, as scholars have pointed out. And then there is also the temporal lag of counter-evidence: the final proof that our current practices are in fact unsustainable will not come until after we are dead. 

So “sustainability,” like “innovation,” combines literal vagueness with moral certainty. As Keith Douglass Warner and David DeCosse point out in a blog post, “sustainability, much like “efficiency,” does not have an intrinsic meaning.” The question, as they argue, is sustainable for whom, and for how long? One will not get a clear answer by surveying the uses of the word. Duke Energy loves to tweet about “sustainability,” as does McDonalds; McMansions can be “stunning and sustainable.” The Sotheby’s primer on “sustainable eco-mansions” reassures buyers that “making a home sustainable is a scalable effort.” What this means is that the imprimatur of “sustainability” can be bought cheaply or dearly, as one wishes: Energy-star appliances and native-plant gardens at the low end, solar panels and reclaimed barn-lumber siding at the higher price point.

The marketing of “sustainability” exemplifies the framing of structural problems as individual ones, and of practices of citizenship as ones of consumption. Thus the inevitable “sustainability apps.” As used by the self-described “urban sustainability consultant” Warren Karlenzig, writing on the website Sustainable Cities Collective, “sustainability” is a libertarian notion of social change, but one in which the anti-social nihilism of “disruption” is softened by a green touch:

Open data will reduce urban traffic congestion: no longer must cars circle downtown blocks as real-time parking rates and open spaces become transparent. Even more sustainable are those who are deciding to telecommute or use public transit on days when they know that parking costs are spiking or when spaces are unavailable.

Built around a labored, confusing metaphor of cities as beehives, and developers and end users as “swarms” of bees, Karlenzig’s thesis is that a “sustainable” city will be spawned by technological expertise and venture capital: “Our pollen dance,” he writes, “will be our testimonials, use patterns, geo-location, and referrals.” 

As a lifestyle and marketing term, “sustainable” can paradoxically express the same capitalist triumphalism—of an ever-expanding horizon of goods and services, of “growth” without consequences—that the conservationist concept was once meant to critique. “Sustainable development,” fuzzy as it is, was intended to remind us of the limited supply and unequal exploitation of natural resources. But if “sustainable” most literally means an ability to keep on doing something, its popularity as a consumerist value suggests that there is a fine line between “sustainable” and “complacent.” We can “sustain” grossly unequal cities—that is, they won’t fall apart utterly—with Lyft and Airbnb, rather than mass transit and affordable housing. For a while, anyway. Whether we will sustain our desire to live in them is another question.

Detroit on $1 Million a Day

Joshua Akers and I published this response in Guernica to Ben Austen’s hagiographic profile of Dan Gilbert in the New York Times Magazine. 

The new Detroit—the one surveyed by what Austen calls the “new prospector class”—resides in a few neighborhoods around downtown and Wayne State University. Perhaps it also lives in the fantasies of tech entrepreneurs whose most substantial decision thus far has likely been choosing what color bean bag chairs to put in their offices. It is built, though, on the backs of mostly black workers cleaning offices, staffing cafeterias, washing dishes, cleaning casino floors, and occasionally finding their way to the front of the house in new downtown bars and restaurants. Such Detroiters are nowhere to be seen in Austen’s account of male entrepreneurial heroes. They are bystanders to some free-market experiment in which the only consequences, apparently, are whether or not speculative investments result in profit.

Detroit on $1 Million a Day

Keywords for the Age of Austerity 9: Content

In a press release announcing its acquisition of the much-loved TV comedy South Park and the yet-to-be-loved comedy The Hotwives of OrlandoHulu trumpeted its expanding “library of exclusive, current and library content.” Hulu’s senior Vice President and “Head of Content” Craig Erwich wrote:  

I could not be more thrilled to announce that we are continuing the momentum this year by bringing new seasons of our beloved Originals, as well as the premiere of our brand new title ‘The Hotwives of Orlando’ and new library deals that will make Hulu’s content offering more robust and diverse than ever before.

In volume 1 of Capital, Marx famously explained commodity fetishism under capitalism as an alienating social world in which, he wrote, 

the relations connecting the labour of one individual with that of the rest appear, not as direct social relations between individuals at work, but as what they really are, material relations between persons and social relations between things.

The reversal contained in Marx’s phrase—manufactured things take on the dynamic richness of people, while people themselves are reduced to mere objects—is part of what bothers Benjamin Hart, in a perceptive article in Salon, about the use of the term “content” in the contemporary culture industry. Hart objects to the degradation of art—what a TV executive might call “quality content”—by its confusion with fluff. But Breaking Bad is a commodity, of course, a product sold to advertisers and viewers in exchange for money, no different in this fundamental respect from Who’s the Boss or The Hotwives of Orlando. The problem with “content,” therefore, runs deeper than the boundary between high art and low culture, to the privatization of the desires, knowledge, and experiences we gain from the stories we read, watch, and remember. “Content” names artistic and narrative creativity, and therefore creators themselves, as things like any other. 

“Content” is ubiquitous in entertainment journalism and in industry discourse—television, film, and music industries in particular use the term regularly, while book publishers, perhaps conscious of the antiquity and prestige of their medium, seem to use it less (please correct me in the comments or on Twitter if I’m wrong here).

The rise of “content” in its current form can be traced to the broadband web. In 2000, Time Magazine reported the merger of AOL and Time Warner by explaining that the new technology of “broadband” originates in “the fat, fast pipes of cable television that could carry vast amounts of Internet content.” The anachronistic materiality of this description (the Internet as a series of tubes, or fat pipes) points out how “content” as a term underscores literary and visual media’s dissolution into digital immateriality. This is not to wring hands about the rise of e-books and small screens and the decline of print and cinema but to emphasize, rather, how digitization is an intensification of the commodification of all forms of culture.

As I found in some preliminary research on BYU’s Corpus of Historical American English,pre-2000 uses of the term “content” mostly follow the Oxford English Dictionary’s definition, even with its outdated print bias: “the things contained or treated of in a writing or document; the various subdivisions of its subject matter,” as in the table of contents.

Elaine Green, assistant principal of Detroit’s Mumford High School, told Time Magazine in June 1989 that teachers and students at her school were “pleased with the quality and content” of Channel 1, the old TV news distributed to schools that, as I recall it from my own high school days, was a beachhead in commercial advertising’s invasion of the school day. Green’s use was once typical—“content” was simply the stuff in Channel 1’s programming, not the programming itself

The term also thrived in the 1990s in calls for government regulation of music, movies, and video games. Tipper Gore’s hilarious anecdote about her encounter with Prince’s “Darling Nikki” in 1985 and Congress’ sanction of the National Endowment for the Arts in the late 1990s focused attention on the “graphic content” of music and art.

Now, an intermission (parental guidance suggested):

Its popularity among executives, politicians, and advertisers gives “content” a drearily bureaucratic ring. In common phrases like “violent content,” “sexual content,” or “inappropriate content,” the word refers to knowledge and information that should be policed. In this context, it is a purposely bloodless euphemism for any controversial narrative, visual, or verbal elements of a work of art (and it refers, in music, only to lyrics, almost never to tone, melody, or rhythm). This usage of “content,” as the raw material by which an artistic work could be judged and condemned, without any attempt at interpretation, presaged the contemporary ubiquity of the term. Today, as Hart observes, “content” is just a “substance” made of digital words, which is how Merriam Webster’s pleasingly cheeky definition now describes it: “the principal substance (as written matter, illustrations, or music) offered by a World Wide Web site.”

This digital substance is the basis of so-called “content farms,” websites that cheaply and quickly produce articles meant to optimize search results. Outfits like and the defunct Associated Content have used low-paid writers (reportedly earning as little as $3.50 per story) to produce articles intended to game Google results and thereby build a stockpile of “content” used to sell targeted ads. See the example, described by Farhad Manjoo, of an Associated Content article that used the phrase “Tiger Woods mistress pictures” 8 times.

Content-substance is undistinguished either generically, by subject matter, by level of specialization, or by style. It is a marketer’s term, used to describe anything that generates views, subscriptions, or ticket sales. But its popularity is less a symptom of the fragmentation of the media market—the multiplication of genres and the web-enabled devices where we consume them—than it is of the widespread privatization and privation of the educational, editorial, and journalistic professions, which has been encouraged, but not invented, by the Internet. The stories of journalism school graduates and newsroom veterans, like one laid-off Miami Herald reporter who turned to content farms to make ends meet in something resembling their chosen profession, are distressing cases in point.

“Content” in educational reform discourse refers to everything that is contained in a curriculum; it’s a usage that reflects the uniformity that reform critics like Diane Ravitch have criticized in the push for school “accountability.” California’s Common Core standards informational sheet, for example, refers throughout to “content areas”—what I might call a discipline or a subject, like history or math. And “content standards” are “curricular and instructional strategies that best deliver the content to their students.” The implicitly quantitative presumptions that “content” reveals here—curricular knowledge that can be measured, repeated, and reliably delivered—is especially clear in the popularity of the construction “content delivery,” beloved by media managers and tech firms.

As things we are accustomed to thinking of as “culture” that we care about—novels, cinema, “prestige” television shows, investigative journalism, Purple Rain—are understood ever more bluntly by advertisers, producers, and others as mere commodities to be “delivered” to buyers or policed for their most literal meaning, what are more obviously “mere” commodities—brand names, commercials, and other objects sold by commercials—are imbued with the aesthetic character of the work we care about. Thus, advertising site (get it?) on “brand storytelling,” and the American Marketing Association’s  seminar on “how to map content to personas and stages of the buyers’ journey.” Here, “content” is treated as a direct path to the dreams, aspirations, doubts, and fears of individuals.

Borrowing a New Agey vernacular of life as a “journey,” a consumer’s potential purchase of a brand-new plasma TV or season 6 of Who’s the Boss? on DVD is graced with spiritual consequences. New Age spiritualism was its own kind of commodified spirituality, of course, making the brave new world of content ownership and ownership “journeys” alienation of the second order.






Keywords for the Age of Austerity 8: Accountability

Accountability, n; accountable, adj.

Like most of these austerity keywords, “accountability” is a term that has exploded in popularity in the last 3 decades after lying relatively dormant for centuries. “Innovation,” “entrepreneurship,” and “nimble” all frame acquisitiveness and sacrifice as art and virtue. With its combination of moral responsibility and the task-based “counting” embedded in the word itself, accountability goes further, and captures the popular fantasy of quantifying virtue.


Count the potential Keywords for the Age of Austerity in the preamble to the No Child Left Behind Act, above

 “Accountability” is popular as a term of art on both the left and the right, in calls for “corporate accountability” and “government accountability.” Its greatest influence, however, has come in the field of U.S. public education, especially since the 2002 No Child Left Behind Act. An unsatisfying explanation of the term appears on the website of the Department of Education:

Under the act’s accountability provisions, states must describe how they will close the achievement gap and make sure all students, including those who are disadvantaged, achieve academic proficiency. They must produce annual state and school district report cards that inform parents and communities about state and school progress. Schools that do not make progress must provide supplemental services, such as free tutoring or after-school assistance; take corrective actions; and, if still not making adequate yearly progress after five years, make dramatic changes to the way the school is run.

This is an unsatisfying and very nearly tautological definition, since it defines “accountability” by means of the mechanisms for being “held accountable.”  

“Accountability” is popular, as well, in the management literature where “leaders” justify themselves to each other. What, for example, do the sages at the Harvard Business Review see as the biggest fault among “leaders” today? “No matter how tough a game they may talk about performance,” they write, “when it comes to holding people’s feet to the fire, leaders step back from the heat.” In other words: not blaming other people for enough things. Indeed, “accountability” is a word that, unlike its relative “responsibility,” assumes retribution. That is, while one can generally be responsible—for your friends, relatives, students, goldfish, etc.—you are only held accountable, by someone else, when you have failed. (Of course, Forbes Magazine says accountability is “not about punishment,” which sounds suspiciously like “this is gonna hurt me more than it hurts you.”)

It is also a concept that, like stakeholder, aids a firm’s public projection of responsibility: don’t regulate us, the term announces, we’re holding ourselves accountable just fine. See, for example, AccountAbility. a consulting firm to the “Financial Services, Pharmaceutical, and Energy and Extractives” industries—are there any  more irresponsible industries than these three?—which identifies its goal as helping clients “embed ethical, environmental, social, and governance accountability into their organisational DNA.”

The term’s popularity represents a shift in official political discourse in (at least) the post-Reagan era, as the “taxpayer” has replaced the “citizen” as the subject of democratic politics. Take this description of the NCLB’s “accountability mechanisms” in an education journalTechnology and Engineering Teacher, which makes free public education sound like a commercial transaction between taxpayers and teachers. The mechanisms are intended, the authors write, to “[hold] educators accountable to public taxpayers for the learning occurring in their classrooms.” (The vagueness of the antecedent of “their” is telling.) And in another sign of the times, the General Accounting Office, founded in 1921 under the Harding administration, changed its name in 2004 to the Government Accountability Office. The GAO’s original mission was to seek “greater economy or efficiency in public expenditures.” Now, however, “the GAO investigates how the federal government spends taxpayer dollars,” a subtle shift that underscores a basic hostility to the “public expenditures” taken for granted in the original.


Catechism on the Doctrines, Usages, and Holy Days of the Protestant Episcopal Church (1879)

Education scholars and journalists often describe the present moment in education policy as “the Age of Accountability,” an unintentional but ominous reference to the term’s last burst of popularity in the middle of the 19th century. Then, “accountability” was used to assess children, but in an explicitly theological sense. Protestant theologians debated the salvation of children who died before the “age of accountability,” a Calvinist concept taken up later in evangelical Protestantism that identifies an age at which a person’s own agency is subject to God’s judgment. Jacobus Arminius, a 16th-century Dutch theologian, argued that if children die before reaching the age when they could knowingly receive Christ, they go to heaven. Afterwards—tough luck. Accountability is a personal, moral category, and one’s judge is God, in his infinite wisdom or, if you are a Calvinist, his inscrutable arbitrariness. Teachers these days may recognize the latter type. 

The OED defines “accountable” as

liability to account for and answer for one’s conduct, performance of duties, etc. (in modern use often with regard to parliamentary, corporate, or financial liability to the public, shareholders, etc.); responsibility.

Although the definition identifies “responsibility” as a synonym, it emphasizes  “liability,” a more limited form of financial or legal duty, rather than the moral obligation present in the Calvinist usage and still assumed in the word. Implicit in the idea of educational “accountability,” after all, is that very moral sense of responsibility for the welfare of all children that the No Child Left Behind law’s title announces. Regardless of its pedagogical worth, this is what gives the concept of “accountability” its political force, in education, government, and elsewhere: who would be against it?

Measurement is key in enfocring the notion of accountability in schools, and it is what many critics of NCLB fixate on: the high-stakes testing regimes, teacher evaluations,  school grades, and so on. And yet there is something persistently vague about its usage. In my cursory reading of the text of NCLB, the term is never defined more clearly than it is above, except to specify that it refers to common standards and enforcement provisions. The law at times also seems to conflate the sanctions for failure—that is, being “held accountable,” or punished—with meeting the standard itself, or “being accountable,” a big difference.

Perhaps it is not explicitly defined because it is taken for common sense. Regardless of the sticks attached to school accountability, which vary by state—whether a school is closed or a teacher dismissed—accountability takes knowledge quantification as a fundamental, underlying principle, as NCLB critic Anthony Cody points out in a clever reading of a Common Core promotional video (image below)


 As the video’s voice-over says:

Like it or not, life is full of measuring sticks: How smart we are, how fast we are, how we can, you know, compete. But up until now, it’s been pretty hard to tell how well kids are competing in school, and how well they’re going to do when they get out of school. We like to think that our education system does that. But when it comes to learning what they really need to be successful after graduation, is a girl in your neighborhood being taught as much as her friend over in the next one? Is a graduating senior in, say, St. Louis, as prepared to get a job as a graduate in Shanghai? Well, it turns out the answer to both of these questions is “no.”

As Cody argues, there is something distressing about the assumptions here—that life is a sequence of measuring sticks, and that a child’s education must be thought of as one part of a ruthless international competition. We will see more on this when, in our next keyword, we examine the use and abuse of content in education.

Accountability is distressing not because it calls for measurement and standardization—these are not bad in and of themselves, even in education, defined as it is in the U.S. by gross disparities in local school funding and teacher training. Rather, as many others have already said in its educational usage, it is the assumption that the logic of the hierarchical marketplace—measuring sticks, competition, “success,” victory over the Chinese—is not only fair but the natural order of things. (Also, my own sense as an “educator” is that administrators only begin counting things when they want to get rid of them). When it combines the moral sense of duty with the bureaucratic zeal for quantification, accountability encodes the fiction that moral obligations can be measured, calculated, and, of course, valued financially.


Keywords for the Age of Austerity 7: The Silo

In the Harvard Business Review—the online resource for businesspeople who like their platitudes burnished with a little crimson—one can read a blog post by Vijay Govindarajan, a professor at Dartmouth’s business school that begins fairly incomprehensibly, at least for me:

When we ask executives, What is the number one innovation killer at your company?, one of the first words we always hear, always, is “silos!” Recently, one executive even muttered, “fortresses.” … Innovation is the Trojan Horse that can be sent in to break down silos.

Elsewhere innovation is described as “the wheels on the Trojan Horse,” so clearly they don’t spend much time on metaphor at Dartmouth College’s School of Business. (Why a Trojan Horse anyway? Why all the stealth, Innovation?)

Unlike most of the other words in this series, the “silo” is a buzzword, a new coinage, often used to impress, that hasn’t widely penetrated the language of non-specialist media. According to the BYU Corpus of American English, the word still refers mostly to grain storage facilities and ballistic missile launchers.


A patch worn by the Air Force crews that manage remote nuclear misslle launch sites in the West. Silos are lonely and cold, hence the slippers. (via

In Govindarajan’s post, the “silo” is used in its business sense, as the implacable enemy the armies of “innovation” are raised against. The term describes the vertical organization of a particular department in a firm: the sales and technical support offices, for example, are “silos” if they fail to integrate with each other and communicate effectively. As we can already see from Govindarajan’s invocation of the organizing myth of “innovation,” in which “silo” takes its meaning, the term also takes on a broader ideological cast. That is, the idea of the “silo” describes not just a means of better organizing a bureaucracy but a logic that justifies and defends it, through the heroic, democratic, silo-busting figure of “the innovator.” 

And in business prose, you will find, silos are always “busted.” Thus Govindarajan’s Trojan War military metaphor, though confusing, wasn’t entirely misplaced. The combination of “silo” and “busting” is borrowed from the military: a “silo-buster” is “a missile which can destroy an enemy missile in its silo,” according to the Oxford English Dictionary.

As an author in the Arizona Republic’s business section explains:

A silo mentality can occur when a team or department shares common tasks but derives their power and status from their group. They are less likely to share resources or ideas with other groups or welcome suggestions as to how they might improve.

Silos, in other words, derive from employee groups that are insufficiently collaborative and overly autonomous. The author goes on, describing the role of the “silo-busting” manager:

If there are many rules, then she will manage employees very formally, ensuring those rules are followed and the culture is very orderly. If there are fewer rules, employees enjoy a flexible culture. The formal culture with strict rules is more likely to have the cultural problem of silos.

Here we have a contradiction inherent in the innovation/entrepreneur myth, which celebrates heroic individualism and fetishizes “team-building” and collaboration. On the one hand, those with a “silo mentality” are too isolated, and on the other hand they are insufficiently invidualistic. The silo-buster is a canny military officer; she is also a flexible “teammate.” Silos are disciplined; innovative workplaces are “nimble.”


The Silo Killer (2002): It’s Harvest Time

The silo is a metaphor that connotes secrecy and confinement, which makes missile silos popular settings for secret supervillain lairs in movies and video games. The silo’s isolation also summons the urbanite’s contempt for the untutored country bumpkin. Reflecting an impression of agriculture likely gleaned from the Interstate, the “silo” is outmoded and not “smart.” It’s a good example of what Evgeny Morozov calls “solutionism,” the framing of social problems as technical or managerial ones, that can be fixed with new technologies or a more innovative “leader.” A popular use of “silo” in the political realm, for example, attributes the failures of the U.S. healthcare system not to patient access or the private insurance industry, but to the bureaucratic “silos” inside hospitals. In academia—where most of these corporate-driven keywords either originated or have since have found a happy home—“silos” are departments that are deemed too labor-intensive or that resist (or are seen to resist) “interdisciplinarity” or  “flexibility.” Russian departments: you are probably silos. Centers for Innovation and Entrepreneurship: you are probably not silos. 

Like “stakeholder,” the silo promotes an ideal of firms as horizontal, self-regulating organizations, organized by cooperation and voluntarism. As this clip from the UK version of The Office makes so painfully and hilariously clear, this bargain is a fantasy. One can’t to make a “flexible” culture in a hierarchical enterprise without, at some level, coercively ordering it, just as there is no “team” without a coach who writes the game plans, sets the roster, and mugs for the camera in the post-game interview. Here, the self-involved boss played by Ricky Gervais, who imagines himself a beloved silo-buster, commandeers an office “team-building” exercise to force his  artistic genius on his employees and a bewildered consultant.

Despite its incongruously agricultural reference, the silo reflects business rhetoric’s celebration of the military and the artist, its veneration of authoritarianism (Machiavelli is a popular “silo-busting” role model on business blogs) and a modernist ideal of artistic vision. Remember: those who fail to innovate shall be confined to the silo.