After contemplating a proposal to ban its editors from blogging, the International Studies Association promotes an initiative to explore how online media can actually benefit scholars in the field.
Anthropologists discuss student debt and other concerns about the "commodification" of higher education, and debate role of faculty members in reform efforts.
Three people in the United States have contracted the Middle East Respiratory Syndrome coronavirus, so far -- two while traveling abroad, the third through contact with one of them. Another 600 or so cases have been diagnosed elsewhere in the world since MERS first appeared in early fall of 2012, according to the World Health Organization.
Or rather, that many cases are now confirmed. It could well be that more people have had MERS (wherever in the world they may be) and endured it as if a terrible flu; it’s also possible to be exposed to it and develop antibodies without showing any of the symptoms. With a new disease, solid information tends to spread more slowly than the vectors carrying it. Some of the online news coverage calls the disease “highly contagious.” But that doesn’t really count as solid information: while MERS has proven fatal about a third of the time, it seems not to be readily transmissible in public settings.
No travel advisory has been issued, nor are special precautions being recommended to the general public, though health care workers are vulnerable. The Centers for Disease Control and Prevention suggests washing your hands regularly and keeping them away from eyes, nose, and mouth as much as possible -- hygiene recommendations of the most generic sort.
But the fearsome label “highly contagious” became almost inevitable when MERS was branded with a name so close to that of Severe Acute Respiratory Syndrome. For SARS was highly contagious; that’s what made it so terrifying. I use the past tense because no new cases have been reported in 10 years. The rapid spread of SARS was halted, and in its wake international efforts to monitor and exchange information about emerging diseases have improved.
MERS ≠ SARS. Even so, its very name calls up the specter of a quick-moving, lethal, and global pandemic. And those connotations insinuate themselves into discourse on the new disease -- as if to ready us for panic.
Well, don’t. That would be premature. (Try not to lick doorknobs or French-kiss anyone with a wracking cough, and you’ll probably be just fine.) The start of the 21st century may well be what CDC director Thomas Friedan has called the "perfect storm of vulnerability”: unknown new diseases can continent-hop by airplane and test their strength against antibiotics that have become ever less effective, thanks to overuse. But humans can think while viruses cannot, and it seems at least possible that could prove the decisive advantage.
Consider a new book from Southern Illinois University Press called Rhetoric of a Global Epidemic: Transcultural Communication about SARS by Huiling Ding, who is an assistant professor of professional and technical communication at North Carolina State University. It is a work of some factual and conceptual density, but I suspect it will play some role in how information about disease outbreaks will be organized and delivered in the future.
Ding has not set out to write the history of SARS, but she does reconstruct and scrutinize how bureaucracies and mass media, both east and west, communicated among themselves and with their publics as the disease emerged in China in November 2002 and began spreading to other countries in the new year. Her analytical tool kit includes elements of classical (even Aristotelean) rhetoric as well as a taxonomy of kinds of cultural flow based on Arjun Appadurai’s anthropology of globalization.
The author prefers to identify her approach as "critical contextualized methodology,” but for the purpose of making introductions we might do better to dwell on a single guiding distinction. Ding is wary of a number of established assumptions implied by the term "intercultural communication,” the very name of which implies two or more distinct cultures, standing at a certain distance from one another, exchanging messages. When things are so configured, “culture” will sooner or later turn out to mean, or to imply, “nation” -- whereupon “state” is sure to follow.
By contrast, "transcultural communication” drags no such metonymic chain behind it. It has a venerable history, with roots in Latin American cultural studies. “Transculturation,”writes Ding, “can be used to describe a wide range of global phenomena, including exile, immigration, multicultural contact, ethnic conflicts, interracial marriages, overseas sojourns, and transnational tourism.” A transcultural perspective focuses on layers and processes that constitute different societies without being specific to any one of them, and that can themselves be in flux.
So, to choose a SARS-related example, referring to "Chinese mass media” will, for most Americans, evoke a relatively simple-seeming concept -- one that involves messages in a single language, circulated through certain well-established forms of transmission (newspapers, radio, television) among a population of citizens living within the borders of a nation-state (presumably the PRC). I dare say “American mass media” has analogous implications for people in China, or wherever.
But whatever sense that outlook once might have made, it now distorts far more than it clarifies. The range and the audience of mass media are in constant flux; the messages they transmit do not respect national borders.
“My research,” Ding said in an email interview, "shows different values and practices of traditional newspapers housed in Beijing and Guangzhou (mainstream and commercial ones) despite the exertion of censorship during the early stage of SARS.” The People’s Daily, official mouthpiece of the Chinese leadership, remained silent on the health crisis until as late as March 2003. But by January 2003, regional newspapers in small cities began reporting on the panic-buying of antiviral drugs and surgical masks -- information that then became known elsewhere in the country, via the Internet, as well as to “overseas Chinese” around the world, well before the crisis was international news.
Ding also discusses the “ad hoc civic infrastructure” that sprang up during the outbreak, such as the website Sosick.org, which engineers in Hong Kong created to circulate information about local SARS cases and encourage voluntary quarantines. "Concerned citizens can learn from coping strategies from other cultures,” she said by email, "be it communities, regions, or countries, and adapt such strategies to cope with local problems. For instance, I am working on another project on quarantine policies and practices during SARS in Singapore, mainland China, Hong Kong, Taiwan, and Canada…. Such bottom-up efforts often carry persuasive power, and in the case of Hong Kong, did help to introduce policy changes.”
Her reference to “persuasive power” is a reminder that Ding’s book belongs to the tradition of rhetorical scholarship. She devotes part of the book to an analysis of enthymemes in official Chinese commentaries on the crisis, for example. (An enthymeme is a deductive argument in which one of the assumptions goes unstated.) That a grassroots quarantine movement on two continents proved more successful and persuasive than state-sanctioned efforts to maintain social order is easy to believe.
What we need, Ding told me, are analyses of the "communication practices of global and/or flexible citizens, or multi-passport holders who regularly travel across continents in search of fame, wealth, or influence. Their familiarity with multiple cultures certainly introduce interesting transcultural communication strategies.” That bottom-up appeals for quarantine proved effective in a number of countries suggests she could be right: cultivating new skills in communication and persuasion might well be crucial for dealing with other public health crises, down the line.
Last week, Nicholas Kristof revived the old canard that academics have removed themselves from the public sphere through obscure prose and interests. Among the problems we might identify in Kristof’s essay -- thereare, obviously, many -- is the irony of a writer with the resources of The New York Times supporting him chiding the rest of us for not writing in outlets such as The New York Times.
But who gets to write in The New York Times -- and to whom is The New York Times accessible? If we’re talking about accessibility and insularity, it’s worth looking at The New York Times’s own content generation cycle and the relationship between press junkets and patronage.
So, instead of confusing intellectual meritocracy with access to outlets, let’s look at how The New York Times itself generates content about something that matters greatly to professors: higher education.
What I learned there -- besides how weird corporate-sponsored conferences are, right down to commercials they looped on screens between talks -- is that there is a system of content generation that feeds thinkpieces and thinkfluencers with greater speed and sound bite concision than most professors can offer.
It’s important to note that the only professors on stage at this conference on the future of higher education had left teaching and research as faculty for academic upper administration or to launch their own MOOC companies. While Kristof might see this lack of platform as more evidence of academic self-cloistering, I see it for the closed system that it is: “influence” comes mainly from those who might be in the position to take out full-page ads in the Times.
I saw the Schools for Tomorrow conference advertised in the Times’ Sunday Magazine, and looked into registration online. It cost $795 for a one-day event. For reference, I just registered for a four-day conference in my humanities field for $150.
I wrote to the Schools for Tomorrow registration office and asked if they could lower the cost for actual professors, bringing it in line with typical registration fees between $75 and $200. They said they could bring it down to $495. I found some institutional money for online teaching development and paid the “reduced” fee.
The $495 did not, however, guarantee me a seat when I got to the conference. The mid-three figures is a lot for humanities faculty and their limited (if existent) travel support, but in this world it just got me through the first door. It turned out that the plenary talk by Sal Khan (of Khan Academy) on globalizing access to education was overbooked, so while corporate sponsors like Bank of America and Blackberry enjoyed reserved seats in the auditorium, a lot of self- or university-sponsored folks like myself ended up watching on screens in the basement.
This was the first lesson in sponsored access to influence and content creation. Since Khan’s talk led into the panel discussion “Has the University as an Institution Had Its Day?” a lot of professors sat out the Q&A in the cheap seats. Not even, really. We were in a different arena altogether.
These weren’t conversations; these weren’t arguments. Mainly, these were rehearsed pitches for products, policies, and industries in which presenters had considerable financial or political stake. Some featured speakers, like former Senator Bob Kerrey, had a foot in several categories: he was in the Senate, he had been president of the New School, and he is now starting a for-profit university.
At various points it became clear that the speakers were used to talking to one another “on the circuit” as one said to another, suggesting that they’d been on the online education junket a lot together that year. And some cycle back through the Times meetings. Having missed Sal Khan at the education conference, I could have caught him the next month at the DealBook business conference.
The third lesson of the conference, however, came when I picked up my New York Times at home. The November 1 "Education Life" section titled “The Disrupters” is almost entirely drawn from or inspired by the conference. One conference reviewer quipped that “so many Times newsroom staff members are participating in the conference, they might not be able to put out the paper on Wednesday.”
To the contrary, such events seem to be built into their content generation strategy. “The Disrupters’ ” lead article, "Innovation Imperative: Change Everything, Online Education as an Agent of Transformation" was written by Michael Horn and Clayton M. Christensen. Both hail from the non-university-affiliated Clayton Christensen Institute for Disruptive Innovation. The latter is a business professor at Harvard and the former was a panelist at the conference. Here's Horn’s bio from the conference webpage. He did a 25-minute one-on-one with David Leonhardt, the Times'sWashington bureau chief, advocating “The Disruption of Higher Education.”
According to his bio, Horn studied for an M.B.A. at Harvard (presumably with his co-author Christensen), then gained a platform as an educational innovation consultant at Arizona State, the editorship of a “journal of opinion and research about education policy,” and invitations to testify on issues relating to education. He does so not from a university, but from an institute that operates in the world between academia and lobbying. He does not balance his time between teaching, service, and peer-reviewed research and publishing. Yet he is a recognized authority on higher education according to the Times’s invitation. And then his work is immediately funneled into and amplified by featured space in the Sunday Times.
Even if the Times itself might be forgiven for seeking out breathless think tankers over professors who lack their own Center for Thinkfluencer Excellence, we might be more critical of the blurry line between content and advertisement.
Elsewhere in the issue you’ll find Bob Kerrey’s Minerva University, a for-profit liberal arts venture, featured prominently. It is mentioned in the Horn article, and is the focus of this article on “affordable elitism.” And then there’s major conference sponsor, Capella University. Their “credit for competencies, not credit hours” model is the subject of this article. It was also a major topic of conversation at the conference, discussed at length by Capella University's president, Scott Kinney. Days before the conference, every registrant received an email promoting Capella and bearing their logo.
How much money did Capella pay for this multiplatform marketing strategy? And where did their marketing end and the ideas at the conference begin? They were in the email of all registrants. Their logo was all over the conference and in full color on the back page of the Times Sunday Magazine. Policy changes crucial to their success were discussed favorably at a conference with Education Secretary Arne Duncan in attendance, and they got an article focusing on them in the Times just below an editorial praising their sort of educational “disruption.”
When Kristof's article began raising questions about professors’ ideas and public influence, I was reminded of the way influence moved from the $795/$495 per person corporate-sponsored conference to the pages of the newspaper of record.
Professors, we need you! Who, then, is the “we”? As lots of people have pointed out, if the “we” is the American public, then you’ve already got us as teachers, popular and specialist writers, activists and more.
If the “we” is pageview ad-metric revenue-hungry online content providers and writers, then that’s another question. Do you really want us? And if we come to you, how much will it cost to get in?
Jonathan Senchyne is an assistant professor of library and information studies at the University of Wisconsin-Madison. A version of this essay originally appeared on Avidly.
As epiphanies go, it was hallucinatory and a little disconcerting… I had been reading about human evolution for a couple of weeks, off and on, trying to wrap my mind around the sheer span of the time involved -- the hundreds of thousands of generations, running back (the current estimate goes) some four million years.
Arguably the story begins a million or two years earlier still, when some kind of proto-hominid emerged from the line that led to the chimpanzees and the bonobos. Humans share more than 99 percent of our DNA with them. We’ve done a lot with the upright posture and those opposable thumbs. The past two million years – the period between Australopithecus and Homo sapiens digitalis – looks positively frenzied by contrast with the usual pace of evolution. And yet we are still distant cousins of the chimps, despite our gift for exalting humankind as existing above nature, or outside it.
One day, while reading these facts and thinking these thoughts, I looked up to see that a very strange thing had happened to everyone around me. They were, all of them, tangibly and unmistakably primates. (Or rather, we were, since my own hand suddenly looked like a well-articulated variety of paw.)
It is one thing to understand evolution at a conceptual level; a fairly difficult thing. Experiencing the continuity between human beings and other species is something else altogether; something like a waking dream. And perhaps especially when seated in a bakery frequented by lawyers, lobbyists, and media people from nearby offices – wearing clothes and mostly fur-less, but still recognizable as mammals distantly akin to monkeys or apes, despite obvious differences in carriage and demeanor.
They (we, rather) were eating scones for breakfast, not chunks of raw antelope. But for a few dizzying moments there, this did not seem as large a difference as it ordinarily might.
Fair warning, then: Reading Travis Rayne Pickering’s Rough and Tumble: Aggression, Hunting, and Human Evolution (University of California Press) may well leave the bright line between nature and culture looking thinner and blurrier than usual. (Pickering is professor of anthropology at the University of Wisconsin at Madison.) I should also warn vegans against reading the book, unless they are in a particularly argumentative mood.
More on that in due course. First, a look at the perennial dispute that Pickering has joined about the source of mankind’s history of violence. One familiar bull-session or editorial-page option is to understand the penchant for violence as an intrinsic and inescapable human disposition, something for which we are genetically programmed, even. In support of this idea, one can cite Jane Goodall’s discovery about the chimps she observed in the wild. While sociable amongst themselves, members of one band were capable not just of killing outsiders but of teaming up to wipe out the young males in another group.
And remember, we share 99 percent of our DNA with the chimps. Case closed! Well, perhaps not, since we have the same margin of genetic overlap with the bonobos. “In general,” Pickering writes, “bonobos engage in sexual contact frequently and casually, in many cases to seemingly allay what could otherwise turn into aggressive interaction.” Besides making love, not war, bonobos in the wild “hunt less frequently than do wild chimpanzees.”
Bunch of hippies. Anyway, the old Hobbes vs. Rousseau dichotomy cannot be decided by consulting the genome -- and the fossil record suggests a more blended perspective on what human beings are and how we got here.
Pickering moves through the evidence and hypotheses about our prehistory with an eye on the disputes they have inspired among paleoarcheologists. Some of them sound quite nasty. (The disputes, that is, though a couple of the researchers also come across as petty and vicious.) I’ll sketch the author’s own conclusions here briefly, but what makes the book especially interesting is its tour of the disciplinary battlefield.
The brains and teeth of our distant ancestors give reason to think they were hunters: regular, successful hunters, at that. The brain consumes a lot of energy. Once proto-humans left the jungles to roam the savanna, their brains grew considerably and at a relatively rapid pace – something a steady diet of meat could only have helped. (And vice versa, since “acquiring meat presumably requires a smarter brain than does picking stationary nuts or grubbing for fixed roots.”)
Besides being available “in the form of large herds of grazing ungulates,” meat “is also soft and does not require that its consumers have a massive dental battery to break it down in the mouth.” Furthermore, the teeth in hominid fossils do not show the kinds and quantity of abrasion found in mammals that normally consume seeds and nuts (also sources of protein) in quantity.
Meat “adheres to bones, comes in large packages, and is stubbornly encased in hairy, elastic hides, so a cutting technology would have been most useful for a blunt-toothed [primate] that had begun to exploit this resource.” Pickering considers the fossil evidence not just of tools and weapons but of animal bones scored with knife-marks left by prehistoric butchers.
But our ancestors’ diet is only part of the story, since the author is less interested in what they ate than in how they developed the capacity to do so regularly. Hunting big game required more than spears and courage. In addition, Pickering stresses the need for emotional control: coolness, grace under pressure, the proper combination of strategy and stealth. He also suggests that the capacity to organize and manage aggression played a role within pre-human society, by obliging members to keep the group’s well-being in mind. An excessively greedy or violent leader might not last for long: the skills required to sneak up on and kill a buffalo could be turned to political uses.
Given how contentious the field seems to be (the title Rough and Tumble clearly refers to paleontologists as much as to prehistoric society) it will be interesting to see how Pickering’s colleagues respond to the book. As a layman, I can only say that it was fascinating and thought-provoking. And that, all things considered, I’d rather be a bonobo.