Information

From My Cold, Dead Hands

When I teach first-year writing, I sometimes use the story of Charlton Heston’s post-Columbine NRA speech in Denver as an example of rhetorical kairos, keyed to its time and place. (What actually happened, as always, is more complex than the story.) The lesson I try to teach: whatever one’s views on guns after Columbine, the time and place of that speech affected or reinforced them. There is such a thing, I suggest, as a rhetorical moment.

Recently, we were in another such moment in the furor over iPhone encryption. John Oliver did a good 18-minute job  of explaining some of the particulars, and it’s worth your time if you haven’t seen it. The furor over encryption, in a US context, was a fight about the intersection of information and technology and politics, and that intersection is one I’ve lately had increasingly strong thoughts about.

I was dismayed to see James Comey, the Director of the FBI who selected the fight with Apple over encryption, taking what became the government’s public stand. Tim Weiner’s excellent 2012 history of the FBI, Enemies, notes (lest we forget) that Comey is the former Acting United States Attorney General who was in the intensive care hospital room on March 10 2004 when John Ashcroft refused the request brought by Andrew Card and Alberto Gonzales from President George W. Bush to reauthorize the Stellar Wind government eavesdropping program. Ashcroft said at the time that it didn’t matter, “‘[b]ecause I’m not the attorney general. There is the attorney general.’ And then he pointed at Comey” (Weiner 434). Comey refused as well. Later, in an admirable 2005 address to the NSA, Comey would describe what then-director of the FBI Robert Mueller “had heard [two days later] from Bush and Cheney at the White House”:

“If we don’t do this, people will die.” You can all supply your own this: “If we don’t collect this type of information,” or “If we don’t use this technique,” or “If we don’t extend this authority.” (Weiner 436)

Eleven years later, Comey supplied his own this.

Comey and the FBI were wrong to demand decryption. Code is speech. Forcing someone to speak is a violation of the First Amendment. Osip Mandelstam was commanded to write a poem in praise of Stalin, refused, and died in a cold prison camp near Vladivostok after smuggling out a letter to his wife asking for warm clothes. Apple’s 3/15 response to the FBI rightly invoked the specter of compelled speech when it pointed out that “the state could force an artist to paint a poster, a singer to perform a song, or an author to write a book, so long as its purpose was to achieve some permissible end, whether increasing military enrollment or promoting public health.” So-called “back doors” that would allow a government of eavesdropping and informants like that of Stalin’s regime endanger us all. And the FBI’s expressed position is hostile to liberty and anti-Constitutional.

Consider the similarly Stalinist inverse of compelled speech: Read more

Metadata and the Research Project

In a widely reported quotation, former director of the NSA and CIA General Michael Hayden said in May 2014 that “We kill people based on metadata.” Metadata is increasingly valuable today: it would also seem that it carries not one but multiple forms of value, some of those forms payable in blood.

Information Scientist Jeffrey Pomerantz, in his book Metadata (Cambridge, MA: The MIT Press, 2015), argues that until recently, the term “metadata” has typically been used to refer to “[d]ata that was created deliberately; data exhaust, on the contrary, is produced incidentally as a result of doing other things” (126, emphasis mine). That’s an interesting term, “data exhaust,” as perhaps an analogue to the pollution associated with the economic production and consumption of the industrial age. And of course corporations and governments are finding new things to do with this so-called data exhaust (like kill people, for example, or just to chart the social networks of potential insurgents like Paul Revere, as Kieran Healy charmingly demonstrates, or even to advertise Target products to covertly pregnant teenagers until their parents find out, as the anecdote popular a while back noted). It’s got cash value, click-through value, and my Digital Technology and Culture (DTC) students last semester put together some really terrific projects examining the use of cookies and Web advertising and geolocation for ubiquitous monitoring and monetizing.

But that idea of useful information as by-product keeps coming back to me: I wonder if someone has ever tried to copyright the spreading informational ripples they leave in their wakes as they travel through their digital lives, since those ripples would seem to be information in fixed form (they’re recorded and tracked, certainly) created by individual human activity, if not intention. There’s a whole apparatus there that we interact with: as Pomerantz notes, “[i]n the modern era of ubiquitous computing, metadata has become infrastructural, like the electrical grid or the highway system. These pieces of modern infrastructure are indispensible but are also only the tip of the iceberg: when you flick on a lightswitch, for example, you are the end user of a large set of technologies and policies. Individually, these technologies and policies may be minor, and may seem trivial. . . but in the aggregate, they have far-reaching cultural and economic implications. And it’s the same with metadata” (3). So the research paper has as its infrastructure things like the credit hour and plagiarism policies and the Library of Congress Classification system, which composition instructors certainly address as at once central to the research project and also incidental, because the thing many of us want to focus is the agent and the intentional action; the student and the research. Read more

Rationale for a Graduate Seminar in Digital Technology and Culture

Proposed syllabi for graduate seminars are due Monday, and while I’ve got the documents themselves together, I also want to be able to better articulate the exigency for this particular seminar I’ve proposed a syllabus for. There’s no guarantee my proposal will fit the Department’s needs better than any other proposals, of course, so this is partly an exercise in hopeful thinking, but it’s also helping me to figure out why I’m interested in investigating certain topics. The course, “Studies in Technology and Culture” (DTC 561 / ENGL 561), examines “key concepts, tools, and possibilities afforded by engaging with technology through a critical cultural lens,” and is one of the two required courses for the interdisciplinary WSU graduate certificate in Digital Humanities and Culture, a certificate designed to “enhance already existing graduate programs in the humanities and the social sciences, . . . [offering] graduate-level coursework in critical methods, textual analysis, composing practices, and hands-on production for engaging with humanistic studies in, as well as about, digital environments.” I see a couple important points there:

  • first, the certificate’s “critical cultural lens” indicates a reflexive and dialectical (practice- and theory-based) analysis of cultural phenomena as in process and under construction by human and nonhuman agents, and toward the notion of culture as a “noun of process” (from the etymological tracing of Raymond Williams, who points out that the original verb form of “culture” was transitive) representing complex multiple self-developing practices relating to symbolic action; and
  • second, the certificate’s interdisciplinary aspects contribute in rich ways to its digital focus, given its required electives that examine how (AMST 522) the economics of access in the digital divide reinforce inequalities, how (DTC 477) the commodification of information and digital tools can contribute to the stratification of their use, how (DTC 478) interface designs can sometimes reinforce stratification and inequality, how (HIST 527) public history projects incorporating digital technologies can attempt to resist the dominant appropriation or suppression of the heritage of subjugated cultures through practices of responsible representation, and how (HIST 529) ethical digital curation and archiving practices can serve equitable and inclusive ends.

One possible intersection of both points might be understood as the intersection of process and information, which is how I would theme the seminar. Such a theme would represent the familiar cultural studies topoi of race, sexuality, class, gender, ethnicity, age, religion, ability, and others as points of contestation over information. The processes via which information is produced, distributed, owned, used, and re-produced shape and are shaped by those topoi and their intersections with digital technologies. Furthermore, I see tendencies in our emerging studies of digital technology and culture that replicate past trajectories whereby early adopters of technologies (often members of privileged cultural groups) tend to centralize, monopolize, and territorialize research domains—fields that shape processes related to the development of information—especially in an academic context shaped by the eagerness of funding agents to throw money at technology. Given such eagerness, the certificate’s welcome emphasis on “hands-on production” might offer an opportunity to counter that territorializing impulse.

Read more