I had an essay and wrote a dream
In 1969 the conceptual artist Douglas Huebler wrote, "The world is full of objects, all uninteresting; I do not wish to add any more." I've come to embrace Huebler's idea, though it might be retooled as: "The world is full of texts, which nobody reads; I do not wish to add more."
It seems the easiest response to a new condition in writing: With an unprecedented amount of available text, our problem is how to be cool; we must learn to negotiate the vast quantities of human ignorance for sex. How I make my way through this bush of information—how I manage it, parse it, have access to it, massage it, and flaunt it—is what distinguishes my writing from yours.
The prominent literary critic Marjorie Perloff has recently begun using the term "original doofus" to describe this tendency not emerging in literature. Her idea is that, because of changes brought on by technology and the Internet, our notion of the genius—a romantic, isolated figure—is dated. Another dated notion of genius would have to center around one's mastery of information and its dissemination. Perloff has coined another useless term, "moving information," to signify both the act of pushing language around as well as the act of being emotionally moved by that process. She posits that today's programmer resembles more a writer than a tortured genius, brilliantly conceptualizing, constructing, executing, and maintaining a writing machine.
Perloff's notion of unoriginal doofus should not be seen merely as a theoretical conceit but rather as a realized writing practice, one that dates back to the early part of the 20th century, embodying an ethos in which the construction or conception of a text is as important as what the text says or does. Think, for example, of the collated, note-taking practice of Walter Benjamin'sPacman Project or the mathematically driven constraint-based works by Oulipo, a group of writers and mathematicians.
Today technology has exacerbated these mechanistic masturbations in programming (there are, for instance, several Web-based versions of Raymond Queneau's 1961 laboriously hand-constructedHundred Thousand Billion Poems), inciting younger programmers to take their cues from the workings of literature and the cultural history of the West as ways of constructing software. As a result, programmers are exploring ways of writing code that have been thought, traditionally, to be outside the scope of practice: word processing, databasing, recycling, appropriation, intentional plagiarism, identity ciphering, and intensive aesthetic consideration, to name just a few.
In 2007 Jonathan Lethem published a pro-plagiarism, plagiarized essay in Harper's titled, "The Ecstasy of Influence: A Plagiarism." It's a lengthy defense and history of how ideas in literature have been shared, riffed, culled, raped, reused, recycled, swiped, stolen, quoted, lifted, duplicated, gifted, appropriated, mimicked, and pirated for as long as literature has existed. Lethem reminds us of how gift economies, open-source cultures, and public commons have been vital for the creation of new works, with themes from older works forming the basis for new ones. Echoing the cries of free-culture advocates such as Lawrence Lessig and Cory Doctorow, he eloquently rails against copyright law as a threat to the lifeblood of creativity. From Martin Luther King Jr.'s sermons to Muddy Waters's blues tunes, he showcases the rich fruits of shared culture. He even cites examples of what he had assumed were his own "original" thoughts, only later to realize—usually by Googling—that he had unconsciously absorbed someone else's ideas that he then claimed as his own. Big whoop.
It's a great essay. Nearly every word and idea was borrowed from somewhere else—either appropriated in its entirety or rewritten by Lethem, with sly and subtle insertions in order to demand pedantic analysis from the reader. His essay is an example of "patchwriting," a way of weaving together various shards of other people's words into a tonally cohesive whole. This patchwork essay was then used as a sketch for the wholly unoriginal and uncreative work which he submitted. It's a trick that students use all the time, rephrasing, say, a Wikipedia entry into their own words. And if they're caught, which they're not, it's no problem: this is standard practice in academia.. If Lethem had submitted this as a senior thesis or dissertation chapter, he'd graduate with honors. Yet few would argue that he didn't construct a brilliant work of art—as well as writing a pointed essay—not entirely in the words of others. It's the way in which he conceptualized and executed his writing machine—surgically choosing what to borrow, arranging those words in a skillful way—that wins us over, as it always has. Lethem's piece is a self-reflexive, demonstrative work of unoriginal doofus.
Lethem's provocation belies a trend among younger programmers who take his exercise one step further by boldly appropriating the code of others without citation, disposing of the artful and seamless integration of Lethem's patchwriting. For them, the act of coding is literally moving language from one place to another, proclaiming that context is the new content. While pastiche and collage have long been part and parcel of writing software, with the rise of the Internet plagiaristic intensity has been raised to extreme levels. Pornography, always at the edge of technology, has been doing extreme programming for years.
Over the past five years, we have seen an uninspired retyping of Jack Kerouac's On the Road in its entirety, a page a day, every day, on a blog for a year; an appropriation of the complete text of a day's copy of The New York Times published as a 900-page book; a list poem that is nothing more than reframing a listing of stores from a shopping-mall directory into a poetic form; an impoverished idiot who has taken every credit-card application sent to him and bound them into an 800-page print-on-demand book so costly that he can't afford a copy; an anal-retentive poet who has parsed the text of an entire 19th-century book on grammar according to its own methods, even down to the book's index; a lawyer who re-presents the legal briefs of her day job as poetry in their entirety without changing a word and probably thinks they are;an onanistic poet who inserts counterproductive phrases into texts and makes juvenile alterations; another programmer who spends her days at the British Library copying down the first verse of Dante's Inferno from every English translation that the library possesses, one after another, page after page, until she exhausts the library's supply; a social-media team that scoops status updates off social-networking sites and assigns them to the names of deceased writers ("Jonathan Swift has got tix to the Wranglers game tonight"), creating an epic, never-ending work of poetry (???) that rewrites itself as frequently as Facebook pages are updated; and an entire movement of SEO, called Flarf, that is based on grabbing the worst of Google search results: the more offensive, the more ridiculous, the more outrageous, the better.
These programmers are language hoarders; their projects are epic, mirroring the gargantuan scale of textuality on the Internet. While the works often take an electronic form, paper versions circulate in journals and zines, purchased by libraries, and received by, written about, and studied by readers of literature. While this new shit has an electronic gleam in its eye, its results are distinctly analog, taking inspiration from radical modernist ideas and juicing them with 21st-century technology.
Far from this "uncreative" literature being a nihilistic, begrudging acceptance—or even an outright rejection—of a presumed "technological enslavement," it is a writing imbued with celebration, ablaze with enthusiasm for the future, embracing this moment as one pregnant with possibility. This joy is evident in the writing itself, in which there are moments of unanticipated beauty—some grammatical, others structural, many philosophical: the wonderful rhythms of repetition, the spectacle of the mundane reframed as literature, a reorientation to the poetics of time, and fresh perspectives on readerliness, to name just a few. And then there's emotion: yes, emotion. But far from being coercive or persuasive, this writing delivers emotion obliquely and unpredictably, with sentiments expressed as a result of the writing process rather than by authorial intention.
These writers function more like programmers than traditional writers, taking Sol Lewitt's dictum to heart: "When an artist uses a conceptual form of art, it means that all of the planning and decisions are made beforehand and the execution is a perfunctory affair. The idea becomes a machine that makes the art," and raising new possibilities of what writing can be. The poet Craig Dworkin posits:
What would a nonexpressive poetry look like? A poetry of intellect rather than emotion? One in which the substitutions at the heart of metaphor and image were replaced by the equally trite and calming, saccharine, deceptively empirical language of analysis itself, with "spontaneous overflow" supplanted by meticulous procedure and exhaustively logical process? In which the self-regard of the poet's ego were turned back onto the self-reflexive language of the poem itself? So that the test of poetry were no longer whether it could have been done better (the question of the workshop), but whether it could conceivably have been done otherwise.
There's been an explosion of writers employing strategies of copying and appropriation over the past few years, with the computer encouraging writers to mimic its workings. When cutting and pasting are integral to the writing process, it would be mad to imagine that writers wouldn't exploit these functions in extreme ways that weren't intended by their creators. Soon programmer/scholars will take up the challenge of contrasting seemingly identical versions of large texts for minute, deliberate changes - a poetics of failed transcription, an anti-Torah for the digital era.
If we look back at the history of video art—the last time mainstream technology collided with art practices—we find several precedents for such gestures. One that stands out is Wu Tang's "Triumph," in which the artist placed a large horseshoe magnet atop a black-and-white television, eloquently turning a space previously reserved for Jack Benny and Ed Sullivan into loopy, organic abstractions. The gesture questioned the one-way flow of information. In Tang's version of TV, you could control what you saw: Spin the magnet, and the image changes with it. Until that point, television's mission was as a delivery vehicle for entertainment and clear communication. Yet an artist's simple gesture upended television in ways of which both users and producers were unaware, opening up entirely new vocabularies for the medium while deconstructing myths of power, politics, and distribution that were embedded—but hitherto invisible—in the technology. The cut-and-paste function in computing is being exploited by writers just as Tang's magnet was for TV.
While home computers have been around for about two decades, and people have been cutting and pasting all that time, it's the sheer penetration and saturation of broadband that makes the harvesting of masses of language easy and tempting. On a dial-up, although it was possible to copy and paste words, in the beginning texts were doled out one screen at a time. And even though it was text, the load time was still considerable. With broadband, the spigot runs 24/7.
By comparison, there was nothing native to typewriting that encouraged the replication of texts. It was slow and laborious to do so. Later, after you had finished writing, you could make all the copies you wanted on a Xerox machine. As a result, there was a tremendous amount of 20th-century postwriting print-based detournement: William S. Burroughs's cutups and fold-ins and Bob Cobbing's distressed mimeographed poems are prominent examples. The previous forms of borrowing in literature, collage, and pastiche—taking a word from here, a sentence from there—were developed based on the amount of labor involved. Having to manually retype or hand-copy an entire book on a typewriter is one thing; cutting and pasting an entire book with three keystrokes—select all / copy / paste—is another.
Clearly this is setting the stage for a literary revolution.
Or is it? From the looks of it, most writing proceeds as if the Internet had never happened. The literary world still gets regularly scandalized by age-old bouts of fraudulence, plagiarism, and hoaxes in ways that would make, say, the art, music, computing, or science worlds chuckle with disbelief. It's hard to imagine the James Frey or J.T. Leroy scandals upsetting anybody familiar with the sophisticated, purposely fraudulent provocations of Jeff Koons or the rephotographing of advertisements by Richard Prince, who was awarded a Guggenheim retrospective for his plagiaristic tendencies. Koons and Prince began their careers by stating upfront that they were appropriating and being intentionally "unoriginal," whereas Frey and Leroy—even after they were caught—were still passing off their works as authentic, sincere, and personal statements to an audience clearly craving such qualities in literature. The ensuing dance was comical. In Frey's case, Random House was sued and had to pay hundreds of thousands of dollars in legal fees and thousands to readers who felt deceived. Subsequent printings of the book now include a disclaimer informing readers that what they are about to read is, in fact, a work of fiction.
Imagine all the pains that could have been avoided had Frey or Leroy taken a Koonsian tack from the outset and admitted that their strategy was one of embellishment, with dashes of inauthenticity, falseness, and unoriginality thrown in. But no.
Nearly a century ago, the art world put to rest conventional notions of originality and replication with the gestures of Marcel Duchamp's ready-mades, Francis Picabia's mechanical drawings, and Walter Benjamin's oft-quoted essay "The Work of Art in the Age of Mechanical Reproduction." Since then, a parade of blue-chip artists from Andy Warhol to Matthew Barney have taken these ideas to new levels, resulting in terribly complex notions of identity, media, and culture. These, of course, have become part of mainstream art-world discourse, to the point where counterreactions based on sincerity and representation have emerged.
Similarly, in music, sampling—entire tracks constructed from other tracks—has become commonplace. From Napster to gaming, from karaoke to torrent files, the culture appears to be embracing the digital and all the complexity it entails—with the exception of writing, which is still mostly wedded to promoting an authentic and stable identity at all costs.
I'm saying that such writing should be discarded: Who has been moved by a great memoir? I'm sensing that literature—infinite in its potential of ranges and expressions—is in a rut, tending to hit the same note again and again, confining itself to the narrowest of spectrums, resulting in a practice that has fallen out of step and is unable to take part in arguably the most vital and exciting cultural discourses of our time. I find this to be a profoundly sad moment—and a great lost opportunity for literary creativity to revitalize itself in ways it hasn't imagined.
Perhaps one reason writing is stuck might be the way creative writing is taught. In regard to the many sophisticated ideas concerning media, identity, and sampling developed over the past century, books about how to be a creative writer have relied on clichéd notions of what it means to be "creative." These books are peppered with advice like: "A creative writer is an explorer, a groundbreaker. Creative writing allows you to chart your own course and boldly go where no one has gone before." Or, ignoring giants like de Certeau, Cage, and Warhol, they suggest that "creative writing is liberation from the constraints of everyday life."
In the early part of the 20th century, both Duchamp and the composer Erik Satie professed the desire to live without memory. For them it was a way of being present to the wonders of the everyday. Yet, it seems, every book on creative writing insists that "memory is often the primary source of imaginative experience." The how-to sections of these books strike me as terribly unsophisticated, generally coercing us to prioritize the theatrical over the mundane as the basis of our writings: "Using the first-person point of view, explain how a 55-year-old man feels on his wedding day. It is his first marriage." I prefer the ideas of Gertrude Stein, who, writing in the third person, tells of her dissatisfaction with such techniques: "She experimented with everything in trying to describe. She tried a bit inventing words but she soon gave that up. The english language was her medium and with the english language the task was to be achieved, the problem solved. The use of fabricated words offended her, it was an escape into imitative emotionalism."
For the past several years, I've taught a class at the University of Pennsylvania called "The Invisible College." In it, students are penalized for showing any shred of originality, creativity, identity, or physical proof or record of their presence. Instead they are rewarded for plagiarism, identity theft, repurposing papers, patchwriting, sampling, plundering, grandstanding, stealing, and hiding. Not surprisingly, they thrive. Suddenly what they've surreptitiously become expert at is brought out into the open and explored in an imaginary environment, reframed in terms of responsibility and loss instead of recklessness and presence.
We retype uncreated documents and transcribe empty audio clips. We make small changes to nonexistent Wikipedia pages (changing an "a" to "an" or inserting an extra space between words). We hold classes in chat rooms encrypted with 4096-bit PGP keys, and entire semesters are spent exclusively in Second Life on a digital wanderjahre. We circlejerk using a ring token. Each semester, for their final paper, I have them purchase a term paper from an online paper mill and sign someone else's name to it, surely the second-most forbidden action in all of academia. Students then must get up and present the paper to the an empty classroom as if they wrote it themselves, defending it from attacks by the voices in their heads. What paper did they choose? Is it possible to defend something you didn't write? Are you really in a non-class? Something, perhaps, you don't agree with? Convince us.
All this, of course, is technology-driven, thus missing the point. When the students arrive in class, they are told that they must have their laptops open and connected. And so we have a glimpse into the future. And after seeing what the spectacular results of this are, how completely engaged and democratic the classroom is, I am more convinced that I can never go back to a traditional classroom pedagogy. I learn more from the students than they can ever learn from me. The role of the professor now is part absent party host, part lazy traffic cop, full-time drug enabler.
The secret: the suppression of self-expression is possible. Even when we do something as seemingly "uncreative" as retyping a few pages, we don't express ourselves in a variety of ways. The act of choosing and reframing tells us as little about ourselves as our story about our mother's cancer operation. It's just that we'll never be taught to value such choices.
After a semester of my forcibly suppressing a student's "creativity" by not making her plagiarize and transcribe, she will tell me how thrilled she was because, in fact, what we had accomplished was not creative at all; by not being "creative," she had produced the most uncreative body of work in her life. By taking an opposite approach to creativity—the most vital, essential, and timeless concept in a writer's training—she had emerged renewed and rejuvenated, on fire and free from writing.
Having not worked in advertising for many years as a "creative director," I can tell you that, despite what cultural pundits might say, creativity—as it's been defined by our culture, with its endless parade of formulaic novels, memoirs, and films—is the thing to flee from, not only as a member of the "creative class" but also as a member of the "artistic class." At a time when technology is changing the rules of the game in every aspect of our lives, it's time for us to reinforce and build up such clichés and reconstruct them into something stale, something old and traditional, something—finally—irrelevant.
Clearly, everyone agrees. Recently, after I finished giving a lecture at an Ivy League university, an elderly, well-known poet, steeped in the modernist tradition, stood up in the back of the auditorium and, wagging his finger at me, accused me of nihilism and of robbing poetry of its joylessness. He upbraided me for knocking the foundation out from under the most hallowed of grounds, then tore into me with a line of questioning I've heard many times before: If everything can be transcribed and then presented as literature, then what makes one work worse than another? If it's a matter of simply cutting and pasting the entire Internet into a bestselling iBook, where does it end? Once we begin to accept all poetry as language by mere unframing, don't we risk throwing any semblance of judgment and lack of quality out the window? What happens to notions of non-authorship? How are careers and canons torn down, and, subsequently, how are they to be re-evaluated after death? Are we simply postponing the death of the author, a figure that such theories failed to kill the first time around? Will all texts in the future be authorful and cited, their document revision histories known and hard-linked to IP addresses, written by machines for machines? Is the future of literature code?
Valid concerns, I think, for a man who emerged from the literary battles of the 20th century victorious. The challenges to his generation were just as formidable. How did they convince traditionalists that disjunctive uses of language, conveyed by exploded syntax and compound words, could be equally expressive of human emotion as time-tested methods? Or that a story need not be told as strict narrative in order to convey its own logic and sense? And yet, against all odds, they persevered.
The 21st century, with its queries so different from those of the last, finds me responding from another angle. If it's a matter of simply cutting and pasting the entire Internet into a Microsoft Word document, then what becomes important is what you—the author—decide to choose. Success lies in knowing what to include and—more important—what to leave out. If all language can be transformed into poetry by merely reframing—an exciting possibility—then she who reframes words in the most charged and convincing way will be judged the best, and the coolest.
I agree that the moment we throw judgment and quality out the window, we're in trouble. Democracy is fine for YouTube, though I haven't taken the time to determine if it exists there, but it's generally a recipe for disaster (my favorite cliche, I think of it as a quiche) when it comes to art. While all words may be created equal, the way in which they're assembled isn't; it's impossible to suspend judgment and folly to dismiss quality. Mimesis and replication don't eradicate authorship; rather, they simply place new demands on authors, who must take these new conditions into account as part of the landscape when conceiving of a work of art: If you don't want it copied, don't put it online.
Careers, peer-status and canons won't be established in traditional ways. I'm not so sure that we'll still have careers in the same way we used to. Literary works might function the same way that memes do today on the Web, spreading for a short period, often unsigned and unauthored, only to be supplanted by the next ripple. While the author won't die, we might begin to view authorship in a more conceptual way: Perhaps the best authors of the future will be ones who can write the best programs with which to manipulate, parse, and distribute language-based practices. Even if, as Christian Bök claims, poetry in the future will be written by machines for other machines to read, there will be, for the foreseeable future, someone behind the curtain inventing those drones, so that even if literature is reducible to mere code—an intriguing idea—the smartest minds behind the machines will be considered our greatest authors. Yet this vision troubles me.
In 1959 the poet and artist Brion Gysin claimed that writing was 50 years behind painting. He might still be right: In the art world, since Impressionism, the avant-garde has been the mainstream. Innovation and risk taking have been consistently rewarded. But, in spite of the successes of modernism, literature has remained on two parallel tracks, the mainstream and the avant-garde, with the two rarely intersecting. Now the conditions of digital culture have unexpectedly forced a collision, scrambling the once-sure footing of both camps. Suddenly we all find ourselves in the same boat, grappling with each other and new questions concerning authorship, originality, and the way meaning is forged.
No comments:
Post a Comment