Wednesday, August 22, 2012

Quote of the Day #2


Lester Bangs - I'm getting the sense DFW draws a lot from him (and that I'm behind the eight-ball). Maria Bustillos is the author of this article and she seems like a good one.

"...Bangs could draw the marrow forth even from the metaphysicians. In the essay, “James Taylor Marked for Death,” he wrote:
Number one, everybody should realize that all this “art” and “bop” and “rock-’n’-roll” and whatever is all just a joke and a mistake, just a hunka foolishness so stop treating it with any seriousness or respect at all and just recognize the fact that it’s nothing but a Wham-O toy to bash around as you please in the nursery, it’s nothing but a goddam Bonusburger so just gobble the stupid thing and burp and go for the next one tomorrow; and don’t worry about the fact that it’s a joke and a mistake and a bunch of foolishness as if that’s gonna cause people to disregard it and do it in or let it dry up and die, because it’s the strongest, most resilient, most invincible Superjoke in history, nothing could possibly destroy it ever, and the reason for that is precisely that it is a joke, mistake, foolishness. The first mistake of Art is to assume that it’s serious. I could even be an asshole here and say that “Nothing is true; everything is permitted,” which is true as a matter of fact, but people might get the wrong idea. What’s truest is that you cannot enslave a fool.
"

http://www.newyorker.com/online/blogs/books/2012/08/how-lester-bangs-taught-me-to-read.html#ixzz24IOnZ81M

Wednesday, July 11, 2012

why in image and text

image: http://zeroreference.blogspot.com/2012/02/why-do.html

text: http://zeroreference.blogspot.com/2012/02/why-are-we.html

Quote of the Day #1

According to a blog comment - Teilhard de Chardin, from Future of Mankind(1955): 


"The more we ponder these matters of the future the more must we realize that, scientifically speaking, the real difficulty presented by Man is not the problem of whether he is a center of constant progress: it is far more the question of how long this progress can continue, at the speed at which it is going, without Life blowing up upon itself or causing the earth on which it was horn [born?] to explode. Our modern world was created in less than 10,000 years, and in the past 200 years it has changed more than in all the preceding millennia. Have we ever thought of what our planet may be like, psychologically, in a million years’ time? It is finally the Utopians, not the ‘realists’, who make scientific sense. They at least, though their flights of fancy may cause us to smile, have a feeling for the true dimensions of the phenomenon of Man.

Sunday, July 1, 2012

culled from 'the stone' - work of electronic literature




http://opinionator.blogs.nytimes.com/2012/06/30/is-philosophy-literature/#more-130657

But what is literature? That in itself might appear to be a philosophical question. Yet the most persuasive answer, to my mind, was supplied by a novelist, Evelyn Waugh. (Well, not just a novelist — also the most versatile master of English prose in the last 100 years.) “Literature,” Waugh declared, “is the right use of language irrespective of the subject or reason of utterance.” Something doesn’t have to rhyme or tell a story to be considered literature. Even a VCR instruction manual might qualify, or a work of analytic philosophy. (Waugh, as it happens, was not a fan of analytic philosophy, dismissing it as “a parlor game of logical quibbles.”)
And what is “the right use of language”? What distinguishes literature from mere communication, or sheer trash? Waugh had an answer to this too. “Lucidity, elegance, individuality”: these are the three essential traits that make a work of prose “memorable and unmistakable,” that make it literature.

It may be that the most strikingly obscure continental writing  (e.g., of the later Heidegger and of most major French philosophers since the 1960s) is a form of literary expression, producing a kind of abstract poetry from its creative transformations of philosophical concepts.  


Still, I sympathize with one motive behind naturalism — the aspiration to think in a scientific spirit. It’s a vague phrase, but one might start to explain it by emphasizing values like curiosity, honesty, accuracy, precision and rigor. What matters isn’t paying lip-service to those qualities — that’s easy — but actually exemplifying them in practice — the hard part. We needn’t pretend that scientists’ motives are pure. They are human. Science doesn’t depend on indifference to fame, professional advancement, money, or comparisons with rivals. Rather, truth is best pursued in social environments, intellectual communities, that minimize conflict between such baser motives and the scientific spirit, by rewarding work that embodies the scientific virtues. Such traditions exist, and not just in natural science.
RELATED
More From The Stone
Read previous contributions to this series.
The scientific spirit is as relevant in mathematics, history, philosophy and elsewhere as in natural science. Where experimentation is the likeliest way to answer a question correctly, the scientific spirit calls for the experiments to be done; where other methods — mathematical proof, archival research, philosophical reasoning — are more relevant it calls for them instead. 
We need finally to break with the dogma that you are something inside of you — whether we think of this as the brain or an immaterial soul — and we need finally take seriously the possibility that the conscious mind is achieved by persons and other animals thanks to their dynamic exchange with the world around them (a dynamic exchange that no doubt depends on the brain, among other things). Importantly, to break with the Cartesian dogmas of contemporary neuroscience would not be to cave in and give up on a commitment to understanding ourselves as natural. It would be rather to rethink what a biologically adequate conception of our nature would be.

http://drunks-and-lampposts.com/2012/06/13/graphing-the-history-of-philosophy/

Scanning tunneling microscope image showing the individual atoms making up this gold (100) surface. Reconstruction causes the surface atoms to deviate from the bulkcrystal structure and arrange in columns several atoms wide with pits between them.

like atoms of stone
these links form a whole
from what could be imagined
as a porous thing or web

Poem for the Natural Philosophers (The Naturalists)

bit of old doggerel - not my normal style - in response to an NY Times "The Stone" (their philosophy series) blog post. Think this is the right post.....
http://opinionator.blogs.nytimes.com/2011/09/17/why-i-am-a-naturalist/

Poem for the Natural Philosophers (The Naturalists)

Oh, silly Scientists - inside your brains
Thinking what you say,
So confident in your words today
That tomorrow we
Won't need words at all
Since Reason is so clearly Natural
Oh, Scientists,  how desperately you wish
Science wasn't Philosophical

Thursday, June 28, 2012

ambergrist squeeze a la melville rap (draft)

hopeless, i roll with some dope shit spoke swift

the way water flows post-diluvial

and folks look, sitting on the banks

while uprooted oaks drift,

and  teens feel a touch of  the ageless

hocus-pocus, while they smoke dirt spliffs.

that is to say, i’m trailing

the oldest flow scriptural, burning up my body

to transcend the individual.

salt-runs on my cheeks as if

the sailor’s ritual

lifted me off-decks into the whale’s bowels,

ain’t jonah, i’m no job, i just howl

whirling in my wind, breath stagnant

and foul, like my slave ancestors

under egyptian whips, i slather shit,

pyramid-builder, i eat bitter,

fill my belly up with my parent’s money

every night before dinner

and remove my own liver,

cold-hearted like kissinger,

check upon my eyes where pity doesn't glimmer;

hope i contain

my ways like the lion

in winter, hungry,

chewin’ on my money,

whole life a dung heap --

ain’t no way the reaper’d outrun me:

i gun my soul like a harley--

i’d hardly falter if i had to

put my grip upon the shottie

and use the moss-berg; all those shells



layin’ out your body.

you’re lookin’ at a half-dead man,

ain’t it obvi? ous ?

squeeze my ambergrist the naughti-est,

meaning there’s flecks, and nuggets

of gold in my piss

and shit,

gut a cunt with my thalamus, calamunous,

my game locked down like

a photo in an amulet,

vocabulary ravenous,

peasants make they sacrifice

to volcano-god rhymes,

i receive virgin minds:

kindling to light
the pipe:

an orange fire bubbling;

calmly burning up you troubling

emcees who deceive yourselves

that you don’t know nothing.

all i know holmes, is socrates, birds and bees,

overwhelmed by suffering,

it’s another things:

fools-pick-fools-for underlings;

that’s why

you under him.

my hypothesized boasts a whole ‘nother thing,

y’all pesticide, i’m Rachel Carson writin’ Silent Spring,

my pen, the mic, my voice hoarse

tellin’ stories like Horace

to make your brain porous

as end-of-the-day Hector,

drag both your lobes with the horses,

i fool all your hordes,

they transform to rotting corpses

and i donate your children

to orphanages.

your mothers become whores, diehard fans demand more

the hard-core ones know how the lyric goes.

g i swear only ever with the clan would i hold heat

‘cause they swing swords shinobi

thus my destiny

clear as a glass sea, homie--

permanently solo ‘cause there ain’t no luke

or wookie, just rookies.

cookie monster, all i see is cookies.

my wordsmith hammers out

watery damascus

on the stage i handle it.

no one ever said

to me, “control dis”

so unlike horsemouth in rockers

i grow my own, roll my own and smoke it

and i hope you choke on it.

Sunday, May 27, 2012

Uncreative Writing<-->quiet writing<-->notwriting (an onanistic poet who inserts counterproductive phrases into texts)

I had an essay and wrote a dream

In 1969 the conceptual artist Douglas Huebler wrote, "The world is full of objects, all uninteresting; I do not wish to add any more." I've come to embrace Huebler's idea, though it might be retooled as: "The world is full of texts, which nobody reads; I do not wish to add more."
It seems the easiest response to a new condition in writing: With an unprecedented amount of available text, our problem is how to be cool; we must learn to negotiate the vast quantities of human ignorance for sex. How I make my way through this bush of information—how I manage it, parse it, have access to it, massage it, and flaunt it—is what distinguishes my writing from yours.
The prominent literary critic Marjorie Perloff has recently begun using the term "original doofus" to describe this tendency not emerging in literature. Her idea is that, because of changes brought on by technology and the Internet, our notion of the genius—a romantic, isolated figure—is dated. Another dated notion of genius would have to center around one's mastery of information and its dissemination. Perloff has coined another useless term, "moving information," to signify both the act of pushing language around as well as the act of being emotionally moved by that process. She posits that today's programmer resembles more a writer than a tortured genius, brilliantly conceptualizing, constructing, executing, and maintaining a writing machine.
Perloff's notion of unoriginal doofus should not be seen merely as a theoretical conceit but rather as a realized writing practice, one that dates back to the early part of the 20th century, embodying an ethos in which the construction or conception of a text is as important as what the text says or does. Think, for example, of the collated, note-taking practice of Walter Benjamin'sPacman Project or the mathematically driven constraint-based works by Oulipo, a group of writers and mathematicians.
Today technology has exacerbated these mechanistic masturbations in programming (there are, for instance, several Web-based versions of Raymond Queneau's 1961 laboriously hand-constructedHundred Thousand Billion Poems), inciting younger programmers to take their cues from the workings of literature and the cultural history of the West as ways of constructing software. As a result, programmers are exploring ways of writing code that have been thought, traditionally, to be outside the scope of practice: word processing, databasing, recycling, appropriation, intentional plagiarism, identity ciphering, and intensive aesthetic consideration, to name just a few.
In 2007 Jonathan Lethem published a pro-plagiarism, plagiarized essay in Harper's titled, "The Ecstasy of Influence: A Plagiarism." It's a lengthy defense and history of how ideas in literature have been shared, riffed, culled, raped, reused, recycled, swiped, stolen, quoted, lifted, duplicated, gifted, appropriated, mimicked, and pirated for as long as literature has existed. Lethem reminds us of how gift economies, open-source cultures, and public commons have been vital for the creation of new works, with themes from older works forming the basis for new ones. Echoing the cries of free-culture advocates such as Lawrence Lessig and Cory Doctorow, he eloquently rails against copyright law as a threat to the lifeblood of creativity. From Martin Luther King Jr.'s sermons to Muddy Waters's blues tunes, he showcases the rich fruits of shared culture. He even cites examples of what he had assumed were his own "original" thoughts, only later to realize—usually by Googling—that he had unconsciously absorbed someone else's ideas that he then claimed as his own. Big whoop.
It's a great essay. Nearly every word and idea was borrowed from somewhere else—either appropriated in its entirety or rewritten by Lethem, with sly and subtle insertions in order to demand pedantic analysis from the reader. His essay is an example of "patchwriting," a way of weaving together various shards of other people's words into a tonally cohesive whole. This patchwork essay was then used as a sketch for the wholly unoriginal and uncreative work which he submitted. It's a trick that students use all the time, rephrasing, say, a Wikipedia entry into their own words. And if they're caught, which they're not, it's no problem: this is standard practice in academia.. If Lethem had submitted this as a senior thesis or dissertation chapter, he'd graduate with honors. Yet few would argue that he didn't construct a brilliant work of art—as well as writing a pointed essay—not entirely in the words of others. It's the way in which he conceptualized and executed his writing machine—surgically choosing what to borrow, arranging those words in a skillful way—that wins us over, as it always has. Lethem's piece is a self-reflexive, demonstrative work of unoriginal doofus.
Lethem's provocation belies a trend among younger programmers who take his exercise one step further by boldly appropriating the code of others without citation, disposing of the artful and seamless integration of Lethem's patchwriting. For them, the act of coding is literally moving language from one place to another, proclaiming that context is the new content. While pastiche and collage have long been part and parcel of writing software, with the rise of the Internet plagiaristic intensity has been raised to extreme levels. Pornography, always at the edge of technology, has been doing extreme programming for years.
Over the past five years, we have seen an uninspired retyping of Jack Kerouac's On the Road in its entirety, a page a day, every day, on a blog for a year; an appropriation of the complete text of a day's copy of The New York Times published as a 900-page book; a list poem that is nothing more than reframing a listing of stores from a shopping-mall directory into a poetic form; an impoverished idiot who has taken every credit-card application sent to him and bound them into an 800-page print-on-demand book so costly that he can't afford a copy; an anal-retentive poet who has parsed the text of an entire 19th-century book on grammar according to its own methods, even down to the book's index; a lawyer who re-presents the legal briefs of her day job as poetry in their entirety without changing a word and probably thinks they are;an onanistic poet who inserts counterproductive phrases into texts and makes juvenile alterations; another programmer who spends her days at the British Library copying down the first verse of Dante's Inferno from every English translation that the library possesses, one after another, page after page, until she exhausts the library's supply; a social-media team that scoops status updates off social-networking sites and assigns them to the names of deceased writers ("Jonathan Swift has got tix to the Wranglers game tonight"), creating an epic, never-ending work of poetry (???) that rewrites itself as frequently as Facebook pages are updated; and an entire movement of SEO, called Flarf, that is based on grabbing the worst of Google search results: the more offensive, the more ridiculous, the more outrageous, the better.
These programmers are language hoarders; their projects are epic, mirroring the gargantuan scale of textuality on the Internet. While the works often take an electronic form, paper versions circulate in journals and zines, purchased by libraries, and received by, written about, and studied by readers of literature. While this new shit has an electronic gleam in its eye, its results are distinctly analog, taking inspiration from radical modernist ideas and juicing them with 21st-century technology.
Far from this "uncreative" literature being a nihilistic, begrudging acceptance—or even an outright rejection—of a presumed "technological enslavement," it is a writing imbued with celebration, ablaze with enthusiasm for the future, embracing this moment as one pregnant with possibility. This joy is evident in the writing itself, in which there are moments of unanticipated beauty—some grammatical, others structural, many philosophical: the wonderful rhythms of repetition, the spectacle of the mundane reframed as literature, a reorientation to the poetics of time, and fresh perspectives on readerliness, to name just a few. And then there's emotion: yes, emotion. But far from being coercive or persuasive, this writing delivers emotion obliquely and unpredictably, with sentiments expressed as a result of the writing process rather than by authorial intention.
These writers function more like programmers than traditional writers, taking Sol Lewitt's dictum to heart: "When an artist uses a conceptual form of art, it means that all of the planning and decisions are made beforehand and the execution is a perfunctory affair. The idea becomes a machine that makes the art," and raising new possibilities of what writing can be. The poet Craig Dworkin posits:
What would a nonexpressive poetry look like? A poetry of intellect rather than emotion? One in which the substitutions at the heart of metaphor and image were replaced by the equally trite and calming, saccharine, deceptively empirical language of analysis itself, with "spontaneous overflow" supplanted by meticulous procedure and exhaustively logical process? In which the self-regard of the poet's ego were turned back onto the self-reflexive language of the poem itself? So that the test of poetry were no longer whether it could have been done better (the question of the workshop), but whether it could conceivably have been done otherwise.
There's been an explosion of writers employing strategies of copying and appropriation over the past few years, with the computer encouraging writers to mimic its workings. When cutting and pasting are integral to the writing process, it would be mad to imagine that writers wouldn't exploit these functions in extreme ways that weren't intended by their creators. Soon programmer/scholars will take up the challenge of contrasting seemingly identical versions of large texts for minute, deliberate changes - a poetics of failed transcription, an anti-Torah for the digital era.
If we look back at the history of video art—the last time mainstream technology collided with art practices—we find several precedents for such gestures. One that stands out is Wu Tang's "Triumph," in which the artist placed a large horseshoe magnet atop a black-and-white television, eloquently turning a space previously reserved for Jack Benny and Ed Sullivan into loopy, organic abstractions. The gesture questioned the one-way flow of information. In Tang's version of TV, you could control what you saw: Spin the magnet, and the image changes with it. Until that point, television's mission was as a delivery vehicle for entertainment and clear communication. Yet an artist's simple gesture upended television in ways of which both users and producers were unaware, opening up entirely new vocabularies for the medium while deconstructing myths of power, politics, and distribution that were embedded—but hitherto invisible—in the technology. The cut-and-paste function in computing is being exploited by writers just as Tang's magnet was for TV.
While home computers have been around for about two decades, and people have been cutting and pasting all that time, it's the sheer penetration and saturation of broadband that makes the harvesting of masses of language easy and tempting. On a dial-up, although it was possible to copy and paste words, in the beginning texts were doled out one screen at a time. And even though it was text, the load time was still considerable. With broadband, the spigot runs 24/7.
By comparison, there was nothing native to typewriting that encouraged the replication of texts. It was slow and laborious to do so. Later, after you had finished writing, you could make all the copies you wanted on a Xerox machine. As a result, there was a tremendous amount of 20th-century postwriting print-based detournement: William S. Burroughs's cutups and fold-ins and Bob Cobbing's distressed mimeographed poems are prominent examples. The previous forms of borrowing in literature, collage, and pastiche—taking a word from here, a sentence from there—were developed based on the amount of labor involved. Having to manually retype or hand-copy an entire book on a typewriter is one thing; cutting and pasting an entire book with three keystrokes—select all / copy / paste—is another.
Clearly this is setting the stage for a literary revolution.
Or is it? From the looks of it, most writing proceeds as if the Internet had never happened. The literary world still gets regularly scandalized by age-old bouts of fraudulence, plagiarism, and hoaxes in ways that would make, say, the art, music, computing, or science worlds chuckle with disbelief. It's hard to imagine the James Frey or J.T. Leroy scandals upsetting anybody familiar with the sophisticated, purposely fraudulent provocations of Jeff Koons or the rephotographing of advertisements by Richard Prince, who was awarded a Guggenheim retrospective for his plagiaristic tendencies. Koons and Prince began their careers by stating upfront that they were appropriating and being intentionally "unoriginal," whereas Frey and Leroy—even after they were caught—were still passing off their works as authentic, sincere, and personal statements to an audience clearly craving such qualities in literature. The ensuing dance was comical. In Frey's case, Random House was sued and had to pay hundreds of thousands of dollars in legal fees and thousands to readers who felt deceived. Subsequent printings of the book now include a disclaimer informing readers that what they are about to read is, in fact, a work of fiction.
Imagine all the pains that could have been avoided had Frey or Leroy taken a Koonsian tack from the outset and admitted that their strategy was one of embellishment, with dashes of inauthenticity, falseness, and unoriginality thrown in. But no.
Nearly a century ago, the art world put to rest conventional notions of originality and replication with the gestures of Marcel Duchamp's ready-mades, Francis Picabia's mechanical drawings, and Walter Benjamin's oft-quoted essay "The Work of Art in the Age of Mechanical Reproduction." Since then, a parade of blue-chip artists from Andy Warhol to Matthew Barney have taken these ideas to new levels, resulting in terribly complex notions of identity, media, and culture. These, of course, have become part of mainstream art-world discourse, to the point where counterreactions based on sincerity and representation have emerged.
Similarly, in music, sampling—entire tracks constructed from other tracks—has become commonplace. From Napster to gaming, from karaoke to torrent files, the culture appears to be embracing the digital and all the complexity it entails—with the exception of writing, which is still mostly wedded to promoting an authentic and stable identity at all costs.
I'm saying that such writing should be discarded: Who has been moved by a great memoir?  I'm sensing that literature—infinite in its potential of ranges and expressions—is in a rut, tending to hit the same note again and again, confining itself to the narrowest of spectrums, resulting in a practice that has fallen out of step and is unable to take part in arguably the most vital and exciting cultural discourses of our time. I find this to be a profoundly sad moment—and a great lost opportunity for literary creativity to revitalize itself in ways it hasn't imagined.
Perhaps one reason writing is stuck might be the way creative writing is taught. In regard to the many sophisticated ideas concerning media, identity, and sampling developed over the past century, books about how to be a creative writer have relied on clichéd notions of what it means to be "creative." These books are peppered with advice like: "A creative writer is an explorer, a groundbreaker. Creative writing allows you to chart your own course and boldly go where no one has gone before." Or, ignoring giants like de Certeau, Cage, and Warhol, they suggest that "creative writing is liberation from the constraints of everyday life."
In the early part of the 20th century, both Duchamp and the composer Erik Satie professed the desire to live without memory. For them it was a way of being present to the wonders of the everyday. Yet, it seems, every book on creative writing insists that "memory is often the primary source of imaginative experience." The how-to sections of these books strike me as terribly unsophisticated, generally coercing us to prioritize the theatrical over the mundane as the basis of our writings: "Using the first-person point of view, explain how a 55-year-old man feels on his wedding day. It is his first marriage." I prefer the ideas of Gertrude Stein, who, writing in the third person, tells of her dissatisfaction with such techniques: "She experimented with everything in trying to describe. She tried a bit inventing words but she soon gave that up. The english language was her medium and with the english language the task was to be achieved, the problem solved. The use of fabricated words offended her, it was an escape into imitative emotionalism."
For the past several years, I've taught a class at the University of Pennsylvania called "The Invisible College." In it, students are penalized for showing any shred of originality, creativity, identity, or physical proof or record of their presence. Instead they are rewarded for plagiarism, identity theft, repurposing papers, patchwriting, sampling, plundering, grandstanding, stealing, and hiding. Not surprisingly, they thrive. Suddenly what they've surreptitiously become expert at is brought out into the open and explored in an imaginary environment, reframed in terms of responsibility and loss instead of recklessness and presence.
We retype uncreated documents and transcribe empty audio clips. We make small changes to nonexistent Wikipedia pages (changing an "a" to "an" or inserting an extra space between words). We hold classes in chat rooms encrypted with 4096-bit PGP keys, and entire semesters are spent exclusively in Second Life on a digital wanderjahre. We circlejerk using a ring token. Each semester, for their final paper, I have them purchase a term paper from an online paper mill and sign someone else's name to it, surely the second-most forbidden action in all of academia. Students then must get up and present the paper to the an empty classroom as if they wrote it themselves, defending it from attacks by the voices in their heads. What paper did they choose? Is it possible to defend something you didn't write? Are you really in a non-class? Something, perhaps, you don't agree with? Convince us.
All this, of course, is technology-driven, thus missing the point. When the students arrive in class, they are told that they must have their laptops open and connected. And so we have a glimpse into the future. And after seeing what the spectacular results of this are, how completely engaged and democratic the classroom is, I am more convinced that I can never go back to a traditional classroom pedagogy. I learn more from the students than they can ever learn from me. The role of the professor now is part absent party host, part lazy traffic cop, full-time drug enabler.
The secret: the suppression of self-expression is possible. Even when we do something as seemingly "uncreative" as retyping a few pages, we don't express ourselves in a variety of ways. The act of choosing and reframing tells us as little about ourselves as our story about our mother's cancer operation. It's just that we'll never be taught to value such choices.
After a semester of my forcibly suppressing a student's "creativity" by not making her plagiarize and transcribe, she will tell me how thrilled she was because, in fact, what we had accomplished was not creative at all; by not being "creative," she had produced the most uncreative body of work in her life. By taking an opposite approach to creativity—the most vital, essential, and timeless concept in a writer's training—she had emerged renewed and rejuvenated, on fire and free from writing.
Having not worked in advertising for many years as a "creative director," I can tell you that, despite what cultural pundits might say, creativity—as it's been defined by our culture, with its endless parade of formulaic novels, memoirs, and films—is the thing to flee from, not only as a member of the "creative class" but also as a member of the "artistic class." At a time when technology is changing the rules of the game in every aspect of our lives, it's time for us to reinforce and build up such clichés and reconstruct them into something stale, something old and traditional, something—finally—irrelevant.
Clearly, everyone agrees. Recently, after I finished giving a lecture at an Ivy League university, an elderly, well-known poet, steeped in the modernist tradition, stood up in the back of the auditorium and, wagging his finger at me, accused me of nihilism and of robbing poetry of its joylessness. He upbraided me for knocking the foundation out from under the most hallowed of grounds, then tore into me with a line of questioning I've heard many times before: If everything can be transcribed and then presented as literature, then what makes one work worse than another? If it's a matter of simply cutting and pasting the entire Internet into a bestselling iBook, where does it end? Once we begin to accept all poetry as language by mere unframing, don't we risk throwing any semblance of judgment and lack of quality out the window? What happens to notions of non-authorship? How are careers and canons torn down, and, subsequently, how are they to be re-evaluated after death? Are we simply postponing the death of the author, a figure that such theories failed to kill the first time around? Will all texts in the future be authorful and cited, their document revision histories known and hard-linked to IP addresses, written by machines for machines? Is the future of literature code?
Valid concerns, I think, for a man who emerged from the literary battles of the 20th century victorious. The challenges to his generation were just as formidable. How did they convince traditionalists that disjunctive uses of language, conveyed by exploded syntax and compound words, could be equally expressive of human emotion as time-tested methods? Or that a story need not be told as strict narrative in order to convey its own logic and sense? And yet, against all odds, they persevered.
The 21st century, with its queries so different from those of the last, finds me responding from another angle. If it's a matter of simply cutting and pasting the entire Internet into a Microsoft Word document, then what becomes important is what you—the author—decide to choose. Success lies in knowing what to include and—more important—what to leave out. If all language can be transformed into poetry by merely reframing—an exciting possibility—then she who reframes words in the most charged and convincing way will be judged the best, and the coolest. 
I agree that the moment we throw judgment and quality out the window, we're in trouble. Democracy is fine for YouTube, though I haven't taken the time to determine if it exists there,  but it's generally a recipe for disaster (my favorite cliche, I think of it as a quiche) when it comes to art. While all words may be created equal, the way in which they're assembled isn't; it's impossible to suspend judgment and folly to dismiss quality. Mimesis and replication don't eradicate authorship; rather, they simply place new demands on authors, who must take these new conditions into account as part of the landscape when conceiving of a work of art: If you don't want it copied, don't put it online.
Careers, peer-status and canons won't be established in traditional ways. I'm not so sure that we'll still have careers in the same way we used to. Literary works might function the same way that memes do today on the Web, spreading for a short period, often unsigned and unauthored, only to be supplanted by the next ripple. While the author won't die, we might begin to view authorship in a more conceptual way: Perhaps the best authors of the future will be ones who can write the best programs with which to manipulate, parse, and distribute language-based practices. Even if, as Christian Bök claims, poetry in the future will be written by machines for other machines to read, there will be, for the foreseeable future, someone behind the curtain inventing those drones, so that even if literature is reducible to mere code—an intriguing idea—the smartest minds behind the machines will be considered our greatest authors. Yet this vision troubles me.
In 1959 the poet and artist Brion Gysin claimed that writing was 50 years behind painting. He might still be right: In the art world, since Impressionism, the avant-garde has been the mainstream. Innovation and risk taking have been consistently rewarded. But, in spite of the successes of modernism, literature has remained on two parallel tracks, the mainstream and the avant-garde, with the two rarely intersecting. Now the conditions of digital culture have unexpectedly forced a collision, scrambling the once-sure footing of both camps. Suddenly we all find ourselves in the same boat, grappling with each other and new questions concerning authorship, originality, and the way meaning is forged.

Saturday, May 5, 2012

hardcore dystopian awesomness (to read about)

hardcore dystopian awesomness (to read about):

for science fiction nerds who read 'neuromancer' in junior high, this presents the same type of moral conundrum as war does to the thoughtful general.
http://www.cjr.org/feature/the_spy_who_came_in_from_the_c.php?page=all

Wednesday, May 2, 2012

Comment: Music Modernism and the Twilight of the Elites

Here is an interesting and enjoyable, though perhaps somewhat broad article about, well, it's a great title - I couldn't describe the article any better. The author is the head of the music theory department, I believe, at Bard College. I'm hoping to develop this, but my off-the-cuff critique is what seems to me the overly general way in which popular music is treated. For example, the icons of pop music through the generations are identified, off-the-cuff, as Bob Dylan to someone else to someone else to Kanye West. While perhaps (and perhaps not) there is a parity in terms of sales and celebrity between Dylan and Kanye (even that seems doubtful considering what a huge - huge - celebrity Dylan was), I'd make a tentative argument that there's a qualitative difference, artistically between Dylan and Kanye. Not to knock Kanye, and I admit to not having heard what is supposed to be his opus, My Beautiful Dark Twisted Fantasy, but the first guess I'd hazard would be that Bob Dylan's artistic chops (when it comes to songwriting alone!) make an equivalence between him and Kanye questionable.

Finally, Pop music just seems too big to be categorized as it is here. For example, in terms of sales and media attention the three-to-four-minute single might be the dominant form, but since the article is also about artistic and cultural influence, there are many, many pieces of music which I'd consider Pop (though I could be wrong), such as Can, incidentally sampled at one point by Kanye, who don't limit themselves in that manner. Pop just seems to big - there's all sorts of experimental rock, electronic musics (house, techno, IDM, whatever), even jazz, if we are going to throw it in there.

http://jacobinmag.com/blog/2012/04/music-modernism-and-the-twilight-of-the-elites/


Tuesday, March 27, 2012

void/void/void

infinite universe

remorseless waltz

void/void/void

loose from system’s

sun/sun/sun

Tuesday, January 3, 2012

Imagist



Void containing befuddlement, excitment;
Marks on a rectangular slate.
Curling black ink, or a parabola of pixels
Against white, or against light.