A Superfluous Man

[1]3,296 words

At first, the rumors had just seemed like sensational click-bait trash, but as the Algorithm steadily improved, my initial scoffing was replaced by a wavering shrug that at least it would never happen in my lifetime. When, of course, it did happen in my lifetime, my colleagues and I had assured each other that the proliferation of “algorithmically-enhanced” novels might be able to put the genre hacks and ghost writers out of business, but that when it came to high literature, to “art,” there was no replacing the human mind. That’s how it played out for a while. Hack-job mystery thrillers were pumped out with their usual mechanical regularity, only now some of them were the product of the Algorithm to some extent. No doubt there were some earth-shaking changes to that particular part of the literary economy — for example, the fact that with the assistance of the technology, established writers could produce material at an accelerated rate, crowding out aspiring unknowns — but those changes didn’t affect us self-defined high-brow types.

A federal law was passed, stating that algorithmically-enhanced books had to carry a written label informing readers that the work was not the product of human effort alone. At first, this was a selling point; customers would buy up computer-generated pulp simply out of curiosity. Later, when algorithmic enhancement had become an accepted part of the literary landscape, the label had begun simply to blend into the background, just like the lists of poisons we all shrug at when we purchase baskets full of chemical goodies from our local big-box chains.

In short order, the price of the computer program, initially only affordable to the biggest commercial publishers, had been reduced to such an extent that the production of a consumer version of the Algorithm became a viable line of business, creating a fad among a certain class of well-heeled yuppies who thought of themselves as “artsy” without ever actually having produced anything artistic. They’d pony up the cash for the kit, feed their half-baked outlines into the computer, strap on the electrodes, and let the machine finish their work. They’d get a little keepsake out of it, a professional- looking vanity publication to put on their coffee table. An ego-boosting conversation starter. I’d felt vindicated at the time, thinking that algorithmic enhancement would never be anything more than an ego-stroke for the perpetually insecure bourgeoisie, or a cheat code for hacks and memoirists.

Then had come the publication of Grimly Gray, Prairie Sky. Unlike the usual semi-cultured yuppies with pretensions of creativity, the “author” of Grimly Gray had been a Kansas Instagram influencer with a surgically-enhanced ass and enough unabashed laziness to have simply hooked up the electrode set to her temples and gone to sleep without providing the Algorithm with any sort of outline, notes, or creative guidance at all. She’d awakened to a finished work which, though I’ve little doubt she was unable to appreciate it herself, drew universal acclaim for its literary genius. Thematically, this story of a lonely, insecure country girl, morally corrupted by the burden of her unearned physical beauty, was unmistakably the product of its originator’s mind and heart, though not of her talent, craft, or effort. Grimly Gray had captured the hidden inner turmoil of this seemingly empty-headed Barbie- doll in a much more convincing and profound manner than her own efforts could have ever done.

When I first read the book, I must confess to hoping desperately that I would find it shallow and unconvincing, its artificiality somehow manifest. If only the accolades had been unearned! But when I turned the final page, with weepy eyes and fluttering heart, I knew that I had borne witness to what would have been unambiguous literary genius, were it not for the fact that Grimly Gray, Prairie Sky was the work of the Algorithm alone, with not a moment of human effort invested in its production.

After reading Grimly Gray, I proceeded on to some of the works that followed in its wake. A small-time accountant in Kansas City whose myopic pursuit of his insignificant career rendered him insensible to the elapsing of his life’s scarce days, the harrowing tale of an alcoholic Navajo child molester. And lest you think that the Algorithm is only capable of producing mere memoirs, think again! Each of these accounts was fictionalized to the point of thematic congruence, and suffused with unique style and artistry appropriate to its content. The novel generated from the consciousness of a professor of economic history, for instance, had been set in a dystopian alterverse, thick with Joycean references, a true representation of the man’s labyrinthine brilliance, and generated in less than 24 hours simply by hooking up the electrodes before he went to sleep. Each artificial, computer-generated novel was more brilliant than the last, as the chilling black box of machine-learning refined the Algorithm ever further.

We all tried our best to be Luddites. There were some cynical careerists in the literary world willing to go out and lie in the press, to pretend that the new novels were obviously distinguishable as artificial, but when the Deasy test[1] [2] was administered and passed in a number of double-blind studies, it was shown, at least to the satisfaction of the public, that such emperors in fact wore no clothes.

[3]

You can buy Greg Johnson’s From Plato to Postmodernism here [4]

I think it was clear to all of us that the cultural landscape had forever shifted, and that we minor purveyors of literature would see our meager livings dissipate quite swiftly indeed, though no doubt some of the big names with devoted readerships would be able to keep selling books for the rest of their careers. The initial media response was jubilant, presenting the usurpation of literature as an emancipation from the tyranny of a white, male artistic elite that was not representative of the society to which it purported to hold a mirror up. The capacity of journalists to fail to see beyond the tips of their own noses has never failed to amaze me: Somehow, they failed to see that this technological development would make history of their meager talents, and in short order at that. Within two years of the publication of Deasy (2035), an estimated 75% of media content in the English-speaking sphere was being artificially generated.[2] [5] Soon, the various academic literatures will be permeated by it, though for now algorithmic enhancement is banned on the pretext of “professional standards” and “academic honesty.” The automation of scholarship will start with the humanities, whose journals have already displayed on several occasions — notably in the case of the Sokol Affair[3] [6] — a complete inability even to distinguish actual gibberish from genuine “scholarship.” From there it will only spread. I can imagine certain logistical hurdles when it comes to the experimental and observational sciences, but with humans acting as mere subalterns, performing the rote motions of experimental procedure and data entry, one can imagine innumerable studies being algorithmically produced in no time at all. Perhaps I flatter myself and my literary brethren in my belief that the terse, technical prose of scientific journals will present much less of a challenge to the Algorithm than that which we call art.

Of course, given the precipitous decline of written media already in evidence since the spread of television, the consequences of cultural automation have so far been confined to a minor, dying branch of creative activity. In time, however, the Algorithm will master image and, perhaps more importantly, voice generation, at which point one imagines that It will become capable of producing something more to the taste of the folks at home. I can imagine something analogous to animated film, or perhaps, if it chooses to take a more interactive tack, akin to video games. At this point, the act of cultural creation will have passed entirely from human hands.

The anti-white, anti-male nastiness of the press aside, I will confess to perceiving a glimmer of truth in their effluvia, given that the former “creative class” possessed such an undeniable predilection for degeneracy and vice, or even just a tendency to deviate wildly from the mean. Perhaps it is for the best that our outsized influence has been telescoped down to nothing. If there’s one thing more foolhardy than democracy, it is the privileging of a society’s deviants via the cult of Art, a regrettable trend that has haunted Europe since at least the Renaissance. But you must forgive me. Indulging in such prolix musings as these is an old habit of mine, a holdover from the days when there were some few who hung onto my every printed word; you see, as I’ve already implied though not yet stated, I was once a novelist. A novelist in a dying literary culture, to be sure, born far too late for there to exist anyone like a Dickens or a Tolstoy in terms of contemporary cultural preeminence, but a novelist nonetheless, one of the lucky few able to pay his meager bills with his scribblings — though no longer.

I suppose it was mere vanity that allowed us to believe replacement would never come for us.

Humans have always been the greatest source of mayhem and inefficiency in human systems; from an engineering perspective, there is no better solution than to eliminate us from society entirely. What a source of inefficiency it was, for instance, to organize ourselves along such arbitrary lines as those of family and community, which inevitably produced such sub-optimal effects as clannishness and provinciality. Given such anachronistic modes of organization as these, a society’s worker-units are nowhere near fungible enough to be shunted here and there with the speed required in the large-scale commercial enterprise that is modern civilization! But an artist’s vanity is nigh boundless, and I suppose we just assumed there was some qualitative difference between our sort of labor and that of the dum-dums who were put out to pasture in the decades before us, left to swallow opium and corn syrup until their days finally and mercifully elapsed. We liked to believe we were irreplaceable.[4] [7]

In any case, when Ikari published his famous op-ed, “We Are Defeated,” in the Gray Lady’s weekend art feature, it had been but a formal admission of what had clearly been the case ever since that Knoxville bimbo’s literary tour de force had been discovered. It turns out that narrative art is not so subtle or complex after all. It turns out that the aggregate of any given human’s life experiences forms a raw material from which can be refined, by the application of a fairly simple mathematical method, an essence that is bracing and pure, containing within it that odd and elusive sense of universality that emerges from the inspired encapsulation of some curated set of particulars, which is the paramount achievement of literary endeavor, if you ask me (which you did not).

The real humdinger of the whole situation, to come back to Grimly Gray, Prairie Sky, had been the fact that what set “her” work apart from the rest of the algorithmically-assisted hackjobs who had been published up to that point, was her complete lack of conscious input. There had followed a great deal of A/B testing, comparing the pure output of the Algorithm to the algorithmic enhancement of outlines, notes, and finished works written by our society’s leading literary lights. The results had been unambiguous in that any constraint placed upon the workings of the Algorithm by the writer’s conscious vision of the work amounted to nothing more than a corrupting factor, even when the “corruption” originated in the mind of one of our decaying society’s foremost authors. It had been the Pulitzer prize-winning Ikari’s experience of comparing the Algorithm’s output to his own organic and unmistakably inferior prose that had inspired him to write the aforementioned op-ed.

God rest him, Shinji was a good man, or seemed like it at least from the couple of times I met him. If I’d come up with my masterstroke in time to share it with him, maybe that would have kept the noose from around his neck. Or perhaps it wouldn’t have helped him after all to know that I have discovered a meager place for man in the workings of algorithmic art, for even in my grand design, man is reduced to a mere cog, a conduit, though (I believe) an irreplaceable one in art’s perfection. But yes, let us get to the central purpose of this little essay: the description of my minor insight, and the possible artistic revolution that it heralds.

[8]

You can buy Greg Johnson’s Graduate School with Heidegger here [9]

One not insignificant consequence of my replacement has been the sudden bounty of time I have for reading. I’d always read just enough to provide grist to the proverbial mental mill, keeping abreast of the latest and greatest experiments in form and diction, content and character, which my forebears and contemporaries had brought to the press. I had read, that is, only insofar as that input would facilitate my literary output. Not since I was a child have I been able to read without ulterior motive, allowing myself to be transported by beauty. And with the recent profusion of top-flight, artificial literature, each work of a quality that would, in other epochs, have been hailed as transcendent genius, there sure isn’t any lack of reading material. The only reason I can imagine that no one has made the same modest breakthrough as I have is that reading simply no longer occurs to people as a potential activity. The fact that ingenious literary production is now the province of every idiot with a computer has diminished the patina of rarification that once motivated many readers, I suspect, to consume literature. In other words, reading “high literature” no longer places one in particularly distinguished company. I think it may have been old Kundera who first envisioned the society in which everyone is a writer, a speaker, a creator, with no beholders and no awestruck, reverent acolytes, or even readers; there is no discourse, no collective processing of an epoch’s common reality. There is no “public” anymore. There is only a palaver of unheard voices, droning endlessly into the void and, in time, falling silent when it becomes clear that to speak, unheard, is for naught.[5] [10]

But I’ve drifted off from the point again. Forgive me. Enough of this indulgence in what was once my heart’s true love (scribbling away in a bright, quiet room). As I said, it’s a bad habit I acquired in my former career, this going on and on, as if anyone is out there listening. There’s no time for that; I’ve work to do, to preserve a little corner for superfluous man in the great machine that will generate our art. My insight sprang from the following question, that haunted me in the days immediately following my first reading of Grimly Gray, Prairie Sky. Grimly Gray, like all previous works of literature, found in the life experiences of its author the raw material from which art could be refined.

What, I asked myself, might an art consist of, whose raw material (i.e. the author’s life) was the mere consumption of media of one sort or another? What might the Algorithm make of a gristless mill, so to speak? What might this superhuman Algorithm make of a life consisting entirely of the monklike consumption of media, of mediated reality? What might It make of a life spent consuming only that media which is itself the product of lives spent ensconced in mediated reality: a doubly, triply mediated art? Imagine, if you can, an art of the n-th order of mediation! Paper can be folded seven times and no more; how many times can our dear Algorithm be folded over upon itself before spitting out an unbroken line of zeroes, infinite and perfect in its emptiness?

Is it the Algorithm being folded over, or is it, perhaps, reality itself?

I’ve resolved to perform the first folding. I’ll spend my remaining days in perfect solitude, consuming only the machine-generated literary distillations of the lives of others. In the first year after the publication of Grimly Gray alone, there was enough material generated to see me through this fleeting life. And when at last my time is through, in lieu of a priest I’ll have brought to me the sacred electrodes, and the machine on which my testament will be laid down, a finely-spun web, a filigree of air. From there, I suppose it will be down to the winds of fate: whether others will follow me, whether others them, whether the next fold will be effected — but if so, my God, what beauty might occur!

I can see it now: monasteries full of us sacred readers, generations of lives spent in the contemplation of lives mechanically distilled into literature, those lives themselves spent in the contemplation of further and further mediated lives, building layer upon layer of perfect abstraction, with each fold a further refinement, each iteration of the algorithm a purer manifestation of form. In the limit, as time approaches infinity, we will reach perfection, and the nasty, dirty business of living will be eliminated in its totality.

*  *  *

Counter-Currents has extended special privileges to those who donate $120 or more per year.

To get full access to all content behind the paywall, sign up here:

Paywall Gift Subscriptions

[11]If you are already behind the paywall and want to share the benefits, Counter-Currents also offers paywall gift subscriptions. We need just five things from you:

To register, just fill out this form and we will walk you through the payment and registration process. There are a number of different payment options.

Notes

[1] [12] Essentially a more exacting version of the classic Turing test, in which critics and academics are given the task of evaluating the quality of a piece of computer-generated writing, which the experimenters falsely attribute to various sources: e.g. an undergraduate creative writing student, or the recently discovered papers of a deceased man of letters. If the writing samples successfully “pass” as the product of human endeavor, the Deasy test is passed. First published by Deasy (2035), and later replicated by Khan, et al (2035) and Sigler and Chen (2036), among many others. Capelli and Knowles (2036) were the first to publish A-B tests comparing the quality of artificial literature to that of the work of published novelists. Further studies too numerous to name have been published in the wake of literary automation; the interested reader may access them with the touch of a button.

[2] [13] Bahdajan, Kremer, & Stojakavich, “Quantifying the Rise of the Machines,” Media Studies Quarterly, 10, 4 (2038).

[3] [14] A scholarly hoax perpetrated in 1996 wherein Alan Sokal, a physicist, successfully published a satirical piece declaring quantum gravity to be a social and linguistic construct in a heretofore respectable journal of postmodern cultural studies.

[4] [15] Though previous decades’ public discussion of automation centered on the plight of the industrial worker, the menial laborer, and the trucker in a world of self-driving automobiles, in truth it was women who were eliminated first. Framed as an emancipation, the mechanization of the crafts that defined their unique role as wives and mothers made them superfluous. The institutionalization of childcare, of “education,” finished the job, and homemaking as a trade was eliminated, an archaism I was born three generations too late to have witnessed. Indeed, I’ve never met a master craftswoman among my people. No doubt they still exist in the pre-modern (but steadily “developing,” never fear) corners of the world, as well as in the deliberately anti-modern religious sects that hermetically seal themselves off from corruption. But in my lifetime here in the decadent modern West, I’ve only ever known what I might call, with apologies, “non-men.” The glad-handed, false-faced destruction of woman’s raison d’etre created a great void of idleness that, in hindsight, could only have been filled with destructive lashings-out. Look at the moral decrepitude of the narcotized unemployables of Obsoletistan and tell me that it’s any surprise that women have turned vulgar and wanton as a consequence of their replacement.

[5] [16] “Once the writer in every individual comes to life (and that time is not far off), we are in for an age of universal deafness and lack of understanding” (Milan Kundera, The Book of Laughter and Forgetting, 1980, cited in Ruth Ozeki, A Tale for the Time Being, 2013).