Oops, I did it again. By which I mean spawned another robot child. This one’s a little different from its elder sibling, though. First off, it tweets about The Waves, not Mrs Dalloway. But more importantly, it doesn’t just spew out whatever I tell it to. Rather, the bot writes its own material. Kinda.

That’s only a half-truth. Rather, I wrote a program in Python (with a great deal of help from this tutorial by Robin Camille Davis and Mark Eaton for CUNY’s Journal of Interactive Technology and Pedagogy) using a package called Markovify to spit out Markov chain-generated remixes of The Waves.

A Markov chain is a type of mathematical model that passes from one state to another. These states can be anything – weather patterns, football scores, or the words in a novel – it doesn’t matter. What matters is that the data pass from state to state. A Markov chain models each possible transition in a set of data, based on what the last data point was. Or, more simply put, if you had a set of data that was a single sentence, ‘the cat sat on the hat, the bug sat on the rug’ then a diagram of that Markov chain would look like this:

created using AGL: see the code at https://rise4fun.com/Agl/BZVm

If you were to run Markovify on this teeny-tiny sample dataset, you might well get remixes that look like these:

the cat sat on the bug

or

the rug

Or, even

the mat the mat the mat the mat the mat the mat the mat

(That last one continues ad infinitum, but you’ll have to imagine it.)

But what happens when you run Markovify on a dataset that starts ‘The sun had not yet risen,’ and ends ‘The waves broke on the shore,’ and which has 77,462 words between these two sentences? You get a bot that tweets stuff like this:

All the right words, but not necessarily in the right order. And one of those words is ‘pimple.’ Who knew?

So, why The Waves, he asks rhetorically? Well, The Waves is unlike anything else Woolf wrote, and quite possibly unlike anything anyone else wrote. Woolf didn’t call The Waves, set of interwoven monologues from six speakers, punctuated with nine italicised interludes where no one seems to narrate, a novel. Instead, she called it ‘a new kind of play [. . .] prose yet poetry; a novel & a play.’

The Waves is linguistically interesting as well. While it has its fair share of striking, sui generis phrases – Jinny’s ‘fulvous dress’ springs to mind – it also repeats itself a fair bit. Certain phrases, like Bernard’s ‘butterfly powder,’ and Louis’ father, who is ‘a banker at Brisbane’ crop up again and again like leitmotifs, while each monologue is marked by the formula ‘X said’: ‘Bernard said,’ ‘Jinny said,’ ‘Susan said’ and so on. Take another look at the diagram: the repeated words ‘sat,’ ‘on,’ and ‘the’ each appear only once, but they have multiple arrows coming off of them: they act like nodes connecting all the other words. The Waves’ repeated formulas and motifs act in much the same way, becoming richly generative points in a reconfigured landscape.

Which brings me on to a broader point about using Markov chains as a way of reconfiguring texts. As human readers with human eyes and human brains, it’s hard for us to look at a text in the same way as a Markov chain does. We read sequentially, from left to right (in English, anyway), page after page. But my Markov chain bot reads The Waves like a network, one where any word can connect to any other word, no matter where it is in the text.

Rather than reading like a human, my bot reads rhizomically. For someone who’s read more Deleuze and Guattari than can be considered healthy (so, any Deleuze and Guattari…), that’s a terribly exciting prospect. Deleuze and Guattari open A Thousand Plateaus by loudly announcing the inadequacy of the book, which engenders a logic which they call ‘arborescent’ – tree-like. It’s a logic which is rigid, governed by temporality and cause and effect. It only moves in one direction, and that direction is predetermined, governed by the author of the book.

They contrast this with the rhizome, which is more akin to the roots of a plant, or a mycelium, the underground part of a fungus. This is a network which moves horizontally, along many lines at once, without privileging any set path or teleology.

Now, something has always bugged me about this. First off, trees don’t work like that: Deleuze and Guattari weren’t very good botanists. Second, they write about the inadequacy of arborescent thought in a book, printed on dead tree matter. While they encourage their readers to skip around from one chapter to another, you’re still reading a book written in characters that go from left to right, one page after the other.

(Even hypertext doesn’t quite cut it – you can move around in hypertexts far more easily than you can a physical book, but you’re still stuck putting one word after the other…)

But my Markov chain bot doesn’t read like that. It acts more like Deleuze and Guattari’s rhizomic reader than a human reader can. As readers, we can’t very easily get a handle on how Markovify does this – if you want to take a look at the data model that the bot generates, you can here, but it’s utterly unreadable. I can’t even begin to tell you how it works. But the tweets that it generates give us an insight into what it’s like to read rhizomically.

I’m just about done so I’ll leave you with my bot’s last words on the matter: