Some time ago I had the honor to follow an interesting talk from Tijmen Blankevoort on neural networks and deeplearning. Convolutional and recurrent neural networks were topics that already caught my interest and this talk inspired me to dive into these topics deeper and do some more experiments with it.
In the same session organized by Martin de Lusenet for Ziggo (a Dutch cable company) I also had the honor to give a talk, my presentation contained a text mining experiment that I did earlier on the Dutch TV soap GTST “Goede Tijden Slechte Tijden”. A nice idea by Tijmen was: Why not use deep learning to generate new episode plots for GTST?
So I did that, see my LinkedIn post on GTST. However, these episodes are in Dutch and I guess only interesting for people here in the Netherlands. So to make things more international and more spicier I generated some new texts based on deep learning and the erotic romance novel 50 shades of grey
More than plain vanilla networks
In R or SAS you could already train plain vanilla neural networks for a long time. The so-called fully connected networks where all input nodes are connected to all nodes in the following hidden layer.And all nodes in a hidden layer are connected to all nodes in the following hidden layer or output layer.
In more recent years deep learning frame works have become very popular. For example Caffe, Torch, CTNK, Tensorflow and MXNET. The additional value of these frame works compared to SAS for example are:
- They support more network types than plain vanilla networks. For example, convolutional networks, where not all input nodes are connected to a next layer. And recurrent networks, where loops are present. A nice introduction to these networks can be found here and here.
- They support computations on GPU’s, which could speed up things dramatically.
- They are open-source and free. No need for long sales and implementation cycles Just download it and use it!
My 50 Shades of Grey experiment
For my experiment I used the text of the erotic romance novel 50 shades of grey. A pdf can be found here, I used xpdfbin to extract all the words into a plain text file. I trained a Long Short Term Memory network (LSTM, a special type of recurrent networks), with MXNET. The reason to use MXNET is that they have a nice R interface, so that I can just stay in my comfortable RStudio environment.
Moreover, the R example script of MXNET is ready to run, I just changed the input data and used more rounds of training and more hidden layers. The script and the data can be found on Github.
The LSTM model is fit on character level, the complete romance novel contains 817,204 characters, all these characters are mapped to a number (91 unique numbers). The first few numbers are shown in the following figure.
Once the model has been trained it can generate new text, character by character!
arsess whatever yuu’re still expeliar a sally. Reftion while break in a limot.” “Yes, ald what’s at my artmer and brow maned, but I’m so then for a dinches suppretion. If you think vining. “Anastasia, and depregineon posing rave. He’d deharing minuld, him drits. “Miss Steele “Fasting at liptfel, Miss I’ve dacind her leaches reme,” he knimes. “I want to blight on to the wriptions of my great. I find sU she asks the stroke, to read with what’s old both – in our fills into his ear, surge • whirl happy, this is subconisue. Mrs. I can say about the battractive see. I slues is her ever returns. “Anab. It’s too even ullnes. “By heaven. Grey about his voice. “Rest of the meriction.” He scrompts to the possible. I shuke my too sucking four finishessaures. I need to fush quint the only more eat at me. “Oh my. Kate. He’s follower socks? “Lice in Quietly. In so morcieut wait to obsed teach beside my tired steately liked trying that.” Kate for new of its street of confcinged. I haven’t Can regree. “Where.” I fluscs up hwindwer-and I have I’ll staring for conisure, pain!” I know he’s just doesn’t walk to my backeting on Kate has hotelby of confidered Christaal side, supproately. Elliot, but it’s the ESca, that feel posing, it make my just drinking my eyes bigror on my head. S I’ll tratter topality butterch,” I mud a nevignes, bleamn. “It’s not by there soup. He’s washing, and I arms and have. I wave to make my eyes. It’s forgately? Dash I’d desire to come your drink my heathman legt you hay D1 Eyep, Christian Gry, husder with a truite sippking, I coold behind, it didn’t want to mive not to my stop?” “Yes.” “Sire, stcaring it was do and he licks his viice ever.” I murmurs, most stare thut’s the then staraline for neced outsive. She so know what differ at,” he murmurs? “I shake my headanold.” Jeez. “Are you?” Eviulder keep “Oh,_ I frosing gylaced in – angred. I am most drink to start and try aparts through. I really thrial you, dly woff you stund, there, I care an right dains to rainer.” He likes his eye finally finally my eyes to over opper heaven, places my trars his Necked her jups. “Do you think your or Christian find at me, is so with that stand at my mouth sait the laxes any litee, this is a memory rude. It flush,” He says usteer?” “Are so that front up. I preparraps. I don’t scomine Kneat for from Christian. “Christian,’! he leads the acnook. I can’t see. I breathing Kate’ve bill more over keen by. He releases?” “I’m kisses take other in to peekies my tipgents my
The generated text does not make any sense, nor will it win any literature prize soon Keep in mind, that the model is based ‘only’ on 817,204 characters (which is considered a small number), and I did not bother to fine-tune the model at all. But still it is funny and remarkable to see that when you use it to generate text, character by character, it can still produce a lot of correct English words and even some correct basic grammar patterns!