A robot attempts to be like Dave Chappelle
On November 7, 2020, Dave Chappelle hosted SNL for the second time, in the week following the presidential election. This time around, the winner of the election was not declared until Saturday — the fifth day after counting began. This left Chappelle mere hours to rewrite and patch together a monologue relevant for the show and to address extraordinary times we’re living in.
The time crunch made me wonder … what if an algorithm could help? Could we generate an artificial Chappelle?
Would it have the same biting insight?
Would it have the same bluntness and provocative social commentary?
Would it talk to the mood of a nation, and capture the dual feelings of relief and devastation that the country is feeling?
No, No, Absolutely Not.
But is it funny? Well, yes … in a weird sort of a way.
(Starter text from the actual monologue)
This morning after the results came in, I got a text from a friend of mine in London and she said the world feels like a safer place now that America has a new president and I said that’s great, but America doesn’t. Do you guys remember what life was like before …
(Robot monologue starts now. Censored, because robots use naughty words, apparently)
that? Feels like, I know that, he was like this [moans].
He made a shotgun … the poor black person screaming at meetings. I got attacked online by me, and then I had to admit, that’s kind of f# hilarious. I got in the circus, the f# is sending and it was a Newport.
Be like “Bro, Mr, about that, I am as strong, so hard, about that.”
Sh# literally, from all those Filipino. Them is buckshot. This is the least threatening mo# I know. What I mean, I am, I know that is not a joke that would pass around. Then he left a bar in Chicago, and and uh weeks all right. Short of rape, in the where I said it, I trust you mo#. The first one was the victim of me, now I might?
Then I was like, I mean I said “No, I was supposed to be a hero”.
Heroes die, friends. Thought about mmm hmm cold, much?
I was in a comedy club in New York, but then they just missed your radio down. Nobody wants to be the Clippers and then he walks right automatic. Chappelle right, you can kiss all this engineering homework goodbye.
Full disclosure — I added punctuation, capitalization, and one word. All the rest is pure robot.
Ok, it’s definitely more Drunk Uncle than vintage Chappelle. My favorite line is:
“I got attacked online by me, and then I had to admit, that’s kind of f# hilarious”
I’ll leave the last word to Dave Chappelle himself (the human one). No matter which political side you’re on, the conclusion of his monologue had wisdom and words of reconciliation for us all.
“And I don’t hate anybody. I just hate that feeling. That’s what I fight through. That’s what I suggest you fight through. You got to find a way to live your life. Got to find a way to forgive each other. Got to find a way to find joy in your existence in spite of that feeling”
The Technical Part
Here are some tips on how to create your own Drunk Uncle Dave.
Start by loading the text you’re working with. As an example, my text was approximately 60,000 words long — which is on the smaller size for a dataset.
Preprocess the text, by separating into words or tokens. Then convert these to lower case, and remove punctuation. Next group the tokens into sequences of the same size.
We can now load the sequences and encode them into integers to fit the model. In my case, I used an LSTM model. The output layer of the model calculates the probability of what the next word will be. Usually, we’d want our model to achieve high accuracy. Language generation is an interesting example where we don’t want maximum accuracy. We want our model to be accurate enough to get the gist of the text, but not so accurate that it parrots our input text back to us.
Here’s the model I used.
# LSTM model
model = Sequential()
model.add(Embedding(vocab_size, 50, input_length=seq_length))
# compile model
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
# fit model
model.fit(X, y, batch_size=128, epochs=100)
This project was highly influenced by a great tutorial from Jason Brownlee’s Machine Learning Mastery
Next steps: I plan to use a pre-trained model or transformer in an attempt to make this text, less drunk and more Dave.