Companion to “Are Emojis Predictable?”

Welcome to the companion to

Are Emojis Predictable?

by  Francesco Barbieri, Migual Ballesteros, and Horacio Saggion.

This is where I’ve attempted to provide some semblance of explanation for the methods of the study. Look, I tried my best with this, so don’t judge. I ordered it in terms of the difficulty had instead of alphabetically. References at the end for thirsty bishes who just can’t get enough.

Difficulty NLP Model or Term
 Grinning Face on Twitter Sentiment Analysis

A way of determining and categorizing opinions and attitudes in a text using computational methods. Also opinion mining.

 Smiling Face on Twitter Neural Network

A computer network that’s based on how the human brain works.

 Slightly Smiling Face on Twitter Recurrent Neural Network

A type of neural network that at can be trained by algorithms and that stores information to make context-based predictions. Also RNN.

 Slightly Smiling Face on Twitter Bag of Words

A neural network that basically counts up the number of instances of words in a text. It’s good at classifying texts by word frequencies, but because it determines words by the white space surrounding them and  disregards grammar and word order, phrases lose their meaning. Also BoW.

 Neutral Face on Twitter Skip Gram

A neural network model does the opposite of the BoW. Instead of looking at the whole context, the skip gram considers word pairs separately. It’s trying to predict the context from a word, so it weighs closer words more than further ones. So the order of words is actually relevant. Also Word2Vec.

 Neutral Face on Twitter Long Short-term Memory Network

A recurrent neural network that can learn the orders of items in sequences and so can predict them. Also LSTM.

 Expressionless Face on Twitter Bidirectional Long Short-term Memory Network

The same as above, but it’s basically time travel because half the neurons are searching backwards and half are searching forwards even if more items are added later. Also BLSTM.

 Downcast Face With Sweat on Twitter Char-BLSTM

A character-based approach that learns representations for words that look similar, so it can handle alternatives of the same word type. More accurate than the word-based variety.

 Confounded Face on Twitter Word-BLSTM

Some kind of word-based variant of the above? Probably?

 Face Vomiting on Twitter Word Vector

Ya, this one is umm… well, you see, it has magnitude and direction. And like, you have to pre-train it. So… “Fuel your lifestyle with .”

Congratulations if you’ve made it this far! You probably already know more than me. Scream it out. I know I did 🙂




Bag of Words (BoW) – Natural Language Processing,

Britz, Denny. “Recurrent Neural Networks Tutorial, Part 1 – Introduction to RNNs.” WildML, 8 July 2016,

Brownlee, Jason. “A Gentle Introduction to Long Short-Term Memory Networks by the Experts.” Machine Learning Mastery, 19 July 2017,

Brownlee, Jason Brownlee. “A Gentle Introduction to the Bag-of-Words Model.” Machine Learning Mastery, 21 Nov. 2017,

Chablani, Manish. “Word2Vec (Skip-Gram Model): PART 1 – Intuition. – Towards Data Science.” Towards Data Science, Towards Data Science, 14 June 2017,

Verwimp, et al. “Character-Word LSTM Language Models.” [1402.1128] Long Short-Term Memory Based Recurrent Neural Network Architectures for Large Vocabulary Speech Recognition, Cornell University Library, 10 Apr. 2017,

Colah, Christopher. “Understanding LSTM Networks.” Understanding LSTM Networks — Colah’s Blog,

Nielsen. “Neural Networks and Deep Learning.” Neural Networks and Deep Learning, Determination Press, 1 Jan. 1970,

“Sentiment Analysis: Concept, Analysis and Applications.” Towards Data Science, Towards Data Science, 7 Jan. 2018,

gk_. “Text Classification Using Neural Networks – Machine Learnings.” Machine Learnings, Machine Learnings, 26 Jan. 2017,

Thireou, T., and M. Reczko. “Bidirectional Long Short-Term Memory Networks for Predicting the Subcellular Localization of Eukaryotic Proteins.” IEEE/ACM Transactions on Computational Biology and Bioinformatics, vol. 4, no. 3, 2007, pp. 441–446., doi:10.1109/tcbb.2007.1015.

“Vector Representations of Words  | TensorFlow.” TensorFlow,

“Word2Vec Tutorial – The Skip-Gram Model.” Word2Vec Tutorial – The Skip-Gram Model · Chris McCormick,

Leave a Reply

Your email address will not be published. Required fields are marked *