Despite the computer’s historical success as a communication tool, machines themselves have yet to fully master the most basic forms of nonverbal communication that we humans use daily. Gender, ethnicity, age and emotional state is often perceived immediately by most humans engaging in conversation.However, training a classifier algorithm to accomplish this form of behavioral observation is a rather difficult task. In this exploratory review, we will be replicating object recognition and deep learning in a convolutional neural network to ultimately train a model to distinguish the universal human emotions from the FER2013 facial expression dataset (Kaggle, 2013).