Face perception is a ubiquitous perceptual task that most people easily perform many times a day, beginning in early childhood. The process of extracting meaningful information from a face for tasks such as face identification, gender discrimination, or emotion discrimination, involves making eye movements to different parts of a face. It is known that most of the information for such tasks can be extracted after just a single initial eye movement. However, the efficiency with which that information can be used may be modulated by the internal representation of faces that are stored in our brains for specific tasks. This dissertation explores several aspects of the interaction of the initial eye movement to a face with internal face representations. One aspect is the evaluation of configural processing of faces in the context of a foveated visual system. Another aspect is the effect of natural statistics of facial expressions on the availability of task-relevant information that may be extracted with an initial eye movement to a face. A third aspect is how individual differences in the initial eye movement may shape the development of internal fixation-specific face representations. All of these aspects are investigated using a combination of human psychophysics studies and computational modeling of face perception and eye movements. The data obtained supports a view of the initial eye movement to a face as a highly practiced and consistent behavior that depends on the statistics of the faces that humans are exposed to during different face discrimination tasks. In turn, this behavior may also shape the internal representations of faces in our brains.