- Main
Detecting social information in a dense database of infants natural visual experience
Abstract
The faces and hands of caregivers and other social partners offer a rich source of social and causal information thatmay be critical for infants cognitive and linguistic development. Previous work using manual annotation strategies andcross-sectional data has found systematic changes in the proportion of faces and hands in the egocentric perspective ofyoung infants. Here, we examine the prevalence of faces and hands in a longitudinal collection of nearly 1700 headcamvideos collected from three children along a span of 6 to 32 months of agethe SAYCam dataset (Sullivan, Mei, Perfors,Wojcik, & Frank, under review). To analyze these naturalistic infant egocentric videos, we first validated the use of amodern convolutional neural network of pose detection (OpenPose) for the detection of faces and hands. We then appliedthis model to the entire dataset, and found a higher proportion of hands in view than previous reported and a moderatedecrease the proportion of faces in childrens view across age. In addition, we found variability in the proportion offaces/hands viewed by different children in different locations (e.g., living room vs. kitchen), suggesting that individualactivity contexts may shape the social information that infants experience.
Main Content
Enter the password to open this PDF file:
-
-
-
-
-
-
-
-
-
-
-
-
-
-