Generating Functions in Neural Learning of Sequential Structures
Skip to main content
eScholarship
Open Access Publications from the University of California

Generating Functions in Neural Learning of Sequential Structures

Abstract

A cornerstone of human statistical learning is the ability to extract abstract regularities from sequential events. Here we present a unique method to derive the generating functions for the waiting time of sequential patterns, then compare these functions with the neural mechanisms for learning sequential structures. We show that the way the neocortex integrates information over time bears a striking resemblance to the way these normative functions operate. They both operate by organizing combinatorial objects into meaningful groups then compressing the representations by discarding irrelevant information. As a result, discrete-time signals are converted into frequency signals, and similarity-based structures are converted into abstract relational structures. Our analyses not only reveal surprisingly rich statistical structures embedded in the seemingly random sequences, but also offer an explanation for how higher-order cognitive biases may have emerged as a consequence of temporal integration.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View