Skip to main content
Download PDF
- Main
Compression: A Lossless Mechanism for Learning Complex Structured Relational Representations
Abstract
People learn by both decomposing and combining concepts; most accounts of combination are either compositional or conjunctive. We augment the DORA model of representation learning to build new predicate representation by combining (or compressing) existing predicate representations (e.g., building a predicate a_b by combining predicates a and b). The resulting model learns structured relational representations from experience and then combines these relational concepts to form more complex, compressed concepts. We show that the resulting model provides an account of a category learning experiment in which categories are defined as novel combinations of relational concepts.
Main Content
For improved accessibility of PDF content, download the file to your device.
Enter the password to open this PDF file:
File name:
-
File size:
-
Title:
-
Author:
-
Subject:
-
Keywords:
-
Creation Date:
-
Modification Date:
-
Creator:
-
PDF Producer:
-
PDF Version:
-
Page Count:
-
Page Size:
-
Fast Web View:
-
Preparing document for printing…
0%