Skip to main content
Download PDF
- Main
Using a Sequential SOM to Parse Long-term Dependencies
Abstract
Simple Recurrent Networks (SRNs) have been widely used in natural language processing tasks. However, their ability to handle long-term dependencies between sentence constituents is somewhat limited. NARX networks have recently been shown to outperform SRNs by preserving past information in explicit delays from the network's prior output. However, it is unclear how the number of delays should be determined. In this study on a shift-reduce parsing task, we demonstrate that comparable performance can be derived more elegantly by using a SARDNET self-organizing map. The resulting architecture can represent arbitrarily long sequences and is cognitively more plausible.
Main Content
For improved accessibility of PDF content, download the file to your device.
Enter the password to open this PDF file:
File name:
-
File size:
-
Title:
-
Author:
-
Subject:
-
Keywords:
-
Creation Date:
-
Modification Date:
-
Creator:
-
PDF Producer:
-
PDF Version:
-
Page Count:
-
Page Size:
-
Fast Web View:
-
Preparing document for printing…
0%