Neural discontinuous constituency parsing
One of the most pressing issues in dis-
continuous constituency transition-based
parsing is that the relevant information for
parsing decisions could be located in any
part of the stack or the buffer. In this pa-
per, we propose a solution to this prob-
lem by replacing the structured percep-
tron model with a recursive neural model
that computes a global representation of
the configuration, therefore allowing even
the most remote parts of the configura-
tion to influence the parsing decisions. We
also provide a detailed analysis of how
this representation should be built out of
sub-representations of its core elements
(words, trees and stack). Additionally, we
investigate how different types of swap or-
acles influence the results. Our model is
the first neural discontinuous constituency
parser, and it outperforms all the previ-
ously published models on three out of
four datasets while on the fourth it obtains
second place by a tiny difference.
continuous constituency transition-based
parsing is that the relevant information for
parsing decisions could be located in any
part of the stack or the buffer. In this pa-
per, we propose a solution to this prob-
lem by replacing the structured percep-
tron model with a recursive neural model
that computes a global representation of
the configuration, therefore allowing even
the most remote parts of the configura-
tion to influence the parsing decisions. We
also provide a detailed analysis of how
this representation should be built out of
sub-representations of its core elements
(words, trees and stack). Additionally, we
investigate how different types of swap or-
acles influence the results. Our model is
the first neural discontinuous constituency
parser, and it outperforms all the previ-
ously published models on three out of
four datasets while on the fourth it obtains
second place by a tiny difference.
Additional information
http://aclweb.org/anthology/D17-1174
Share this page