Learning recursion: Multiple nested and crossed dependencies
Language acquisition in both natural and artificial language learning settings crucially depends on extracting information from sequence input. A shared sequence learning mechanism is thus assumed to underlie both natural and
artificial language learning. A growing body of empirical evidence is consistent with this hypothesis. By means of artificial language learning experiments, we may
therefore gain more insight in this shared mechanism. In this paper, we review empirical evidence from artificial language learning and computational modelling
studies, as well as natural language data, and suggest that there are two key factors that help determine processing complexity in sequence learning, and thus in
natural language processing. We propose that the specific ordering of non-adjacent dependencies (i.e., nested or crossed), as well as the number of non-adjacent dependencies to be resolved simultaneously (i.e., two or three) are important
factors in gaining more insight into the boundaries of human sequence learning; and thus, also in natural language processing. The implications for theories of linguistic competence are discussed.
Share this page