Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Tracy K. Teal
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Artificial Life (2000) 6 (2): 129–143.
Published: 01 April 2000
Abstract
View article
PDF
Abstract For many adaptive complex systems information about the environment is not simply recorded in a look-up table, but is rather encoded in a theory, schema, or model, which compresses information. The grammar of a language can be viewed as such a schema or theory. In a prior study [Teal et al., 1999] we proposed several conjectures about the learning and evolution of language that should follow from these observations: (C1) compression aids in generalization; (C2) compression occurs more easily in a “smooth”, as opposed to a “rugged”, problem space; and (C3) constraints from compression make it likely that natural languages evolve towards smooth string spaces. This previous work found general, if not complete support for these three conjectures. Here we build on that study to clarify the relationship between Minimum Description Length (MDL) and error in our model and examine evolution of certain languages in more detail. Our results suggest a fourth conjecture: that all else being equal, (C4) more complex languages change more rapidly during evolution.