On the von Neumann entropy of language networks: Applications to cross-linguistic comparisons

Javier Vera, Diego Fuentealba, Mario Lopez, Hector Ponce, Roberto Zariquiey

Producción científica: Contribución a una revistaArtículorevisión exhaustiva

3 Citas (Scopus)

Resumen

Words are not isolated entities within a language. In this paper, we measure the number of choices transmitted in natural language by means of the von Neumann entropy of language networks. This quantity, introduced in Quantum Information accounts, provides a detailed characterization of network complexities. The simulations are based on a large parallel corpus of 362 languages across 55 linguistic families (focusing on the sub-sample of 85 languages from the Americas). With this, we constructed language networks as a simple way to describe word connectivity patterns for each language. We studied several aspects of the von Neumann entropy of language networks. First, we discovered large groups of languages with low average degree and high von Neumann entropy. The results suggested also that large von Neumann entropy is associated with word entropy (as a proxy for morphological complexity), and is inversely related to degree regularity. This means that there are pressures at play that keep a balance between word morphological complexity and patterns of connections between words. We suggested also a strong influence of functional words on low von Neumann entropy languages. Our approach is thus a simple network-based contribution to establish cross-linguistic language comparisons from textual data.

Idioma originalInglés
Número de artículo68003
PublicaciónEPL
Volumen136
N.º6
DOI
EstadoPublicada - dic. 2021

Huella

Profundice en los temas de investigación de 'On the von Neumann entropy of language networks: Applications to cross-linguistic comparisons'. En conjunto forman una huella única.

Citar esto