Natural Information Processing
| Authors |
|
|---|---|
| Publication date | 2024 |
| Host editors |
|
| Book title | Foundational Papers in Complexity Science. - Volume 4 |
| Book subtitle | 1989-2000 |
| ISBN |
|
| Chapter | 78 |
| Pages (from-to) | 2447-2532 |
| Publisher | Santa Fe, NM: SFI Press |
| Organisations |
|
| Abstract |
Measuring the emergent complexity of a complex system has itself become a complex process—and is still ongoing. Over the past few decades, an ever-expanding realm of researchers from various disciplines have come up with a wide variety of different metrics, starting from different viewpoints and answering different questions that can often somehow be related to each other. A root cause of this expansion is the difficulty of pinning down the exact problem. Or as Seth Lloyd (2001) aptly put it: “A historical analog to the problem of measuring complexity is the problem of describing electromagnetism before Maxwell’s equations.”
Initially, many researchers were in pursuit of the complexity measure: one formula or algorithm that quantifies the amount of complexity in any given program or pattern. The sheer variety of measures that resulted has shifted the focus to look for a complexity measure: a choice that depends on the context, the research question, and the assumptions one is willing to make. In this light, James Crutchfield’s 1994 paper can be seen as a novel approach in the statistical description of complexity measures, but with a key distinction that has crucial consequences. |
| Document type | Chapter |
| Note | With reference to, and including the text of: J. P. Crutchfield, “The Calculi of Emergence: Computation, Dynamics, and Induction,” Physica D 75: 1–3, 11–54 (1994). |
| Language | English |
| Published at | https://doi.org/10.37911/9781947864559.78 |
| Downloads |
78_Crutchfield_1994_Sloot_DRAFT_112524
(Final published version)
|
| Permalink to this page | |
