NEXT STORY

Competition among different schemata

RELATED STORIES

a story lives forever

Register

Sign in

My Profile

Sign in

Register

NEXT STORY

Competition among different schemata

RELATED STORIES

Simplicity and complexity. Complex adaptive systems

Murray Gell-Mann
Scientist

Views | Duration | ||
---|---|---|---|

171. Early days at Santa Fe | 315 | 01:27 | |

172. Finding accommodation for the Institute | 284 | 01:47 | |

173. Achievements of the Institute | 329 | 04:07 | |

174. The future of the Institute. Models | 306 | 02:02 | |

175. Complex adaptive systems | 368 | 03:38 | |

176. Analytic work at Santa Fe. Integrative workshop | 264 | 01:38 | |

177. Gell-Mann's research: 1997 | 417 | 03:25 | |

178. Simplicity and complexity. Complex adaptive systems | 398 | 03:34 | |

179. Competition among different schemata | 335 | 02:51 | |

180. Information overload. A crude look at the whole | 368 | 04:42 |

- 1
- ...
- 16
- 17
- 18
- 19
- 20

Comments
(0)
Please sign in or
register to add comments

What people usually mean by simplicity and complexity, by… what people usually mean by complexity in ordinary conversation or in most scientific discourse is what I call effective complexity, which in simple lay terms would the be the length of a very compact description of the regularities of the entity under consideration; not the length of a description of the whole entity, but of its regularities. That's effective complexity. You can then make it more technical by introducing the concept of algorithmic information content, the length of the shortest program that would cause a standard universal computer to print out a bit-coded description of the entity and then halt and stop computing. That's what you usually mean when you say that the plot of a novel is complex, or a culture is complex, or a big conglomerate business firm is complex, you mean it would take a long time to describe all the regularities. So that's certainly one kind of complexity. Logical depth as defined by Charlie Bennett is another very important one, and that's for a short program how long would it take the calculation to go before you printed out a description of the… of the entity. And sometimes you can't tell the difference. It's hard to know which you're dealing with. You see something that's apparently complicated, and you don't know whether it's something that's really very simple but with a huge amount of computation to go from the simple program to the description, or whether it actually is complex, effectively complex and has many laws. If you're given the energy levels or nuclei, you might think that the rules for them were very, very complicated, lots and lots and lots of rules in order to explain the energy levels of nuclei, but we believe that quantum chromodynamics and quantum electrodynamics, both very simple theories, if combined together would predict all the energy levels of nuclei, but the computation is so long that it hasn't actually been done yet. But we hope that when it is done it will confirm our conviction that QED and QCD, simple theories, underlie the nuclear energy levels. But without knowing about QED and QCD one could easily imagine that the energy levels were effectively complex. So there's always this trade-off possible. Likewise there's a possible trade-off between randomness and simplicity. Suppose you're given a long, long, long number, it looks random and you would at first assign it no regularity, you'd say it was a random string, maybe you would guess that it was a random string and therefore with very little effective complexity. But then you might learn that it was actually a very simple string, also with very little effective complexity but at the opposite end of the algorithmic information content scale. For instance, suppose you were given the second million bits of pi, you might at first think that was a random sequence, but then after a while you might realize if, especially if somebody hinted to you that that's what it might be, that it was second million bits of pi, in which case of course it's very simple because you can describe that very quickly. So there're all these trade-offs based on knowledge.

New York-born physicist Murray Gell-Mann (1929-2019) was known for his creation of the eightfold way, an ordering system for subatomic particles, comparable to the periodic table. His discovery of the omega-minus particle filled a gap in the system, brought the theory wide acceptance and led to Gell-Mann's winning the Nobel Prize in Physics in 1969.

**Title: **Simplicity and complexity. Complex adaptive systems

**Listeners:**
Geoffrey West

Geoffrey West is a Staff Member, Fellow, and Program Manager for High Energy Physics at Los Alamos National Laboratory. He is also a member of The Santa Fe Institute. He is a native of England and was educated at Cambridge University (B.A. 1961). He received his Ph.D. from Stanford University in 1966 followed by post-doctoral appointments at Cornell and Harvard Universities. He returned to Stanford as a faculty member in 1970. He left to build and lead the Theoretical High Energy Physics Group at Los Alamos. He has numerous scientific publications including the editing of three books. His primary interest has been in fundamental questions in Physics, especially those concerning the elementary particles and their interactions. His long-term fascination in general scaling phenomena grew out of his work on scaling in quantum chromodynamics and the unification of all forces of nature. In 1996 this evolved into the highly productive collaboration with James Brown and Brian Enquist on the origin of allometric scaling laws in biology and the development of realistic quantitative models that analyse the influence of size on the structural and functional design of organisms.

**Tags:**
Charlie Bennett

**Duration:**
3 minutes, 35 seconds

**Date story recorded:**
October 1997

**Date story went live:**
29 September 2010