Wednesday, September 12, 2007

Hunting for the fundamental laws of physics

Stephen Wolfram has posted here some interesting news about one of his hobbies - hunting for the fundamental laws of physics, where he outlines how he is both developing and using Mathematica to search for fundamental laws of physics that generate simulated universes which have properties that resemble our own real universe.

The power of his approach lies not only in his use of Mathematica, but more fundamentally in his use of very few axioms to define what the fundamental laws of physics are in the first place. His basic object is a network (i.e. nodes and links-between-nodes), and his basic operation is the mutation of a piece of the network (via the application of a set of rules), which thus allows the network structure to have a dynamical behaviour. In this approach the fundamental laws of physics are determined by the choice of the set of network update rules.

It turns out that various simple consistency criteria cause this approach to give rise to both special relativity and general relativity. That is impressive, starting from a rule-based approach!

One of the challenges is to determine the consequences of a particular choice of network update rules, and to ascertain if they correspond to the behaviour of our known real universe. In general, the network behaviour in response to its update rules can be extremely complicated, and working out what is going on can thus be very difficult and time consuming. The development of Mathematica itself is partly driven by the need to create tools for addressing problems such as this.

Wolfram says that he has not yet found a viable candidate for the fundamental laws of physics using this approach, but that he is hard at work both developing and using Mathematica to achieve this goal. As he says

I certainly think it'll be an interesting - almost metaphysical - moment if we finally have a simple rule which we can tell is our universe. And we'll be able to know that our particular universe is number such-and-such in the enumeration of all possible universes.

I wish him luck in this venture. It would be very impressive if he found that a 3-line Mathematica program was all that was needed to generate the behaviour of our known real universe. Even if he is destined not to discover the fundamental laws of physics using this approach, he will nevertheless have created along the way a very useful toolbox for doing lots of other things, i.e. Mathematica.

4 comments:

Anonymous said...

Steve,

I'm surprised at the assumptions present in his approach - for example adopting a network as a basic object. I do think that the generators of physical "law" need to be a lot more abstract than that.

Another approach is to create a "mathematical wrapper", i.e. a mathematical system that is rich enough to contain enough "information" to generate the "system(s)" under observation. This is a common trick of wily mathematicians but suffers from being, generally, an inefficient coding scheme. String theory is such an example (and also requires a canvas upon which to draw "reality", whatever that is).

However, all that we observe and all the tools we use to observe the observable are presumably emergent properties of the fundamental generators of what we (loosely) refer to as reality. We are cursed with representing all concepts accessible to us with a countable infinity of ones and zeros (or derivatives thereof). We also have a pathological tendency to ignore a significant body of inductive evidence (such as pointers to a lack of objective reality) and confuse concepts (such as various descriptions of randomness, and randomness and uncomputability).

My guess is that one needs to look for something very simple, very abstract and most likely counter-intuitive. Examples:

Very simple - abstract notions of symmetry

Very abstract - question fundamental concept of information in context of above

Counter-intuitive - something that automatically gets over the bootstrap problem (replace the notion of "set" with an entity that naturally handles Russell's paradox, halting problem etc.)

I ramble...

Stephen Luttrell said...

My understanding of Wolfram's network approach is that he has a bunch of little pieces of algorithm (algon? algino?) that link together in a reconfigurable way to form an overall algorithm. The reconfigurability would itself be implemented using an algorithm, so everything is an algorithm with its internal communication links being visualised (by us) as a reconfigurable network.

So I Wolfram's basic construct is little pieces of algorithm all working in parallel, and talking to each other via a "network" which would itself be implemented using pieces of algorithm. The problem is to work out what algorithms are used to implement "physics".

Conventional physics explores a small subset of this set of all algorithms, namely the part that can be accessed using standard mathematical techniques. But there are a lot more algorithms out there, most of which operate in ways that are completely mysterious to us, and my understanding is that Wolfram has been "mining" the universe of algorithms for good candidates. Good luck to him!

It may turn out that there is a "small" algorithm (let's call it "algorithm 42") that has emergent properties that are indistinguishable from "life, the universe, and all that". We and everything else are implemented using this algorithm, and all observations made by us are implemented as interactions using this same algorithm.

Anonymous said...

I should imagine that one could gain quite a lot by examining the concept of "algorithm".

A good start would be the trade-off between the "algorithm", the "processor", the "data transport processes" and the "data". There are many arguments for combining (at least) pairs of these four concepts, but it's rather more difficult to see how a single, primitive entity splits into the others that one would like to declare as fundamental.

To take "computing" as an example, that's not a bad place to start as long as one discards the baggage acquired from the way that we tend to conceive of, and use, the fundamental concept.

Even the few physicists who understand such things in reasonably fundamental terms, however, do not tend to approach the subject in the right way. Partially it is due to (other types of) baggage and partially it is due to having to use concepts that differ markedly from previous experience.

In term of baggage, they are used to dealing with conventional constructs. Often, these contain many hidden assumptions. In terms of new concepts, they will frequently have difficulty in dealing with the results of their own experiments (quantum mechanics is awash with difficult conundrums). Usually, they will find comfort in the nearest mathematical edifice that makes consistent sense to them.

To take a relevant example, a good way to begin with "computing" is to understand where complexity enters, and leaves, the universe. The most strongly relevant clues in this instance are to remember that one must choose the correct definition of complexity, and to remember one of the most tantalising lessons learned from quantum mechanics.

Stephen Luttrell said...

I apologise for my glib use of the word "algorithm"; Wolfram's use of this word is a composite of algorithm/processor/data/data transport process. Maybe I should have described his whole approach as a set of flexibly-networked state machines, but then "state machine" is yet another piece of terminology.

There is no way around the problem that all of our conceptions carry our baggage, because we are part of the universe that we are trying to model, but that should not stop us from trying to formulate simple and universal models. One of the main points of Wolfram's approach is that he is not artificially constraining his approach by using only continuum mathematics, which allows him access to a whole new universe of low complexity models. Science is an empirical venture, and the success of our models is judged by their ability to summarise our observations in a highly compressed form, i.e. with a short description length. One could use compression as a selection criterion for discriminating between candidate models.

The standard model of physics is an extremely compressed version of (part of) the universe of observations, but we demand more because there is still a residual arbitrariness in this model. I think that if someone discovered an algorithm (of any type) that reproduced all of the standard model, yet had a much shorter description length than the standard model (e.g. it might simply be a compressed version of the standard model), then that person would be soon be visiting Stockholm. An example of where this has happened before is the quark model of hadrons, which explained (some aspects of) the world of hadronic resonances in terms of symmetric multiplets multi-quark states.