One, two, many: The meaning of the name

We used to think that if we knew one, we knew two, because one and one are two.  We are finding that we must learn a great deal more about ‘and’

– Arthur Eddington (1882-1944)

 

More is different

Philip W. Anderson (1923-2020)

 

Most of the people who have ended up on this page will be aware of two of the great revolutions in physics that took place at the beginning of the 20th century.  Both pushed limits of measurements beyond what we as humans are used to, and showed that the laws of Newton which had stood solid for centuries were incomplete:

  • If things move very fast, close to the speed of light, then we need the theory of relativity, as developed by Einstein.
  • If things are very small, more or less on an atomic scale, then we need quantum mechanics, as developed by Heisenberg, Born, Pauli, Schrodinger and many others.

Both of these theories are very general, and reproduce Newton’s classical mechanics in case you are in fact a normal size moving at a normal speed – more precisely, they show that corrections to Newton’s theory are extremely (unmeasurably) small for most everyday phenomena.  It is also no problem to put quantum mechanics and relativity together, as Paul Dirac did in 1928, if you really feel the need to describe something very little going very fast (for example, you are a particle physicist).

But there was another revolution in physics taking place around the same time, that is much less talked about in popular literature.  It is often considered much less fundamental because it does not imply any corrections to Newtons laws in certain limits.  Rather, it implies they are not useful.  This revolution was statistical physics, best epitomised by the name of Ludwig Boltzmann, and relates to complexity.  It is most beautifully summarised in the quotes at the top of this page, and provides the link between the microscopic and the macroscopic; between the fundamental and the emergent.

Statistical mechanics starts with constituent particles.  Each of the particles has (in principle) some known simple dynamical equations governing its motion, e.g. Newtons laws, but quantum statistical mechanics or even relativistic quantum statistical mechanics isn’t fundamentally different.  We then ask the question what happens when there are many such particles interacting together?

A favourite example of mine is to consider the air molecules in the room around you.  As a very simple model, you can imagine each molecule is a sphere, and if two spheres get too close, they bounce off each other – otherwise the molecules/spheres ahoy Newton’s laws of motion.  Highly simplified from the true dynamics of molecules, but it turns out not to matter too much!  Now, suppose you are speaking to somebody.  In doing so, you aren’t sending molecules of air from your mouth to the listeners ear to send information (that would usually called ‘licking’, and would typically be considered an unusual from of communication, unless you happen to be a dog).  Rather, you are making a pressure wave in the molecules of air near you – and it is this pressure wave that propagates, not the individual molecules which typically don’t move very far at all at normal pressure and temperature.  The question is:

if all you knew was Newton’s laws, would you be able to predict sound waves?

Ultimately, the answer is yes – because the model of many spheres/molecules bouncing off each other via Newton’s laws does indeed exhibit sound waves.  However this is a far from trivial step – the sound waves are a new emergent behaviour that only occur when there are many interacting constituents and you wouldn’t see by analysing the motion of one or two balls.

There is another element to this story, and for this, lets start to build up our system of many balls starting with one, and just for a nicer historical context, lets consider that they interact via gravity rather than a contact interaction.  Suppose we have one body only – then we can apply Newton’s 1st law and conclude that it will move at constant velocity as there are no other particles to put a force on it.  Lets move on to two spheres (e.g. the earth and the sun) interacting via gravity then.  A formal solution may not come up until more advanced classical mechanics courses, but we know that the two bodies will orbit each other around the centre of gravity according the laws empirically derived by Kepler in the early 17th century and explained by Newton later that century.  So we can solve the motion of one or two bodies — so far so good.

But let us move now on to three bodies interacting via gravity, e.g. the sun, the earth and the moon.  Well, we all know what happens on typical timescales we observe it – the moon goes round the earth with a period of about a month, while the earth goes round the sun with a period of about a year.  But Newton and many great mathematicians and physicists who followed him failed to find nice equations governing the motion on longer time scales, and it became known in the elite mathematical circles as the three-body problem.  In fact, so little progress on this had been made in the 200 years after Newton solved the two-body problem that in the late 19th century, King Oscar II of Sweden offered a prize to anybody who could answer the simplest possible question: is this three-body system stable?  In other words, in the absence of anything else happening, would the moon continue to orbit the earth and the earth orbit the sun for eternity?  Just to be clear – it is fairly obvious that this situation is meta-stable, i.e. it will continue like this for a long time.  But would it continue for ever?  Or would somehow the earth move to a slightly different orbit and eject the moon?  Or something else?  Maybe not a question of practical significance in this case given the timescales involved, but certainly one in which you feel you should be able to answer if you truly understand Newtons laws.

The prize was won around the turn of the 20th century by a young french mathematician named Henri Poincare.  But he didn’t actually answer the question – instead he proved that you can’t answer the question.  Basically, he showed that the system may be deterministic (i.e. we know the equations of motion so in principle can compute anything) but it is chaotic – meaning that the answer depends so sensitively on initial conditions that if you measured a position or velocity to anything less than infinite accuracy, you may come to the wrong conclusion.  See this post for more on the difference between deterministic and predictable.

So we can solve for one body, we can solve for two bodies.  We can’t solve for three, or four, or five.  But we can solve for many – why?  Its because if you have many many (e.g. Avogadros number) of bodies, then we are asking different questions.  I would posit that you have rarely (if ever) wondered about the exact position and speed of every air molecule in the room.  Even if you could store all of this information, what would you do with it?  The motion of individual air molecules is chaotic – unpredictable even with a big enough computer.  However with enough particles, you can start to take averages – and these averages are predictable.  But they are predictable based on emergent laws – laws of thermodynamics or laws of fluid dynamics which in principle come from Newton’s laws for each small part of them, but in a very non-obvious way.  I refer once again to the quotes I began with.

Hence the name of this blog – one, two, many.  A physicists way of counting, as these are the only numbers of particles we can deal with.  This is also my research area.  In my case, the constituent particles I deal with are usually electrons, for example in a metal.  To describe a single electron in a metal, I need quantum mechanics, but typically not relativity, so my starting point is the Schrodinger equation.  Then I wonder what happens when many of these electrons get together to party.  From this, I try to understand all sorts of phases of matter, such as electrical conduction in ultra-thin wires, magnetic phase transitions, or even more exotic phase transitions where the electrons organise themselves in some unusual, collective, emergent way, like the way a conga line develops at a party out of everybody doing their own thing, only cooler.

Leave a Reply

Your email address will not be published.