Tag Archives: Statistical Mechanics

Logistic Map and Lyapunov Exponent


Lyapunov Exponent for the Speading of Two Close Points on a Logistic Map
Lyapunov Exponent for the Speading of Two Close Points on a Logistic Map

LogisticMap is a C# Windows application that I developed for homework in my Statistical Mechanics class.  The application plots the spread between two points on a logistic map and calculates the Lyapunov exponent for the spreading.  LogisticMap is a solution to Exercise 5.9 in Dr. James Sethna’s Statistical Mechanics: Entropy, Order Parameters, and Complexity.

You can get the LogisticMap application and its source code on GitHub.

Here’s some background on the logistic map, chaos, and the Lyapunov exponent.  In the application, we are evaluating the following function:

f(x) = 4μx(1-x)

This function is called a logistic map because it takes a point between 0 and 1 and returns a different point that is also between 0 and 1.  It maps the unit interval (0,1) into itself.  We can think about the trajectory of an initial point, x0, on the map as being the successive results of plugging the previous result back into the function: f(x0), f(f(x0)), f(f(f(x0))), …

Trajectories on the Logistic Map (μ = 0.9)
Trajectories on the Logistic Map (μ = 0.9)

The trajectory depends on the value of the constant μ.  When μ = 0 obviously all trajectories will immediately converge to 0.  When μ = 0.5 all trajectories converge on 0.5 but not immediately. When μ = 0.9 the trajectories do not converge on one value, but instead wind up within a certain range of values (roughly between 0.3 and 0.6 and between 0.8 and 0.9).

Trajectories on the Logistic Map (μ = 1)
Trajectories on the Logistic Map (μ = 1)

Now when μ = 1, the trajectories do not converge on one value and their range of values is still between 0 and 1, the same range of values that we chose for the initial point.  As Sethna states, “for μ = 1, it precisely folds the unit interval in half, and stretches it (non-uniformly) to cover the original domain”.  Furthermore, two very close initial points will have dramatically different trajectories.  This sensitive dependence on the initial point is what makes the logistic map with μ = 1 a chaotic system.

It turns out that the separation between the trajectories of the two initial points grows exponentially at a rate which is called the Lyapunov exponent.  This exponential drifting eventually stops because the two trajectories are still pegged between 0 and 1, so there is a limit to how far apart they can be.  At this point, the distance of their separation will fluctuate randomly.

If you would like to compute the Lyapunov exponent for a chaotic system, try using the following values:

  • X: 0.9 (X can be any number between 0 and 1, but not 0, 0.5, or 1)
  • E: 0.00000001 (E is the small difference between the two initial points, x0 and x0+E, so it should be very small)
  • N: 50 (N is the number of computed trajectory points, so it should be reasonably large enough that the exponential spreading of the two points can be computed)
  • Mu: 1 (this is what makes the logistic map a chaotic system)

Random Walks and Emergent Properties


Histogram of the Endpoints of One Hundred Thousand Random Walks
Histogram of the Endpoints of One Hundred Thousand Random Walks

RandomWalk is a C# Windows application that I developed for homework in my Statistical Mechanics class. The application graphs 1-D and 2-D random walks. It also graphs the histogram of endpoints for an ensemble of 1D random walks.  RandomWalk is a solution to Exercise 2.5 in Dr. James Sethna’s Statistical Mechanics: Entropy, Order Parameters, and Complexity.

You can get the RandomWalk application and its source code on GitHub.

Here’s some background on random walks.  A random walk is a series of steps, each having the same length and going in a random direction.  In one dimension, the random steps can go left or right.  In two dimensions, the random steps can go left, right, up, or down.

Now it turns out that if you have many walkers all starting at the same starting location and you record where they end up, the most likely end point is exactly the starting location.  The least likely end point is the furthest possible distance away from the starting location.  The graph of these points approximates a bell curve (also called a normal distribution or a Gaussian distribution).  In mathematics, the explanation of this phenomenon is called the Central Limit Theorem.

If you would like to see the bell curve in the RandomWalk application, try using the following values:

  • Number of Steps: 3 (if the walkers move only one step, they cannot return to the starting location)
  • Number of Walkers: 100000 (this is the key to getting the bell curve shape, there needs to be enough random walks for the histogram of their endpoints to “smooth” out)
  • Number of Dimensions: 1 (the RandomWalk program only graphs the histogram for 1-D walks)
  • Number of Bins: 100 (this parameter just lumps the random walks into a discrete number of bins – so instead of 100000 points on your graph, you only have 100 points)