image_pdfimage_print

Turbulence renormalization and the Euler equation: 2

In the early 1970s, my former PhD supervisor Sam Edwards asked me to be the external examiner for one of his current students. It was only a few years since I had been on the receiving end of this process so naturally I approached the task in a merciful way! Anyway, if memory serves, the thesis was about a statistical theory of surface roughness and it cited various papers that applied methods of theoretical physics to practical engineering problems such as properties of polymer solutions, stochastic behaviour of structures and (of course) turbulence. To me this crystallized a problem that was then troubling me. If you regarded yourself as belonging to this approach (and I did), what would you call it? The absence of a recognisable generic title when filling in research grant applications or other statements about one’s research seemed to be a handicap.

Ultimately I decided on the term renormalization methods but the term renormalization did not really come into general use, even in physics, until the success of renormalization group (or RG) in the early 1980s. Actually, the common element in these problems is that one is dealing with systems where the degrees of freedom interact with each other. So, another possible title would be many-body theory. We can also expect to observe collective behaviour, which is another possible label. We will begin by looking briefly at two pioneering theories in condensed matter physics, as comparing and contrasting these will be helpful when we go on to the theory of turbulence.

We begin with the Weiss theory of ferromagnetism which dates from 1907 (see Section 3.2 of [1]), in which a piece of magnetic material was pictured as being made up from tiny magnets at the molecular level. This predates quantum theory and nowadays we would think in terms of lattice spins. There are two steps in the theory. First there is the mean field approximation. Weiss considered the effect of an applied magnetic field B producing a magnetization M in the specimen, and argued that the tendency of spins to line up spontaneously would lead to a molecular field B_m, such that one could expect an effective field B_E, such that:

    \[B_E = B + B_m.\]

This is the mean-field approximation.

Then Weiss made the assumption

    \[B_m\propto M.\]

This is the self-consistent approximation. Combining the two, and writing the magnetization as a fraction of its saturation value M_\infty, an updated treatment gives:

    \[\frac{M}{M_\infty}= \tanh\left[\frac{JZ}{kT}\frac{M}{M_\infty}\right],\]

where J is the strength of interaction between spins, Z is the number of nearest neighbours of any one spin, k is the Boltzmann constant and T is the absolute temperature. This expression can be solved graphically for the value of the critical temperature T_C: see [1].

Our second theory dates from 1922 and considers electrons (in an electrolyte, say) and evaluates the effect all the other electrons on the potential due to any one electron. For any one electron in isolation, we have the Coulomb potential, thus:

    \[V(r)\sim \frac{e}{r}\]

where e is the electronic charge and r is distance from the electron. This theory too has mean-field and self-consistent steps (see [1] for details) and leads to the so-called screened potential,

    \[V_s(r) \sim \frac{e \exp[-r/l_D]}{r},\]

where l_D is the Debye length and depends on the electronic charge and the number density of electrons. This potential falls off much faster than the Coulomb form and is interpreted in terms of the screening effect of the cloud of electrons round the one that we are considering.

However, we can interpret it as a form of charge renormalization, in which the free-field charge e is replaced by a charge which has been renormalized by the interactions with the other electrons, or:

    \[e \rightarrow e \times \exp[-r/l_D].\]

Note that the renormalized charge depends on r and this type of scale dependence is absolutely characteristic of renormalized quantities. In the next blog post we will discuss statistical theories of turbulence in terms of what we have learned here. For sake of completeness, we should also mention here that the idea of an `effective’ or `apparent’ or `turbulence’ viscosity was introduced in 1877 by Boussinesq. For details, see the book by Hinze [2]. This may possibly be the first recognition of a renormalization process.

[1] W. D. McComb. Renormalization Methods: A Guide for Beginners. Oxford University Press, 2004.
[2] J. O. Hinze. Turbulence. McGraw-Hill, New York, 1st edition, 1959. (2nd edition, 1975).

 

2 Replies to “Turbulence renormalization and the Euler equation: 2”

  1. I am a Ph.D. student and beginner. I don’t understand many of the things related to turbulence. But want to learn it. Can you please suggest a good book to grasp the fundamental physics of turbulence?

    1. If you look under the Publications tab in my blog, you will find a section on my books. The Physics of Fluid Turbulence was published in 1990 and in paperback form in 1991. The first three chapters give a general introduction to the subject and I believe that some people have found it helpful. Number 5 in my list is a more up-to-date treatment of the theory and might be helpful in time. Thank you for your interest.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.