Breaking News
Loading...
Monday 14 January 2013

Info Post
TBBT: The latest episode The Bakersfield Expedition of the Big Bang Theory attracted the record 20.0 million U.S. viewers. That's quite a number. The episode showed the boys in sci-fi costumes going to a Comic Con and the girls accidentally plunging into a boy-like argument about sci-fi topics.
Energy is a concept that is often given associated either with a mysterious or, on the contrary, excessively corpuscular meaning by the laymen. What do I mean?

The laymen either think that energy is some ill-defined, science-transcending form of a soul or happiness or something that can't be quantified. In that case, they associate it with health, with the hugging of the trees, and similar things.

Other laymen realize that energy is a fully quantitative concept, something that can be moved from one place to another. But to make this assertion compatible with their imagination, they think of energy as some kind of marbles or material, one that is excessively similar to water and similar stuff. This leads them to believe that energy has to remain equally visible at all times.

None of these two ideas is right. The truth is different and, to some extent, it is in between the two positions above. Energy is a quantity that may be quantified but its units resemble neither marbles nor molecules of water. Energy doesn't have any molecules or other indivisible minimal units. Its values are continuous in general and its units are abstract, different from any "object" we know. Energy may change its forms in a huge number of ways, it may hide and become invisible to the human eyes (but visible to other ways of measuring it). In this sense, it resembles the spirit or ghosts envisioned by the first group of the laymen.

What is the most universal definition of energy?




Energy is the quantity that is conserved – whose total value is unchanged – whenever the laws of physics that govern the evolution of the physical system in time have one particular property. This property is an example of a symmetry. And the symmetry associated with energy is the time-translational symmetry. It essentially requires that the laws of physics don't change with time. If you repeat the same experiment with the same initial conditions tomorrow, it will yield the same results – or, in quantum mechanics, Nature will predict the same probabilities of various outcomes as it did today.

Why is it so? This relationship between energy and the symmetry is known as Noether's theorem, after Emmy Noether, one of the best female mathematicians of all time. The proof in classical physics looks contrived and Noether's papers were almost unreadable for a modern physicist. However, the relationship becomes crystal clear in quantum mechanics. In quantum mechanics, the time derivative of a quantity \(L\) is determined, via Heisenberg's equations of motion, as a multiple of its commutator with the Hamiltonian:\[

\ddfrac{L}{t} = \frac{1}{i\hbar} [L,H].

\] Now, a conserved quantity is obviously one (\(L\)) for which both sides are equal to zero i.e. a quantity that commutes with the Hamiltonian. But when \(L\) commutes with the Hamiltonian, we may say another thing about \(L\). We may say that if we first transform the initial state vector by a transformation generated by \(L\) and then "wait" (i.e. evolve it in time), we get the same final state as if we first "wait" and then transform it by a symmetry:\[

\exp(i \alpha L)\exp(Ht/i\hbar) \ket\psi = \exp(Ht/i\hbar) \exp(i\alpha L)\ket \psi.

\] This identity follows from \(HL=LH\) because \(HL=LH\) also implies \(f(H)g(L)=g(L)f(H)\) for any functions \(f,g\) of these two operators. It means that every conserved quantity is a generator of a symmetry and vice versa. \(L=H\) is a particular example of a conserved quantity because \([H,H]=0\). Well, that's obviously true. And we may see that \(H\) is the generator of translations in time; that's a direct interpretation of Schrödinger's equation (or even Heisenberg's equations of motion, for that matter).

The classical proofs of the relationship between conserved quantities and symmetries are actually a bit more messy. For example, you may replace the "simple" commutators by Poisson brackets which seem to be given by rather convoluted formulae. This messiness is a hint that quantum mechanics is more fundamental than classical physics.

I hope that many readers who didn't understand the text above continued to read, anyway. It was a test of some discipline. Now, things will get easier.

Energy in classical mechanics

Some years after Isaac Newton wrote his equations that govern the motion of the planets in the Solar System and other things, people such as Newton's foe Gottfied Leibniz noticed that one may define an expression – a mathematical function of the masses, positions, and velocities – that isn't changing with time. In this way, a specific formula was given for the ancient visions by Thales of Miletus (550 BC) and other philosophers. An important term in the energy is the kinetic energy\[

E_{\rm kin} = \sum_j \frac{m_j v_j^2}{2}.

\] Each object \(j\) contributes a term that increases with the velocity – quadratically – and that is proportional to the mass. The factor of one-half and the second power is "natural" because if you calculate the time derivative of this energy, you will get\[

\ddfrac{E_{\rm kin}}{t} = \sum_j m_j \vec a_j\cdot \vec v_j

\] where \(\vec a_j\), the acceleration of the object \(j\), arises from differentiation and the factor of \(1/2\) cancels due to Leibniz's rule for the derivative of the product. But \(m_j\vec a_j=\vec F_j\) is nothing else than the force that must act on the object \(j\) and \(\vec v_j = \dd \vec x/\dd t\). And \(\vec F_j\cdot \dd\vec x\) is the infinitesimal work. So the change of the kinetic energy of some object per unit time is the rate of work that others do on this object. All these formulae are consistent with each other but the main claim one may actually check is that if you calculate the derivative of the total energy with respect to time and use the equations of motion, you get zero.

In the SI units, the kinetic energy has units of \[

[E]= {\rm kg}\cdot {\rm m}^2/{\rm s}^2.

\] This may look abstract but there's nothing wrong about considering powers of well-known units and their products. These days, we also use the name "joule" to describe the unit above. This name celebrates James Joule who discovered the equivalence between work and heat; this will be discussed later. Note that all forms of energy have the same unit, otherwise they couldn't be added. But they must be added all the time because it's the total value (sum) of energy in all of its forms that is conserved.

An individual planet's kinetic energy isn't conserved because the speed is changing. However, what is constant is the total planet's energy that also contains the potential energy\[

E_{\rm pot} = -\frac{G M_{\rm Sun} m_j}{R}.

\] For an approximately uniform gravitational field, e.g. one above the surface of the Earth, we have \[

E_{\rm pot} = m_j gh_j

\] where \(h_j\) is the height of the object \(j\). If you consider balls moving in the gravitational field of the Earth, the total sum\[

E = \frac{mv^2}{2} + mgh

\] is conserved. The ball may reach higher altitudes but it's slower over there, and when it drops, its velocity is increasing so that \(E\) is constant. By summing over all objects and their pairwise interaction energies, this formula is easily generalized to the case of many particles. Also, these formulae assume that the objects are "point masses" and their internal motion (spin) is neglected. However, an object that is extended and that may be rotated may also be decomposed into small particles (imagine atoms). When you sum their kinetic energies, the kinetic energy of the internal motion (spin) of the extended object is easily shown to be\[

E_{\rm rot} = \frac{I\omega^2}{2}

\] where \(I\) is the moment of inertia with respect to the axis of rotation\[

I = \int \dd m\cdot \rho_{\dd m}^2.

\] and \(\omega\) is the angular frequency of the rotation. Note that the energy of a spinning object is again proportional to "something like the mass", the moment of inertia \(I\), and it increases with the second power of "something like the velocity", namely the angular velocity.

However, balls that are made of too light materials quickly slow down due to the friction. It seems that the energy is lost. However, it isn't lost without a trace. It actually heats up the balls or the air. If you carefully measure the temperature change of the ball and other objects and multiply the temperature changes by the heat capacity \(C\),\[

\Delta E_{\rm thermal} = \Delta T\cdot C

\] you will obtain the thermal energy hiding in the materials. This linearized formula is analogous to \(mgh\). In reality, the heat capacity \(C\) depends on \(T\) as well so the exact thermal energy stored in a piece of material is nonlinear, much like \(-GMm/r\) was nonlinear (the formula for the thermal energy of a material is different and more messy than the gravitational potential energy in general).

At any rate, if you observe the velocity, position, and temperature of the moving balls (and the surrounding air) and add the relevant pieces of energy up, you will get a quantity that is conserved i.e. whose value isn't changing with time. James Joule was the guy who clarified how the mechanical forms of energy – potential or kinetic energy – may be converted to heat (energy of thermal form that manifests itself by increasing the temperature of objects) and how much heat you actually get. Before Joule, heat would be measured by different units – calories (we still use them for nutrition values of foods) – but after Joule, it was clear that one may use the same unit for both, much like we use the same unit for horizontal and vertical distances. To celebrate Joule, the modern unit of energy/work/heat is named after him.

I have mentioned several forms of energy. Point masses have kinetic and gravitational potential energy, rotating objects have the kinetic energy of spin, warmer objects have a higher thermal energy which may be transferred as heat when they're in contact (or created by friction). We need to add many more terms in the overall energy for our discussion to be sufficiently comprehensive.

When you consider charged objects, they have an electrostatic potential energy,\[

E_{\rm elst} = \frac{Q_1 Q_2}{4\pi\epsilon_0 R}.

\] Similar formulae exist for magnetostatic potential energy and so on. The electromagnetic field itself carries energy whose density per unit volume is\[

\ddfrac{E_\text{elmg. field}}{V} = \rho_\text{elmg. field} = \frac 12\zav{ \epsilon_0 E^2+ \frac{1}{\mu_0} B^2}.

\] Electromagnetic waves carry nonzero energy even if you can't account for the sources that created them. These waves may also be interpreted, in quantum theory, as a flow of photons whose energy is \(E=\hbar \omega\) per photon where \(\omega\) is the angular frequency of the photon.

In special relativity, the mechanical energy \(mv^2\) is replaced by\[

E_{\rm rel,kin} = \frac{mc^2}{\sqrt{1-v^2/c^2}}.

\] If you Taylor expand this formula around \(v/c=0\) which is useful for \(v\ll c\), you get\[

E_{\rm rel, kin} = mc^2+\frac{mv^2}{2} +\frac{3mv^4}{8c^4}+\dots

\] Aside from increasingly negligible terms such as one whose coefficient is \(3/8\), you also see the usual non-relativistic energy \(mv^2/2\) in between – it's what we need to reduce relativity to the Newtonian physics. But there's also the huge leading term \(E=mc^2\), Einstein's famous addition to the energy of the world. This latent energy hiding in every piece of mass may be released by annihilation or by evaporation of a black hole that devoured the mass.

And a part of this energy may be released if you change the structure of the material. Thermonuclear fusion releases roughly \(mc^2/100\), fission gives us ten times less, \(mc^2/1,000\), and chemical reactions release about 1 million times less than the nuclear reactions (the million also arises as the ratio of \(1\MeV\) and \(1\eV\) which are the estimated energies coming from one atom/nucleus in nuclear and chemical reactions, respectively), i.e. about \(mc^2/100,000,000\).

We got to this "transmutation of materials" through a discussion of special relativity. But long before relativity, people knew some chemistry and they knew that energy – e.g. heat – may be obtained not only by friction (recall our discussions of balls with friction) but also by burning objects, e.g. fire. Burning one kilogram of a particular material in a particular way releases a particular amount of energy (which depends on the material and the way of burning etc.). This chemical energy, chemistry's contribution to the overall energy, has to be added if you want the total energy to be conserved. However, in physics, chemistry may be reduced to the reorganization of electrons and their motion around nuclei so all the chemical energy may be replaced by a more accurate formula for the kinetic and (mostly electrostatic) energy of electrons (and nuclei) in the atoms and molecules.

I have mentioned the kinetic energy of spinning bodies and materials' thermal and chemical energy – and the whole \(E=mc^2\) which is hard to release unless you manage to do something "truly existential" with the material. But in thermodynamics, there are other forms of energy of materials, too. In particular, if you compress gas to a high pressure (or any other material, for that matter, but it's harder for other materials), the gas wants to expand back to the original volume (for which the pressures are balanced) and it is willing to do work to achieve its dream. The ability to do work is always - and this situation is no exception – a measure of the internal energy.

So at the "trajectory" in the space of states of gas that may be reached without a heat answer (adiabatic processes), the energy of a body of gas is a decreasing function of the volume. The infitesimal change of the energy, the work \(p\cdot \dd V\), may be converted either to heat and change the temperature, or it may do mechanical work (and, for example, change the potential energy of heavy solid bodies), or do other things. One must understand all these conversions and which of them occurs if he studies things like aerodynamics or atmospheric physics, of course. Recall our debates related to the greenhouse effect and climatology in general.

I could go on and go on and go on. The importance of newer examples I would be adding would arguably decrease. And I could also discuss the price of 1 kWh of electrical energy and lots of values of energy in the real world. However, I want to address one conceptual question. You may be surprised why pretty much any kind of a natural phenomenon contributes to the overall energy. Well, that's a good question. You're actually far too modest. Every type of a phenomenon in Nature has to contribute to the energy for a fundamental reason that was already mentioned at the beginning. The total energy – in abstract and quantum mechanics known as the Hamiltonian – is given by a formula and this formula actually determines the evolution of anything and everything in time.

Any quantity (position, velocity, electric field at a given point, volume of a gas, the percentage of an isotope or carbon dioxide in it, and so on) that is evolving in time is evolving because it doesn't commute with the Hamiltonian – with the operator of energy. So the energy has to depend on a related "complementary" quantity describing very similar things. It's not hard to realize that it is the reason why the total energy has to depend on "really everything that may change and that may be measured in the world".

The energy conservation law – one of the defining reasons why we talk about energy at all – is only one real constraint on the degrees of freedom in your system. If you have too many, it tells you "almost nothing" about their evolution. However, the formula for the energy – for the Hamiltonian – isn't just a curiosity, a number that accidentally happens to be conserved. It determines the evolution of the whole Cosmos and every piece of it in time.

0 comments:

Post a Comment