Thursday 31 January 2013

Czech temperature record 1961-2012

Because Frank E. L. asked me about related issues, I have downloaded all the monthly/regional Czech Hydrometeorological Institute's temperature data from the years 1961-2012 to Mathematica and calculated all statistical quantities I considered interesting.
Here is the PDF preview of the notebook.
Click at the red tile. Let me describe what I have found.

First, I had to figure out the right URLs of the HTML pages that contain the tables in an importable enough format, import them with some sensible formatting options to Mathematica so that they quickly become an array, and replace decimal commas by decimal points.

When we're done, we have the temperature for each of the 12 months of each of the 52 years for each of the 1+13 regions of the Czech Republic. Just to be sure, Czechia has 14 administrative regions (the division was different in the era of communism and it's not one of the things in which I considered communism to be worse: I don't really care) but in this dataset, Prague is unified with the Central Bohemian region that surrounds the capital which means that we only have 13 "weather regions" but they're sometimes supplemented by the new 14th region (put at the beginning) which is the whole country.

Well, your humble correspondent rewrote the names of the regions with the right character set and diacritics. It still displayed incorrectly in the PDF and most readers can't read Czech, anyway. ;-)




Now, I have checked the consistency of the data – the web pages show the actual monthly temperature for each month/year/region combination, the normal temperature for this month/region combination (that should be independent of the year), and their difference, the temperature anomaly.

The "normal temperatures" were listed identically almost everywhere. The only mistake appeared in January 1961 (the first year) for the Southern Bohemian region which says +2.8 although all other years agree it should be –2.8 – all these numbers are in Celsius degrees. So I suppose +2.8 is a typo which also makes the anomaly for that single month/region combination incorrect.

Then I verified whether their anomalies are really equal to the differences of the actual temperatures and the the normal temperatures. It turned out to be exact almost everywhere. However, there were 49 month/year/region combinations where the difference was plus or minus 0.1 Celsius degree, probably due to rounding. Curiously enough, all these mismatches (a tiny minority of month/year/region combinations: such rounding errors should have occurred in 1/2 of the data i.e. in thousands of combinations if the algorithm capable of producing rounding errors were used everywhere) appeared in the years 1974, 1981, 2003, 2005, and 2012. One may see some inconsistency in the rounding schemes – it was probably done by different people or at different moments for different years.

At any rate, the actual temperatures seem trustworthy enough. I haven't verified whether the tables did the correct averaging over the years and the correct averaging over the 13 weather regions – some new rounding issues could be found and discussed here as well, I guess.

So I calculated my own normal temperatures. For each region, they're a pretty nice sine that goes from –4 through –2 °C in January to +15 through +18 °C in July. You may see that the differences between the regions are slightly larger in the spring and the summer than they are in the fall or the winter.

The maximum temperature anomaly for a month/year/region combination was 6.3 degrees. The root mean square anomaly was 1.93 °C. Yes, if you average the temperature over a month, they give you a pretty nice Gaussian centered at the "normal temperature" whose standard deviation is almost two degrees. That's true in the Czech Republic. The numbers may differ in your country or region. The fluctuations are likely to be smaller near the sea and larger deep inside the continents.

When I divided the anomalies to individual years, the histograms were much less Gaussian and more "noisy". In different years, the root mean square anomaly went from 1.2 to 3.0 °C or so. When I divided the anomalies to the 13 or 14 individual regions, the Gaussians were a bit smoother and all the root mean square values of the anomalies were between 1.85 and 2.05 °C.

Now, the trends. Note that when we calculate the trends, the year variable "drops out" because it's already been used to calculate the trend. So we only have at most 12*14 = 168 month/region combinations for which the trend may be computed. The mean trend is 2.8 °C per century – Czechia has obviously seen a faster rate than the globe – the latter had close to 1 °C in the last 50 years. The histogram is somewhat noisy and almost all the entries show a trend between 0 °C and 6°C.

When the trends are divided to the individual 14 regions, they are all between 2.45 and 3.15 °C per century – we still have the degeneracy over the months. All these differences between the regions may be described as noise.

However, a shocking and a kind of curious observation is that the temperature trend vastly depends on the month – from January to December – if all the regions are clumped. It changes in a zig-zag way. The trend for Octobers is close to 0.2 °C per century, virtually zero, while the trend for Mays and Augusts exceeds 4.3 °C per century.

Do you have an explanation for this zigzag behavior?

A possible explanation you may suggest is that due to leap years, Januaries of different years are "different parts of the year", and similarly for the other 11 months. However, this explanation seems to produce far too weak an effect. At most, the shift of a month is by half a day in one way or another. Seasonally, half a day only makes 0.1 °C of a difference and a vast majority of these effects gets averaged out, anyway, because among the 52 years, some of them will have excesses and some of them won't.

So we are forced to conclude that there's been no climate change in Octobers, almost no climate change in Septembers, and a negligible climate change in Februaries. ;-) Does a similar pattern exist at other places? Do you have an explanation of this strong dependence on the month?

One more hint: it could have something to do with other drivers such as aerosols. Maybe the no-trend months – October, September, February – are the worst smog months and smog has contributed a negative amount to the warming trend which only seems to operate when there's actually smog? Do you have a better idea? What should be the differences expected from a sensible statistical/weather model you would think of?

Evolving portrait of the electron

About 150 years ago, people began to do the experiments that would lead to the discovery of the electron.



They would study the electrical conductivity of rarefied gases. Finally, in 1896, J.J. Thomson and his collaborators proved that the colorful fog coming from the cathodes is composed of individual discrete corpuscles, the electrons.




It just happened that within 10 years, electrons were seen by Henri Becquerel in a completely different context – as radioactive beta-radiation of some nuclei. In 1909, Robert Millikan and Harvey Fletcher performed their oil-drop experiment. The charge of an oil drop may be as small as one electron's elementary charge which translates to an elementary force acting on the drops in an external electric field. Another force acting on the oil drops is the friction force and the equilibrium between these two forces determines the asymptotic speed of the oil drops.

The 1910s and 1920s were all about the "quantum motion" of the electrons: people started to understand the structure of the atoms. First, Bohr offered his classical planetary model of the atom equipped with the extra ad hoc quantization rules for the orbits. Finally, in the mid 1920s, the correct laws of quantum mechanics were found and replaced Bohr's model that wasn't quite correct even though it had some desirable properties that had partially captured the "spirit" of the coming quantum revolution.

Motion of the electron vs structure of the electron

I want to spend the bulk of this article by discussions about the internal structure of the electron – and how it was evolving over the years. That's why I find it very important to clarify a widespread misconception. What is it? People tend to confuse the wave function of the electron in space with the internal structure of the electron. They're completely different things.

The wave function of the electron in a hydrogen atom is a "cloud" of radius 0.1 nanometers or so. This distance scale doesn't determine the order-of-magnitude size of the electron; instead, it determines the size of the atom. Imagine that an electron is a ball of radius \(r_e\) for a while. The center of this ball may be located at various points in space. In the hydrogen atom, the location of the center of this electron ball is undetermined and the uncertainty is approximately 0.1 nanometers (Bohr radius times two). But the size of the ball – the electron – is a completely different, independent length that is much shorter.

Classical radius of the electron

Hendrik Antoon Lorentz began to discuss the radius of the electron as early as in 1892, four years before Thomson's discovery of the electron. He would offer a clever idea that the electrostatic potential energy of the electron – a distribution of the electric charge – is equal to the latent energy \(E=mc^2\) stored in the electron's rest mass. However, it was 13 years before Einstein found his special theory of relativity so all these "early glimpses" of relativity always had some bugs in them. For example, Max Abraham would claim that the aether theory implied \(E_0=(3/4)m_e c^2\) as the relationship between the interaction energy and the mass. Many of the dimensionless numerical coefficients of order one were simply wrong.

By the classical electron radius, we usually mean\[

r_e = \frac{1}{4\pi\varepsilon_0} \frac{e^2}{m_ec^2}\approx 2.818 \times 10^{-15}\,{\rm m}.

\] If you place two charges \({\mathcal O}(e)\) at the distance \(r_e\) from one another, the electrostatic potential energy between them will be \(m_e c^2\). It's not shocking that the electrostatic potential energy of a "diluted electron" represented as a sphere of radius \(r_e\) or a ball of this radius is of the same order, up to coefficients such as \(3/5\) or \(1/2\).

You should note that the classical electron radius is some 10,000-100,000 times shorter than the Bohr radius. The electron is much much smaller than the atoms. For the sake of completeness, let me also mention that there exists another length, the Compton wavelength of the electron \(h/m_e c\), which is exactly the geometric average of the classical radius of the electron and the Bohr radius. Up to numbers of order one, it is exactly in between, about \(2.4\times 10^{-12}\,{\rm m}\). The longer Bohr radius and the shorter classical electron radius are the fine-structure-constant times longer and shorter, respectively.

But let's return to the main story. If the electron is visualized as a sphere of radius \(r_e\), the electrostatic potential energy is of order \(E_0=m_e c^2\) which allows us to say that all of electron's mass actually arises from the electrostatic energy. Of course, for this "explanation" to be a meaningful one rather than an example of circular reasoning, we should also find a theory explaining why the electron wants to keep its radius at \(r_e\), why it doesn't want to blow up. Recall that the like-sign charges repel so "halves" or other "pieces" of the electron want to repel from the rest, too.

If you want to succeed in this task (a task that is misguided, however, as we will mention in a moment), linear electrodynamics is no good. People proposed various non-linear modifications of classical electrodynamics, most famously the Born-Infeld model in the 1930s. The Lagrangian\[

\mathcal{L}=-b^2\sqrt{-\det\left(\eta+{F\over b}\right)}+b^2

\] reduces to \(-F_{\mu\nu}F^{\mu\nu}\) if \(F_{\mu\nu}\to 0\) but for larger values of the electromagnetic field strength, it develops some non-linear terms that prevent the electron from shrinking to \(r_e\to 0\) or exploding to \(r_e\to\infty\). There are infinitely many nonlinear deformations of classical electrodynamics one could think of. But remarkably enough, the Born-Infeld action is exactly what was derived in the 1990s as the only right answer for the electromagnetic fields on D-branes in string theory. This fact showed that Born and Infeld had a remarkably good intuition for the "right equations". Everyone who finds an equation that later turns out to be "unique" by insights of string theory may count as a visionary of a sort.

Renormalization: how the attempts to regularize the electron became obsolete

If the electron radius were exactly \(r_e=0\), its electrostatic interaction energy would be infinite. This is true even in classical (non-quantum) physics. However, these infinities arising from short distances – and \(r_e\to 0\) is an example of short distances – reappear in quantum physics all the time. While the Born-Infeld action was an attempt to get rid of the infinities in classical physics, many attempts to remove analogous divergences have been made in quantum physics in general – and quantum field theory in particular.



The first image, (a), is the leading correction to the electron's self-energy. It's a Feynman diagram most directly corresponding to the classical electron's self-interaction, electrostatic energy. Because of this diagram, the electron mass (and other quantities) is modified by an infinite amount. The infinity arises from the part of integrals in which the two interaction points (events in spacetime from the Feynman diagram) are very close to each other i.e. \(x\to 0\) or, equivalently, in which the loop momentum \(p\to\infty\). This region of the integration variables is known as the ultraviolet (UV, short-distance) region. The other two diagrams – and many others – contribute their own UV divergences to the dynamics, too.

However, starting from the 1940s, people learned how to subtract these UV divergences. Even though the individual terms may be infinite, when all of the terms are properly summed and a finite number of "genuinely physical" parameters is set to their measured values, all the infinities cancel and the predictions for all "genuinely physical" quantities will be finite. This is the magic of the renormalization.

Renormalization has been emotionally frustrating to various people – including giants such as Paul Dirac – but this dissatisfaction was always irrational. What's important in science is that one has a well-defined procedure to extract the physical predictions and these predictions agree with the observations. QED and other quantum field theories supplemented with the renormalization technique can't get a different grade than A, at least in the subject of science. They could get a D or worse from philosophy or emotions but a bad grade from philosophy may often be a reason to boast.

Ken Wilson's concept of the renormalization group from the 1970s gave us a new way to understand why renormalization worked. Before Wilson, people would think it was necessary to imagine that the electron had to be exactly point-like and the subtraction of the infinities – and they had to be strict infinities – was an essential part of the game. However, after Wilson, people would interpret the renormalization differently. They would say that they work with an "effective theory" that is able to predict all sufficiently long-distance, low-energy processes.

This effective theory doesn't force you to believe that the electron is exactly point-like. Instead, you may imagine that it has a nonzero size and its inner architecture may be "pretty much anything". The effective theory allows you to be agnostic about the inner architecture of the electron. It allows you to prove that whatever the internal structure of particles is, the predictions for all the long-distance phenomena will only depend on a finite number of constants such as \(m_e,m_\gamma,e\) – which may be calculated as functions of the internal architecture of the particles. But one may prove that all the other details about the internal structure will make no impact on the long-distance, low-energy observables! We say that the long-distance, low-energy predictions only depend on the internal architecture of particles through a finite number of constants such as \(m_e,m_\gamma,e\).

To say the least, it's a way of thinking about the divergences that makes the whole process of renormalization more acceptable, more philosophically pleasing. The divergent terms may be finite, after all. And while the calculations become most beautiful if the electron is strictly point-like, you're allowed to imagine it is not exactly point-like but you may prove that almost all the dependence on the messiness of a "finite-size electron" evaporates if you study long-distance, low-energy processes only. According to the renormalization group's philosophy, infinities in renormalization that cancel are shortcuts for unknown large yet finite numbers whose detailed value is mostly irrelevant.

Going to high energies

Well, while it's perfectly enough for all questions that affect atomic physics, you may still want to know what is hiding inside the electron; you may want to go to high energies and short distances, either theoretically, or experimentally. When you look at the electron, it seems obvious that its size has to be much smaller than the classical electron radius, probably at least 2-3 orders of magnitude shorter than that. The Standard Model allows you to "create" electrons by the Dirac field \(\Psi(x,y,z,t)\) which depends on the point in spacetime – so it's apparently created at a single point only. And the interactions are perfectly local, too. In this sense, the electron is exactly point-like in the Standard Model – although the electron is obviously acting on objects in its vicinity in various ways as well. And we know that the Standard Model is a good theory for all distances longer than \(10^{-19}\,{\rm m}\) or so which implies that the internal structure of the electron can't be longer than that.

So what is inside the electron according to the cutting-edge theories?

In the 1970s, people proposed preons. Quarks and leptons could be composite particles much like protons and neutrons. It wouldn't be the first time when the "indivisible" particles of our time were divided to smaller pieces. In other words, it wouldn't be the most original idea about the way how to do further progress in physics. However, when one looks at the preon models, they don't seem to work well, they predict lots of new particles that don't exist according to the experiments, and they don't seem to be helpful to solve any open puzzles in physics.

In other words, the evidence is now pretty strong that if you want to stay at the level of point-like quantum field theories, electrons are strictly point-like particles. It doesn't mean that electrons are strictly point-like in general, however. If you upgrade your physics toolkit to string theory, the only known (and quite possibly, the only mathematically possible) framework that goes beyond that of point-like-particle-based quantum field theory, the effective field theories derived from all the convincing and viable vacua of string theory will look at the electron as an exactly point-like particle. However, if you look at the electron with the string accuracy, it's still a string.



In most vacua, the electron is a closed string although models where the electron is an open string exist, too. An electron as a compact brane is in principle possible as well but it is much more exotic and maybe impossible when all the known empirical constraints are imposed. The typical size of the string is of order \(10^{-34}\) meters although models where it's a few orders of magnitude longer also exist. At any rate, the size of the string hiding in the electron is incomparably shorter than the classical electron radius.

You may interpret a string as a chain of "string bits". In this sense, a string is a composite system that has many internal degrees of freedom, much like atoms and molecules. However, the stringy compositeness has some advantages that allow us to circumvent problems of the preon models. I discussed them in the article Preons probably can't exist three months ago.

Because string theory suggests that the internal size of the "things inside the electron" is much shorter than the classical electron radius, you may rightfully conclude that the most modern research has led to the verdict that the classical electron radius isn't such an important length scale. You may calculate it from the electron mass and the elementary charge; however, nothing too special is happening at the distance scale comparable to the classical electron radius. Instead, the size of "things inside the electron" may be much shorter than the classical electron radius and the electron emerges as a rather light particle because its interaction with the mass-giving Higgs field is rather weak – and because the Higgs condensate is rather small, too (the latter fact seems to be "unlikely" from a generic short-distance viewpoint: this mystery is known as the hierarchy problem).

Long distances: everything is clear

I think it's appropriate to emphasize once again that all these ambitious questions about the internal structure of the electron make pretty much no impact on the behavior of the electron in atoms and other long-distance situations. If we know that there is one electron in a state, this electron is fully described by its momentum (probabilistically, by a wave function) or position (by another wave function) and by its spin (one qubit of quantum information). And yes, the electron is spinning, after all.

The transformation of the spin degree of freedom to another basis – a basis connected with a different axis than the \(z\)-axis – is completely understood and dictated by the group theory applied to the group of rotations. Many people tend to be misguided about this point. They think that if they "deform" the electron by equipping it with some non-linear terms or by seeing its internal stringy or preon-based structure or by acknowledging the fields around the electron, the states of the electron will deviate from the simple Hilbert space whose basis is given by the states \(\ket{\vec p,\lambda}\) where \(\lambda\) is "up" or "down".

But this isn't possible. Even if you incorporate all the facts about the electron's structure, its interactions with all other fields, preons or (more likely) strings that may be hiding inside, as well as higher-order interaction terms that we neglect in the Standard Model, it's still exactly true that what I wrote is the basis of the electron's Hilbert space. This claim follows from the spacetime symmetries – and the basic, totally well-established facts about the electron such as \(J=1/2\). The symmetries are not only beautiful but, as the experiments show, they hold in Nature. Your full theory – including all the corrections and subtleties – must conform to them.

The behavior of the electron at long enough (atomic and longer) distances is described by the Dirac equation. When the speed of the electron is much smaller than the speed of light, you may simplify the Dirac equation to the Pauli equation which is nothing else than the non-relativistic Schrödinger's equation with an extra qubit, two-fold degeneracy for the spin (but the operator \(\vec S\) doesn't enter the Hamiltonian, at least not in the leading approximation in which the spin-orbit and other relativistic interactions are neglected).

Well, the electron also acts as a tiny magnet whose magnetic moment is a particular multiple of the spin, \(\vec\mu\sim \vec S\): we have to add \(-\vec\mu\cdot \vec B\) to the Hamiltonian. The coefficient \(\vec \mu/\vec S\) may be calculated from the Dirac equation, up to the 0.1% corrections of loop processes in Quantum Electrodynamics. The magnitude of the Dirac-equation-calculable magnetic moment is twice larger than what we would expect from a classical "spinning charge/current" of the same magnitude.

The electron may hide lots of wonderful new structure inside. However, the particle's behavior in the atoms is independent of these not-yet-settled mysteries. It's both good news and bad news. It's good news because our understanding of atoms and similar, relatively long-distance physical situations may be rather complete despite the incompleteness of our understanding of the internal structure. It's bad news for the same reason: the observations of the atomic and other phenomena can't tell us anything about the very short-distance physics even though we would love to learn about it.

Wednesday 30 January 2013

Feed URLs for blog categories

This is a purely technical blog entry. Magnus Andersson asked me to create a feed with the climate articles only.

Google reader: string theory blogBlog posts on this blog are assigned several categories or labels – see "other texts on similar topics" beneath each post or the list of categories or labels in the right sidebar. This feature only appears in the widget-heavy, green template, not in the mobile one.

If you have a favorite category, you may find its feed at URLs such as
http://motls.blogspot.com/feeds/posts/default/-/climate

http://motls.blogspot.com/feeds/posts/default/-/weather records
The second URL was mentioned to show that spaces may be included in the URLs. If your browser happens to have problems with spaces, replace each with %20

(I couldn't add the period after the last sentence because it would be misleading.)

Your humble correspondent has created a Feedburner copy of the climate feed – in a new format – on a special URL.




The Feedburner climate feed URL is at
http://feeds.feedburner.com/blogspot/trfclimate
If you want another category to be put on Feedburner, let me know. The Feedburner version of the blog-wide Disqus comment feed has also been created.

I don't expect the number of subscribers to these feeds to reach a number similar to the number of subscribers for the main TRF feed which is about 17,000 right now (16,000 out of it is via Google Feedfetcher) so it may be a waste of time to preemptively prepare too many feeds.

Let me mention that in the right sidebar of the widget-heavy, green template, you find several colorful feed icons such as
Google reader: string theory blog Findory: one sentence from each article of this blog physics blog feed physics blog comment feed Lubos Motl Twitter
which link to assorted feeds with or without Google Reader (add http://www.google.com/reader/view/feed/ right before the http URL of the feed), feeds for Disqus (whole blog), Twitter (whole blog: I am de facto not using Twitter for anything else aside from the automatic alerts to blog articles), and so on. Try to hover over the icons or click at them to see what they do.



If you don't know what feeds are and who is actually feeding whom by what, don't worry: the video above explains it.

Also, at the bottom of the Disqus comment section of each blog entry, there is a special "comment feed" for that article as well as "subscribe via e-mail" (try to click it, may be undone by another click on the same link). You will also subscribe to the feeds (or via e-mail?) if you star a discussion.

Monday 28 January 2013

A theory of everything is an important research project

Richard Feynman stressed that we shouldn't make preconditions about how our future description of Nature is going to look like:



Lisa Randall, a top phenomenologist whom I know very well, gave an interview to Nude Socialist in which she says that it's an illusion that physics is mostly about the search for the final theory (among other things: read it). To a large extent, her answers are similar to Feynman's.

Phil Gibbs wrote that we need to find a TOE, after all.




Despite the contradictions in the "spirit" of their answers, I agree with all these three folks but I still think that some of the uncertainty in the first two people's comments are, to a certain extent, obsolete.

So, I agree with Feynman that it can't a priori be clear whether the Universe obeys the laws of a concise final theory that may be found after a finite time. It's critically important in science not to confine your reasoning by some assumptions whose validity isn't really certain – and sometimes not even justified – i.e. by dogmas.

On the other hand, I think that the evidence has accumulated that the alternative non-TOE scenario of the onion with infinitely many layers can't operate in Nature. Ken Wilson taught us to organize the phenomena in Nature according to their characteristic distance scale (or time scale or energy scale).

We may seemingly go to ever shorter distances and discover new and previously unknown layers of the onion, matryoshkas inside the larger matryoshkas, and so on. However, I am confident that we pretty much know that this "seemingly infinite" process inevitably stops at some point – the Planck scale. There are no distances shorter than the Planck scale that may be physically resolved, that make sense in the usual physical sense.

So once you describe all the effective theories – layers of the onion of knowledge – for all distance scales up to (longer than or equal to) the Planck scale, that's the end of the story. The last layer – the explanation how Nature behaves at the scale of the quantum gravity – will be the only task that you have to solve fully. No additional infinite hierarchy of effective theories can be squeezed over there. Because of the uncertainty principle's impact on the proper length that becomes severe and of order 100% near the Planck scale, energies that exceed the Planck energy no longer allow you to probe shorter distance scales and qualitatively new physical phenomena; instead, if the center-of-mass energy of a collision dramatically exceeds the Planck scale, you create a black hole (an ever larger one if you increase the energy) whose rough behavior is captured by the low-energy effective equations once again. No new physics emerges.

These were general comments boiling down to the Renormalization Group and the existence of gravity in our quantum world. But we have been collecting some precious, much more specific evidence for a few decades when it looked increasingly indisputable that string/M-theory is the final theory of everything. It apparently possesses everything that a final theory needs to have. It seems to be 100% robust and predictable: there's no way to "deform it" without spoiling its consistency. It allows no adjustable yet non-dynamical continuous dimensionless parameters. It seems that everything that's left is to understand string/M-theory more completely (including persistently confusing aspects such as the initial conditions and Big Bang singularity and the vacuum selection mechanisms in general if there are any) and isolate the solution that is relevant for the environment seen in Nature around us.

I could spend lots of time with mostly linguistic disclaimers that I don't find too deep or interesting here.

Of course, the term "theory of everything" has to be interpreted correctly – we/physicists don't know "everything", just the elementary laws to which (plus the knowledge of the initial state and "the right questions") everything may be reduced in principle. I think that people realize that a TOE has to be interpreted this carefully (as the theory about the elementary forces and building blocks – or the maths replacing them – only) and science doesn't completely stop when you find a TOE (lots of complex questions always remain) which is why I think that the opposition to the term TOE because of this reason is really unsubstantiated. Also, it's obvious that even in particle physics, many physicists are working on many things that have almost nothing to do with the search for a final theory – and that don't even depend on its existence in any way.

Still, the fact that physicists are working on various things doesn't mean that they're equally important. I agree with Phil that the search for a TOE is a very important research industry in physics, one that – according to the present evidence – should be solvable in principle but one that is so ambitious that it's clearly impossible to promise any deadlines for the date when the problem will be fully solved. We're not there yet but the "TOE research program" has already generated lots of profound spin-offs that have been valuable even for those who don't give a damn about a TOE.

I have always found the possibility that there is a TOE important enough for much of my thinking time to be occupied by questions related to this project – although the particular things one is thinking about are always much more concrete, limited, and modest than overarching claims about a TOE (a fact that the laymen may easily misunderstand, too: physicists are simply not meditating about a TOE, OM, TOE for hours in their office, they're looking at well-defined, seemingly more special, questions). On the other hand, Lisa Randall isn't passionate about a TOE – a fact that is correlated with her being a phenomenologist rather than a string theorist. Needless to say, I view both approaches as important ones but the TOE-focused, string-theoretical approach to be significantly more successful as a generator of progress in the last 35 years or so, a trend that is more likely than not to continue in the coming years, I think.

Sunday 27 January 2013

A visit to Crumlaw

Fun paper: More than 12 years after the LEP collider was closed (now the LHC sits in the same tunnel), the collaborations publish a paper on the search for charged Higgses. That's quite a delay! ;-)
On Sunday, I gave an interactive blackboard physics talk in Český Krumlov, an architectonic and historical pearl of Southern Bohemia (14,000 inhabitants today) to a group of a dozen+ of curious, wonderful, young people.



I began like this. What is physics? It comes from the word physis, Nature. We started in ancient Greece 2,600 years ago, on a warm sunny evening when you just finished your shopping in a local market or agora and you looked at the night sky... ;-)

And so on, elements, particles, quantum mechanics, double slit experiment, sound and light and aether, relativity, LHC etc. (over two hours of the lecture was cut from The Big Bang Theory, as you may have noticed). Of course that most attendants had to be disappointed when I elaborated upon the physicists' opinions – well, conclusions, I may say on my own blog – that fewer supernatural phenomena are possible relatively to what they believe. To mention a well-defined yet representative example, the idea that one may detectably increase the rate of the Higgs boson production by thinking about it intensely – by telekinesis of a sort – is very hot in Krummau these days. :-)

My hosts were very kind, creative, generous, and educated, and as they were voters for Schwarzenberg, I've heard some stories and arguments that would have had increased the odds that I would vote for Schwarzenberg if I had heard those things before Friday – but probably not quite above 50 percent. ;-)

One of the hosts – a proud descendant of people of many ethnic groups and the Germans in particular (the Sudetenland border was crossing the town) – also clarified some facts about the life and events in the Sudetenland etc.; I also learned the answer to a question by a TRF reader why the maps of the low election turnout (today) coincide with the map of the Sudetenland (newly imported Romanian citizens were a part of the story).




After almost 3 hours of the talk, we were shown the local amazing castle and chateau, the 2nd largest in the Czech Republic after the Prague Castle (well, 1-2 buildings among 50), including some intimate "guts" that a very small number of people may ever see. In particular, we focused on the complex baroque wooden-and-rope-based technologies to switch backcloths in the local famous baroque theater. Truly impressive. I vaguely knew there was a theater but I was completely ignorant about what I was missing.

The castle manager – well, the first vice-castle manager after a boss, if I put it in this way, has clearly done an unbelievable amount of work on that place. The anecdotal evidence led me to the conclusion that the work of the people over there is significantly underrated and underpaid. There's a lot of everyday technological know-how surrounding the baroque (and older) "technologies" that coming generations may start to forget. Of course, most of these technologies have been superseded by some modern ones but I still think it would be a pity if the sophisticated mechanical constructions start to be unusable for the living generations.

Saturday 26 January 2013

Klaus' successor: Miloš Zeman elected Czech president

I have "more positive than negative" opinions about both candidates but I voted for Miloš Zeman (*1944, see a sensible story in NYT) in both rounds of the first direct presidential elections because he seemed like a decent enough successor to Klaus to me – and I am not claiming that Zeman is quite in the same category as Klaus (yet). As the de facto founder of a non-communist, major, left-wing party (in Czechoslovakia and later Czechia) that other nations know, this guy who has lived as a pensioner in the last 10 years is one of the last active "founding fathers" of the post-revolutionary political system in my nation.

Whether or not the politicians – and the society – preserve the continuity and the respect to the first events of our modern democracy (instead of the currently widespread and fashionable, de facto anti-democratic, Hitler-like screaming that the system we have is intrinsically or even inevitably corrupt and so on) is something that I find very, very important. So is the genuine rule of law – as opposed to arbitrary decisions of some self-anointed people following some unwritten, would-be "moral" criteria; the genuine freedom and democracy – as opposed to freedoms that only exist for those who agree with the self-anointed ones and the "democratic" selection with the only allowed choice where everyone else is immediately attacked, demonized, and labeled unethical.



I am not hugging the trees in order to extract energy from them. I am just sometimes caressing the bark of such a tree that looks beautiful to me. I am not preparing my opinions at the moment when I am giving an interview or writing a book. They have to be clear to me a long time earlier. Our politics is grey and boring and people are already tired of ripoffs and scandals. But there's an act that is better than to curse the darkness: to light at least one candle. Who doesn't believe in himself shouldn't enter politics at all. To make fun out of other people is only possible if you can make fun of yourself, too. My worst vice is trustfulness but maybe it's my best virtue, too. I have always lived so that I didn't have to be ashamed of myself, and so that my daughter Kateřina didn't have to be ashamed of me. Each president should work for the future. And the future is also kids.

Despite his being an officially and verbally a "leftist" politician and in spite of his opposition to Klaus in the early and mid 1990s (when he sometimes drove me up the wall – and I would consider him Sládek light: but the political struggle ultimately respected the etiquette), he seems like the more right-wing candidate among the two candidates to me in the respects I find most important – undesirability of the power for the uncontrolled "civil society", opposition to the environmental movement and green parties, global warming skepticism, hawkish attitude to foreign matters, especially in the Middle East, defense of basic national interests and belief in the legitimacy of the expulsion of most Germans after the war, and even in minor things such as his opposition to the independence Kosovo.

In the recent two weeks, I was pretty much sure that Zeman would win. My methodology was to look at the results of the first round and estimate the percentage of the voters of other candidates that would switch to Zeman. All the remaining 3 candidates in the top 5 voted for Zeman – Jiří Dienstbier (because he is a social democrat), Jan Fischer (who supported Zeman indirectly, citing anti-nationalism as the problem with Schwarzenberg), and (surprisingly) Vladimír Avatar Franz (most of the aides to these folks, especially Franz, became Schwarzenberg voters but they were not too important). Getting just 60% out of these people's voters would be enough for a smooth Zeman victory – above 55% – despite the unlimited and insane pro-Schwarzenberg brainwashing in pretty much all the media during the last two weeks.




And that's where we are now, 70 minutes after the polls ended. Almost 57% is behind Zeman while 77% have already been counted. It's implausible that Schwarzenberg would still get the edge to win. Why? Only 23% of the votes or districts wait to be counted. Schwarzenberg's percentage is higher in Prague where only 50% of the small districts have been counted but the remainder is not enough for Schwarzenberg to win.

In Prague, Schwarzenberg is getting a 66-34 advantage over Zeman. Half of Prague (which remains uncounted) is the only major part of the votes that exhibits a significant pro-Schwarzenberg bias. Prague is 1/10 of the nation, 1/2 of Prague is therefore 1/20 of the nation, and because the bias in this 1/20 of the nation – to be added – is just 66-34=32 percent in favor of Schwarzenberg, Schwarzenberg is expected to get a boost 32/20 = 1.6 percent from this half of Prague after it will be included. That's much less than those 7 percent he needs to win.

Update: at 15:18, 83% of votes are counted and Zeman is still ahead with 56.5%. Counted votes in Prague where Schwarzenberg still has 66% is approaching 60%. Only 1/25 of the nation – 40% of Prague – has a chance to systematically distort the present percentages. But 32%/25 is much less than 6.5% that the prince needs to eliminate.

Update: at 15:38, less then 100 minutes after the polls ended, 95% is already counted, Zeman stands at 55.6%, 82% of Prague is already counted. The story seems settled to me, I won't add further updates. Zeman will clearly win 55-45 or so.

Tough campaign, possible doubts

It's been a tough campaign in which Zeman was being linked to former communists, agents, criminals, and corruption etc. – all these things are so indirect that one could summarize them as bogus – while potentially anti-Czech, pro-German opinions and plans of the prince (who has lived in Austria for decades) had to be discussed but the media tried to hide them.

When the polls already opened yesterday, Schwarzenberg became upset about a paid commercial in a tabloid (Blesk = Flash) that accurately summarized some of Schwarzenberg's opinions about the Second World War and its consequences and suggested – perhaps speculatively but carefully, I would say – that he is preparing the ground for the returning of the assets to to descendants of the expelled Germans. Schwarzenberg, who has previously claimed that he never gets angry, sued the person behind the ad and got so angry that he submitted an invalid vote himself (he didn't place it in the envelope as required by the law). Schwarzenberg was the only person in his village who was incapable of voting properly; there was, however, one more invalid vote by a person who wanted to elect Thomas Garrigue Masaryk (the founding president of Czechoslovakia) using a specially designed ballot. ;-)

We would read tons of positive articles (and watched positive TV pieces) about Schwarzenberg, his nice and humane wife, the prince's love for soups, cooks in his chateaux who love him, his dissatisfaction with "untrue" claims by the Zeman campaign; and tons of negative articles about Zeman, his contacts with an unpopular ex-communist Šlouf (a good friend of Zeman who is primarily a grandfather of many children these days and who has a pleasant smile – that's how I would summarize this guy whose politics has clearly been alien to me), the fact that no one knows where Zeman's wife Ivana Zemanová is employed as a secretary right now (obviously a story written in order to suggest that she may be corrupt or stealing products in the supermarkets or whatever), and so on.

The anti-Zeman, pro-Schwarzenberg bias of the media has been staggering. It's hard to see whether the whole cultural-media front/complex has really been this brainwashed or whether what we saw was mainly a consequence of the fact that the leading Czech newspapers currently (but not just currently) have German owners. At any rate, it apparently didn't matter much. Zeman suggests that the voters ultimately tend to vote the opposite candidate than what the media prescribe – something that the stupid journalists may fail to see. I think he may be right but I think that the most important drivers that decide about the chosen candidate have nothing to do with some propaganda in the campaign, neither in the positive sense nor in the negative sense.

Now, what can we expect from Zeman?

Global warming skeptics should be happy that climate skepticism keeps the Prague Castle although Zeman has been far less visible in this topic. Your humble correspondent will continue to work on the project that this situation will change. (I think that Zeman's English is somewhat weaker than Klaus', so I would have to work on that, too.) Zeman has been opposing the environmental activists in tons of topics – most characteristically for him, he supported the chopping-down fight against the bark beetle in the Šumava National Park against the green advocacy groups that insisted that the forest must always take care of himself and humans have no right to intervene. But he has said that global warming is one of the topics in which Klaus was wise and right.

Much of the talk about the economy – if he will be active in speaking – will get a more left-wing flavor (he already promises to be a president of the lower 10 million, give me a break with this garbage) but it's the kind of moderate, centrist left-wing flavor I can live with (although the financial disclosures for the wealthy enough people that Zeman seems to advocate in order to fight corruption would almost certainly represent an annoying amount of bureaucracy for me if they would turn out to apply to me) because I think it's mostly Zeman's image from the 1990s by which he produced enough political room for him to compete with Klaus after the Velvet Revolution (Klaus dominated the clearly cut right wing of the spectrum from the beginning). Zeman is otherwise a pro-market guy who realized a very important part of the privatization – privatization of banks – in his tenure as the prime minister (1998-2002) by which he ended the "bank socialism" (things are superficially privately owned except that they are often/mostly owned by banks that are state-owned), something that Klaus was afraid to do for some reasons or whatever, but Klaus simply hadn't done it for years. Much of the qualitative difference in the economic policies between Klaus and Zeman is arguably at the level of words only. Well, I believe that Zeman's natural membership in the "poor class" – he is proud to have lived out of a $700 pension for the last decade – is completely genuine and I have no problem with that, either.

There are many reasons why Zeman's actual policies in the economy (1998-2002, prime minister) were characterized as "hidden Reaganomics". If he will use the adjective "left-wing" for those things, it's just fine with me – it will redefine the adjective "left-wing" to something I don't find terrible. But he won't have much influence over the economy, anyway.

Zeman is officially a Eurofederalist but he denounces the control of Brussels over many things such as the "bans of light bulbs" (he hates fluorescent light bulbs, among other things, and this topic is often picked as his #1 example that the centralization in Brussels may end up with absurd outcomes). Just like Klaus is a national patriot of a sort, Zeman is an intense patriot favoring the "Western civilization", if I put it this way. He supports a preemptive strike against Iran and the redefinition of NATO (ideally with Israel among the members) as an anti-Islamic alliance and be sure that he isn't trying to suggest that it's just some rare special form or feature of Islam that is the problem; it's the real-world Islamic culture itself.

Now, tons of ordinary people, the typical Czech whiners, have voted him because they're truly left-wing (think about various pensioners many of whom preferred the life under communism), unlike your humble correspondent who voted him because he was the more right-wing candidate ;-) but I don't care – this is what we may describe by the words "Zeman is a uniter". :-)

I think that many right-wing voters have already understood the picture and the reasons why I preferred Zeman despite his self-described left-wing image. Of course, Klaus voted for Zeman as well and Zeman's victory means that Klaus doesn't have to emigrate (as he suggested in a text message to a trusted female friend who immediately leaked the text message).

Most of the people on Facebook, in culture, and so on are probably getting shocked right now because an overwhelming majority of them quickly switched to the "only correct choice" which was Schwarzenberg for them. But neither the media nor Facebook are omnipotent, everyone should learn, even if they are very loud and very biased.

And that's the memo.

Bonus: map, reactions



The map: In Pilsen, Schwarzenberg won with 55 percent or so. A few more large cities were similar. Prague, the most pro-Schwarzenberg area of the Czech Republic, gave him 66 percent. That's nothing compared to the Czechs who live abroad – Schwarzenberg stood at 84 percent.

I find Schwarzenberg himself fine but I think that these heavily uniform results deserve to be called the dictatorship of the political correctness or mediocracy. It's driven by some societal pressure in which everyone must vote for the "right thing", a pressure that seems far more prevalent among the people who would like to call themselves the "elite" even though the would-be arguments (personal attacks or, on the contrary, glorification) promoted by the media are usually so incredibly dumbed down that the word "elite" sounds ironic here. The very fact that I would belong to 16 percent of the "expatriate" Czech population – a pretty small minority – if I were still in the U.S. would look a bit worrisome fact to me, after some brutal experience of mine with PC at Harvard.

Poor countryside in Northern Moravia approached 75 percent for Zeman (e.g. around the town of Bruntál) who was also above 60 percent in the Vysočina Hills near the middle of the country where he lived, and got those 55 percent or so in the bulk of the country.

Happy president Klaus wittily said (audio) that by the victory of Zeman, "the truth and love has finally prevailed over the lies and hatred", a favorite slogan of Havel who would surely disagree with the application of it here, however. ;-) Klaus is on his visit of Chile again – I hope he will bring us some nice pens again. :-) Slovakia is mostly satisfied with the choice of Zeman, not only because the first lady will be partly Slovak just like the current one. However, my reading of comments at SME.SK, a PC center-right counterpart of iDNES.CZ, suggests that readers over there are mostly anti-Zeman – but they appreciate that the Czechs have proved that their character is almost identical to the Slovak one. ;-)

I hope that the terror against Zeman voters in various corners where they turned out to be a small minority – victims were actors but even a 12-year-old girl with Zeman-voting parents (unlike the parents of her classmates): the girl was bullied and a big sentence "[her name] is a cu*t" appeared on the blackboard after a class on civil issues in which the teacher aired anti-Zeman YouTube videos, among other things – will stop once Zeman won. But some of the reactions after the results were announced are unbelievably arrogant and bitter – for example, the editor-in-chief of lidovky.cz Mr Balšínek wrote a disgusting tirade about all the voters of Zeman who failed to transcend their shadow and... OK, let me stop, I would vomit whenever I see these intellectually average people who would still love to dictate everyone how to live.

Zeman has some personal characteristics and idiosyncratic opinions that are often being used against him. For example, he believes that a huge portion of the journalists are journalistic hyenas of a mediocre intelligence who often try to intervene into things they have no (political or intellectual) credentials to affect. Some people are shocked whenever he reveals that he still thinks so. Needless to say, I completely agree with Zeman on this point and it was one of the numerous reasons why I supported him. The infallibility of mediocre journalists became one of the dogmas of the PC-controlled West and indeed, I am grateful that we may hopefully avoid this cesspool for 5 more years. If a journalist wants to do politics, he or she must be treated as a fair game by a politician and attacked whenever the politician would attack another politician who would be trying to do the same (i.e. demagogy). If a journalist finds something "inappropriate" (read as "inconvenient"), it doesn't mean it shouldn't be discussed, and so on.

Friday 25 January 2013

Weinberg's evolving views on quantum mechanics

Cool anniversary (1/25): CERN discovered the W-boson (UA1 experiment) exactly 30 years ago, two months after their first W candidate; there was a press conference. Via Joseph S.

A pulsar with a button periodically switched by ET aliens in between two regimes, to broadcast a binary message to us, was found. This answers the question "Where are they?" The new question is "What are they talking about?" Via The Register.
Lectures on Quantum Mechanics by one of the world's most achieved living physicists may be grabbed from the bookshelves; click at the amazon.com link on the left side.

Aside from the Weinbergization of lots of the usual technical topics you expect in similar textbooks, there is also a section, Section 3.7, dedicated to the interpretations of quantum mechanics.

One may see that Weinberg's views have changed. Unfortunately, the direction of the change may be associated with the word "aging".




Lots of web pages such as a Facebook Weinberg fan page and John Preskill's blog (comments) quoted the most characteristic sentences in the book:
My own conclusion (not universally shared) is that today there is no interpretation of quantum mechanics that does not have serious flaws, and that we ought to take seriously the possibility of finding some more satisfactory other theory, to which quantum mechanics is merely a good approximation.
I think that I remember the Czech translation of Weinberg's "Dreams of a Final Theory" that I bought in 1999. At that time, I was already completely certain that quantum mechanics – in the founders' (refined or unrefined) interpretation – made a complete sense and it was a complete theory linking observations to mathematical objects and able to make (probabilistic) predictions.

And my memory indicates that Weinberg just confirmed my conclusion that the universal postulates of quantum mechanics were exact. They had to be a final answer to the "foundational questions" and they couldn't be deformed.

By now, Steven Weinberg has revised his opinions and – because "no interpretation looks good enough" to him – he believes that quantum mechanics isn't exact, isn't the final story, and so on. Too bad. The basic axioms and postulates of quantum mechanics are rather crisp and easy and if there is a problem with them, I wonder why Steven Weinberg didn't analyze them (rather important questions in science) and didn't articulate their hypothetical problems about 30 or 40 years ago when no one had doubts that his brain was among the 10 most penetrating and reliable physics brains in the world.

However, there are also comments about the foundations of quantum mechanics in his new book that are right on the money. In particular, Weinberg says that all attempts to derive Born's rule for the probabilities out of something "more fundamental" seem to involve circular reasoning.

Circular reasoning is found everywhere – for example, in the very meme that the proper Copenhagen-like interpretations of quantum mechanics are "incomplete". Whenever people say such a thing, it's because they first convince themselves that there must be a whole skyscraper of mechanisms that explain the rules of quantum mechanics using "something more fundamental". Then they search for possible forms of this "more fundamental skyscraper" and when they do it well, they find out that none of the candidates really works. Therefore, they conclude that there's a problem with the interpretations of quantum mechanics – even though their failure only actually proves that there is a problem with the assumption that there is something else and "deeper" to be found about the foundations of quantum mechanics.

But let me return to Weinberg's more specific claim, "published derivations of Born's rule from something else involve circular reasoning". I want to mention one example I was forced to get familiar with, Brian Greene's "The Hidden Reality" that I translated to Czech. In general, it's a very good book about all types of parallel universes one may encounter in physics. The book has many flaws, too. In particular, the whole quantum mechanical chapter is a deeply misguided promotion of the Many Worlds Interpretation and "realism" in quantum mechanics where pretty much every sentence is invalid even though all the misconceptions are presented with Brian's extraordinary clarity. But there are also some would-be "new discoveries" in the chapter that aren't new at all and that don't prove or derive what Brian claims to be proven or derived. I hope it's OK to copy a part of the end note #9 for Chapter 8:
Over the years, a number of researchers including Neill Graham; Bryce DeWitt; James Hartle; Edward Farhi, Jeffrey Goldstone, and Sam Gutmann; David Deutsch; Sidney Coleman; David Albert; and others, including me, have independently come upon a striking mathematical fact that seems central to understanding the nature of probability in quantum mechanics.
As I will discuss in some detail, this list is totally misleading because the "finding" is a general property of all probabilities and it was totally comprehensible to the founding fathers of quantum mechanics, too. For example, if you watch the lecture Quantum Mechanics In Your Face by Sidney Coleman – one of the physicists in Brian's list – he explains this very same argument (without some of the philosophical conclusions that don't follow from the argument) but he also tells you, at the very beginning, that nothing in his lecture is new – except for his style of presentation of these things.
For the mathematically inclined reader, here’s what it says: Let \(\ket\psi\) be the wavefunction for a quantum mechanical system, a vector that’s an element of the Hilbert space \(\HH\). The wavefunction for \(n\) identical copies of the system is thus \(\ket\psi^{\otimes n}\). Let \(A\) be any Hermitian operator with eigenvalues \(\alpha_k\), and eigenfunctions \(\ket{\lambda_k}\). Let \(F_k(A)\) be the “frequency” operator that counts the number of times \(\ket{\lambda_k}\) appears in a given state lying in \(\HH^{\otimes n}\). The mathematical result is that\[

\lim_{n\to \infty} [F_k(A) {\ket\psi}^{\otimes n}] = \abs{ \braket{\psi}{\lambda_k} }^2 {\ket\psi}^{\otimes n}.

\] That is, as the number of identical copies of the system grows without bound, the wavefunction of the composite system approaches an eigenfunction of the frequency operator, with eigenvalue \( \abs{ \braket{\psi}{\lambda_k} }^2\).
But this "discovery" is a trivial consequence of probability theory. It doesn't depend on any new fact about quantum mechanics. At most, it is one consistency check you can make if you want to verify that the probabilities predicted by quantum mechanics are consistent with the usual rules of probability theory.

While the formalism above may look intimidating, its essence is completely simple. It says that if some property is predicted to appear with probability \(p\) and you repeat the same experiment \(n\) times, then the probability is nearly 100 percent that the property will appear in \(pn\pm 5\sqrt{pn}\) cases, i.e. in \(pn\) cases with an error margin that becomes tiny, relatively speaking, as you send \(n\to\infty\). The number \(5\) meant that I wanted a 5-sigma certainty that we will be inside the interval. So by combining \(n\to\infty\) independent propositions of the same form with the same probability to be true \(p\) and by counting, you may construct propositions about the number that are almost certainly true.

This fact is true pretty much by the definition of probabilities: it's what the notion of probabilities means according to the frequentists. In other words, you may prove it by deducing the binomial distribution simply from the distributive law applied to the power \([p+(1-p)]^n\) and from a separation of different powers of \(p\). And this simple claim that is true by the definition of probabilities – and that was true even in statistical physics applied to models of classical physics – was pretty much just translated to the notation of quantum mechanics. Claims about the numbers' having a value were translated to eigenvalue equations; the calculated probabilities were translated, via Born's rule, to the squared absolute values of the complex probability amplitudes.
This is a remarkable result.
No, it's not. Again, it's at most one of the trivial consistency checks one can make to verify that the probabilities predicted by quantum mechanics are compatible with the rudimentary, general, frequentist properties of the notion of probability. Also, one doesn't have to make infinitely many consistency checks one by one. Instead, one can verify that all axioms of general probability theory are satisfied by the quantum-predicted probabilities which implies that all conceivable consistency checks like that would work.
Being an eigenfunction of the frequency operator means that, in the stated limit, the fractional number of times an observer measuring \(A\) will find \(\alpha_k\) is \( \abs{ \braket{\psi}{\lambda_k} }^2\) – which looks like the most straightforward derivation of the famous Born rule for quantum mechanical probability.
As Weinberg correctly said about similar "derivations" in general, this derivation boils down to circular reasoning. Brian Greene concluded that an identity for the limit of some ket vector acted upon by some operator implies that we may say something certain about the number \(pn\) of repetitions of the same experiment, with \(n\) repetitions in total, when a property was satisfied.

But in this claim, it's being assumed that the (nonzero) deviations of the state vectors in the sequence from the ultimate limit of the sequence "don't matter" when \(n\) is large. However, for any finite \(n\), they do matter. If you want to calculate how large \(n\) has to be for your confidence level to exceed a certain threshold (e.g. to discuss the reasonable deviation away from \(pn\) that you may expect, and it is of order \(\sqrt{pn}\)), you will need to know and use Born's rule for the probabilities \(0\lt p\lt 1\) of individual repetitions of the experiment, anyway.

At most, Brian "derived" Born's rule for propositions whose \(p=1\) – by assuming a "simpler, more plausible, special form" of Born's rule for such \(p=1\) propositions – while he allowed himself to be sloppy about the quantification of the small differences of \(p\) from one and about the rigorously required calculation how small they actually are (and whether they are small). At any rate, the reasoning is circular. One may use the general Born's rule for any value of \(p\) and derive the same thing, or one may use the special Born's rule (eigenvalue equation) for propositions that happen to have \(p=1\) and derive the same thing (which is useless at the rigorous level, however, because no nontrivial propositions with \(p=1\) exactly may be constructed out of propositions with general values of \(p\)).

It shouldn't be surprising that all derivations of Born's rule for the probabilities have to be circular – simply because Born's rule is a fundamental and concise enough postulate of quantum mechanics. Probabilities play a fundamental role in quantum mechanics so they can't honestly be derived from anything simpler or more fundamental. If someone has managed to run through consistency checks such as Brian's consistency check, it should assure him that the way how quantum mechanics incorporates probabilities is totally smooth, natural, and internally consistent. It should weaken his or her attempts to "fight" against the foundations of quantum mechanics. And because things would fail to work if anything were "deformed" away from the rules of quantum mechanics, the consistency suggests that the universal laws of quantum mechanics can't be deformed at all. Too bad that so many people – including Brian – try to interpret the success of these consistency checks exactly in the opposite way!
From the Many Worlds perspective, it suggests that those worlds...
I decided to terminate this quote because it's getting preposterous at this point. Brian effectively tries to argue that the trivial translation of the frequentist definition of probabilities to the formalism of quantum mechanics proves the Many Worlds Interpretation. It surely doesn't. At most, Brian attempted to present a "story" whose goal is to demonstrate the compatibility of the quantum mechanical probabilities with the Many Worlds paradigm; even if true, this compatibility would be very far from "proving" the Many Worlds paradigm.

But the truth is that these two things aren't even compatible. The Many Worlds paradigm isn't compatible with the very fact that individual questions usually have probabilities \(p\) that are strictly in between \(0\) and \(1\) and that are, by the way, almost always irrational numbers. This can't really be achieved with "many worlds" at all. For any finite number of many words that are "equally likely", the probabilities will be rational (repetitions of the same world are needed to allow a trivial and generic situation, namely that \(p\) differs from \(1/2\) at all). And if you pick infinitely many worlds, in an attempt to approximate an irrational value of \(p\), the probabilities will be indeterminate form of the type \(\infty/\infty\), so they will be ill-defined.

This has to be combined with all the other severe diseases of the Many Worlds paradigm – no sensible or natural rule "when" the worlds should split and why, failure to obey conservation laws, failure to acknowledge that an arbitrarily large yet finite system always has a nonzero chance to "recohere" so the apparently irreversible "splitting of the worlds" should really never occur, and so on. If I summarize the flaws, the Many Worlds Interpretation contradicts the fundamental fact that the conditions for two possibilities to be already "decohered" are intrinsically subjective conditions, depending on the observer's choice of questions (and her choice of the set of consistent histories), her desired accuracy and confidence level, and other things. A fundamental, undebatable goal of any version of the Many Worlds Interpretation is to make these intrinsically subjective and fuzzily defined "events" look objective and this basic confusion of subjective and objective facts about the real world makes all conceivable mutations of the Many World Interpretation deeply flawed.

In Brian's comments about "derivations of the probabilities", and many similar philosophically oriented remarks about quantum mechanics, the illogical and flawed steps appear pretty much in every step. If Brian were a student who would use so many sloppy steps and incorporated so many logical errors in a calculation of some technical issue that isn't connected with philosophy and widespread misconceptions, he would just fail the exam and that would be the end of the story.

But because the misconception that the probabilities in quantum mechanics (and, more generally, postulates of quantum mechanics) aren't fundamental and exact is so incredibly widespread, all these numerous errors just "don't matter". Brian's end notes – and tons of articles with similarly flawed content – are viewed as OK simply because there are always lots of prejudiced people who feel "certain" about the totally invalid assumption of "realism behind quantum mechanics" and who are therefore ready to forgive an arbitrary number of mistakes as long as the basic spirit of the conclusion agrees with their prejudices.

As I have written many times, this belief in realism is analogous to any other religious belief. People's rational thinking simply gets turned off as soon as they hit questions that could threaten some opinions and assumptions that they view as fundamental for their world view. That's a pity. The advocates of "realism behind quantum mechanics" are running an industry of arguments that is analogous to creationism and its claims that Darwin's evolution has lethal flaws.

Now, Steven Weinberg apparently knows and admits that none of the existing attempts to derive quantum mechanics from something "more fundamental" is more than an example of circular reasoning or, if you want one words instead of two, gibberish. But the older Weinberg is still prejudiced that there should be some "less quantum" foundations beneath the quantum phenomena although this more general prejudice is still scientifically unjustifiable and ultimately wrong.

Thursday 24 January 2013

HEP: the bias favors women

In the Time Magazine, when Jeffrey Kluger wrote about Ms Fabiola Gianotti, the spokesman of ATLAS at the LHC, as the runner-up for their "Person of the Year", he wrote, among other things:
Physics is a male-dominated field, and the assumption is that a woman has to overcome hurdles and face down biases that men don’t.

But that just isn’t so. Women in physics are familiar with this misconception and acknowledge it mostly with jokes.
This is absolutely accurate in most cases. Pretty much all competent women in high-energy physics whom I met acknowledge that this "myth about extra hurdles" for women is just nonsense. You won't hear about these women because their politics is inconvenient for the PC Nazis who have hijacked most of the media. However, the women with this opinion on the situation produce well over 90% of the actual scientific output that women contribute to science, an enterprise of all the humans.

(Some feminist activists who are good enough physicists, e.g. Melissa Franklin at Harvard, would love to deny the fact that there is no bias against women left and the bias that remains real has the opposite sign. However, their room to spread this fairy-tale about their "oppression" usually shrinks substantially once they're elected the department chair, for no really good reason.)

I could tell you dozens and dozens of examples of highly productive women – really the bulk of women in the proper science – who agree with me but I can't even afford to do this thing because they would face trouble with the PC Nazis just for the fact that their name has appeared on my blog in this context. So even though they're the majority among the productive female scientists, you won't hear about them or their opinions. They know something that certain people just don't want to be heard. Instead, you will always be offered stories by obnoxious, constantly whining, largely unproductive "also scientists" who want the vagina to become a universal excuse for incompetence so they will always be dissatisfied with something.

Sabine Hossenfelder and Tommaso Dorigo disagree with Kluger (and with your humble correspondent) and they try various incomprehensible sleights of hand to justify their claims. However, ironically enough, the fact that women are much more likely to have advantages rather than disadvantages is well documented by pretty much all the female names that appear in these texts. What do I mean?




Well, I also mean Sabine Hossenfelder herself. But if I followed this example and listed some details, this article could be excessively controversial.

So let's pick Fabiola Gianotti. An article about her – a discoverer of the Higgs boson – started this whole story. Is she an example of the discrimination against women?

I think she is a very good physicist, articles about her on this blog are universally positive (including comments about her wise choice of fonts, Comic Sans), but we should still notice that ATLAS is not the only major detector at the LHC.

There is also CMS – a detector surrounded by a collaboration that Tommaso Dorigo belongs to – and the CMS has discovered the Higgs boson, too. It was done in the same channels, at the same time, pretty much at the same confidence level (up to differences that were clearly due to chance). The spokesman for CMS is male and American, Joe Incandela of Santa Barbara.

In many situations in real life, one may compare "analogous situations" to see that women are surely not discriminated against, quite on the contrary. But non-experts may fail to understand which situations are really analogous to each other and which situations are not. Outsiders just can't determine whether two physicists are equally good or not.

However, almost by construction, we have a situation in which it may be done almost rigorously. Sociologically speaking, there is an almost perfect \(\ZZ_2\) symmetry between ATLAS and CMS – and their discoveries of the Higgs boson. Fine, who was in advantage in the wake of the discovery of the Higgs boson? Joe Incandela's description of the situation was arguably more clearly organized and more comprehensible – which is partly due to his being a native speaker.

When you look at the "Person of the Year" contest itself, you will find Ms Fabiola Gianotti but you will not find Mr Joe Incandela. The symmetry has been broken and it has been broken in the opposite direction than one you would hear from the dishonest promoters of the PC propaganda, right?

Just to be sure that we're not talking just about some perception of editors of non-scientific magazines, there's a difference in the funding, too. A recent Milner bonus prize gave some money to the experimenters in particle physics, too. How did the current spokespeople do?

Well, Ms Gianotti received $500,000 while Mr Incandela only received $250,000. Which number is larger? What is the ratio? Is it substantial? Is it meritocratically justifiable? Of course, by looking at the structure of the winners, one may find an "excuse" why Incandela got less money than Gianotti: three more spokespeople shared a million with him while Gianotti has only shared a million with one additional person.

But this would be a truly lame excuse because the current spokespersons have nothing to do with the number and composition of the past spokespersons. There is a symmetry between their work. A more natural distribution would give 1/6 of those $2 million to the six people. But it didn't happen. Why? Easy. I think that Yuri Milner is a meritocrat but he – or his advisers – just had to give a factor-of-two advantage to a visible enough woman in order to improve their image in the broader scientific/leftist community where the dishonest PC Nazis have accumulated a huge power.

I could give you tons of similar examples but the symmetry or asymmetry between the men and women in the story would be far less obvious to the outsiders so they wouldn't be as convincing and indisputable as this particular example.

Now, Ms Hossenfelder argues that some people working in the Academia (not necessarily scientists) heard her last name and talked as if they thought she was male. Well, it was their guess because it's far more likely that a random theoretical physicist is male and ordinary people simply use sentences with "he" or "she" and they have to decide.

In the U.S., I have met almost no one among these people who would pronounce my name correctly at all and I have never complained. In fact, I would have wanted too much if I expected the Americans to correctly guess whether Luboš is a male or female name. ;-)

Ms Hossenfelder would do better in the Czech Republic because almost all surnames are nontrivially feminized. Her name would be Sabine (or Sabina, if truly Czech) Hossenfelderová. This is not a joke: this is how a book or article about her would really write down her name as long as it would respect the rules of grammar. The ending -ová produces a feminine adjective related to the original male name, Hossenfelder. When her name would be pronounced by Czechs in the proper form, Ms/pí Hossenfelderová, everyone would know it's "she". But I am afraid that in Czechia, she would also complain – namely about her name's having a different, derived form! ;-)

(Incidentally, -ová is one of the numerous possible feminine suffixes. We use the flexibility of the language and the diversity at many places, for example in chemistry where they distinguish the oxidation number. So -ová also appears in "kyselina sírová" which is "sulfuric acid" and denotes the oxidation number six; "síra" is "sulfur". Kids learn the suffixes for oxidation numbers between one and eight as -ný, -natý, -itý, -ičitý, -ičný/-ečný, -ový, -istý, -ičelý. The masculine letter "-ý" may be replaced by the feminine "-á". It's kind of clever and poetic.)

But at any rate, janitors' guess that Hossenfelder is a man isn't a discrimination. On the other hand, keeping someone whose latest 10 papers are absolutely and entirely wrong, nonsensical, and absurd in the system for many and many years is an example of reverse discrimination.

Incidentally, their comments about the need to take care of children are inappropriate, too. A woman may play the more important and less avoidable role in the care about children and it's important and the society may appreciate it. But if that's the key activity that a woman is doing well, she should be getting some special money for her being a mother – and not for her being a physicist. In the same way, a day care center is something else than a physics department. Be sure that some people confuse these two things deliberately because a physicist's salary is still significantly larger than some social aid that mothers may be receiving. But one simply shouldn't confuse these two activities.

Please, feminist demagogues, be ashamed, be very ashamed. You must know very well that the claims about the discrimination against women are just malicious lies but you use them nevertheless to elevate your status and the status of your political ideas/delusions/lies.

CNN: Marc Morano on extreme weather trends

CNN and its Piers Morgan show just aired a very short exchange of opinions
Michael Brune vs Marc Morano (video)
between a defender of the climate alarm (an activist in the Sierra Club, well, its executive director) and the man behind the skeptical ClimateDepot.com website.

I think that Marc Morano couldn't be a full-fledged scientist – I mean to actually calculate various things from the observed data, including confidence levels, and other things. It's my understanding he doesn't have the technical background for that.

However, when it comes to his ability to localize the relevant literature and data for a question, to figure out their implications for some general enough questions, to memorize all these things, and to clearly present them, he would probably beat a vast majority of scientists and non-scientists on both sides of the dispute about the climate change.




A significant part of the scientific research – even in climatology – is complex enough so that the laymen can't really understand it or reproduce it and they may end up with wrong conclusions if they use their inadequate approximate methods to think and their not really solid and verified "idiosyncratic theories of physics" – and they should know that these methods aren't rigorous.

On the other hand, I believe that there exists a certain elementary layer of all these scientific questions that even a layman, assuming his or her basic intelligence, should be able to follow. When a layperson is actually told some data about some question, he or she should be able to deduce the right conclusions.

Brune said that everything is settled and the climate change is real. But what does this widespread yet ill-defined slogan mean? He immediately switched to the enumeration of some recent extreme weather events or recent natural catastrophes, if I use a strong word. Fires, droughts, hurricanes, floods in his parents' house, and so on.

He doesn't say so comprehensibly or explicitly but these events are supposed to be summarized by the phrase "climate change" (in its meaning as of January 2013 and the meaning keeps on mutating and changing). But despite its ill-defined character, "climate change" must surely have something to do with the "climate" and with the "change". But an elementary evaluation of the data shows that the "examples" that Brune enumerated have nothing to do with the climate and nothing to do with a change.

They have nothing to do with the climate because they are weather events. If you talk about these events in isolation, they're weather events of the kind that simply do occur sometimes (or generalized weather events: wildfires are surely not "just" weather events). If you want to talk about the climate, you must look at their frequency or statistical distribution over longer periods of time, at least 30 years or so.

But as Marc Morano was able to efficiently and quantitatively explain in the highly limited room he was given, none of these "bad events" actually has had any positive (i.e. increasingly worrisome or harmful) trend in recent years or decades. These are just scientific facts. A talk about some particular events may impress some viewers but to suggest that these events have been getting increasingly frequent, strong, or devastating is simply a lie. It is incompatible with all the observations.

So Piers Morgan may be obsessed by the rise of CO2 but the scientific fact is that despite the nearly religious status that CO2 seems to play in the world view of people such as Morgan, it hasn't mattered for the weather at all, at least not at a detectable level. After all, this conclusion of numerous explicit observations isn't a purely empirical fact. Even according to the state-of-the-art theories of the climate, there doesn't seem to be the tiniest reason why the increasing CO2 should directly yet strongly enough (for the detection to be possible) affect the frequency of fires (more CO2 usually means less oxygen, and this actually makes fires slightly harder to ignite, not easier), hurricanes, tornadoes, drought events, floods (those things seem to be uncorrelated).

The only hypothetical, theoretically justifiable influence of CO2 on the weather events is an indirect one: increasing CO2 concentrations first increase the global mean temperature (or at most the overall latitude-dependent temperature gradient), and then this higher global mean temperature changes the frequency and composition of extreme events (although it's usually not clear why the overall shift of the global mean temperature should seriously matter for the local weather: there isn't any counterpart of the "greenhouse effect" here that would make it natural to believe in these correlations). Because the link is indirect and at least one of the links in the chain is speculative, one should expect – even theoretically – that this relationship will be even weaker and less visible in the data than the hypothesized influence of CO2 on the global mean temperature.

But even the hypothetically stronger influence of CO2 on the temperature has been empirically non-existent in the last 15 years. If we assume that the changes in the extreme events are caused by the change of the global mean temperature, and it's the only mechanism that is theoretically defensible these days, then we may predict that there should be no significant CO2-driven change in the frequency of extreme events in the last 15 years, either. And indeed, this expectation seems to be true.

However, even if you found some change in the frequency of some extreme events, e.g. one type of them, in recent years (less than 15), it simply couldn't have been caused by global warming because there's been no global warming for 15 years (the slope of the straight lines most accurately approximating temperature graphs and calculable by linear regression seems to be "flat"). This is such a simple logical observation that I believe that everyone who has a college degree (or even a good enough high school in his CV) should be able to "rediscover" this simple argument and reach the simple conclusion. No change of extreme events between the late 1990s and the recent 5 years could have been caused by global warming because the latter just didn't exist.

What do the champions of the climate alarm respond to this simple and pretty much indisputable argument? They usually switch topics, usually to personal attacks (and absurd claims about the funding that is behind the skeptics rather than the alarmists: Brune did make this switch immediately), so that they don't have to deal with some basic science that is inconvenient for their beliefs. They want everyone to forget this argument quickly, to forget their knowledge of science and even logic in general. However, when they do respond, they usually say that the global warming is still there. It is "underlying" the noisy evolution of the temperature record.

The separation of the temperature graphs to the "noise" and the "underlying trend" of course can't be reliably done and it is hypothetical – this separation i.e. attribution is just another way to formulate the basic climate dispute. But more importantly, it is physically inconsequential, too. It's unimportant because if the frequency of some extreme weather events depends on the temperatures (imagine a whole map of them, one for each moment of time), it is the actual temperatures measured by thermometers and not some hypothetical "underlying" temperatures that can't be measured by thermometers! For example, if the frequency of wildfires were affected by the temperature and if the temperature were changing, it wouldn't matter whether it was changing because of the Sun, black carbon, CO2, or accidentally lowered number of clouds in the air.

And the actual temperatures simply haven't risen for 15 years which is why this non-existent rise couldn't have increased the frequency of wildfires, droughts, hurricanes, or floods in the houses of parents of Sierra Club hired guns. It's just not physically and logically possible. Something that doesn't exist can't really cause anything!

(The number of flooded Sierra Club members' parents' houses was increasing but that's not because of climate change but because of the dramatically increasing number of these demagogic parasites in that NGO and other alarmist organizations, and the implied increase of the number of their parents and their houses, and maybe even in the increase of the number of houses per one Sierra Club family as these jerks have been accumulating wealth.)

These logical arguments about the possible causal relationships are so simple that it's very hard for me to imagine that some superficially moderately intelligent people such as Piers Morgan who talk about climate change very often and listen to many comments about it haven't been able to understand them as of today. But maybe I am wrong. Maybe they are really not getting it. Alternatively, they know that their opinions about the climate are pure crap but they spread them because the belief in them leads to some convenient conclusions and they're convinced that many TV viewers are stupid enough to buy these illogical comments, anyway.

I am leaning towards the latter scenario. Piers Morgan and others isn't completely stupid; he's more dishonest than stupid.