Discover
/
Article

Analysis and Synthesis II: Universal Characteristics

JUL 01, 2003

DOI: 10.1063/1.1603051

Frank Wilczek

Quantum electrodynamics (QED) runs the show outside atomic nuclei, and quantum chromodynamics (QCD) runs it inside. This Q*D dynasty governs ordinary matter. The Q*Ds, which are purely based on abstract concepts, provide good impressionistic models of matter when supplemented with just two numerical parameters, and quite a lifelike rendering using four parameters (see my previous column, “Analysis and Synthesis,” Physics Today, May 2003, page 10 ). From these ingredients, we can synthesize a universe of mathematical possibilities that we believe, on good evidence, accurately mirrors the physical universe of materials and their chemistry.

PTO.v56.i7.10_1.d1.jpg

The polished core of astrophysics

In astrophysics, we study the behavior of very large amounts of matter over very long periods of time. Small but cumulative effects (gravity), or rare but transformative ones (weak interactions), which are negligible for most terrestrial and laboratory concerns, must be taken into account when we so widen our horizons. Yet by adding just two more parameters, we can extend our analysis of matter to cover most of astrophysics.

We have an excellent theory for gravity, namely Einstein’s general theory of relativity. It contains just one new parameter, Newton’s constant G N. As I explained in an earlier column (Physics Today, August 2002, page 10 ), there is absolutely no practical problem in combining the successful theories of matter with general relativity. For example, the intricate global positioning system (GPS), which defines spacetime operationally with precision and versatility, works just fine. The GPS is so accurate that it is sensitive to the effect of Earth’s gravitational redshift, a direct reflection of the warping of time by matter. While recognizing Einstein’s theory, so far the GPS seems perfectly oblivious to ongoing, much-heralded crises and revolutions in spacetime concepts. This lack of awareness is consistent with expectations from the working theory of quantum gravity I sketched—which is also the theory tacitly assumed throughout astrophysics.

The weak interaction powers the energy release of stars and drives their evolution. For most purposes, it is enough to include the basic interaction d u e v ¯ that converts a d quark into a u quark, an electron, and an antineutrino. The overall strength of that process is governed by one more new parameter, the Fermi constant G F. (Strictly speaking, what appears is G F times the squared cosine of the Cabibbo angle; this numerical factor is about 0.98, that is, very nearly unity.) This basic quark-conversion process underlies both radioactive β decays on Earth and the complex forms of nuclear cooking that occur in normal stars. Inclusion of the weak interaction also allows us to smooth out a slight imperfection in the logical structure of our account of matter based on QED and QCD. By destabilizing nuclei that would otherwise be stable, the weak interaction plays an important negative role. It defines the boundary of the periodic table for practical chemistry at the line of β stability, and removes spurious isotopes that would otherwise appear.

And the ragged edge

Other significant weak interaction processes involve strange (in the technical sense) particles or neutral currents producing neutrino-antineutrino pairs. Such processes become important in the later stages of stellar evolution and especially during the cataclysmic explosions of supernovae and their aftermath. Analysis of this physics brings in three more numerical parameters: the Cabibbo angle, the Weinberg angle, and the strange quark mass. With these, we can set up the governing equations. Of course, as in previous cases, solving the equations poses problems of a different order.

In the aftermath of supernova explosions, and perhaps in other extreme astrophysical environments, a few particles get accelerated to extremely high energies—they become cosmic rays. When cosmic rays collide with interstellar material, or impact our atmosphere, the debris of the collisions contains muons, taus, and heavy quarks. To describe the properties of all those particles, we must introduce many new parameters, specifically their masses and weak mixing angles. Recently we’ve learned, also mainly through the study of cosmic rays (including those emanating from the Sun) that neutrinos oscillate, so we must include masses and mixings for them, too.

At the high-energy frontier, parameters begin to proliferate more rapidly than major new phenomena. At lower energies, our use of intermediate, truncated models sustained a magnificent price to earnings ratio of parameters to results. But at the high-energy frontier, I think, we’ve reached the point of diminishing returns. Our most accurate understanding of the laws of nature, including all known details—the standard model, supplemented with nonzero neutrino masses and mixings, merged with general relativity—is clearly an inventory of raw materials, not a finished product.

That batch of raw materials is where the program of understanding matter by analysis and synthesis stands today. The analysis, though clearly unfinished, is already stunningly successful. It supports the synthesis of a remarkably economical conceptual system encompassing, in Dirac’s phrase, “All of chemistry and most of physics.” And the “most” now includes, as it did not in Dirac’s time, nuclear physics and astrophysics.

Cosmology by numbers

Remarkably, a parallel program of analysis and synthesis can be carried through for cosmology. We can construct conceptual models that use a very small number of parameters to describe major aspects of the universe as a whole.

The first model treats the universe as homogeneous and isotropic or in plain English, uniform. It isn’t, of course. Indeed, superficially the distribution of matter in the universe appears to be anything but uniform. Matter is concentrated in stars, separated by vast tenuously laden spaces, gathered into galaxies separated by still emptier spaces. We’ll be well rewarded for temporarily ignoring such embarrassing details, however.

The parameters of the model specify a few average properties of matter, taken over large spatial volumes. Those are the densities of ordinary matter (baryons), dark matter, and dark energy. We know quite a lot about ordinary matter, as I’ve been discussing, and can detect it at great distances by several methods. It contributes about 3% of the total density. About dark (actually, transparent) matter we know much less. It has been seen only indirectly, through the influence of its gravity on the motion of visible matter. Dark matter is observed to exert very little pressure, and it contributes about 30% of the total density. Dark (actually, transparent) energy contributes about 67% of the total density. It has a large negative pressure. Dark energy is most mysterious and disturbing, as I’ll elaborate next time.

Fortunately, our nearly total ignorance concerning the nature of most of the mass of the universe does not bar us from modeling its evolution. That’s because the dominant interaction on large scales is gravity, and gravity does not care about details. According to general relativity, only total energy-momentum counts—or equivalently, for uniform matter, total density and pressure.

We can use the equations of general relativity to extrapolate the present expansion of the universe back to earlier times by assuming the observed relative densities for matter, dark matter, and dark energy, the geometry of space is flat, and still assuming uniformity. This extrapolation defines the standard Big Bang scenario. It successfully predicts several things that would otherwise be very difficult to understand, including the redshift of distant galaxies, the existence of the microwave background radiation, and the relative abundance of light nuclear isotopes. The procedure is also internally consistent, and even self-validating, in that the microwave background is observed to be uniform to high accuracy, namely a few parts in 105.

A second model, refining the first, allows for small departures from uniformity in the early universe and follows the dynamical evolution of some assumed spectrum of initial fluctuations. The seeds grow by gravitational instability, with overly dense regions attracting more matter, thus increasing their density enhancement with time. Starting from very small seeds, the process plausibly could eventually trigger the formation of galaxies, stars, and other structures observed today. A priori, one might consider all kinds of assumptions about the initial fluctuations, and over the years many hypotheses have been proposed. But recent observations, especially the gorgeous WMAP measurements of microwave background anisotropies, favor what, in many ways, is the simplest possible guess, the so-called Harrison-Zeldovich spectrum. In that theory the fluctuations are assumed to be strongly random—uncorrelated and Gaussian with a spatially scale-invariant spectrum, to be precise—and to affect both ordinary and dark matter equally (adiabatic). Given such strong assumptions, just one parameter, the overall amplitude of fluctuations, defines the statistical distribution completely. With the appropriate value for that amplitude, the second cosmological model fits the WMAP data and other measures of large-scale structure remarkably well.

Yearning, opportunity, discontent

As I have just sketched, cosmology has been reduced to some general hypotheses and just four new continuous parameters. It is an amazing development. Yet I think that most physicists will not, and should not, feel entirely satisfied with it. The parameters appearing in the cosmological models, unlike those in the comparable models of matter, do not describe the fundamental behavior of simple entities. Rather, they appear as summary descriptors of averaged properties of macroscopic (VERY macroscopic) agglomerations. They are neither key players in a varied repertoire of phenomena nor essential elements in a beautiful mathematical theory. Due to these shortcomings we are left wondering why just these parameters appear necessary to make a working description of existing observations, and uncertain whether we’ll need to include more as observations are refined. We’d like to carry the analysis to another level, where the four working parameters will give way to different ones that are closer to fundamentals.

A different limitation to our insight penetrates modern cosmology so deeply that most physicists and cosmologists, inured by long familiarity aren’t as discontented as they ought to be. Modern cosmology consigns everything about the world, apart from a handful of statistical regularities, to chance and contingency. In Dante’s universe everything had its reason for being and its proper place. Kepler aspired to explain the specific form of the Solar System. Now it appears that the number of planets in the Solar System, their masses, and the shape of their orbits—and more generally, every specific fact about every specific object or group of objects in the universe—are mere accidents, not susceptible to fundamental explanation. Some are born modest and some work to be modest, but cosmologists have had modesty thrust upon them.

There are genuine, exciting opportunities for carrying the analysis of cosmological parameters further. It seems most unlikely, however, that the pervasive indeterminacy and randomness of the core of our model of the universe will go away. That uncertainty confronts us with an opportunity of another sort: the opportunity to expand our vision of what constitutes fundamental explanation. I’ll say more about these opportunities next time.

PTO.v56.i7.10_1.f1.jpg

“I’LL BE WORKING ON THE LARGEST AND SMALLEST OBJECTS IN THE UNIVERSE—SUPERCLUSTERS AND NEUTRIONS. ID LIKE YOU TO HANDLE EVERYTHING IN BETWEEN.”

View larger

More about the Authors

Frank Wilczek is the Herman Feshbach Professor of Physics at the Massachusetts Institute of Technology in Cambridge, Massachusetts.

Frank Wilczek. Massachusetts Institute of Technology, Cambridge, Massachusetts, US .

Related content
/
Article
The scientific enterprise is under attack. Being a physicist means speaking out for it.
/
Article
Clogging can take place whenever a suspension of discrete objects flows through a confined space.
/
Article
A listing of newly published books spanning several genres of the physical sciences.
/
Article
Unusual Arctic fire activity in 2019–21 was driven by, among other factors, earlier snowmelt and varying atmospheric conditions brought about by rising temperatures.
This Content Appeared In
pt-cover_2003_07.jpeg

Volume 56, Number 7

Get PT in your inbox

Physics Today - The Week in Physics

The Week in Physics" is likely a reference to the regular updates or summaries of new physics research, such as those found in publications like Physics Today from AIP Publishing or on news aggregators like Phys.org.

Physics Today - Table of Contents
Physics Today - Whitepapers & Webinars
By signing up you agree to allow AIP to send you email newsletters. You further agree to our privacy policy and terms of service.