Portal:Physics/2007 Selected articles

From Wikipedia, the free encyclopedia


This is an archive of entries that appeared on Portal:Physics's Selected Article section in 2007.

Please do not edit this page directly. Instead, use one of the "Edit selected article" links on the right of this page. This will ensure that you edit the correct page for your changes to appear on Portal:Physics in the correct week.

Week 1

Photons emitted in a coherent beam from a laser

In modern physics, the photon is the elementary particle responsible for electromagnetic phenomena. It mediates electromagnetic interactions and makes up all forms of light. The photon has zero invariant mass and travels at the constant speed c, the speed of light in empty space. However, in the presence of matter, a photon can be slowed or even absorbed, transferring energy and momentum proportional to its frequency. Like all quanta, the photon has both wave and particle properties; it exhibits wave–particle duality.

The modern concept of the photon was developed gradually (1905–17) by Albert Einstein to explain experimental observations that did not fit the classical wave model of light. In particular, the photon model accounted for the frequency dependence of light's energy, and explained the ability of matter and radiation to be in thermal equilibrium. Other physicists sought to explain these anomalous observations by semiclassical models, in which light is still described by Maxwell's equations but the material objects that emit and absorb light are quantized. Although these semiclassical models contributed to the development of quantum mechanics, further experiments proved Einstein's hypothesis that light itself is quantized; the quanta of light are photons.

The photon concept has led to momentous advances in experimental and theoretical physics, such as lasers, Bose–Einstein condensation, quantum field theory, and the probabilistic interpretation of quantum mechanics. According to the Standard Model of particle physics, photons are responsible for producing all electric and magnetic fields, and are themselves the product of requiring that physical laws have a certain symmetry at every point in spacetime. The intrinsic properties of photons — such as charge, mass and spin — are determined by the properties of this gauge symmetry. Photons have many applications in technology such as photochemistry, high-resolution microscopy, and measurements of molecular distances. Recently, photons have been studied as elements of quantum computers and for sophisticated applications in optical communication such as quantum cryptography.


Week 2

The NASA Astrophysics Data System (usually referred to as ADS) is an online database of over 5,000,000 astronomy and physics papers from both peer reviewed and non-peer reviewed sources. Abstracts are available for free online for all articles, and full scanned articles are available in GIF and PDF format for older articles. New articles have links to electronic versions hosted at the journal's webpage, but these are typically available only by subscription (which most astronomy research facilities have).

ADS is an extremely powerful research tool, and has had a significant impact on the efficiency of astronomical research since it was launched in 1992. Literature searches which previously would have taken days or weeks can now be carried out in seconds, via the sophisticated ADS search engine, which is custom-built for astronomical needs. Studies have found that the benefit to astronomy of the ADS is equivalent to several hundred million US dollars annually, and the system is estimated to have tripled the readership of astronomical journals.

Use of ADS is almost universal among astronomers worldwide, and therefore ADS usage statistics can be used to analyse global trends in astronomical research. They have revealed that the amount of research an astronomer carries out is related to the GDP per capita of the country in which they are based and that the number of astronomers in a country is proportional to the GDP of that country, so the amount of research done in a country is proportional to the square of its GDP divided by its population.


Week 3

Blaise Pascal, (June 19, 1623 – August 19, 1662) was a French mathematician, physicist, and religious philosopher. He was a child prodigy who was educated by his father. Pascal's earliest work was in the natural and applied sciences where he made important contributions to the construction of mechanical calculators, the study of fluids, and clarified the concepts of pressure and vacuum by generalizing the work of Evangelista Torricelli. Pascal also wrote powerfully in defense of the scientific method.

He was a mathematician of the first order. Pascal helped create two major new areas of research. He wrote a significant treatise on the subject of projective geometry at the age of sixteen and corresponded with Pierre de Fermat from 1654 and later on probability theory, strongly influencing the development of modern economics and social science.

Following a mystical experience in late 1654, he abandoned his scientific work and devoted himself to philosophy and theology. His two most famous works date from this period: the Lettres provinciales and the Pensées. However, he had suffered from ill-health throughout his life and his new interests were ended by his early death two months after his 39th birthday.


Week 4

The Sun is the star of our Solar System. The Earth and other matter (including other planets, asteroids, meteoroids, comets and dust) orbit the Sun, which by itself accounts for more than 99% of the Solar System's mass. Energy from the Sun—in the form of insolation from sunlight—supports almost all life on Earth via photosynthesis, and drives the Earth's climate and weather.

About 74% of the Sun's mass is hydrogen, 25% is helium, and the rest is made up of trace quantities of heavier elements. The Sun has a spectral class of G2V. "G2" means that it has a surface temperature of approximately 5,500 K, giving it a white color, which because of atmospheric scattering appears yellow. Its spectrum contains lines of ionized and neutral metals as well as very weak hydrogen lines. The "V" suffix indicates that the Sun, like most stars, is a main sequence star. This means that it generates its energy by nuclear fusion of hydrogen nuclei into helium and is in a state of hydrostatic equilibrium, neither contracting nor expanding over time. There are more than 100 million G2 class stars in our galaxy. Because of logarithmic size distribution, the Sun is actually brighter than 85% of the stars in the Galaxy, most of which are red dwarfs.

The Sun is a magnetically active star; it supports a strong, changing magnetic field that varies year-to-year and reverses direction about every eleven years. The Sun's magnetic field gives rise to many effects that are collectively called solar activity, including sunspots on the surface of the Sun, solar flares, and variations in the solar wind that carry material through the solar system. The effects of solar activity on Earth include auroras at moderate to high latitudes, and the disruption of radio communications and electric power. Solar activity is thought to have played a large role in the formation and evolution of the Solar System, and strongly affects the structure of Earth's outer atmosphere.


Week 5

A potassium Faraday filter designed, built and photographed by Jonas Hedin for making daytime LIDAR measurements at Arecibo Observatory.
A potassium Faraday filter designed, built and photographed by Jonas Hedin for making daytime LIDAR measurements at Arecibo Observatory.

An atomic line filter (ALF) is an advanced optical band-pass filter used in the physical sciences for filtering electromagnetic radiation with precision, accuracy, and minimal signal strength loss. Atomic line filters work via the absorption or resonance lines of atomic vapors and so may also be designated an atomic resonance filter (ARF).

The three major types of atomic line filters are absorption-re-emission ALFs, Faraday filters and Voigt filters. Absorption-re-emission filters were the first type developed, and so are commonly called simply "atomic line filters"; the other two types are usually referred to specifically as "Faraday filters" or "Voigt filters". Atomic line filters use different mechanisms and designs for different applications, but the same basic strategy is always employed: by taking advantage of the narrow lines of absorption or resonance in a metallic vapor, a specific frequency of light bypasses a series of filters that block all other light.

Atomic line filters can be considered the optical equivalent of lock-in amplifiers; they are used in scientific applications requiring the effective detection of a narrowband signal (almost always laser light) that would otherwise be obscured by broadband sources, such as daylight. They are used regularly in Laser Imaging Detection and Ranging (LIDAR) and are being studied for their potential use in laser communication systems. Atomic line filters are superior to conventional dielectric optical filters such as interference filters and Lyot filters, but their greater complexity makes them practical only in background-limited detection, where a weak signal is detected while suppressing a strong background. Compared to etalons, another high-end optical filter, Faraday filters are significantly sturdier and may be six times cheaper at around US$15,000 per unit.


Week 6

Herbig-Haro object HH47, imaged by the Hubble Space Telescope. The scale bar represents 1000 astronomical units, equivalent to about 20 times the size of our Solar System, or 1000 times the distance from the Earth to the Sun
Herbig-Haro object HH47, imaged by the Hubble Space Telescope. The scale bar represents 1000 astronomical units, equivalent to about 20 times the size of our Solar System, or 1000 times the distance from the Earth to the Sun

Herbig–Haro objects are small patches of nebulosity associated with newly-born stars, and are formed when gas ejected by young stars collides with clouds of gas and dust nearby at speeds of several hundred kilometres per second. Herbig–Haro objects are ubiquitous in star-forming regions, and several are often seen around a single star, aligned along its rotational axis.

HH objects are transient phenomena, lasting only a few thousand years at most. They can evolve visibly over quite short timescales as they move rapidly away from their parent star into the gas clouds in interstellar space (the interstellar medium or ISM). Hubble Space Telescope observations reveal complex evolution of HH objects over a few years, as parts of them fade while others brighten as they collide with clumpy material in the interstellar medium.

The objects were first observed in the late 19th century by Sherburne Wesley Burnham, but were not recognised as being a distinct type of emission nebula until the 1940s. The first astronomers to study them in detail were George Herbig and Guillermo Haro, after whom they have been named. Herbig and Haro were working independently on studies of star formation when they first analysed Herbig–Haro objects, and recognised that they were a by-product of the star formation process.


Week 7

Albert Einstein (German pronunciation) (March 14, 1879 – April 18, 1955) was a German-born theoretical physicist widely considered one of the greatest physicists of all time.[1][2] While best known for the theory of relativity (and specifically mass–energy equivalence, E=mc2), he was awarded the 1921 Nobel Prize in Physics for his 1905 (Annus Mirabilis) explanation of the photoelectric effect and "for his services to theoretical physics". In popular culture, the name "Einstein" has become synonymous with great intelligence and genius.

He was known for many scientific investigations, among which were: his special theory of relativity which stemmed from an attempt to reconcile the laws of mechanics with the laws of the electromagnetic field, his general theory of relativity which extended the principle of relativity to include gravitation, relativistic cosmology, capillary action, critical opalescence, classical problems of statistical mechanics and problems in which they were merged with quantum theory, leading to an explanation of the Brownian motion of molecules; atomic transition probabilities, the probabilistic interpretation of quantum theory, the quantum theory of a monatomic gas, the thermal properties of light with a low radiation density which laid the foundation of the photon theory of light, the theory of radiation, including stimulated emission; the construction of a unified field theory, and the geometrization of physics.

References

  1. ^ "Einstein the greatest". BBC. November 29, 1999.
  2. ^ "Einstein tops physicist pop chart". Institute of Physics. Retrieved 2006-09-28.


Week 8

According to the Big Bang, the universe emerged from an extremely dense and hot state (bottom). Since then, space itself has expanded with the passage of time, carrying the galaxies with it.

In physical cosmology, the Big Bang is the scientific theory that the universe emerged from a tremendously dense and hot state about 13.7 billion years ago. The theory is based on the observations indicating the expansion of space in accord with the Robertson-Walker model of general relativity, as indicated by the Hubble redshift of distant galaxies taken together with the cosmological principle.

Extrapolated into the past, these observations show that the universe has expanded from a state in which all the matter and energy in the universe was at an immense temperature and density. Physicists do not widely agree on what happened before this, although general relativity predicts a gravitational singularity.

The term Big Bang is used both in a narrow sense to refer to a point in time when the observed expansion of the universe (Hubble's law) began — calculated to be 13.7 billion (1.37 × 1010) years ago (± 2%) — and in a more general sense to refer to the prevailing cosmological paradigm explaining the origin and expansion of the universe, as well as the composition of primordial matter through nucleosynthesis as predicted by the Alpher-Bethe-Gamow theory.

From this model, George Gamow was able to predict in 1948 the existence of cosmic microwave background radiation (CMB). The CMB was discovered in 1964[1] and corroborated the Big Bang theory, giving it more credence over its chief rival, the steady state theory.

References

  1. ^ A. A. Penzias and R. W. Wilson, "A Measurement of Excess Antenna Temperature at 4080 Mc/s," Astrophysical Journal 142 (1965), 419.


Week 9

An extrasolar planet, or exoplanet, is a planet beyond the Solar System. As of February 2007, the count of known exoplanets stands at 212. The vast majority have been detected through various indirect methods rather than actual imaging. Most of them are giant planets likely to resemble Jupiter more than Earth.

Known exoplanets are members of planetary systems that orbit a star. There have also been unconfirmed reports of free-floating planetary-mass objects (sometimes called "rogue planets"): that is, ones that do not orbit any star. Since such objects do not satisfy the working definition of "planet" adopted by the International Astronomical Union, and since their existence remains unconfirmed, they will not be discussed in this article. For more information, see interstellar planet.

Extrasolar planets became a subject of scientific investigation in the mid-nineteenth century. Astronomers generally supposed that some existed, but it was a mystery how common they were and how similar they were to the planets of the Solar System. The first confirmed detections were finally made in the 1990s; since 2002, more than twenty have been discovered every year. It is now estimated that at least 10% of sunlike stars have planets, and the true proportion may be much higher. The discovery of extrasolar planets further raises the question of whether some might support extraterrestrial life.


Week 10

Carl Friedrich Gauss (Gauß) (30 April 1777 – 23 February 1855) was a German mathematician and scientist of profound genius who contributed significantly to many fields, including number theory, analysis, differential geometry, geodesy, magnetism, astronomy and optics. Sometimes known as "the prince of mathematicians" and "greatest mathematician since antiquity", Gauss had a remarkable influence in many fields of mathematics and science and is ranked as one of history's most influential mathematicians.

In 1831 Gauss developed a fruitful collaboration with the physics professor Wilhelm Weber; it led to new knowledge in the field of magnetism (including finding a representation for the unit of magnetism in terms of mass, length and time) and the discovery of Kirchhoff's circuit laws in electricity. Gauss and Weber constructed the first electromagnetic telegraph in 1833, which connected the observatory with the institute for physics in Göttingen. Gauss ordered a magnetic observatory to be built in the garden of the observatory and with Weber founded the magnetischer Verein ("magnetic club"), which supported measurements of earth's magnetic field in many regions of the world. He developed a method of measuring the horizontal intensity of the magnetic field which has been in use well into the second half of the 20th century and worked out the mathematical theory for separating the inner (core and crust) and outer (magnetospheric) sources of Earth's magnetic field.


Week 11

A quantum computer is any device for computation that makes direct use of distinctively quantum mechanical phenomena, such as superposition and entanglement, to perform operations on data. In a classical (or conventional) computer, the amount of data is measured by bits; in a quantum computer, the data is measured by qubits. The basic principle of quantum computation is that the quantum properties of particles can be used to represent and structure data, and that quantum mechanisms can be devised and built to perform operations with these data.

Though quantum computing is still in its infancy, experiments have been carried out in which quantum computational operations were executed on a very small number of qubits. Research in both theoretical and practical areas continues at a frantic pace, and many national government and military funding agencies support quantum computing research to develop quantum computers for both civilian and national security purposes, such as cryptanalysis. (See Timeline of quantum computing for details on current and past progress.)

It is widely believed that if large-scale quantum computers can be built, they will be able to solve certain problems exponentially faster than any classical computer. Quantum computers are different from other computers such as DNA computers and computers based on transistors, even though these may ultimately use some kind of quantum mechanical effect (for example covalent bonds). Some computing architectures such as optical computers may use classical superposition of electromagnetic waves, but without some specifically quantum mechanical resource such as entanglement, they do not share the potential for computational speed-up of quantum computers.



Week 13


A simplified mathematical
model of bike and rider.

Bicycle and motorcycle dynamics is the science of the motion of bicycles and motorcycles. It is concerned with the motions of bikes, their parts, and the forces acting on them. Specific subjects include balancing, steering, braking, and suspension.

Experimentation and mathematical analysis have shown that a bike stays upright when it is steered to keep its center of mass over its wheels. This steering is usually supplied by a rider, or in certain circumstances, by the bike itself.

While remaining upright may be the primary goal of beginning riders, a bike must lean in order to turn. The higher the speed or smaller the turn radius, the more lean is required. This is necessary in order to balance centrifugal forces due to the turn with gravitational forces due to the lean.

When braking, depending on the location of the combined center of mass of the bike and rider with respect to the point where the front wheel contacts the ground, bikes can either skid the front wheel or flip the bike and rider over the front wheel.


Week 14

Photo from NASA of the Bullet Cluster showing the inferred dark matter distribution as blue and the measured hot gas distributions in red.

In astrophysics and cosmology, dark matter is matter, not directly observed and of unknown composition, that does not emit or reflect enough electromagnetic radiation to be detected directly, but whose presence can be inferred from gravitational effects on visible matter. According to the Standard Model of Cosmology, dark matter accounts for the vast majority of mass in the observable universe. Among the observed phenomena consistent with the hypothesis of dark matter are the rotational speeds of galaxies and orbital velocities of galaxies in clusters, gravitational lensing of background objects by galaxy clusters such as the Bullet cluster, and the temperature distribution of hot gas in galaxies and clusters of galaxies. Dark matter also plays a central role in structure formation and Big Bang nucleosynthesis, and has measurable effects on the anisotropy of the cosmic microwave background. All these lines of evidence suggest that galaxies, clusters of galaxies, and the universe as a whole contain far more matter than that which interacts with electromagnetic radiation: the remainder is called the "dark matter component".


Week 15

The concept of entropy (Greek: εν (en=inside) + verb: τρέπω (trepo= to chase, escape, rotate, turn)) in thermodynamics is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously. Spontaneous changes occur with an increase in entropy. Spontaneous changes tend to smooth out differences in temperature, pressure, density, and chemical potential that may exist in a system, and entropy is thus a measure of how far this smoothing-out process has progressed. In contrast, the first law of thermodynamics deals with the concept of energy, which is conserved. Entropy change has often been defined as a change to a more disordered state at a molecular level. In recent years, entropy has been interpreted in terms of the "dispersal" of energy. Entropy is an extensive state function that accounts for the effects of irreversibility in thermodynamic systems.

Image: ice melting - classic example of entropy increasing described in 1862 by Rudolf Clausius as an increase in the disgregation of the molecules of the body of ice.


Week 16

In quantum mechanics, the EPR paradox is a thought experiment which challenged long-held ideas about the relation between the observed values of physical quantities and the values that can be accounted for by a physical theory. "EPR" stands for Einstein, Podolsky, and Rosen, who introduced the thought experiment in a 1935 paper to argue that quantum mechanics is not a complete physical theory. It is sometimes referred to as the EPRB paradox for David Bohm, who converted the original thought experiment into something closer to being experimentally testable.

The EPR thought experiment, performed with electrons. A source (center) sends electrons toward two observers, Alice (left) and Bob (right), who can perform spin measurements.

The EPR experiment yields a dichotomy. Either

  1. The result of a measurement performed on one part A of a quantum system has a non-local effect on the physical reality of another distant part B, in the sense that quantum mechanics can predict outcomes of some measurements carried out at B or
  2. Quantum mechanics is incomplete in the sense that some element of physical reality corresponding to B cannot be accounted for by quantum mechanics (that is, some extra variable is needed to account for it.)

Although originally devised as a thought experiment that would demonstrate the incompleteness of quantum mechanics, actual experimental results refute the principle of locality, invalidating the EPR trio's original purpose. The "spooky action at a distance" that so disturbed the authors of EPR consistently occurs in numerous and widely replicated experiments. Einstein never accepted quantum mechanics as a "real" and complete theory, struggling to the end of his life for an interpretation that could comply with relativity without implying "God playing dice", as he condensed his dissatisfaction with quantum mechanic's intrinsic randomness and counter-intuitivity.


Week 17

Nikola Tesla (10 July 1856 - 7 January 1943) was a world-renowned inventor, physicist, mechanical engineer and electrical engineer. He was born an ethnic Serb citizen of the Austrian Empire and later became an American citizen. Tesla is best known for his revolutionary work in, and numerous contributions to, the discipline of electricity and magnetism in the late 19th and early 20th century. Tesla's patents and theoretical work formed the basis of modern alternating current electric power (AC) systems, including the polyphase power distribution systems and the AC motor, with which he helped usher in the Second Industrial Revolution.

After his demonstration of wireless communication in 1893 and after being the victor in the "War of Currents", he was widely respected as America's greatest electrical engineer. Much of his early work pioneered modern electrical engineering and many of his discoveries were of groundbreaking importance. In the United States, Tesla's fame rivaled that of any other inventor or scientist in history or popular culture, but due to his eccentric personality and, at the time, unbelievable and sometimes bizarre claims about possible scientific and technological developments, Tesla was ultimately ostracized and regarded as a mad scientist. Never putting much focus on his finances, Tesla died impoverished at the age of 86.

The SI unit measuring magnetic flux density or magnetic induction (commonly known as the magnetic field ), the tesla, was named in his honour (at the Conférence Générale des Poids et Mesures, Paris, 1960).

Aside from his work on electromagnetism and engineering, Tesla is said to have contributed in varying degrees to the establishment of robotics, remote control, radar and computer science and to the expansion of ballistics, nuclear physics and theoretical physics. In 1943, the Supreme Court of the United States credited him as being the inventor of the radio. Many of his achievements have been used, with some controversy, to support various pseudosciences, UFO theories and new age occultism. Contemporary researchers of Tesla have deemed him "the man who invented the twentieth century" and "the patron saint of modern electricity."


May

In physics and astronomy, redshift occurs when the electromagnetic radiation, usually visible light, that is emitted from or reflected off of an object is shifted towards the red end of the electromagnetic spectrum. More generally, redshift is defined as an increase in the wavelength of electromagnetic radiation received by a detector compared with the wavelength emitted by the source. This increase in wavelength corresponds to a decrease in the frequency of the electromagnetic radiation. Conversely, a decrease in wavelength is called blue shift.

A redshift can occur when a light source moves away from an observer, corresponding to the Doppler shift that changes the frequency of sound waves. Although observing such redshifts, or complementary blue shifts, has several terrestrial applications (e.g., Doppler radar and radar guns), spectroscopic astrophysics uses Doppler redshifts to determine the movement of distant astronomical objects.[1] This phenomenon was first predicted and observed in the 19th century as scientists began to consider the dynamical implications of the wave-nature of light.

Another redshift mechanism is the expansion of the universe, which explains the famous observation that the spectral redshifts of distant galaxies, quasars, and intergalactic gas clouds increase in proportion to their distance from the observer. This mechanism is a key feature of the Big Bang model of physical cosmology. Yet a third type of redshift, the gravitational redshift (also known as the Einstein effect), is a result of the time dilation that occurs near massive objects, according to general relativity.

Image: Redshift of spectral lines in the optical spectrum of a supercluster of distant galaxies (right), as compared with that of the Sun (left). Wavelength increases up towards the red and beyond (frequency decreases).

References

  1. ^ See Binney and Merrifeld (1998), Carroll and Ostlie (1996), Kutner (2003) for applications in astronomy.


June

In classical mechanics, the Laplace–Runge–Lenz vector (or simply the LRL vector) is a vector used chiefly to describe the shape and orientation of the orbit of one astronomical body around another, such as a planet revolving around a sun. For two bodies interacting by Newtonian gravity, the LRL vector is a constant of motion, meaning that it is the same no matter where it is calculated on the orbit; equivalently, the LRL vector is said to be conserved. More generally, the LRL vector is conserved in all problems in which two bodies interact by a central force that varies as the inverse square of the distance between them; such problems are called Kepler problems.

The hydrogen atom is a Kepler problem, since it comprises two charged particles interacting by Coulomb's law of electrostatics, another inverse-square central force. The LRL vector was essential in the first quantum mechanical derivation of the spectrum of the hydrogen atom, before the development of the Schrödinger equation. However, this approach is rarely used today.

In classical and quantum mechanics, conserved quantities generally correspond to a symmetry of the system. The conservation of the LRL vector corresponds to an unusual symmetry; the Kepler problem is mathematically equivalent to a particle moving freely on a four-dimensional sphere, so that the whole problem is symmetric under certain rotations of the four-dimensional sphere. This higher symmetry results from two properties of the Kepler problem: the velocity vector always moves in a perfect circle and, for a given total energy, all such velocity circles intersect each another in the same two points.

Image: The LRL vector A (shown in red) at a point on the elliptical orbit of a bound point particle moving under an inverse-square central force. The center of attraction is shown as a small black circle from which the position vectors (likewise black) emanate. The angular momentum vector L is perpendicular to the orbit. The coplanar vector (mk/r)r, where m is the mass and r is the radius, is shown in green. The vector A is constant in direction and magnitude.


July

In physics, a laser is a device that emits light through a specific mechanism for which the term laser is an acronym: light amplification by stimulated emission of radiation. This is a combined quantum-mechanical and thermodynamical process discussed in more detail below. As a light source, a laser can have various properties, depending on the purpose for which it is designed. A typical laser emits light in a narrow, low-divergence beam and with a well-defined wavelength (or color, when the laser is operating in the visible spectrum). This is in contrast to a light source such as the incandescent light bulb, which emits into a large solid angle and over a wide spectrum of wavelength. These properties can be summarized in the term coherence.

A laser consists of a gain medium inside an optical cavity, with a means to supply energy to the gain medium. The gain medium is a material (gas, liquid, solid or free electrons) with appropriate optical properties. In its simplest form, a cavity consists of two mirrors arranged such that light bounces back and forth, each time passing through the gain medium. Typically, one of the two mirrors, the output coupler, is partially transparent. The output laser beam is emitted through this mirror.

Light of a specific wavelength that passes through the gain medium is amplified (increases in power); the surrounding mirrors ensure that most of the light makes many passes through the gain medium. Part of the light that is between the mirrors (i.e., is in the cavity) passes through the partially transparent mirror and appears as a beam of light. The process of supplying the energy required for the amplification is called pumping and the energy is typically supplied as an electric current or as light at a different wavelength. In the latter case, the light source can be a flash lamp or another laser. Most practical lasers contain additional elements that affect properties such as the wavelength of the emitted light and the shape of the beam.

The first working laser was demonstrated in May 1960 by Theodore Maiman at Hughes Research Laboratories. Nowadays, lasers have become a multi-billion dollar industry. The most widespread use of lasers is in optical storage devices such as compact disc and dvd players, in which the laser (a few millimeters in size) scans the surface of the disc. Other common applications of lasers are bar code readers and laser pointers. In industry, lasers are used for cutting steel and other metals and for inscribing patterns (such as the letters on computer keyboards). Lasers are also commonly used in various fields in science, especially spectroscopy, typically because of their well-defined wavelength or short pulse duration in the case of pulsed lasers. Lasers are also used for military and medical applications.

Image: Experiment with a laser (likely an argon type) (US Military)


August

In physics, the Casimir effect or Casimir-Polder force is a physical force exerted between separate objects, which is due to neither charge, gravity, nor the exchange of particles, but instead is due to resonance of all-pervasive energy fields in the intervening space between the objects. This is sometimes described in terms of virtual particles interacting with the objects, due to the mathematical form of one possible way of calculating the strength of the effect. Since the strength of the force falls off rapidly with distance it is only measurable when the distance between the objects is extremely small. On a submicrometre scale, this force becomes so strong that it becomes the dominant force between uncharged conductors. Indeed at separations of 10 nm - about a hundred times the typical size of an atom - the Casimir effect produces the equivalent of 1 atmosphere of pressure (101.3 kPa).

Dutch physicists Hendrik B. G. Casimir and Dirk Polder first proposed the existence of the force, and formulated an experiment to detect it in 1948 while participating in research at Philips Research Labs. The classic form of his experiment used a pair of uncharged parallel metal plates in a vacuum, and successfully demonstrated the force to within 15% of the value he had predicted according to his theory.

The van der Waals force between a pair of neutral atoms is a similar effect. In modern theoretical physics, the Casimir effect plays an important role in the chiral bag model of the nucleon; and in applied physics, it is becoming increasingly important in the development of the ever-smaller, miniaturised components of emerging micro- and nano-technologies.

Image: Casimir forces on parallel plates.


September

Introduction to general relativity provides a generally accessible introduction to the subject. For the main encyclopedia article, please see General relativity.

General relativity (GR) is a theory of gravitation that was developed by Albert Einstein between 1907 and 1915. According to general relativity, the observed gravitational attraction between masses results from those masses warping nearby space and time. Previously, Newton's law of universal gravitation (1686) had described gravity as a force between masses, but experiments have shown that Einstein's description is more accurate. What is more, general relativity predicts interesting new phenomena such as gravitational waves.

General relativity accounts for several effects that are unexplained by Newton's law, such as minute anomalies in the orbits of Mercury and other planets, and it makes numerous predictions – since confirmed – for novel effects of gravity, such as the bending of light and the slowing of time. Although general relativity is not the only relativistic theory of gravity, it is the simplest such theory that is consistent with the experimental data. However, a number of open questions remain; the most fundamental is how general relativity can be reconciled with the laws of quantum physics to produce a complete and self-consistent theory of quantum gravity.

The theory has developed into an essential tool for modern astrophysics. It provides the foundation for our current understanding of black holes; these are regions of space where gravitational attraction is so strong that not even light can escape. Their strong gravity is thought to be responsible for the intense radiation emitted by certain types of astronomical objects (such as active galactic nuclei or microquasars).

The bending of light by gravity can lead to the curious phenomenon of multiple images of one and the same astronomical object being visible in the sky. This effect is called gravitational lensing, and its study is an active branch of astronomy. Direct evidence of gravitational waves is being sought by several teams of scientists, as in the LIGO and GEO 600 projects; success should allow scientists to study a variety of interesting phenomena, from black holes to the early universe, by analyzing the gravitational waves they produce. General relativity is also the basis of the standard Big Bang model of cosmology.

Image: High-precision test of general relativity by the Cassini space probe (artist's impression): radio signals sent between the Earth and the probe (green wave) are delayed by the warping of space and time (blue lines).


October

The Compact Muon Solenoid (CMS) experiment is one of two large general-purpose particle physics detectors being (as of 2007) built on the proton-proton Large Hadron Collider (LHC) at CERN in Switzerland. Approximately 2600 people from 180 scientific institutes form the collaboration building it. It will be located in an underground chamber at Cessy in France, just across the border from Geneva. The completed detector will be cylindrical, 21 metres long and 16 metres diameter and weigh approximately 12,500 tonnes.

The main goals of the experiment are:

Image: The set up of the CMS. In the middle, under the so-called barrel there is a man for the scale. (HCAL=hadron calorimeter, ECAL=electromagnetic calorimeter)


November

The Casimir force is a force exerted between separate objects due to resonance of electromagnetic energy fields in the intervening space between the objects. This is sometimes described in terms of virtual particles interacting with the objects. Because the strength of the force falls off rapidly with distance, it is only measurable when the distance between the objects is extremely small. On a submicrometre scale, the Casimir force becomes so strong that it can be the dominant force between uncharged conductors. Indeed at separations of 10 nm — about a hundred times the typical size of an atom — the Casimir effect produces the equivalent of 1 atmosphere of pressure (101.3 kPa).



December

A simulation of a proton collision at the LHC's CMS, causing the decay of higgs bosons
A simulation of a proton collision at the LHC's CMS, causing the decay of higgs bosons

The Higgs boson is a hypothetical massive scalar elementary particle predicted by the Standard Model. It plays a key role in explaining principals of mass. It was first theorized in 1964 by Peter Higgs, François Englert, Robert Brout, Gerald Guralnik, Carl Hagen, and Tom Kibble. It is the only particle of the standard model not yet observed, despite the attempts of researchers at Fermilab to detect it. It is also a major aim of the upcoming Large Hadron Collider to do so.

The properties of the Higgs boson can have large implications in physics. For example, if the mass of the Higgs boson is between 115 and 180 GeV, then the Standard Model can be valid at energy scales all the way up to the Planck scale. Measurements of the Higgs boson may also lead to a deeper understanding of the forces.