« May 2007 | Main | July 2007 »

June 29, 2007

Metastable Alloys

If thermodynamics could be summarized in one word, that word would be "equilibrium." Thermodynamics portrays processes accurately only under equilibrium conditions. Fortunately, Mother Nature is not that strict, so all the equations we've learned work well under most conditions. Often, it's useful to turn this restriction around and purposely introduce a non-equilibrium condition to generate an unusual, metastable material. Honeywell did this many years ago when it produced Metglas, an amorphous metal with many unique properties. It was formed by rapidly quenching a molten metal on a rapidly spinning cooled wheel. The cooling rate is nearly a million degrees per second, and the process is aided by using alloy compositions with deep eutectics. Materials scientists at Sandia National Laboratories are investigating a more extreme process for producing metastable alloys. They are using intense radiation to produce superalloy nanoparticles from precursor solutions [1].

The Sandia team, led by Tina Nenoff, is investigating this unique method for production of superalloy nanoparticles to improve weapons casings and gas turbine engines. Superalloys, as their name implies, have excellent oxidation and corrosion resistance, and they have strength at high temperature. Development of improved superalloys has followed the traditional path of changing the proportions of the alloyed elements, adding additional elements, and changing temperature treatment conditions. The Sandia work takes the novel approach of using intense electron and proton irradiation to decompose precursor molecules and grow nanoparticles of metal alloys in an alcohol solution. The process is called "radiolysis." The process allows the creation of metastable alloy phases that can't be prepared by the typical melting/solidification/heat-treatment process. The Sandia process produces particles that are of uniform size and defect-free. The particles themselves are not radioactive.

Reference:
1. Darrick Hurst, "Nanoparticles unlock the future of superalloy metals" (Sandia Press Release, June 13, 2007).

June 28, 2007

Standard Kilogram

What's a few micrograms between friends? It's thought that the standard kilogram, a platinum-iridium alloy cylinder kept at the Bureau International des Poids et Mesures (International Bureau of Weights and Measures), near Paris, France, may have lost or gained about 100 micrograms over the century of its existence. The standard kilogram replaced a simply realized laboratory standard of using a measured volume of water at its temperature of maximum density (277.13 K or 3.98 oC). This earlier standard had an unusual source of error, aside from getting very pure water, maintaining it at a very precise temperature, and correcting for meniscus effects. The density of water depends somewhat on atmospheric pressure, and pressure is gram-force per unit area, so there's a circular reference.

The growth of silicon single crystals is an extremely well developed process that produces huge crystals of extreme perfection for integrated circuits. There is an idea that a perfectly dimensioned cube of an isotopically pure element, such as carbon [1] or silicon could be used as a mass standard. For silicon, silicon-28 (28Si), which is 92.23% of natural silicon, would be used. Atoms in silicon are arranged in a very regular lattice structure (diamond cubic) so that, in principle, the number of atoms in the cube can be "counted." Recently [2], a group of scientists at the Institute of Crystal Growth (Berlin, Germany) prepared a very pure five-kilogram crystal of 99.994% pure silicon-28. They used a zone-refining technique to purify the material to 100 ppb. They intend to prepare perfect spheres of this material.

Although fabrication of spheres is an easier process than preparing cubes, Frank J. Donahoe of Wilkes University (Wilkes-Barre, PA) suggested the following idea [3]. 84,446,891 is a prime number, and a cube of 84,446,891 atoms on a side would be just slightly larger than the accepted value of Avogadro's number (6.0221415 x 1023).

References:
1. Ronald F. Fox and Theodore P. Hill, "An Exact Value for Avogadro's Number," American Scientist, vol. 95, no. 2 (March-April 2007).
2. Nicola Jones, "Silicon crystal cooked to perfection," Nauture Online (29 May 2007).
3. Frank J. Donahoe, Letter to the Editor, American Scientist, vol. 95, no. 3 (May-June 2007).

June 27, 2007

What, Exactly, is Astronomy?

The word, astronomy, comes from the Latin astrum (star or constellation) and nominare (to name). Originally, astronomy was just finding and naming stars. By the time of Johannes Kepler in the seventeenth century, astronomy became more of an exact science, and many stars were named by their position, rather than some mythological character. Eventually, astrometry, the precise measurement of galactic and stellar positions and their movements, emerged as an important subfield of astronmoy. Today, astronomers measure the spectra of stars, the distance to galaxies, galactic magnetic fields, and the like. They also measure things such as the anisotropy of the cosmic background radiation, they search for dark energy and gravitational waves, and they put instruments onto satellites. But are these activities really astronomy? In an era when funding for astronomy is severely constrained [1], when it comes time to allocate money for research, is it fair that things relating more to particle physics are funded from the same pot as "real" astronomy.

Simon D. M. White of the Max Planck Institute for Astrophysics, Garching, Germany, says that dark energy studies, which are more relevant to physics than astronomy, will take hundreds of millions of dollars away from other astronomy projects [2]. All of this effort would be to measure just a single ratio; that is, the ratio of dark energy to matter in the universe. In an article scheduled for publication in Reports on Progress in Physics, Simon draws a sharp distinction between projects that are really astronomy, and those merely particle physics in disguise. As an example, the Hubble Space Telescope is an astronomy project, since it's an observatory, designed for general tasks, serving a diverse community, used by teams of all sizes, produces unanticipated results, prepares a new generation of astronomers, and generates great public interest. In contrast, the Wilkinson Microwave Anisotropy Probe is a single experiment, designed for a specific task, serves a single community of scientists, has a planned result, and employs more computer programmers than astronomers.

White admits that dark energy is an important research topic, but it can be probed by conventional astronomical observations alone, and astronomy funds should not be drained into a separate dark matter experiment.

References:
1. William Atkins, "Arecibo Observatory may be nixed" (IT-Wire, June 2, 2007).
2. Simon D.M. White, "Fundamentalist physics: why Dark Energy is bad for Astronomy" (arXiv Preprint).
3. Geoff Brumfiel, "A clash of cosmologies: Is too much physics bad for astronomy?" Nature, vol. 447 no. 7141 (10 May 2007), p. 122f.

June 26, 2007

Lawrence J. Fogel

Computer Science has been around for only fifty years, so many of the well-known algorithms are less than fifty years old. Many of the founders of the field have passed-away in the last several years. One of them, Lawrence J. Fogel, died on February 18, 2007 [1].

Lawrence J. Fogel (1928-2007) was the father of evolutionary programming, but like most computer scientists of his age, his career started in a different field for the simple reason that there were no computers. Fogel received his B. S. in electrical engineering from New York University in 1948, and he worked from 1948 until about 1960 designing antennas and military electronics systems. While working, he obtained his M. S. in electrical engineering from Rutgers University (New Brunswick, NJ) in 1952. Many of his patents were in the area of active noise cancellation, and he was also inventor of an advanced cockpit display. From 1960-61, Fogel was a Special Assistant to the Associate Director of Research of the National Science Foundation, and it was there that he invented evolutionary programming. He developed the method further from 1961-1963 while at General Dynamics (San Diego).

In 1964, Fogel received his electrical engineering Ph.D from the University of California, Los Angeles. His thesis, entitled "On the Organization of Intellect," was on evolutionary computation [2]. This was about the same time that John Henry Holland was inventing the complementary genetic algorithm. Fogel expanded his thesis into a book [3], "Artificial Intelligence through Simulated Evolution," co-authored with Alvin Owens and Michael Walsh. To capitalize on this new programming method, Fogel founded Decision Science, Inc. in 1965. Decision Science developed a combat flight simulator program that was so effective that they needed to weaken it to give the trainee pilots a chance to win. Decision Science was acquired by another company in 1982. Some years after the sale of his company, Fogel joined ORINCON Corporation. In 1993 he founded another company, Natural Selection, Inc., which exists today developing applications in such diverse areas as navigation of unmanned vehicles, aeronautics, and industrial control.

Although he was an industrial scientist, Fogel published voluminously. He authored hundreds of journal articles and many books. Fogel was the founding Editor-in-Chief of the Journal of Cybernetics, and he was a Fellow of the IEEE. Like many electrical engineers, he was a hobbyist, building model planes and boats. He was a columnist for Model Builder Magazine in the 1980s, after the sale of his first company. As if that weren't enough, Fogel was a jazz musician, so perhaps he was the model for Buckaroo Banzai!

References:
1. George H. Burgin, "In Memoriam Dr. Lawrence J. Fogel 1928-2007," IEEE Transactions on Evolutionary Computation, vol. 11, no. 3 (June 2007), pp. 290-293.
2. L. J. Fogel, "On the organization of intellect," Ph.D. dissertation, Univ. California at Los Angeles, Los Angeles, CA, 1964.
3. L. J. Fogel, A. J. Owens, and M. J. Walsh, "Artificial Intelligence Through Simulated Evolution," Wiley (New York), 1966.

June 21, 2007

Short Vacation (Plus a Joke)

I'll be away until Tuesday, June 26, 2007, on a short vacation. I'll leave you with a variation on a joke you may have heard before. You can credit me with authorship of this variant.

A mathematician was walking in the research building when he saw there was a small fire in a room. He noticed that a nearby waste-paper basket could be used as a bucket, and there was an eye wash station a few feet down the corridor where he could fill the waste-paper basket with water. Knowing that he could, in principle, extinguish the fire, he continued on his way.

Shortly after that, a physicist sees the same fire, the waste-paper basket, and the source of water. He estimates the quantity of fire, the rate of growth of the fire, measures the volume of the waste-paper basket, and measures the flow rate of the water source. He decides that the fire can be extinguished with more than 95% certainty, posts his calculations on the door of the room, and leaves.

An engineer smells smoke and traces the source to the fire. He quickly fills the waste-paper basket with water. He's about to extinguish the fire when his PDA beeps to remind him that he's due at a Six Sigma training class. He leaves the waste paper basket on the floor and rushes to his class.

A few seconds later, the Six Sigma instructor walks by the same room on his way to teach his class. He sees the fire and the waste-paper basket full of water, and he decides that there's far too much water for such a small fire. He starts to calculate the smallest quantity of water needed to extinguish the fire, but as the fire grows in size, he needs to revise his calculation again and again. Finally, the calculated quantity of water exceeds the quantity in the waste paper basket.

The Six Sigma instructor was the only person to perish in the fire. Without Six Sigma training, the company folds, and the mathematician, physicist and engineer go on to found a successful fire prevention company.

June 20, 2007

Flower Power

The Italian mathematician, Leonardo of Pisa, is considered to be one of the most important mathematicians of the Middle Ages. Since his father had the nickname, "Bonaccio," which today would translate to "Great Guy," Leonardo was called "filius Bonacci," for "son of Bonaccio." That's why he's known as Fibonacci. One of his inventions was a sequence of numbers we now call the Fibonacci Sequence. This number sequence is easy to generate. Start with {0,1}, add these two numbers to get a third number, in this case 1, and add that to the sequence {0,1,1}. From that point onwards, just keep adding the final two numbers of the sequence to get the next number; i.e., {0,1,1,2}, {1,1,2,3}, {1,1,2,3,5}, {1,1,2,3,5,8}, and so on. Simple enough, but why would it be interesting? As it turns out, the Fibonacci Sequence models some biological processes, such as the arrangement of segments on a pine cone, branching in trees, and the spirals of sea shells [1]. It also models the pattern of petals in a sunflower [2] according to a simple equation that gives the rotation angle from one petal to another as you move from the center of the flower. This arrangement makes sense, since it gives adequate space to each petal as the flower grows outwards, but why does such a thing happen?

In 1868, the German German botanist, Wilhelm Hofmeister, proposed that plant cells tend to move in directions away from existing growth. This was later shown to be an effect of chemical signalling in the plant via a plant hormone, auxin. Stephane Douady and Yves Couder, two physicists from the Laboratoire de Physique Statistique, Paris, France, created this effect, known by its proper name, phyllotaxis (from the Greek φυλλον, leaf, and τακτικος, arrangement), using a numerical simulation in 1992 [3]. The physical basis of this simulation is the idea of having the system evolve in a way that avoids a rational, periodic, organization. Their simulation has only one parameter that describes the successive appearance of new elements.

This simulation builds on the results of an experiment that Douady and Couder performed [3]. In this experiment, they dropped ferrofluid (a solution of fine iron particles) droplets into a dish of silicon oil in the center of a magnetic ring. Ferrofluid and silicone oil are immiscible, so the droplets remained as small spheres in the silicone oil. At a slow dropping rate, the droplets were attracted to the edge of the dish and repelled from each other. However, as the dropping rate increased, there were more droplets in the vicinity of the newer droplets, and the repulsion forces were more complex. At a certain rate, a spiral pattern of droplets emerged. One thing that emerged from this experiment was that the spiral need not always obey the Fibonacci Sequence. It could instead obey a closely related sequence based on the Lucas numbers, which have the sequence {2, 1, 3, 4, 7, 11, 18, 29, 47, 76, 123, ...}. A few plants have been identified with phyllotaxis in this pattern.

Flower Power was the slogan of the US hippie movement during the time of the Vietnam War. It is thought that the expression, "Flower Power," was coined by the beat poet, Allen Ginsberg, in 1965. Followers of the movement were known as "Flower Children."

References:
1. Fibonacci numbers in nature (Wikipedia).
2. H. Vogel, "A better way to construct the sunflower head", Mathematical Biosciences, no. 44 (1979), pp. 179-189.
3. S. Douady and Y. Couder, "Phyllotaxis as a physical self-organized growth process," Phys. Rev. Lett. vol. 68, no. 13 (1992), pp. 2098-2101
4. Julie J. Rehmeyer, "The Mathematical Lives of Plants," Science News, vol. 171, no. 18 (May 5, 2007).

June 19, 2007

Time Travel

We all are time travelers. The problem is, we can only move forward through time at the same rate, whatever "forward" means. Our current sense of time is dictated by the second law of thermodynamics, the entropy law. This law, simply stated, is "The entropy of an isolated system not in equilibrium will tend to increase over time, approaching a maximum value at equilibrium [1]." Entropy is the arrow that points in the direction of increasing time. The Time Machine by H. G. Wells is an interesting story, but mainstream physics doesn't see any way for time travel in a "time machine" sense. As a consequence, there are no funded proposals for time travel.

This lack of traditional funding for time travel research didn't deter John Cramer, a University of Washington physicist. Since public money wasn't available, he convinced several private donors to fund his research. He was aided by the fact that he is known outside traditional scientific circles, since he writes science fiction books. Gregory Benford, a professor at the University of California, Irvine, is another physicist who writes science fiction novels. He is also aided by his scientific credentials. Cramer has worked at CERN, and he was previously the director of the University of Washington's nuclear physics laboratory. So far, the public has donated $35,000 to fund his research program. One donor, a music industry executive, has donated $20,000 of this sum [2].

How far will this $35,000 get him? Cramer's experiment doesn't involve huge particle accelerators. It's a simple optics experiment that sits on a benchtop. He hopes to detect signaling between distant, entangled photons. This is an experimental test of the Einstein-Podolsky-Rosen Paradox, a quantum mechanical effect that seems to allow instantaneous signalling across great distances. Einstein helped to author the paradox as an argument against quantum mechanics. He called the EPR effect, "spooky action at a distance," but he was unable to deter practitioners of quantum mechanics by this paradox, and they didn't see a way around EPR.

John Wheeler and Richard Feynman, both prominent physicists, dubbed the ERP effect, "retrocausality," and they thought it was worthwhile investigating. Cramer's calls the version of retrocausality tested by his experiment the "transactional interpretation of quantum mechanics." DARPA told him that his experiment was "too weird." That's the same DARPA that funded his work in creating liquid robots, simpler versions of the one in the Terminator movie.

References:
1. The second law of thermodynamics (Wikipedia).
2. Tom Paulson, "Public donates to UW scientist to fund backward-in-time research".
3. Tax-deductible contributions to the project may be made by contacting Jennifer Raines, UW Department of Physics, at jraines@phys.washington.edu. (More information in ref. 2).

June 18, 2007

It's a Hit!

Music is big business, generating several tens of billions of dollars in wholesale revenue each year. Just as in product development, for every hit there must be at least ten duds. In industry we try to increase the hit-to-dud ratio by doing extensive market analysis. Picking a hit record is more difficult, since the topic is art, not science, and a panel of trained ears needs to make its best guess as to whether a record should go to market or stay in the recording studio vault. There's a possibly apocryphal story I heard in the sixties about a company marketing a music service to the recording industry. Recording industry executives would travel to the company, where there was a computer system in a room. All you needed to do was play the recording in the presence of the computer, and lights would indicate whether or not it was a hit. The service became very popular, since there was a high success rate. As it turned out, much like "The Turk", a famous chess-playing automaton, there was a man in the box who did all the analysis. Fortunately for him, computers were as large as refrigerators in those days [1]. It's hard for machinery to best the working of the human mind.

Today, computers are much faster, and mathematicians are still very clever, so an actual computer system has emerged to analyze recordings for their sales potential [2,3]. Platinum Blue Music Intelligence Inc., or just Platinum Blue, a New York City based company, has developed a computer algorithm for analyzing songs. How does it work? Platinum Blue identified thirty quantifiable parameters in any song recording. Some obvious first choices for parameters would be tempo, changes in tempo, spectral range, etc. It seems to be hard to define thirty parameters, let alone detect and analyze these. Actually, the analysis step is simple, at least for a mathematician. A recording can be identified by a point in thirty-dimensional space, and its proximity to other songs in that space is the end result of the analysis.

There are two uses for the algorithm. First is a hit detector called Music Xray. Obviously, a song that's a close neighbor to Bono in this thirty-dimensional space is hit material, although it might be better to have Bono do the release recording. The second use of the algorithm, called Platinum Song Seeker, helps people select more of the music they like. You list your record collection, and the mathematical model suggests other recordings in close proximity to your favorites. You're "mathematically guaranteed" to like these. This algorithm is being marketed also for copyright violation detection and selecting appropriate music for movies and video games. Can pure mathematics really make money? Perhaps not. Platinum blue offers also a credit card awards program geared towards music. Anything to pay the bills.

References:
1. Disclaimer - No, people shouldn't shut themsleves into refrigerators.
2. Tony Phillips, "Mathematical patterns in songs."
3. Predicting Popularity: The Math Behind Hit Music (National Public Radio Interview).

June 15, 2007

Half-Brained Mice

Ten years ago, an IBM supercomputer called Deep Blue beat chess grandmaster Gary Kasparov in a chess tournament. Blue is the IBM color, and IBM is commonly called Big Blue. Deep Blue, whose 32 processors could examine about 200 million moves per second, won by a full game out of the six played [1]. This accomplishment was always considered to be the Holy Grail of computer intelligence, but are computers really that smart?

Fast forward to the present to the latest manifestation of Deep Blue, IBM's BlueGene L supercomputer. The BlueGene L has a peak processing speed of 360 teraflops as a result of its 4,096 processors running at 4.7 GHz [2]. Researchers at IBM and the University of Nevada have used the computing power of BlueGene to simulate the cortex of a mouse brain, which has about eight million neurons, each connected to eight thousand other neurons. Although neurons fire at a slow rate, about once per second, each neuron must process its inputs from the other neurons to update its state, so there are 64 billion updates per second (eight million times eight thousand). That's a 64 GHz rate, quite beyond a single processor. The parallelism of the process allows it to be broken into parts so that the multi-processor BlueGene can handle it, but only at a tenth of the mouse rate. Memory limitations limit the computation to ten seconds duration, so what we have is just a single second of a mouse thought.

The human brain has about 100 billion neurons with similar connectivity, so a much more powerful computer would be required to simulate the human mind. But would this be just a simulation, or a conscious person? As I mentioned in a previous article, some scientists believe that computers will become sentient somewhere between the petaFLOP and exaFLOP (1,000 petaFLOPS) level.

References:
1. Michael Corrado, "Blue Gene, Progeny of IBM's Chess Champ, Deep Blue, Becomes an Engine of Scientific Discovery" (IBM Press Release).
2. Darren Murph, "IBM kicks out energy-efficient 4.7GHz POWER6 processor" (engadget.com, May 22, 2007).
3. Kate Bevan, "Just how powerful is the brain of a mouse?" (The Guardian, May 3, 2007).

June 14, 2007

Mr. Wizard

May parents always encouraged my scientific interests, buying me science books, batteries, and other equipment. There was that one time, though, when my father complained about the smells when I was making my own paper. I was recreating an experiment I saw on a television show, "Watch Mr. Wizard." Don Herbert, who played "Mr. Wizard" in 547 episodes on the NBC television network, died on June 12, 2007, at age eighty-nine [1,2].

Don Herbert (1917-2007) did not have an advanced degree in science. He graduated from the LaCrosse (Wisconsin) State Teachers College in 1940 with credentials in both English and science. His teaching career was interrupted immediately by World War II, in which he flew fifty-six missions as a B-24 bomber pilot in the US Army Air Force. After the war, he worked in radio, and as an actor and model, before his role as Mr. Wizard on the NBC television network in 1951. The concept for using the new medium of television to present science experiments to children was Herbert's idea alone, and the television series ran for fourteen years. Magnetic recording was not well developed until the mid-1960s, so most television shows were broadcast live. This was somewhat of a gamble, since child actors were involved in every episode. However, it was the presence of a child, learning how to do simple scientific experiments, that made the show popular among "Baby Boomers," myself included. "Watch Mr. Wizard" was the winner of a Peabody Award, and the show was reprised on the Nickelodeon cable television channel in the 1980s.

The shows, which aired in black and white, began with an announcer saying, "Watch Mr. Wizard. That's what all the kids in the neighborhood call him, because he shows them the magic and mystery of science in everyday living." Herbert said that the show was directed to twelve-year old children, and he had tried doing the show solo, but he was dissatisfied. Having a child there to react and ask questions gave the show some needed energy. Herbert stated in an interview with the Voice of America that "Using everyday equipment made it something that children should not be afraid of. If you used scientific equipment that's strange to the child, it's not going to help him or her understand." [3] The original "Watch Mr. Wizard" shows are available on DVD [4].

References:
1. 'Mr. Wizard' dies at 89 (Los Angeles Times, June 12, 2007); also available here.
2. Donald Jeffrey Herbert (Wikipedia).
3. Art Chimes, "TV's Mr. Wizard Touched the Daily Lives of Generations of Kids with the Magic of Science" (Voice of America).
4. Mr. Wizard Web Site.

June 13, 2007

Another Milestone - 200 Articles

This is the two-hundredth entry of this blog, which started on August 10, 2006. You can read my welcoming message here.

In honor of this occasion, some levity is in order, so I present the "barometer question." This is a simple question intended for freshmen physics students, "How would you measure the height of a building with a barometer." Of course, physicists are creative and playful people, and they don't stop at a simple answer. One web site lists fifteen possible methods for finding the height of a building with a barometer. The first is the "expected" answer.

• Measure the barometric pressure at the top and at the bottom
• Drop the barometer and time how long it takes to fall
• Use the barometer as a measuring stick
• Offer the barometer to the superintendent in return for the height of the building
• Measure the shadow of the barometer and the building
• Measure the shadow of the building, calibrated by the barometer
• Find a barometer with heights of local buildings on it
• Compare the barometer height to the building height (relative triangles)
• Trade the barometer for a long measuring tape
• Drop the barometer on the roof and on the ground (inverse square gravitation change)
• Use the barometer as a pendulum
• Drop the barometer on a windless day (mercury vapor diffusion from broken barometer)
• Seal the building and fill with water
• Take the barometer and building to an airless world (achieve orbital motion of barometer)
• Clap the barometer and listen for the echo

June 12, 2007

Granular Materials

Granular materials have some peculiar properties [1]. The most interesting is the "Brazil Nut Effect," named after the tendency for large nuts to be at the surface of a container of mixed nuts. This can be reproduced in any mixture of large and small particles which is shaken vertically. If you make a conical pile of sand by pouring it through an orifice, the region of highest pressure at the bottom of this pile is not below the center of the cone. It's in a ring at one third the radius of the base. More surprisingly, the flow rate of sand through an hourglass is independent of the height of the sand in the upper chamber, quite unlike the case for water.

Granular materials flow well under most conditions. An hourglass, mentioned above, is one example. In the real world, however, blockages occur frequently in the transport of granular materials in industrial processes. Some simple experiments have been performed recently to examine the jamming process [2,3].

Harry L. Swinney, the Sid Richardson Foundation Regents Chair of the Department of Physics, University of Texas at Austin, became interested in granular flow after seeing a magician's trick. A knife is inserted into a bottle filled with sand, the bottle is shaken slightly, and the magician is able to raise the bottle using the knife as a handle and swing it above his head. As Swinney found, there's more physics than magic at work here. He added small glass beads to a tube filled with water and found that when the beads reached 59 volume percent, there is a transition, similar to water freezing, in which the beads lock together into a coherent mass. Swinney thinks the same phenomenon is at work in the magician's trick.

Robert Behringer, a professor of physics at Duke University, and his colleagues studied the same jamming phenomenon using a single layer of transparent plastic disks in a channel - a two dimensional analog of a granular materials. By viewing the disks through crossed polarizing filters, the Duke physicists were able to record the stress in each disk with a video camera while the channel dimensions were changed. The changing channel dimensions were a way to change the volume fraction of the disks in the channel. Their measurements showed a similar jamming phase transition, and they also showed the development of "force chains" in the material. The granular material is not uniformly stressed when it jams. Instead, the stress is carried by linear filaments of contacting disks. Behringer's research may lead to techniques for preventing jamming in solids hoppers, but for now he offers this advice, "Bang on the side of your hopper with a sledgehammer."

The hammer is a versatile tool, the expression, "When all you have is a hammer, everything looks like a nail," notwithstanding. A hammer was used on the moon in an attempt to fix the Apollo 12 television camera. The camera had a spinning filter disk to enable a color image, and the transmitted image looked as if the wheel had stalled. One of the astronauts tapped the camera with a hammer in an attempt to restart the wheel. Unfortunately, the problem was electronic (the camera was accidentally pointed at the sun, burning out the camera tube), and not a problem with the spinning filter wheel, so the hammer fix didn't work. As a very young child, I wondered how hammers could exist, because it seemed to me that you needed a hammer to make a hammer.

References:
1. Harry L. Swinney Research Group.
2. Glowing discs reveal sudden granular jamming (New Scientist).
3. Magic sand or simple physics? (New Scientist).
4. F. Ludewig, S. Dorbolo and N. Vandewalle, "Effect of friction in a toy model of granular compaction." (Preprint)

June 11, 2007

Extrasolar Planets (Continued)

At one time, we thought there were only five planets. Then, our solar system grew to nine planets, a number reduced by one with the demotion of Pluto last year. Of course, that's just for our own solar neighborhood. If you include planets around other stars, you can count an additional 236 planets discovered to date and cataloged on a new web site. Twenty-eight of these were discovered in just the past year, and a survey of their properties indicates that billions of habitable planets are likely to exist in our galaxy. It's as if the Star Trek universe, with its many "Class M" planets, is coming true.

There is much present activity in the search for extrasolar planets, and there are new technologies to aid discovery. There were actually thirty seven objects discovered in the past year circling other suns, but seven of these were brown dwarfs, nearly star-sized objects which do not have enough mass to ignite a thermonuclear reaction and qualify as a star. Most of the search for extrasolar planets is conducted by astronomers from the California and Carnegie Planet Search and the Anglo-Australian Planet Search teams. Their work is in addition to other efforts, such as the Eastern Southern Observatory search that discovered planets around Gliese 581, but this collaboration has discovered more than half of the extrasolar planets.

The California and Carnegie Planet Search/Anglo-Australian Planet Search collaboration has concentrated on the discovery of planets within 650 light years from Earth. Such a short distance, cosmologically speaking, allows the possibility of actually imaging the planets sometime in the future. One planet the team discovered, circling Gliese 436, a star just 30 light years from Earth, is especially interesting. A transit of the planet across the star allowed a measurement of its density as two grams per cc. This would be the equivalent of a planet composed of half rock and half water. The planet is so massive, however, that the water would be compressed into a solid and would not exist as a liquid.

Now that a sizeable number of planets have been discovered, it's possible to do some statistics. Massive stars appear more likely to have massive planets. This is probably a consequence of the formative solar system having more material to start with.

References:
1. Extrasolar Planets (This Blog).
2. Lynette Cook, "28 New Exoplanets and Four Multi-Planet Systems" (University of California Press Release).
3. R. P. Butler, J. T. Wright, G. W. Marcy, D. A Fischer, S. S. Vogt, C. G. Tinney,
H. R. A. Jones, B. D. Carter, J. A. Johnson, C. McCarthy, and A. J. Penny, "Catalog of Nearby Exoplanets," Astrophysical Journal, Vol. 646 (2006) pp. 505f. (PDF file).

June 08, 2007

Plasma Lamps

Mercury is a toxic chemical element. Mercury's toxicity is exacerbated by its high vapor pressure at room temperature, and its tendency to form metallo-organic compounds. Acute exposure to mercury vapor results in central nervous system effects most graphically illustrated by the mercury poisoning of the people of Minimata, Japan. The Mad Hatter in Alice's Adventures in Wonderland, written by the English mathematician, Charles Dodgson (a.k.a. Lewis Carroll), was mad from mercury poisoning. Unfortunately, mercury is present in fluorescent lamps, where it acts as an ultraviolet emitter (253.7 and 185 nm) that excites the lamp phosphor. Of course, a mercury-free efficient lighting source would be welcome.

A team from the University of Illinois at Urbana-Champaign Department of Electrical and Computer Engineering has developed a thin plasma lamp that's designed to replace traditional fluorescent lamps in residential and commercial lighting applications. As the world waits for thin, white-light emitting diodes for illumination, this team has developed a light source that's six times thinner than light-emitting diode panels [1].

The plasma of this lamp is contained in micro cavities etched in a thin aluminum sheet of a few hundred micrometers thickness. The cavities are 500 micrometers in diameter. This perforated aluminum sheet is rendered non-conductive by surface oxidation and coating with a thin glass layer, and it's then bonded to an aluminum substrate. A glass window of about 500 micrometers thickness with a 10 micrometer phosphor layer is then bonded to the perforated sheet. The structure is evacuated and filled with a plasma-producing gas. Voltage excitation is apparently between the aluminum substrate and the perforated aluminum sheet. The overall thickness of the lamp structure is about 0.8 mm.

The team has fabricated panels of more than 200 square centimeters in size with a luminous efficiency of up to fifteen lumens per watt. They anticipate the efficiency to reach 30 lumens per watt. For comparison, an incandescent bulb has an efficiency of 10 - 15 watts, depending on its wattage rating (e.g., a hundred watt bulb is more efficient than a forty watt bulb). An efficient fluorescent light has a luminous efficiency of a hundred lumens per watt, and an "ideal" white light source would have a luminous efficiency of 242.5 lumens per watt.

Flexible displays built on polymers have been demonstrated. A paper on this light source, authored by Gary Eden, a professor of electrical and computer engineering, three of his students, and Sung-Jin Park, a visiting research scientist in his laboratory, appears in the June issue of the Journal of Physics D: Applied Physics [2]. This is a special issue on mercury-free discharges for lighting. This work was funded by the U.S. Air Force Office of Scientific Research and the Office of Naval Research.

References:
1.James E. Kloeppel, "Aluminum foil lamps outshine incandescent lights" (University of Illinois at Urbana-Champaign Press Release, 4-Jun-2007).
2. S-J Park, J D Readle, A J Price, J K Yoon and J Gary Eden, "Lighting from thin (< 1 mm) sheets of microcavity plasma arrays fabricated in Al/Al2O3/glass structures: planar, mercury-free lamps with radiating areas beyond 200 cm2," Journal of Physics D: Applied Physics (online publication, June 2007; print publication, July 2007).

June 07, 2007

Inverse Woodpile

Many years ago, Honeywell developed an inverse opal photonic material [1,2]. Opal is composed of stacked spheres of silica of regular size. The spheres are stacked like oranges with a planar spacing of a few hundred nanometers. The inverse opal structure is formed by filling the space between spheres and then etching away the silica.

Now, a team of materials scientists from the University of Illinois at Urbana-Champaign have developed an inverse woodpile structure of germanium. This material has an extremely large photonic band gap, so it is able to reflect light over a large range of wavelengths [3]. Their research has been reported online at the web site of the journal Advanced Materials [4].

The University of Illinois team used a type of solid free-form fabrication to create this structure. A polymeric ink is dispensed as a filament from a micrometer-sized nozzle to form rods in a woodpile arrangement under computer control. Then, a coating of alumina and silica is deposited to ensure contact between the rods and protect the polymer. Finally, the space between the rods is filled with germanium, the polymer is burned away, and the alumina-silica layer is removed by an acid etch. What remains is germanium in an inverse woodpile structure. Their proof of concept device consisted of twelve layers of tubes with a thickness of about fifteen micrometers. The structure is about a half-millimeter square.

The inverse woodpile research was funded by the U.S. Department of Energy (DEFG-02-91-ER45439), and the U.S. Army Research Office (DAAD19-03-1-0227).

References:
1. Anvar Zakhidov, Ray Baughman, Changxing Cui, Ilyas I. Khayrullin, Lo-Min Liu, Igor Udod, Ji Su, Mikhail Kozlov, US Patent No. 6,261,469, "Three dimensionally periodic structural assemblies on nanometer and longer scales" (Jul 17, 2001).
2. Anvar Zakhidov, Ray Baughman, Changxing Cui, Ilyas I. Khayrullin, Lo-Min Liu, Igor Udod, Ji Su, Mikhail Kozlov, US Patent No. 6,517,763, "Three dimensionally periodic structural assemblies in nanometer and longer scales" (Feb 11, 2003).
3. James E. Kloeppel, "Inverse woodpile structure has extremely large photonic band gap" (University of Illinois at Urbana-Champaign Press Release, 21 May 2007).
4. F. Garcia-Santamaria, M. Xu, V. Lousse, S. Fan, P. V. Braun, and J. A. Lewis, "A Germanium Inverse Woodpile Structure with a Large Photonic Band Gap," Advanced Materials (Published Online: 18 May 2007).

June 06, 2007

Forensic Nanoparticles

Honeywell has had a no-smoking policy for many years. Before then, meetings were nearly unbearable for me, since several of my co-workers were smokers. One thing I remember is that my hands felt dirty after sitting through such meetings. Well, it wasn't my imagination. Research shows that fingerprints can be used to detect not only a smoking habit, but also drug use. Even my own secrets are not safe, since coffee drinkers can be detected by their fingerprints! This fingerprinting method was developed in England by scientists at the University of East Anglia and King's College in London. Should we be surprised at the venue of this research? England has become an Orwellesque country of more than four million surveillance cameras, one for every fourteen people. It is estimated that people there are seen on camera about 300 times each day [1].

The detection method is simple [2]. When all such substances are processed by the body, certain by-products (metabolites) are produced, and some of these are excreted through the skin. Of course, the quantities of these metabolites are small, so the novelty here is in the detection method, and it's so sensitive that hand washing will not prevent detection. The secret ingredient is gold nanoparticles. For detection of a tobacco habit, the gold nanoparticles are tagged with an antibody that binds to cotinine, a metabolite of tobacco, and applied to the fingerprint. A second antibody for cotinine, tagged with a fluorescent dye, is applied, and the fingerprint is examined under ultraviolet light. The method can be extended to other metabolites, such as those of alcohol and sports-drugs, provided that suitable antibodies can be developed [3].

References:
1. George Orwell, Big Brother is watching your house (thisislondon.co.uk, March 31, 2007)
2. Paul Marks, "New fingerprint analysis identifies smokers," New Scientist Online (18 May 2007).
3. Richard Leggett, Emma E. Lee-Smith, Sue M. Jickells, and David A. Russell, "Intelligent Fingerprinting: Simultaneous Identification of Drug Metabolites and Individuals by Using Antibody-Functionalized Nanoparticles," Angewandte Chemie International Edition, Volume 46, Issue 22 (27 Apr 2007), pp. 4100-4103.

June 05, 2007

Radiation-Eating Fungi

Many plants concentrate trace elements, so they're useful for bioremediation of hazardous waste sites. One example is locoweed (Genus Astragalus), which concentrates selenium up to several percent of its dry weight. Animals grazing on locoweed suffer neurological damage from organo-selenium compounds, and this leads to unusual behavior. In fact, the common name of loco weed derives from the Spanish loco, for crazy. In the age of high nickel prices, we should be interested in Sebertia Acuminata, a tree indigenous to New Caledonia, the dry weight of which is twenty percent nickel! This same mineral-concentrating behavior of plants can be applied to remediation of radioactive waste if the plants concentrate actinide elements. Now, scientists have discovered a fungus that doesn't eat radioactive elements - it "eats" radiation itself [1, 2].

A group of scientists from the Albert Einstein College of Medicine, New York City, New York, have published a paper [3] that describes the radiation-harvesting mechanism of the fungi, Cladosporium sphaerospermum, Wangiella dermatitidis and Cryptococcus neoformans. Cladosporium sphaerospermum is a common household mold. One of the authors, Arturo Casadevall, read an article about a robot sent to explore the highly radioactive Chernobyl reactor. The robot harvested a black fungus growing on the reactor walls and it was found to be rich in melanin. Melanin is a pigment found in many species of fungi, and also in human skin. It is known that melanin-rich fungi are more common in soils containing radioactive minerals.

In their experiments, they exposed fungi to the radioactive isotope, cesium-137, to produce a radiation level about 500 times stronger than typical background radiation levels. The fungi grew faster in this radioactive environment, and their melanin was found by electron spin resonance measurements to have transformed to a different molecular conformation. It appears that just as chlorophyll converts sunlight to chemical energy, the fungal melanin does the same for ionizing radiation. As a control experiment, they found that fungi without melanin did not grow faster upon radiation exposure. To think that I grew up thinking that environmental radiation was always bad.

References:
1. Karen Gardner, "Einstein researchers' discover 'radiation-eating' fungi" (Albert Einstein College of Medicine Press Release, 22 May 2007).
2. Heidi Ledford, "Hungry fungi chomp on radiation" (Nature Online, 23 May 2007, doi:10.1038/news070521-5).
3. Ekaterina Dadachova, Ruth A. Bryan, Xianchun Huang, Tiffany Moadel, Andrew D. Schweitzer, Philip Aisen, Joshua D. Nosanchuk, and Arturo Casadevall, "Ionizing Radiation Changes the Electronic Properties of Melanin and Enhances the Growth of Melanized Fungi" (PLOS).

June 04, 2007

Lightning Detection

Lightning kills more people in the US than any other weather phenomenon. There are approximately 67% as many deaths from tornadoes, 58% as many deaths from floods, and 25% as many deaths from hurricanes [1]. The following is a list of the top ten states for lightning related deaths from 1940-1984 and the total number of deaths in that period.

• Florida - 1523
• Michigan - 732
• Pennsylvania - 644
• North Carolina - 629
• New York - 577
• Ohio - 545
• Texas - 498
• Tennessee - 473
• Georgia - 410
• Colorado - 394

For those of us who still listen to AM broadcast radio, it's easy to detect a proximate lightning storm, since each lightning strike is heard as a noise spike. The detection range for these frequencies (530-1750 kHz) is about twenty miles, but interference is also present at lower and higher frequencies. A quantitative picture of radio signals from a lightning storm can be found on my daughter's meteor detection web site. In this case the amplitude modulated interference signal from lightning at 97.7 MHz is plotted as a function of time. The electromagnetic interference (EMI) caused by a lightning stroke can be detected in nearly all radio frequency bands from 10 Hz to 5 GHz. There's a peak at around 500 Hz. A lightning storm at about five miles distance produces radio pulses with amplitudes of up to a tenth of a volt per meter in a kilohertz bandwidth.

In a recently published US patent application [3], Finnish inventors of the cellphone manufacturer Nokia have applied this same radio detection method as a lightning warning system for cellphone users. Modern cellphones contain radio receivers not just for their primary communications function (typically, GSM). They contain also radio receivers for networking (Bluetooth, Wi-Fi and RFID) and FM radio. Each of these frequency bands has its own detection range for lightning. Nokia's idea is to use software to predict things such as distance to a storm, its intensity, and its speed towards your location.

References:
1. Weather Fun Facts
2. NOAA Tech Memo NWS SR-193, Section 3, "Variations by State in Reported Frequencies. a. Deaths and Injuries Combined (Casualties)".
3. Jantunen, et al., "Detection of lightning," US Patent Application 20070085525 (Application Date October 14, 2005, Publication Date April 19, 2007).
4. Cellphones could warn of imminent lightning strike (New Scientist Online, 20 May 2007).
5. Lightning primer -- (NASA/GHCC) basic information about lightning.
6. Where Lightning Strikes (NASA)

June 01, 2007

Carolus Linnaeus

Most of us have heard of Galileo, but what about Marcello Malpighi? Readers of this blog are likely specialists in the physical sciences. Although they know most of the founders of their own field, few have heard about Malpighi, since his research areas were physiology and anatomy. May 23 marked the 300th anniversary of the birth of Carolus Linnaeus (1707 - 1778), who is perhaps unknown to most physical scientists, but is considered to be the father of modern taxonomy.

Linnaeus was named Carl at birth, but the custom of scholars in those days was to publish in Latin and assume a latinized name, so he became Carolus. Linnaeus had an early interest in botany, so he was enrolled at the nearby Lund University, his father's alma mater. Lund's botanical facilities, however, were meager, so Linnaeus left within a year for Uppsala University. At Uppsala he was noticed by Olof Celsius (yes, that Celsius), who gave him a research stipend to defray his meals and lodging. His first research interest was the stamens and pistils of flowers, and he wrote a treatise on the sexes of plants in 1729. He led a botanical expedition to Lappland, a region of northernmost Sweden, in 1732, and published a book, Flora Lapponica, about his discoveries in 1737.

We physical scientists have an easy time with our research, since we deal with equations and numbers. Everything is very nicely organized and manipulated. During Linnaeus's time, botany was essentially just "stamp collecting," as Ernest Rutherford once said, but these botanical collections had no organization, whatever. Linnaeus sought to organize the tree of life in a classification system, which he introduced in his 1735 publication Systema Naturae. This classification system was further developed in his Species Plantarum, published in 1753. The Species Plantarum is an important work, since it contained descriptions of all plants known at that time!

The Linnaean system of taxonomy is hierarchical. There are three "Kingdoms" at the top (Animal, Mineral, and Vegetable) which are subdivided first into Classes, then Orders, Genera, Species and Varieties. As taxonomy advanced, Linnaeus' original system was modified considerably to be more descriptive. Linnaeus was able only to identify gross physical differences, whereas advanced technologies such as DNA analysis allow greater precision. As an example, a modern horse is identified as

• Kingdom: Animalia
• Phylum: Chordata
• Class: Mammalia
• Order: Perissodactyla
• Family: Equidae
• Genus: Equus
• Species: E. caballus

Linnaeus placed humans with the primates, recognizing his fellow men as Homo sapiens, but specifying another class of human called Homo troglodytes ("cave-dwelling man"). This was before the discovery of the Neanderthal man (1829). The classification of man among the primates displeased the Lutheran Archbishop of Uppsala, who accused Linnaeus of "impiety." Linnaeus seemed to enjoy his role in this controvesy, and he would say, "Deus creavit, Linnaeus disposuit" ("God created, Linnaeus organized"). Physical scientists owe Linnaeus a debt of gratitude for at least one thing - he modified Celsius' temperature scale, in which the melting point of ice was 100 and the boiling point of water was zero, to the modern form.

References:
1. Sweden celebrates 300th birthday of Linnaeus (New Scientist Online, 23 May 2007).
2. Carolus Linnaeus (Wikipedia).
3. "Linnaeus," Encyclopedia Britannica (1911), vol. 16, p. 733.