Captain Jack and the Large Hardon Collider

John BarrowmanDoctor Who’s maverick side-kick Captain Jack, played by big show-tunes fan John Barrowman, took time out from his busy schedule to indulge a passion for mini big bangs with a visit to CERN, the world’s largest
particle physics laboratory and home of the large hardon collider (LHC). I suspect that Barrowman has not read my earlier post on the LHC and misread the title, but you never know.

Anyway, Barrowman took Manchester-CERN high-energy physicist, Brian Cox, along for the ride, and yes, there really are just far too many lewd puns to be made in the context of Cox, hadrons and Barrowman to be worth the effort. Actually, Cox was a ray of Sunshine, he was one of the scientific advisers on the recent sci-fi flick of that name. Barrowman, apparently, is genuinely interested in exploring the boundaries between science fact and science fiction. His fascinating response when confronted with the notion that a speeding proton in the particle accelerator experiences every second of our time as a seven thousandth of a second is illuminating to say the least – “Holey Moley”, he exclaimed. But, at least he went and donned the hard hat in the name of science.

Check out the video. Dig the groovy tune. And if you’re into that kind of thing you get to see Barrowman’s teeth, which are a miracle of modern science in themselves.

Funding the All-electric Aircraft

Superconducting motorToday, Philippe Masson of the FAMU-FSU College of Engineering and Center for Advanced Power Systems and colleagues at NASA and Georgia Tech publish details of an entirely new class of aircraft engine that, if it takes off, could lead to an all-electric aircraft that would cut airport pollution and reduce aircraft vapor trails to a distant memory. You can read my write-up about the work on the AlphaGalileo site here.

Unfortunately, while the science is sound, no one is yet beating a path to the inventors’ door, despite NASA backing. I asked Masson why he thought this was the case and his answer provides some cutting insights into the nature of the transport industry and the manufacturers that currently underpin it.

First off he pointed out that, “Conventional jet engines (turbofans) are very reliable and can still be improved: people are still working on NOx and noise reduction (including as part of our NASA sponsored project),” he says, “Therefore, there is a lot of inertia and imposing a new and totally different technology would be very difficult.” The major advantage of using electrical power is environment preservation because the performance of an all-electric aircraft would be unchanged unless one takes into account increased controllability and decreased maintenance requirements.

Masson’s electric jet is based on using zero-resistance superconducting materials as the magnetic components of the turbo-driving motor, but he points out that these, and cryogenic support systems needed to make them work, are still very expensive thus making funding difficult to find. It is possible that mass production would reduce costs to an economically viable level, but that is probably not going to happen any time soon.

“The motor designs we proposed can exhibit impressive power densities that would unfortunately almost only benefit airborne applications, there are no other applications with critical constraints in terms of weight and volume,” he told me, “As for the car industry in which combustion engine manufacturers are putting a lot of pressure to prevent new clean technologies to take off, jet engine manufacturers would not be happy to see electrical propulsion systems becoming a new standard.”

“An all-electric aircraft prototype is feasible,” he adds, “but imposing this technology as a replacement to gas turbines would still require a lot of research and development to meet flight requirements in terms of reliability.” However, Masson asserts that the appearance of increasingly electrical airliners from both Airbus and Boeing could hint at a future of all-electric aircraft. “I am convinced that one day in a not so far future we will see small electrically powered aircraft,” he says. He concedes that, “It will be years, probably tens of years, before we can see a truly all-electrical aircraft as all the components require extensive testing and a very high reliability before being implemented in airplanes.”

Masson and his colleagues have approached several companies and aircraft manufacturers and have not yet been successful in getting funding to build a prototype of their superconducting propulsion motor for which patents are pending. “We are still hopeful and will keep looking for funding,” he says.

Measuring Up Size Comparisons

After such a long, serious, and scientific post on genetics and disease yesterday, I thought it was time to post a slightly less serious, shorter, but hopefully useful item on length. As a science journalist, I often need to explain the scale of nanometres, picometres, and very, very rarely yoctometers (okay never) to a non-scientific audience in my writing. Similarly, visitors to this site often ask questions relating to size and the relative scale of something like a femtometer. For more on the definitions of the prefixes you can check out this earlier article on yotta to yocto. Meanwhile, here’s a digest of some of the more common size comparisons:

One metre (1 m), that’s about the length of our dog, the height of a two-year old toddler, or roughly the length of a six-foot adult male’s arm, give or take a couple of inches.

One millimetre (1 mm) a sheet of fairly stiff, but plain, cardboard is about 1 mm. A pinhead measures up to approximately 1.7 mm.

One micrometre (1 um, the u should be a Greek letter mu) is the size at which things start to get a bit tricky. Because of the “micro”, these things are by definition microscopic: a grain of pollen, a red blood cell, are 1 micrometre across. A human hair is about 200 micrometres thick, for comparison.

One nanometre (1 nm). Now comes the really interesting bit, a nanometre is a billionth of a metre, viruses are on this scale as to are the breadth of a strand of DNA. Cell membranes too are about one nanometre thick. However, when researchers talk about nanotechnology, the scale of the objects they are discussing can stretch from this very large molecule size all the way up to several hundred nanometres…which strictly speaking is probably better thought of as a few tenths of a micrometre instead.

The Sense of Scale site has some additional comparisons, although they seriously let themselves down by talking of flourine, as opposed to fluorine atomic nuclei. Nevertheless, they do offer some interesting size comparisons, such as 260 nm being the length of the smallest transistor in a Pentium 3 chip. A Pentium 3 chip, you say? Well, presumably the site was produced when those chips were cutting edge and long before 65 nm and 45 nm processes in microelectronics had become reality. A grain of salt is about 100 micrometres, meanwhile, which given it is a near-perfect cube means it is a million cubic micrometres.

All of this relates, of course, to the orders of magnitude primer on Sciencebase some time ago, which is visualised very well in the FSU’s Powers of Ten movie. The interactive clip stretches from 100 attometres (10 to the minus 16 metres) to 10 million light years (10 to 23 metres, which is a tenth of a yottametre).

Of course, I’ve only touched on length in this post. Sometimes I need size comparisons for mass, time, density, in fact most physical properties. If you have any good indicators, leave a comment to tell us about them.

Matrix recharged

Matrix rechargedOne of the big problems facing society in its search for sustainable alternative energy sources is not how to harness wind, solar, or wave power, but how to store the electricity produced using these elements at times of low demand. Capacitors could be the answer. These devices can be charged up very quickly, store electrical energy for long periods, and then be discharged rapidly for a range of applications. Such capacitors are on the horizon but their small-scale cousins are developing even more rapidly for portable applications. Find out more in the May issue of Intute Spotlight.

Also under the Spotlight this month, Norwegian scientists have drawn up a league table of alternative fuels for cars based on what they call a “well-to-wheel” analysis. Their approach takes into account the energy costs in manufacturing, total energy use, and overall pollution included greenhouse gas emissions. Unsurprisingly, petrol and diesel vehicles foot the table, closely followed by hybrid vehicles. In contrast, the greenest way to power a vehicle turns out to be to use an electric fuel cell powered by hydrogen made from natural gas, methane.

Finally, in the 1950s, the atomic clock was the pinnacle of split-second time-keeping. Today, physicists use its successors based on energy transitions in rubidium atoms that gives them 100 times more accuracy. These clocks currently operate at their theoretical limit but nevertheless are accurate to one second every 50 million years. Quantum noise, the random fluctuations of atoms and ions and the grim truth of Heisenberg’s Uncertainty Principle would probably mean no improvements for the next 50 million years.

Listening out for one-over-f noise

One over f noiseThe universe is a noisy place – from traffic growling along roads to the random fluctuations in DNA sequences and from the distribution of stars in galaxies to the hissy fit that is electronic noise. One thing all these forms of noise have in common is they are related by the phrase “One-over-f”, the reciprocal of frequency.

A new understanding of “1/f” has emerged from a collaboration between scientists in Norway, Russia, and the USA. Their work could lead to more sensitive sensors and detectors based on semiconductor electronics.

According to materials scientist Valerii Vinokour of Argonne National Laboratory, Illinois, “Finding the common origin of one-over-f noise in its many forms is one of the grand challenges of materials physics,” he says. He and his colleagues have developed a new theory of 1/f noise establishes its origin and lower limit in semiconductor electronics, which could help developers optimize detectors for commercial applications.

Noise is nothing more than timely fluctuations, deviations from the average. In microelectronics, noise is generated by random fluctuations of electrons. Vinokur and his colleagues report in the May 11 issue of the science journal Phys Rev Lett how 1/f noise in doped semiconductors, the platform for all modern electronics, originates in the random distribution of impurities and the mutual interaction of the many electrons surrounding them.

These two ingredients – randomness and interaction – lead to electrons being trapped in a Coulomb glass state in which electrons hop randomly from point to point.

“Our results,” Vinokour explains, “establish that one-over-f noise is a generic property of Coulomb glasses and, moreover, of a wide class of random interacting systems and phenomena ranging from mechanical properties of real materials and electric properties of electronic devices to fluctuations in the traffic of computer networks and the Internet.”

Cervical fluids and boron nitride

Two more reports of general interest from my SpectroscopyNOW column. The first is on a new informatics approach to understanding cervical vaginal fluids and the second on a new study of boron nitride the technological wonder material of the future
Screening for premature problems
The application of multiple protein identification algorithms to an analysis of cervical vaginal fluid (CVF) can provide a detailed map of biological markers to help researchers understand the course of human pregnancy and the problems that can arise. Preliminary tests suggest it could be used to determine the likelihood of a premature birth.

Inelastic boron nitride
The results of inelastic X-ray scattering and other techniques have been combined with ab initio calculations to characterise and explain the behaviour of the superficially simple binary material boron nitride. Insights from the research could lead to new ways to exploit the electronic and mechanical properties of hexagonal boron nitride.

Open Access Abbreviated, Combined

Phys Math CentralJust when you thought that the publishers had ran out of combinations of shortened discipline names – PhysChemOrgPhys, ChemCommPhysChem, CommPhysOrgGeoAstroChem (You know who you are!), BioMedCentral(!) is yet to launch another – PhysMath Central. PMC, an open access publishing platform, goes live today with a call for papers for its first journal is officially accepting papers for publication in its first journal, PMC Physics A, B, and C.

My former colleague on ChemWeb(!) Chris Leonard who is now heading up PMC tells me about why this endeavour is so important to the scientific community and publishing in general. “Global access to peer-reviewed research is as essential in the physical sciences as it is in the life sciences,” he says, “The same benefits apply, namely; increased readership, increased citations, decreased access barriers and the retention of copyright by the author.” Leonard is on record as saying that his move from the world of traditional publishing to the OA end of the spectrum represented an epiphany. “I started off at ChemWeb.com and subsequently moved to Amsterdam to work for Elsevier,” he explains, “I have now seen the light and am very happy to be developing physics and mathematics journals for the Open Access publisher BioMed Central.”

BMC explains the rationale behind the launch as being aimed at meeting the increasing demand for open access journals from major research institutes (such as CERN, the European Organization for Nuclear Research) and other funding organizations and government bodies. PhysMath Central could make research in physics, mathematics and computer science more widely available and increase access to this research to all institutes and individuals, without the burden of subscription charges. “The demand for open access is growing constantly as all scientists from all disciplines become aware of the benefits of open access publishing,” adds Leonard. Success will hinge, as with any new journal launch, on whether or not the putative authors feel the return on investment of submitting to the new journal will pay off in terms of readership and impact factor.

If the existence of yet more journals in the literature is not enough, PMC is also launching a blog, be sure to add it to your blogroll to keep up with developments and impact factor evolution. Oh, and one more thing, for their British authors: they deliberately missed off the “s” from “maths”.

Large Hadron Collider at CERN

CERN magnetUPDATE: OCT 15, 2008 – Investigations at CERN* following a large helium leak into sector 3-4 of the Large Hadron Collider (LHC) tunnel have confirmed that cause of the incident was a faulty electrical connection between two of the accelerator’s magnets. This resulted in mechanical damage and release of helium from the magnet cold mass into the tunnel.

Proper safety procedures were in force, the safety systems performed as expected, and no one was put at risk. Sufficient spare components are in hand to ensure that the LHC is able to restart in 2009, and measures to prevent a similar incident in the future are being put in place.

‘This incident was unforeseen,’ said CERN Director General Robert Aymar, ‘but I am now confident that we can make the necessary repairs, ensure that a similar incident can not happen in the future and move forward to achieving our research objectives.’

UPDATE: SEPT 10, 2008 – The first particle beam has been sent around the 27km long tunnel at the LHC. This is the equivalent of a computer POST (power on, self test), they are yet to collide any hadrons at near light speed (that will be the BOOT proper). Sciencebase has now published its Large Hadron Collider LHC-FAQ and will keep you up to date with the latest from the LHC via the site’s RSS newsfeed; subscribe for free now to stay informed, alternatively, you can get updates by email. For concerns about black holes and revelations at the Large Hadron Collider, you may wish to read an extended guest post on the subject.

Physics followers among our readers will have no doubt seen the ubiquitous LHC “typo” a million times, so we’ve been very careful to avoid it in this item (email me if you cannot work out what it is). But, one thing that is not unavoidable, is the huge number of news reports that claim there was some kind of mathematical error that led to the recent little big bang at the CERN site.

Jonathan Leake at The Times, for instance, in an article headlined: “Big Bang at the atomic lab after scientists get their maths wrong” says, “A £2 billion project to answer some of the biggest mysteries of the universe has been delayed by months after scientists building it made basic errors in their mathematical calculations.”

Tests were started on the enormous magnets that will pull particles around the accelerator to great speeds in a giant experiment to mimic conditions at the beginning of time. But, I just heard from Fermilab visiting scientist Peter Limon, who is helping to commission the LHC, and he tells an entirely different story. What exactly was the cause of the accident deep underground at the CERN particle accelerator complex near Geneva in Switzerland?

“The problem with the inner triplet magnets in the LHC is as follows,” Limon told me, “The superconducting magnets themselves are in a pressure vessel (called the cold mass) that will eventually be cooled to 1.9 K for operation. These cold masses are suspended inside a cryostat (a vacuum vessel) so that they can be isolated from the heat that would otherwise make it impossible to cool the magnet. The suspension is made of a composite glass/epoxy material to minimize the heat flow from the outside of the vessel into the magnet.”

Some reports have claimed that the magnet was lifted out of its mountings. “The magnets did not lift itself off its mountings,” Limon said emphatically, “The break was in suspension pieces inside the cryostat. There was no motion of the magnet on its mountings, as far as we can tell.”

Limon then explained that, “Because of the geometry and the connections between magnets in the inner triplet, there is an unbalanced longitudinal force on the cold mass when the cold mass is pressurized.” This force, he adds, is transmitted from the cold mass to the the cryostat through the composite suspensions. “The design of the suspensions is inadequate to withstand those forces, but at 20 atmospheres, they broke,” he says, “The pressure test would have been successful if the pressure had gotten to 25 atm.”

Engineering calculations completed independently by Fermilab and CERN on March 28, the day after the accident, showed that the G-11 support structure in the magnets was inadequate to withstand the associated longitudinal forces. The word “inadequate” is rather euphemistic in this context, in reality the equipment simply was not up to the job in hand and it broke under the strain.

“In short,” Limon told me, “this was not a mathematical error, but an engineering omission. The full extent of the unbalanced longitudinal force (as much as 15 tons!) was not taken into account when the suspension was designed.”

I don’t believe design sabotage has been ruled out, but it is rather unlikely. Although the cynical among us will note that Fermilab who designed the magnet is also a scientific rival of CERN itself. Fermilab operates a particle accelerator, Tevatron, that is less powerful than the LHC but which Fermilab scientists are continually pushing to its limits. Moreover, they hope to beat CERN in the race to find the key particle that could unlock the secrets of the universe – the Higgs boson – before the LHC is even fired up.

The repair work by Fermilab and CERN staff, is according to the CERN website “being closely coordinated”. Fermilab personnel are on site at CERN, no doubt working under a cloud. That phrase “closely coordinated” would suggest some serious monitoring of activities. Too right.

Additionally, Fermilab is currently examining all aspects of the US-supplied components for the LHC just in case there are any other “potential vulnerabilities.” Whether CERN’s problems were mathematical or engineering in origin, CERN’s plans have been seriously delayed, which could give the Tevatron, with its dearth of “potential vulnerabilities” a particular advantage in the quest for the secrets of the universe.

NEWS FLASH

On Thursday 26 April, the last superconducting magnet of the Large Hadron Collider (LHC), a 15 metre long dipole weighing 34 tonnes, will be lowered in to the 27 km tunnel of the accelerator. With this magnet, the world’s largest superconducting installation receives its final component. The LHC is made up of some 1700 superconducting magnet assemblies, which will guide and focus the LHC’s particle beams. Teams are at work in the tunnel to conclude the complex task of magnet interconnection, and the sequence of procedures necessary before the machine’s scheduled start-up at the end of the year.

Coming soon: The Large Hadron Collider FAQ (the LHCFAQ) from Sciencebase.

Save a balloon with water

Balloon in a candle flameWhat connects cooling computer chips, melting car engines, and a balloon that will not pop? This week’s science video sees Robert Krampf explaining the principles behind heat sinks, car radiators, water cooling, and how to hold a balloon above a burning candle without it ever popping.

Krampf points out that, “Because we’re using fire, always be sure you keep safety in mind, and be sure you’ve got an adult around, so that you’ll have somebody to blame if something goes wrong!”

So, what is it about water that makes it absorb the heat from the candle flame so fast and so protect the rubber of the balloon from melting or burning? Water has the second highest specific heat capacity of any known chemical compound, after ammonia. This is due to the extensive but transient network of temporary hydrogen bonds that form between the oxygen atom at the centre of each water molecule and a hydrogen atom from a neighbouring water molecule. This fluxional network of loose bonds allows liquid water to rapidly absorb heat and also allows the heat to quickly be dispersed through the bulk liquid.

WARNING: Please don’t attempt this experiment with anything but water in the balloon. Water is about the only fluid that is safe to use but more to the point, it won’t work properly with any other fluid.

R&R leads to molecular recovery

Mark Kuzyk is at it again. The physicist continues to explore a range of novel, light-sensitive compounds and has found one that degrades over time…but if kept in the dark for a short period of time, spontaneously heals itself. This amazing property could be exploited in industrial processes such as optical data storage and photolithography, which could use the recyclable material instead of having to replace the expensive stuff for every turn over.

Kuzyk and colleagues at Washington State University have found a molecule that loses its ability to fluoresce when bathed with laser light but regains this talent if it gets plenty of rest in the dark. Recovery starts during a half hour power nap and is complete after a good eight hours R&R, say the resarchers.

“It’s almost as if you have a piece of paper that’s yellowed over time, and you put it in a dark room for a day, and it comes back brand-new,” enthuses Kuzyk. Previously, I discussed Kuzyk’s work on Sciencebase and Intute Spotlight.

Kuzyk and students Ye Zhu and Juefei Zhou discovered the “self-healing” property of the dye AF455, which excels at two-photon absorption, an important property in optical data storage and in producing microelectronics for photolithography. The team will report details in the April 15 issue of the journal Optics Letters.

I received a follow-up email to this from Kuzyk: I’ve reproduced the Mark Kuzyk email here.