Heavy Metal and Alzheimer’s

Heavy metal bassMore popular science news with a spectroscopic bent from the desk of David Bradley, this week: Heavy metal and Alzheimer’s – While the protein-like plaques that form in the brains of people with Alzheimer’s disease and in other tissues in a wide range of different disorders are well known, what is less well known is that fairly high concentrations of transition metal elements, including copper, iron, and zinc, are also present. Do these metals have a role to play in plaque formation or are they a side-effect. New research using X-ray and NMR spectroscopy could shed light on the issue and perhaps one day lead to new approaches to therapy based on controlling these metals.

Forgetful quanta – Researchers have, for the first time, monitored oscillations in a vanadium-based molecular magnet. These so-called Rabi oscillations are characteristic of the disturbances that have so far prevented scientists developing a viable quantum bit, or qubit, for use in the next generation of probabilistic computers and encryption devices. According to one independent commentator, the research represents the passing of a milestone on the road to quantum computers. Now that scientists understand the cause of this problem they might be able to address it by swapping atoms with spin for isotopes with zero spin and so cut down on the noise.

More spec news from DB and others on spectroscopynow.com

Naturally Fibrous Mimic

One of the important components of the extracellular matrix is collagen, which comprises the major structural protein component of higher organisms. However, it remains a major challenge to emulate the unique structural and biological properties of native collagenous biomaterials in synthetic analogues. Consequently, numerous opportunities exist for synthetic collagens in biomedical applications as extracellular matrix analogues, if the appropriate materials could be constructed that retain and expand upon the desirable properties of native collagen fibrils.

The exploration of chemical and molecular genetic techniques to design and synthesize collagen-mimetic polypeptides and fibers that are competent for self-assembly into structurally defined protein fibrils is an intriguing avenue for exploration. In this context, Shyam Rele and colleagues have been leading the efforts in the de novo design of nanostructured biological materials through self-assembly of peptides and proteins.

Rele, together with Elliot Chaikof and Vince Conticello in the Laboratory of Bio/Molecular Engineering and Advanced Vascular Technologies at Emory University School of Medicine have been successful in designing and synthesizing the first ever Synthetic Collagen Peptide system which is a 36 amino acid long unit which self-assembles into a fibrous structure with well-defined periodicity reminiscent of native collagen observed in the human body.

Specifically, the synthesized peptide protomer which is made up of three heterotrimeric peptide repeat units contains a hydrophobic proline-hydroxyproline-glycine core flanked on both the sides by distinct sets of peptide repeats containing either negatively (Glutamic acid) or positively (Arginine) charged amino acid residues. When positioned appropriately, these charged amino acids bias and adopt the triple helical self-assembly which undergoes fibrillogenesis at physiological temperatures producing D-periodic microfibers driven through electrostatic interactions.

Transmission electron microscopy on annealed samples revealed that fiber growth proceeded within several hours by initial formation of smooth fibrils that were hundreds of nanometers in length and tens of nanometers in diameter. These fibrils displayed tapered tips similar to the tactoidal ends of native collagen fibers from which continued fiber growth is thought to occur. The D-periodicity of the synthetic collagen-mimetic microfibers was approximately 18 nm. Significantly, the collagen mimic shows a high propensity for self-association following a nucleation-growth mechanism even at lower concentrations (<1.0 mg/mL) and neutral pH. This following discovery for making human collagen in the laboratory is pathbreaking in the field of nanotechnology and bio-inspired biomaterials. Several scientists for the past three decades have been trying to synthesize and emulate collagen's remarkable properties and have failed in their attempts to mimic the long, fibrous molecules found in nature. The ability of Rele, Chaikof and Conticello to generate a synthetic collagen in a laboratory (in vitro) on a nanomolecular level for the first time, therefore represents an important milestone in nanotechnology and biomaterial development. Such self-assembling peptides may have broad applications in medicine, neurodegenerative diseases, protein folding catalyst design, bio-nanotechnology, tissue engineering and origins of life research. Furthermore, generation of such nanostructured molecules which mimic native structural proteins will lay the future ground work for unraveling complex phenomena including collagen fiber formation in protein conformational diseases and for the design of new materials with biological, chemical, and mechanical properties that exceed those of currently available synthetic polymers.

The propensity to generate such self-assembling, biologically compatible peptide scaffolds to arrange themselves into fibers, tubules, and a variety of geometrical layers, establishes an important substrates for cell growth, differentiation, and biological function, and will have an important impact in the treatment of cardiovascular, orthopedic, and neurological disease.

Adapted from a write-up supplied by Rele. Further details can be found in JACS, vol 129, 14780-14787.

Recycling Science

Oleg Shpyrko speckleMy Alchemist column on ChemWeb is live once again: This week’s award is for science that sheds light on a range of physical phenomena including liquid-metal surfaces and condensed matter. The recipient of the award, Oleg Shpyrko of the University of California San Diego, will receive the 2008 Rosalind Franklin Young Investigator Award from Argonne National Laboratory. I asked him what the award means to him:

“It is a great honor to have my research recognized in this way but credit should be shared between all of my collaborators, especially the Advanced Photon Source beamline scientists without whom the research simply would not be possible. What makes the APS a truly world-class facility is not just its unique X-ray beam characteristics, but also the outstanding group of scientists working here. The synergy between the users and APS scientists is an absolutely crucial component for the cutting-edge research performed there.”

Meanwhile, in straight chemistry news, nanotubes are feeling the heat of chilies and while analysts are musing on the lack of psychedelics in artists’ tipple absinthe. Also, this week, X-ray studies are helping in the redesign of novel anticancer compounds, while a connection the great British seaside holiday, kelp and iodine as an oxidant is revealed. Finally, plastic lasers could open the door for a new range of spectroscopic and medical diagnostics instrumentation. Get the full alchemical news here.

You may also be interested in science news with a spectroscopic bent where I report on how recycling old computers and electronics can be used to make a new type of feedstock oil for the petrochemical industry.

Recycling of a different kind in which parts from a CD-ROM drive have been scavenged for another purpose could help bring quick and inexpensive DNA diagnostics to the poorer parts of the world. More on that here.

Data Loss Disaster Recovery

Until recently, most IT professionals have viewed disaster recovery as straightforward file recovery. As such, a nightly backup was the prescription. Things have changed. Businesses have to maintain continuous access to applications and data for employees, partners, and customers and website and systems downtime is no longer acceptable. Disaster recovery means maintaining availability without downtime.

Here are a few free magazines and white papers on the subject.

  • The Data Disappearing Act: Mitigating Virtual Data Loss (slideshare.net)
  • Hacker Destroys Avsim.com Along With Its Backups (it.slashdot.org)

Stem Cell Research

Embryonic stem cellsLots of visitors are hitting the Sciencebase site look for information on stem cell research. It is a subject I’ve written about before, both on this site and elsewhere, but I thought it might be useful, given that my alma mater is at the forefront of stem cell research in the UK, to provide a FAQ on the subject of stem cells. Just to be clear, usually when the media uses the phrase stem cells, they really mean human embryonic stem cells, but that occasionally takes print journalists over the wordcount, so it’s commonly abbreviated to stem cells, so for the sake of brevity, I’ll do the same here.

  • What are stem cells?

    Stem cells are primordial cells that can divide without limit and differentiate into the various types of cell used to build our livers, hearts, bones, brains, skin, and other organs, blood cells, nerves etc. More details.

  • Where are stem cells found?

    Pluripotent stem cells, which can form any cell type, can be harvested from human embryos that are just a few days old.

  • What do researchers do with harvested stem cells?

    Harvested pluripotent stem cells can be cultured in the laboratory to create “stem cell lines” for research and development.

  • What can be done with cultured stem cells?

    A cultured stem cell line can multiply indefinitely in the lab, so once produced researchers can use the same line without having to harvest new stem cells.

  • What might stem cells be used for?

    Cultured stem cell lines can be “engineered” to differentiate into specific cell types, which researchers are hoping can be transplanted into a patient to treat a wide range of problems, including cancer, spinal cord injury, stroke, burns, heart disease, diabetes, birth defects and neurodegenerative disorders, such as Parkinson’s and Alzheimer’s disease.

  • Have researchers cured diseases with stem cells?

    Not yet, stem cell research is little more than a decade old and is very much in the experimental stages. Legal, funding, and ethical issues in the US, UK and elsewhere have slowed stem cell advances during this time to some degree.

  • Aren’t bone marrow transplants using stem cells?

    The well-known bone marrow transplant uses the blood stem cells found in bone marrow and has been used to treat a range of diseases, such as leukaemia, for four decades.

  • Do embryos have to be used to harvest stem cells?

    Not necessarily, the umbilical cord is being researched as an alternative source of stem cells that would sidestep some of the ethical issues associated with embryonic stem cells. There is also research into using tissue-specific stem cells from adult donors.

    A much more detailed FAQ on stem cell research can be found on the ISSCR site, while the US National Academy of Sciences has lots of info too. Additional resources may be found on the Applied Biosystems site and at the Stem Cell Companies site.

Genetic Manipulation

European corn borer

Are you happy to eat genetically modified foods? What about your friends and colleagues? Do the GM pros outweigh the cons?

I asked a few contacts for some answers by way of building up to a more formal response to those kinds of questions that will be published soon in the International Journal of Biotechnology (IJBT, 2008, 10, 240-259).

Plant geneticist Dennis Lee, Director of Research at mAbGen, in Houston, Texas, suggests that GM crops have several significant advantages. “Total cost per acre can actually be significantly less for GM crops,” he says. This is particularly true for crop species, such as maize, that have been modified to produce natural toxins that fend off insect pests or protect the crop from the herbicides need to keep weed growth at a minimum. However, he points out that, “In practice, this is often not the case – farmers tend to err on the side of caution and continue to use significant amounts of pesticides and herbicides.”

That said, crops can also be modified to grow in substandard conditions, such as strains of tubers grown in Kenya that are capable of surviving both drought conditions and high-salt soils. “Obviously, this is beneficial to yield – you can actually get some food out of places where you previously could not,” adds Lee. In addition, it could be possible to modify some crops to have greater nutritional content, such as the so-called “golden rice” project by Ingo Potrykus then at the Institute of Plant Sciences of the ETH Zurich.

One of the biggest perceived problems regarding GM crops is the possible contamination of other species. What if herbicide-resistant genes could jump into weed species, for instance? Lee points out that this putative problem can be overcome by using terminator technology to jumping genes. “However, in doing so, it creates a different problem,” Lee adds, namely that farmers must buy seed from the agbiotech company each year rather than save seed for planting.” One might say that this is an exploitative industry focused purely on maximizing profits, but at the same time it solves a serious technical problem that has been seen as one of the biggest stumbling blocks to the acceptance of GM crops.

Jeff Chatterton, a Risk and Crisis Communications Consultant at Checkmate Public Affairs, in Ottawa, points out that the pros are well documented: increased yield per acre, ease of use and perhaps, some day, increased ‘consumer level’ benefits such as higher nutritional values. But, echoes others’ comments on the hidden con of farmers the world over potentially being locked into the agbiotech company’s seed and having no recourse to produce their own from one year to the next.

“As traditional family farms are increasingly moving towards “Roundup Ready” corn or soybeans, you’re increasingly seeing a change in the business model of farming,” he says. “Rather than ‘family farms’ using traditional farming practices, agricultural operations are increasingly becoming factory farms.” It might be said that the emergence of factory farms is occurring outside the realm of GM crops, but with pressure being applied to produce more and more crops for non-food purposes, including, biofuels, unique polymers, and other products, the notion of a factory farm that doesn’t even feed us could become an increasing reality.

Lee also mentions an intriguing irony regarding the public perception of risk-benefits concerning GM crops and that is that the toxins produced by modified Bt maize is exactly the same toxin produced by the natural soil microbe Bacillus thuringiensis (Bt) itself and this is same Bt toxin that so-called “organic” farmers are usually allowed to use instead of “synthetic” pesticides.

Information Technology and Services Professional Bill Nigh of Bluenog, based in New York, provides perspective as a lay person. “We’ve been engaged in genetic manipulation for a long time now,” he says, “but it was limited by the technology at hand. With recombinant DNA it’s a remarkably more vast field of play and a whole new ball game.” He stresses that his main concern regarding GM crops is that, “We seem to be just smart enough to make drastic breakthroughs and inventions, and are driven by the dynamics of the marketplace and ego to produce a lot of new things quickly. However our systems of governance, oversight and coordination are not mature enough to work through the implications of those new things in a timely fashion, especially the unforeseen synergies the breakthroughs can unleash.”

All that said, an international team has now investigated the various issues and has assessed the public’s Willingness to Accept (WTA) GM foods based on experimental auctions carried out in France, UK, and USA. Lead author of the IJBT paper Wallace Yee now at the University of Liverpool, worked, while at Reading University, with colleagues in various disciplines, from agricultural and food to business and economics in Italy, New Zealand, UK and US to explore perceptions of risk and benefits, moral concerns and attitudes to the environment and technology.

“Trust in information provided by industry proved to be the most important determinant of risk/benefit perceptions,” the researchers conclude, “willingness to accept followed general attitudes to the environment and technology.” They also found that educational level and age could also enhance perceived benefits and lower perceived risks of GM foods. “Our research suggests that trust-building by industry would be the most effective approach to enhancing the acceptance of GM foods,” the team says.

“If the industry could educate people that GM technology does not pose any threat to the environment, but provides benefits to society as a whole and consumers as individuals, the attitudes of the public towards GM in food production would be favourable, and in turn increase their willingness to accept,” they conclude.

Computing professional Paul Boddie of Oslo, Norway, coming at the issue of GM crops from an indirect angle provides an allusion to computer programming that seems quite pertinent and was originally attributed to Brian Kernighan, which Boddie suggests readily transfers to other disciplines including genetic engineering: “Everyone knows that debugging is twice as hard as writing a program in the first place. So if you are as clever as you can be when you write it, how will you ever debug it?”

Yee, W.M., Traill, W.B., Lusk, J.L., Jaeger, S.R., House, L.O., Moore, M., Morrow, J.’., Valli, C. (2008). Determinants of consumers’ willingness to accept GM foods. International Journal of Biotechnology, 10(2/3), 240. DOI: 10.1504/IJBT.2008.018356

Interstellar Molecular Thermometer

Carbon monoxideAstronomers have detected for the first time in the ultraviolet region the spectroscopic signature of the carbon monoxide molecule in a galaxy located almost 11 billion light-years away. This molecule has eluded astronomers for a quarter of a century.

The detection now allows them to obtain the most precise measurement of the cosmic temperature just 1.5 billion years after the Big Bang (give or take 25 years). The team used the UVES spectrograph on ESO’s Very Large Telescope (it does what it says on the tin) to record the signal from a well-hidden galaxy whose light has taken 4 fifths the age of the Universe to reach Earth. Apparently, it was at a rather balmy (compared to today’s temperatures) 9-and-a-bit Kelvin.

More details from the European Southern Observatory site.

Balls to the Dinosaurs, Oceanic Oxygen, and a Nano Flush

Dinosaur balls

The May issue of my Spotlight column over on the Intute site is now online, this month featuring:

Flush with nanoparticles – What happens to carbon-based nanoparticles when they enter groundwater? Can municipal water supplies filter them out? And, if they cannot will they cause health problems? These are crucial questions that need answers now, as nanotechnology grows. Now, a new study by Kurt Pennell, of the Georgia Institute of Technology, and colleagues, suggests that subtle differences in the solution properties of the water carrying such particles can determine their ultimate fate.

Ocean oxygen starvation – Oxygen-poor regions of tropical oceans are expanding as the oceans warm, limiting the areas in which predatory fishes and other marine organisms can live or enter in search of food, according to a major ongoing marine exploratory project. The phenomenon could cut overall marine biodiversity.

The Collaborative Research Centre programme – Climate: Biogeochemistry Interactions in the Tropical Ocean – funded by the German Research Foundation is working in close cooperation with the University of Kiel, and researchers at the Scripps Institution of Oceanography at the University of California San Diego.

Carbon balls to the dinosaurs – Palaeontologists presume that an asteroid impact led to such enormous and widespread environmental upheaval that it wiped out the dinosaurs and thousands of other species when it struck the Earth. Now, researchers from Italy, New Zealand, UK, and USA suggests that the impact force was so great that it would have liquefied carbon in the planet’s crust and sprayed tiny airborne carbon beads into the atmosphere in unimaginable quantities.

You can check out the Spotlight archives via the Sciencebase recent scientific discoveries page.

Teatime

Chocolate teapotI commented on a post on the Bad Language blog, produced by my good friend Matthew Stibbe, earlier this week. He was waxing lyrical about cutting power consumption in his SOHO and mentioned how he prefers to brew tea with freshly drawn water. I pointed out that while this may have benefits it would actually increase his kettle limescale problems through the addition of extra calcium and magnesium ions. The effect will be negligible, but if we are adding up every single kilowatt-second then it could make a difference. Of course, brewing tea is not environment friendly in the first place and we should all really be drinking trapped dew under a hessian bivouac, or somesuch.

Anyway, Matthew immediately followed up my comment with a defence of using freshly drawn water for making a cuppa. He’s a man after my own heart. I’ve done this once or twice in the past and it exemplifies precisely how blogs are if nothing else a dialogue (please don’t prove me wrong by not commenting on this post…)

I’d better qualify my boiling/reboiling comment on his blog. Chemically speaking the difference between starting with freshly drawn water each time will be a simple matter of formation of insoluble calcium and magnesium salts. With freshly drawnn water you’re adding new metal ions, which will effectively add to your limescale. However, the de-hardening of hard water by heating is not a perfect process so some will be retained in the beverage once you pour over tea leaves, but the actual balance depends on how soft or hard is your water supply in the first place.

However, now that I’ve had a glass or two of vino (at the time of writing), it has also occurred to me that there are lots of other, organic, components in fresh tapwater, such as humic acids, and organochlorine compounds (possibly even fluorine compounds depending on where you live). These will be presumably be degraded and/or boiled off with the first boil to a degree. In the second boiling it is more likely that you will get rid of all these flavoursome ingredients from the water. So, perhaps there is something in the use of fresh water for the best cuppa, but it’s marginal given that any flavours in the water will essentially be overwhelmed by the flavour of the tea itself. It’s like worrying about the sounds they leave out when compressing a music file into mp3 format.

Meanwhile, the origins of tea lie in an attempt at “storing” water in Asia, so legend goes, and to protect it from contamination by pathogens (namely cholera, although they didn’t know this as the agent at the time). The polyphenolics and other materials in tea infused into the water are to a degree antimicrobial, but perhaps more importantly the simple act of boiling kills of the microbes quickly and succinctly without any recourse to chemistry.

In the “West”, the equivalent solution to the great clean water problem was the addition of fermenting fruits and the subsequent production of wine or beer depending on the region. It’s thought to explain why westerners have evolved an enzyme to break down alcohol and its metabolites whereas some Asians lack this enzyme system.

Given the choice between a freshly brewed cuppa, I know which I prefer, especially at this time of the evening…now where’s that corkscrew?

Accounting for Research

Accounting for scientists

How does one measure the worth of the science base? From the scientists’ perspective it is their bread and butter, or low-fat spread and rye biscuit, perhaps, in some cases. From industry’s standpoint, it is occasionally a source of interesting and potentially money-spinning ideas. Sometimes, it sits in its ivory tower and, to the public, it is the root of all those media scare stories. At worst, the science base is perceived as a huge drain on taxpayers’ money, especially when the popular press gets hold of ‘spiders on cannabis’ and the ‘scum on your tea’ as the lead science stories for the year!

For the government though, which more often than not is providing the funds for basic research, the science base is crucial to all kinds of endeavours: wealth creation, the development of fundamental science into practical technology, the demolition of those ivory towers and the mixing of scientists with the great industrial unwashed through collaboration. As such, governments try to ensure that the science they fund is accountable – to government, to its sponsors and to society and the public as a whole.

But, I come back to my first question. How does one measure the impact of basic research on society? If one went begging for funding for a new area in chemistry with no practical applications anywhere in sight, funding would likely be meagre. It can be dressed up, of course, natural product chemistry almost always has the potential for novel medicinally active compounds while even the most esoteric supramolecular chemistry could be the nanotechnology breakthrough we have been waiting for. You’ve seen and maybe even written the applications yourself. On the other hand, take any piece of genetics with the potential to cure some likely disease and the cash will usually roll in, at least relatively speaking.

So, what does quality mean when applied to scientific research? Was the discovery of the fullerenes quality science? Well, yes it obviously was in that it stirred up the chemistry and other communities and generated mass appeal for a subject that gets rather less of an airing in a positive light than certain other sciences. Fullerenes also provided some of the scientists involved with a Nobel Prize so someone in Sweden must have liked it.

But, if we were to apply any kind of standard criteria of usefulness to society we would be hard pushed to give it the highest score except only as a demonstration that fundamental science can still excite. After all, have you seen any real applications yet? I touched on the potential for medicinal fullerenes early in the fullerene rising star and it is probably unfair to single them out for accountability, especially as ultimately they inspired the carbon nanotubes. You might say that they are simply one of many examples of science as art. They allow us to visualise the world in a new way, they are beautiful – chemically, mathematically, physically.

The pressure is now on scientists to face up to some imposing questions as government-mandated requirements begin to come into effect. [This has become a moot point in the UK since this article was first aired, given funding cuts for big, esoteric science projects]. Efforts to make science accountable come with a massive burden of controversy and are hindered by the almost impossible task of measuring creative activities such as research. Added to this, accountability requires increasing levels of administration especially at times of formal assessment for the scientists themselves.

The careers of most scientists hinge on these assessments, in more ways than one, as the pressure on faculty pushes them in directions they may not naturally go, producing research papers just to satisfy the assessment process, for instance. This coupled with a general drive to bring science to the public – through media initiatives – and so demonstrate to people why science is important and why their money should be spent on it – just adds to the pressure.

However, despite the marketing-style talk of stakeholders, and the close industrial analogues, the shareholders, basic scientific research is not about customers and churning out identical components on a production line. There are usually no targets and no truly viable and encompassing methods to assess the quality of any part of the scientific endeavour. Ironically, this means the end-of-year bonus is something on which most scientists miss out, regardless of their successes. Science is the art, technology makes it pay, but some art is fundamental or avant garde and some finds its way on to advertising hoardings. Which do you prefer, fine art or glossy brochure?

By forcing basic science to become accountable in terms of product and efficiency there is the possibility that creativity and autonomy will be stifled. If done right, accountability can strengthen the relationship between research and society.

Measuring the socioeconomic benefits from specific scientific investments is tough. Basic research gets embodied in society’s collective skills, putatively taking us many more directions than we would otherwise have headed. As such, it can have a future impact on society at entirely unpredictable points in time. Who knows where that pioneering fullerene chemistry will have taken us by the end of this century?

Sir Harry Kroto, co-discoverer of the fullerenes told me in an interview once that, “Scientists are undervalued by a society that does not understand how outstanding someone has to be to become a full-time researcher.” Maybe the measure of science is in its beauty rather than its assessment scores.