Recycling Science

Oleg Shpyrko speckleMy Alchemist column on ChemWeb is live once again: This week’s award is for science that sheds light on a range of physical phenomena including liquid-metal surfaces and condensed matter. The recipient of the award, Oleg Shpyrko of the University of California San Diego, will receive the 2008 Rosalind Franklin Young Investigator Award from Argonne National Laboratory. I asked him what the award means to him:

“It is a great honor to have my research recognized in this way but credit should be shared between all of my collaborators, especially the Advanced Photon Source beamline scientists without whom the research simply would not be possible. What makes the APS a truly world-class facility is not just its unique X-ray beam characteristics, but also the outstanding group of scientists working here. The synergy between the users and APS scientists is an absolutely crucial component for the cutting-edge research performed there.”

Meanwhile, in straight chemistry news, nanotubes are feeling the heat of chilies and while analysts are musing on the lack of psychedelics in artists’ tipple absinthe. Also, this week, X-ray studies are helping in the redesign of novel anticancer compounds, while a connection the great British seaside holiday, kelp and iodine as an oxidant is revealed. Finally, plastic lasers could open the door for a new range of spectroscopic and medical diagnostics instrumentation. Get the full alchemical news here.

You may also be interested in science news with a spectroscopic bent where I report on how recycling old computers and electronics can be used to make a new type of feedstock oil for the petrochemical industry.

Recycling of a different kind in which parts from a CD-ROM drive have been scavenged for another purpose could help bring quick and inexpensive DNA diagnostics to the poorer parts of the world. More on that here.

Data Loss Disaster Recovery

Until recently, most IT professionals have viewed disaster recovery as straightforward file recovery. As such, a nightly backup was the prescription. Things have changed. Businesses have to maintain continuous access to applications and data for employees, partners, and customers and website and systems downtime is no longer acceptable. Disaster recovery means maintaining availability without downtime.

Here are a few free magazines and white papers on the subject.

  • The Data Disappearing Act: Mitigating Virtual Data Loss (slideshare.net)
  • Hacker Destroys Avsim.com Along With Its Backups (it.slashdot.org)

Stem Cell Research

Embryonic stem cellsLots of visitors are hitting the Sciencebase site look for information on stem cell research. It is a subject I’ve written about before, both on this site and elsewhere, but I thought it might be useful, given that my alma mater is at the forefront of stem cell research in the UK, to provide a FAQ on the subject of stem cells. Just to be clear, usually when the media uses the phrase stem cells, they really mean human embryonic stem cells, but that occasionally takes print journalists over the wordcount, so it’s commonly abbreviated to stem cells, so for the sake of brevity, I’ll do the same here.

  • What are stem cells?

    Stem cells are primordial cells that can divide without limit and differentiate into the various types of cell used to build our livers, hearts, bones, brains, skin, and other organs, blood cells, nerves etc. More details.

  • Where are stem cells found?

    Pluripotent stem cells, which can form any cell type, can be harvested from human embryos that are just a few days old.

  • What do researchers do with harvested stem cells?

    Harvested pluripotent stem cells can be cultured in the laboratory to create “stem cell lines” for research and development.

  • What can be done with cultured stem cells?

    A cultured stem cell line can multiply indefinitely in the lab, so once produced researchers can use the same line without having to harvest new stem cells.

  • What might stem cells be used for?

    Cultured stem cell lines can be “engineered” to differentiate into specific cell types, which researchers are hoping can be transplanted into a patient to treat a wide range of problems, including cancer, spinal cord injury, stroke, burns, heart disease, diabetes, birth defects and neurodegenerative disorders, such as Parkinson’s and Alzheimer’s disease.

  • Have researchers cured diseases with stem cells?

    Not yet, stem cell research is little more than a decade old and is very much in the experimental stages. Legal, funding, and ethical issues in the US, UK and elsewhere have slowed stem cell advances during this time to some degree.

  • Aren’t bone marrow transplants using stem cells?

    The well-known bone marrow transplant uses the blood stem cells found in bone marrow and has been used to treat a range of diseases, such as leukaemia, for four decades.

  • Do embryos have to be used to harvest stem cells?

    Not necessarily, the umbilical cord is being researched as an alternative source of stem cells that would sidestep some of the ethical issues associated with embryonic stem cells. There is also research into using tissue-specific stem cells from adult donors.

    A much more detailed FAQ on stem cell research can be found on the ISSCR site, while the US National Academy of Sciences has lots of info too. Additional resources may be found on the Applied Biosystems site and at the Stem Cell Companies site.

Genetic Manipulation

European corn borer

Are you happy to eat genetically modified foods? What about your friends and colleagues? Do the GM pros outweigh the cons?

I asked a few contacts for some answers by way of building up to a more formal response to those kinds of questions that will be published soon in the International Journal of Biotechnology (IJBT, 2008, 10, 240-259).

Plant geneticist Dennis Lee, Director of Research at mAbGen, in Houston, Texas, suggests that GM crops have several significant advantages. “Total cost per acre can actually be significantly less for GM crops,” he says. This is particularly true for crop species, such as maize, that have been modified to produce natural toxins that fend off insect pests or protect the crop from the herbicides need to keep weed growth at a minimum. However, he points out that, “In practice, this is often not the case – farmers tend to err on the side of caution and continue to use significant amounts of pesticides and herbicides.”

That said, crops can also be modified to grow in substandard conditions, such as strains of tubers grown in Kenya that are capable of surviving both drought conditions and high-salt soils. “Obviously, this is beneficial to yield – you can actually get some food out of places where you previously could not,” adds Lee. In addition, it could be possible to modify some crops to have greater nutritional content, such as the so-called “golden rice” project by Ingo Potrykus then at the Institute of Plant Sciences of the ETH Zurich.

One of the biggest perceived problems regarding GM crops is the possible contamination of other species. What if herbicide-resistant genes could jump into weed species, for instance? Lee points out that this putative problem can be overcome by using terminator technology to jumping genes. “However, in doing so, it creates a different problem,” Lee adds, namely that farmers must buy seed from the agbiotech company each year rather than save seed for planting.” One might say that this is an exploitative industry focused purely on maximizing profits, but at the same time it solves a serious technical problem that has been seen as one of the biggest stumbling blocks to the acceptance of GM crops.

Jeff Chatterton, a Risk and Crisis Communications Consultant at Checkmate Public Affairs, in Ottawa, points out that the pros are well documented: increased yield per acre, ease of use and perhaps, some day, increased ‘consumer level’ benefits such as higher nutritional values. But, echoes others’ comments on the hidden con of farmers the world over potentially being locked into the agbiotech company’s seed and having no recourse to produce their own from one year to the next.

“As traditional family farms are increasingly moving towards “Roundup Ready” corn or soybeans, you’re increasingly seeing a change in the business model of farming,” he says. “Rather than ‘family farms’ using traditional farming practices, agricultural operations are increasingly becoming factory farms.” It might be said that the emergence of factory farms is occurring outside the realm of GM crops, but with pressure being applied to produce more and more crops for non-food purposes, including, biofuels, unique polymers, and other products, the notion of a factory farm that doesn’t even feed us could become an increasing reality.

Lee also mentions an intriguing irony regarding the public perception of risk-benefits concerning GM crops and that is that the toxins produced by modified Bt maize is exactly the same toxin produced by the natural soil microbe Bacillus thuringiensis (Bt) itself and this is same Bt toxin that so-called “organic” farmers are usually allowed to use instead of “synthetic” pesticides.

Information Technology and Services Professional Bill Nigh of Bluenog, based in New York, provides perspective as a lay person. “We’ve been engaged in genetic manipulation for a long time now,” he says, “but it was limited by the technology at hand. With recombinant DNA it’s a remarkably more vast field of play and a whole new ball game.” He stresses that his main concern regarding GM crops is that, “We seem to be just smart enough to make drastic breakthroughs and inventions, and are driven by the dynamics of the marketplace and ego to produce a lot of new things quickly. However our systems of governance, oversight and coordination are not mature enough to work through the implications of those new things in a timely fashion, especially the unforeseen synergies the breakthroughs can unleash.”

All that said, an international team has now investigated the various issues and has assessed the public’s Willingness to Accept (WTA) GM foods based on experimental auctions carried out in France, UK, and USA. Lead author of the IJBT paper Wallace Yee now at the University of Liverpool, worked, while at Reading University, with colleagues in various disciplines, from agricultural and food to business and economics in Italy, New Zealand, UK and US to explore perceptions of risk and benefits, moral concerns and attitudes to the environment and technology.

“Trust in information provided by industry proved to be the most important determinant of risk/benefit perceptions,” the researchers conclude, “willingness to accept followed general attitudes to the environment and technology.” They also found that educational level and age could also enhance perceived benefits and lower perceived risks of GM foods. “Our research suggests that trust-building by industry would be the most effective approach to enhancing the acceptance of GM foods,” the team says.

“If the industry could educate people that GM technology does not pose any threat to the environment, but provides benefits to society as a whole and consumers as individuals, the attitudes of the public towards GM in food production would be favourable, and in turn increase their willingness to accept,” they conclude.

Computing professional Paul Boddie of Oslo, Norway, coming at the issue of GM crops from an indirect angle provides an allusion to computer programming that seems quite pertinent and was originally attributed to Brian Kernighan, which Boddie suggests readily transfers to other disciplines including genetic engineering: “Everyone knows that debugging is twice as hard as writing a program in the first place. So if you are as clever as you can be when you write it, how will you ever debug it?”

Yee, W.M., Traill, W.B., Lusk, J.L., Jaeger, S.R., House, L.O., Moore, M., Morrow, J.’., Valli, C. (2008). Determinants of consumers’ willingness to accept GM foods. International Journal of Biotechnology, 10(2/3), 240. DOI: 10.1504/IJBT.2008.018356

Interstellar Molecular Thermometer

Carbon monoxideAstronomers have detected for the first time in the ultraviolet region the spectroscopic signature of the carbon monoxide molecule in a galaxy located almost 11 billion light-years away. This molecule has eluded astronomers for a quarter of a century.

The detection now allows them to obtain the most precise measurement of the cosmic temperature just 1.5 billion years after the Big Bang (give or take 25 years). The team used the UVES spectrograph on ESO’s Very Large Telescope (it does what it says on the tin) to record the signal from a well-hidden galaxy whose light has taken 4 fifths the age of the Universe to reach Earth. Apparently, it was at a rather balmy (compared to today’s temperatures) 9-and-a-bit Kelvin.

More details from the European Southern Observatory site.

Balls to the Dinosaurs, Oceanic Oxygen, and a Nano Flush

Dinosaur balls

The May issue of my Spotlight column over on the Intute site is now online, this month featuring:

Flush with nanoparticles – What happens to carbon-based nanoparticles when they enter groundwater? Can municipal water supplies filter them out? And, if they cannot will they cause health problems? These are crucial questions that need answers now, as nanotechnology grows. Now, a new study by Kurt Pennell, of the Georgia Institute of Technology, and colleagues, suggests that subtle differences in the solution properties of the water carrying such particles can determine their ultimate fate.

Ocean oxygen starvation – Oxygen-poor regions of tropical oceans are expanding as the oceans warm, limiting the areas in which predatory fishes and other marine organisms can live or enter in search of food, according to a major ongoing marine exploratory project. The phenomenon could cut overall marine biodiversity.

The Collaborative Research Centre programme – Climate: Biogeochemistry Interactions in the Tropical Ocean – funded by the German Research Foundation is working in close cooperation with the University of Kiel, and researchers at the Scripps Institution of Oceanography at the University of California San Diego.

Carbon balls to the dinosaurs – Palaeontologists presume that an asteroid impact led to such enormous and widespread environmental upheaval that it wiped out the dinosaurs and thousands of other species when it struck the Earth. Now, researchers from Italy, New Zealand, UK, and USA suggests that the impact force was so great that it would have liquefied carbon in the planet’s crust and sprayed tiny airborne carbon beads into the atmosphere in unimaginable quantities.

You can check out the Spotlight archives via the Sciencebase recent scientific discoveries page.

Teatime

Chocolate teapotI commented on a post on the Bad Language blog, produced by my good friend Matthew Stibbe, earlier this week. He was waxing lyrical about cutting power consumption in his SOHO and mentioned how he prefers to brew tea with freshly drawn water. I pointed out that while this may have benefits it would actually increase his kettle limescale problems through the addition of extra calcium and magnesium ions. The effect will be negligible, but if we are adding up every single kilowatt-second then it could make a difference. Of course, brewing tea is not environment friendly in the first place and we should all really be drinking trapped dew under a hessian bivouac, or somesuch.

Anyway, Matthew immediately followed up my comment with a defence of using freshly drawn water for making a cuppa. He’s a man after my own heart. I’ve done this once or twice in the past and it exemplifies precisely how blogs are if nothing else a dialogue (please don’t prove me wrong by not commenting on this post…)

I’d better qualify my boiling/reboiling comment on his blog. Chemically speaking the difference between starting with freshly drawn water each time will be a simple matter of formation of insoluble calcium and magnesium salts. With freshly drawnn water you’re adding new metal ions, which will effectively add to your limescale. However, the de-hardening of hard water by heating is not a perfect process so some will be retained in the beverage once you pour over tea leaves, but the actual balance depends on how soft or hard is your water supply in the first place.

However, now that I’ve had a glass or two of vino (at the time of writing), it has also occurred to me that there are lots of other, organic, components in fresh tapwater, such as humic acids, and organochlorine compounds (possibly even fluorine compounds depending on where you live). These will be presumably be degraded and/or boiled off with the first boil to a degree. In the second boiling it is more likely that you will get rid of all these flavoursome ingredients from the water. So, perhaps there is something in the use of fresh water for the best cuppa, but it’s marginal given that any flavours in the water will essentially be overwhelmed by the flavour of the tea itself. It’s like worrying about the sounds they leave out when compressing a music file into mp3 format.

Meanwhile, the origins of tea lie in an attempt at “storing” water in Asia, so legend goes, and to protect it from contamination by pathogens (namely cholera, although they didn’t know this as the agent at the time). The polyphenolics and other materials in tea infused into the water are to a degree antimicrobial, but perhaps more importantly the simple act of boiling kills of the microbes quickly and succinctly without any recourse to chemistry.

In the “West”, the equivalent solution to the great clean water problem was the addition of fermenting fruits and the subsequent production of wine or beer depending on the region. It’s thought to explain why westerners have evolved an enzyme to break down alcohol and its metabolites whereas some Asians lack this enzyme system.

Given the choice between a freshly brewed cuppa, I know which I prefer, especially at this time of the evening…now where’s that corkscrew?

Accounting for Research

Accounting for scientists

How does one measure the worth of the science base? From the scientists’ perspective it is their bread and butter, or low-fat spread and rye biscuit, perhaps, in some cases. From industry’s standpoint, it is occasionally a source of interesting and potentially money-spinning ideas. Sometimes, it sits in its ivory tower and, to the public, it is the root of all those media scare stories. At worst, the science base is perceived as a huge drain on taxpayers’ money, especially when the popular press gets hold of ‘spiders on cannabis’ and the ‘scum on your tea’ as the lead science stories for the year!

For the government though, which more often than not is providing the funds for basic research, the science base is crucial to all kinds of endeavours: wealth creation, the development of fundamental science into practical technology, the demolition of those ivory towers and the mixing of scientists with the great industrial unwashed through collaboration. As such, governments try to ensure that the science they fund is accountable – to government, to its sponsors and to society and the public as a whole.

But, I come back to my first question. How does one measure the impact of basic research on society? If one went begging for funding for a new area in chemistry with no practical applications anywhere in sight, funding would likely be meagre. It can be dressed up, of course, natural product chemistry almost always has the potential for novel medicinally active compounds while even the most esoteric supramolecular chemistry could be the nanotechnology breakthrough we have been waiting for. You’ve seen and maybe even written the applications yourself. On the other hand, take any piece of genetics with the potential to cure some likely disease and the cash will usually roll in, at least relatively speaking.

So, what does quality mean when applied to scientific research? Was the discovery of the fullerenes quality science? Well, yes it obviously was in that it stirred up the chemistry and other communities and generated mass appeal for a subject that gets rather less of an airing in a positive light than certain other sciences. Fullerenes also provided some of the scientists involved with a Nobel Prize so someone in Sweden must have liked it.

But, if we were to apply any kind of standard criteria of usefulness to society we would be hard pushed to give it the highest score except only as a demonstration that fundamental science can still excite. After all, have you seen any real applications yet? I touched on the potential for medicinal fullerenes early in the fullerene rising star and it is probably unfair to single them out for accountability, especially as ultimately they inspired the carbon nanotubes. You might say that they are simply one of many examples of science as art. They allow us to visualise the world in a new way, they are beautiful – chemically, mathematically, physically.

The pressure is now on scientists to face up to some imposing questions as government-mandated requirements begin to come into effect. [This has become a moot point in the UK since this article was first aired, given funding cuts for big, esoteric science projects]. Efforts to make science accountable come with a massive burden of controversy and are hindered by the almost impossible task of measuring creative activities such as research. Added to this, accountability requires increasing levels of administration especially at times of formal assessment for the scientists themselves.

The careers of most scientists hinge on these assessments, in more ways than one, as the pressure on faculty pushes them in directions they may not naturally go, producing research papers just to satisfy the assessment process, for instance. This coupled with a general drive to bring science to the public – through media initiatives – and so demonstrate to people why science is important and why their money should be spent on it – just adds to the pressure.

However, despite the marketing-style talk of stakeholders, and the close industrial analogues, the shareholders, basic scientific research is not about customers and churning out identical components on a production line. There are usually no targets and no truly viable and encompassing methods to assess the quality of any part of the scientific endeavour. Ironically, this means the end-of-year bonus is something on which most scientists miss out, regardless of their successes. Science is the art, technology makes it pay, but some art is fundamental or avant garde and some finds its way on to advertising hoardings. Which do you prefer, fine art or glossy brochure?

By forcing basic science to become accountable in terms of product and efficiency there is the possibility that creativity and autonomy will be stifled. If done right, accountability can strengthen the relationship between research and society.

Measuring the socioeconomic benefits from specific scientific investments is tough. Basic research gets embodied in society’s collective skills, putatively taking us many more directions than we would otherwise have headed. As such, it can have a future impact on society at entirely unpredictable points in time. Who knows where that pioneering fullerene chemistry will have taken us by the end of this century?

Sir Harry Kroto, co-discoverer of the fullerenes told me in an interview once that, “Scientists are undervalued by a society that does not understand how outstanding someone has to be to become a full-time researcher.” Maybe the measure of science is in its beauty rather than its assessment scores.

A Wrench for Social Engineering

Social engineering attacks, what used to be known as a confidence, or con, tricks, can only be defeated by potential victims taking a sceptical attitude to unsolicited approaches and requests for privileged information and resources. That is the message that arrives from European researchers.

Most of us have received probably dozens of phishing messages and emails from scammers on the African continent seeking to relieve us of our hard-earned cash. Apparently, these confidence tricksters are so persuasive that they succeed repeatedly in hustling funds even from those among us with a normally cynical outlook and awareness of the ways of the world.

On the increase too are cowboy construction outfits and hoax double-glazing sales staff who wrest the life savings from senior citizens and so-called boiler room fraudsters who present get-rich-quick schemes so persuasively that thousands of unwitting individuals lose money totalling millions of dollars each year.

Con artists and hustlers have always preyed on greed and ignorance. As the saying, goes a fool and their money are easily parted. However, the new generation of social engineers, are not necessarily plundering bank accounts with promises of riches untold, but are finding ways to infiltrate sensitive databases, accounts, and other resources, using time-honoured tricks and a few new sleights of hand.

Now, Jose Sarriegi of the Tecnun (University of Navarra), in San Sebastian, Spain, and Jose Gonzalez, currently in the department of Security and Quality and Organizations, at the University of Agder, Norway, have taken a look at the concept of social engineering, and stripped it down to the most abstract level (International Journal of System of Systems Engineering (2008, 1, 111-127)). Their research could lead to a shift in attitude that will arm even the least sceptical person with the necessary social tools to spot an attempt at social engineering and stave off the attack with diligence.

Fundamentally, the researchers explain, social engineering is an attempt to exploit a victim, whether an individual or organization, in order to steal an asset, money, data, or another resource or else to make some resource unavailable to legitimate users in a denial of service attack or in the extreme instigate some terrorist, or equally destructive, activity.

Of course, a social engineering attack may not amount to a single intrusion, it could involve layer upon layer of deceptions at different places and on different people and resources. The creation of a sophisticated back-story, access to less sensitive resources, and targeting of the ultimate goal is more likely to be a dynamic process. This, the researchers suggest, means that looking for “heaps of symptoms”, as might occur in attempting to detect someone breaking into a computer system, is no longer appropriate and a dynamic response to a dynamic attack is more necessary now than ever before.

Recognising the shifting patterns of an ongoing and ever-changing social engineering attack means better detection of problems low in the metaphorical radar, the team suggests. Better detection means improved efficacy of security controls. The best defence is then to build, layer-by-layer, feedback loops that can catch an intruder at any of many different stages rather than relying on a single front-line defence that might be defeated with a single blow.

Latest on Spectral Lines

Spectral FloydThere have been 32 issues of my science news column on spectroscopynow.com since it was last officially called Spectral Lines, but I thought it was a nice name so occasionally resurrect it here when I highlight the latest research findings I cover on the site. It also gives me an excuse to re-use a logo I did in the early days of the site touting the line “David Bradley On Spec” (geddit?).

So this, week the first May issue is brought to you by the letter “F” with articles entitled: Fishing for amines, Fancy ants for arthritis, and Fixing chemotherapy. We also have, Rewiring brains therapeutically, Hybrid contact, and Boning up with Raman, but they don’t start with an “F” so required a separate sentence. Anyway…

Those fancy ants are perhaps not the first organism one would think to turn to for medical assistance, but researchers in Hong Kong and Japan have now used spectroscopy to study the chemical structures of various compounds extracted from Chinese medicinal ants that are thought to have anti-arthritic activity and be beneficial in treating hepatitis. There are lessons to be learned here, regarding the harvesting of traditional knowledge from folk medicine as well as yet another reason to try and conserve biodiversity the world over.

In Rewiring brains therapeutically, Edward Taub and colleagues at UAB use MRI scans to lay to rest once and for all the medical myth that the adult brain cannot grow new neurons. They show that a form of therapy, developed by Taub in the early 1990s for helping stroke patients recover use of paralysed limbs, so-called constraint induced (CI) therapy, really does induce a remodelling of the brain.

And in my Hybrid contact item, I discuss how early attempts to create protein-polymer hybrid materials often foundered because the mixed chemistry was simply not up to the task. Now, a UCB team has developed a new approach to hooking up natural proteins with synthetic polymers that could work with almost any protein and any polymer and could be used to develop new types of chemical sensor for medical diagnostics, quality control and environmental analysis. Related materials might also work as highly targeted drug-delivery systems, or even as the components of a future nanomachine.