Scientific stereotypes

Youtube scientist bad reactionScientific stereotypes continue to persist, stretching even as far as recent Google acquisition, Youtube, the social video upload site. (Right click and view image to read it full size).

Today, Youtube had a period of allegedly scheduled downtime and to explain the lack of vids, they displayed a cartoon showing a marginally mad scientist (albeit a youngster rather than the usual aged, balding mad scientist). The scientist in question seemed malevolently preoccupied with pouring one green liquid from a test-tube into another. For what purpose we’re not told, but the caption beneath read:

“We’re busy pushing out some new concoctions and formulas We’ll be back soon…assuming all reactions are stable”

Inherent in that phrase is a fundamental lack of understanding of chemistry, of course. What, after all is a “stable reaction”, a reaction by virtue of being is anything but stable, it is intrinsically unstable, in constant flux…reacting! Perhaps they meant to say “steady reaction”, instead, a reaction can be steady, with a constant conversion of starting materials into stable products and byproducts as opposed to an explosive reaction, which one might talk about as being unsteady, or perhaps that’s what they are alluding to in using the words stable, somehow attempting to imply that reactions to their new concoctions might be unstable and lead to an extension of their scheduled downtime.

I suspect most Youtube readers will not care one ion. But, the pushing of scientific stereotypes in popular culture is a serious issue. With scientists repeatedly characterised as mad, malevolent or at best absent minded, it is difficult to see how the general public will ever reach a point at which they will understand or trust the scientific endeavour.

You can read a feature article I wrote on the subject of scientific stereotypes for the now defunct HMSBeagle webzine on BioMedNet here. Note the hopefully ironic use of a benevolent, slightly madcap, and certainly balding character as illustration.

Viruses Versus Bacteria

bacteriophageIn 1919, long before antibiotics were commonplace and long before the notion of drug resistance had emerged, a doctor in the east European state of what is now Georgia, Felix d’Herelle, gave a patient suffering from severe dysentery a seemingly lethal concoction of viruses. You might think such a drink would kill the patient, but these were no ordinary viruses, they were bacteriophages, the nemesis of bacteria.

The patient was well again within a week.

Thus was heralded in the age of phage therapy. Different viral strains were selected for almost every bacterial infection. Diseases were cured. What’s more, because bacteriophages are themselves in some sense alive, they can evolve to keep up with any resistance efforts mounted by the bacteria.

So what happened to bacteriophages? Why are the news headlines filled with stories of new deadly bacteria, such as MRSA, and the newly re-emerged forms of tuberculosis? Why are we so worried about outbreaks of E coli, salmonella, and other bacteria. Surely, we have a whole armoury of trusty phages to turn to that can wipe out the rank and file of resistance microbes quickly?

Well, we don’t, somewhere between the discovery of penicillin and the second world war, chemical antibiotics fell in to pharmaceutical line as the treatment of choice to deal with bacterial infections. Never mind the fact that within months of the first dose of penicillin being given doctors were already seeing resistance. Today, there are thousands of antibiotics on the market, some are even available over-the-counter in southern Europe. Moreover, in countries that cannot really afford to use them, individuals receive short dose regimens that don’t cure their illness and provide new opportunities for bacteria to develop resistant genes.

Swiss science editor Thomas Häusler tells the story of bacteriophages and phage therapy from its humble roots to its dimly recalled heyday of the 1920s and 1930s in his book Viruses vs. Superbugs. He tells a tale of rancidity and disease that were all but eradicated by bacteriophages but that is gradually returning as hospital wards succumb to the resistant hoards and various sectors of society, such as drug users and the homeless are dealt a deadly blow as TB and other “old” diseases crawl the streets.

In the USA alone some 90000 people die each year from these so-called superbugs. The likes of the World Health Organization and other official bodies agree that things can only get worse. Perhaps a discovery from the middle of the Great War of 1914-1918 could take the place of the dozens of obsolete antibiotics stacked on pharmacy shelves and provide a final cure for the bacterial infections that until the 1960s the medical profession had all but consigned to the history books.

Intelligent Dawkins Debate

Dawkins rap

The Intelligent Design and Anti-Evolution lobbies often argue that evolution is but a theory and that opposing theories must be taught in order to be properly scientific about the origins of the human race. Well, if its debate they want, then it’s debate they shall have. The Education section of the Guardian reports that the UK government wants religious education classes for 11-14 year olds to encompass the notion of intelligent design (ID) and to highlight texts such as the writings of evolutionary biologist Richard Dawkins, Galileo, and Charles Darwin.

It’s about time. While it is all well and good giving our children an education that offers them the opportunity to understand the traditional religions – Buddhism, Christianity, Hinduism, Judaism, Islam etc – the only way to get a true perspective on philosophical thinking is to provide them with the perspective of those who have no religion.

“ID,” The Guardian says, “argues that the creation of the world was so complex that an intelligent – religious – force must have directed it.” The debate has been an incredibly contentious issue for scientists and “people of faith” in Britain and the US in recent year, with several education boards (Kansas in the US, Gateshead in the UK) famously scratching evolution from the curriculum because it is purportedly an “unproven theory”.

Scientifically speaking, evolution is a theory, of course and as all good scientists know theories cannot be proved. Science can only look for contradictory evidence that requires said theory to be refined or discarded if too many observations conflict with the predictions of the theory. Scientists are yet to find any such conflicting evidence when it comes to evolution. In contrast, there is much evidence that ID “as a theory” is wholly invalid.

Take the eye, for instance. How on earth could such a device have been designed and if it were, then why were so many variations developed from the simple light sensors of flat worms to the prismatic arrays of fruit flies to the honed sensors of the Golden Eagle?

Debate is a good thing and it is certainly a positive step to at least address the concerns of scientists about the degrading of evolutionary theory by the ID lobby, but there is the worry that 11-14 year olds who are not generally keen on science will become even more confused by the complexities of evolution as a sound explanation for the origin of species. It might even nudge a proportion of them to the far easier to understand fairy tales of benevolent sky gods.

How do you feel about this development? Does evolution have a place alongside Intelligent Design in religious education or should they both be kept for science lab debates?

On the origin of chemical species

Organic chemist Dan Lednicer has provided us with a guest Sciencebase editorial. “The enormous strides that have recently been made in molecular biology hold great promise for speeding the discovery of pharmaceuticals to treat diseases that have so far been recalcitrant to drug therapy,” he explains, “and the day may well be in the offing when a majority of important new pharmaceutical products will owe their existence to carefully crafted research programs based on the increasingly detailed understanding of the molecular biology involved in the particular disease that is being addressed.

Read the full feature from Dr Lednicer here in Serendipity and Science

Sex and diabetes

Approximately half of men with diabetes suffer at least one episode of erectile dysfunction and there are several strategies available to overcome what is in those cases usually a problem of body chemistry. According to a report in the Cochrane Review of clinical trials, the well-known drugs for treating erectile dysfunction really do improve sexual satisfaction for sufferers. The report covers the three main phosphodiesterase type 5 (PDE-5) inhibitors, sildenafil (Viagra), vardenafil (Levitra) and tadalafil (Cialis).

According to the study side-effects, such as headache and flushing, are common, but not sufficiently adverse as to dissuade users from abandoning the drug.

The Cochrane Review draws data from eight clinical trials (totalling almost 1800 participants) in which 976 men had been given a PDE-5 inhibitor, and 741 a placebo.

‘If taken as prescribed and when no contra-indications exist, PDE-5 inhibitors provide a useful option for men with diabetes who suffer from erectile dysfunction,’ says report author Moshe Vardi of the Carmel Medical Center, in Haifa, Israel.

You can read the abstract from the report at the Cochrane Library site. For more on the origins of Viagra and the other PDE-5 that followed in its wake, check out the Sciencebase archives.

Mobile Phones and Cancer

Mobile phones and healthThe UK Times paper reported on Saturday that a leading cancer researcher Professor Lawrie Challis chairman of the government-funded mobile telecommunications health research programme believes it is time that a large-scale study into the long-term risks associated with cellphone use.

Intriguingly, health and medicine writer Caroline Richmond pointed out that just such a study was actually published just three days prior to The Times article appearing.

The abstract for this paper by STUK, Radiation and Nuclear Safety Authority, Helsinki, Finland says:

“We conducted a population-based case-control study to investigate the relationship between mobile phone use and risk of glioma among 1,522 glioma patients and 3,301 controls. We found no evidence of increased risk of glioma related to regular mobile phone use (odds ratio, OR = 0.78, 95% confidence interval, CI: 0.68, 0.91).” The study encompasses digital and analog mobile phone use lasting ten years.

More than 200,000 volunteers and £3 million ($6m) of government and phone industry money will be needed to assess long-term risks of five years or so for cancer and Parkinson’s and Alzheimer’s diseases. Challis is currently negotiating for the necessary funding.

It is odd that this news story broke so close to the publication online in International Journal of Cancer. It also makes one wonder why there seems to be such a continued “hope” among certain segments of the media to find a correlation between mobile phone use and brain cancer. Surely, there isn’t an expectation that if such a correlation were ever demonstrated that the industry would cough up compensation to the literally millions upon millions of regular, long-term mobile phone users. Moreover, if such a demonstration were published might not a similar investigation raise concerns about other electromagnetic radiation sources again, such as powerlines, computer screens, microwave ovens and most recently wireless internet connections?

What do Sciencebase readers think? Would this be £3m well spent, or shouldn’t The Times simply publish a front page story about the STUK study, so similar to the one that Challis is after, that has already been carried out, peer reviewed and published.

Plos One latest

Plos One the new OA science journal, the launch of which we announced here on January 1, seems to be building up quite a head of steam, it’s almost superheated in fact (more on that via the link). There are some rather fancy paper titles on their homepage, as I write, covering some very disparate subject areas, which is what the journal needs if it is to compete in the open market with the likes of Nature, PNAS, and Science. Among the latest, at the time of writing are:

Superheated Water

TL:DR – Superheated water is water that has been heated to above its boiling point while held under sufficiently high pressure to keep it liquid. Technically, it is at a temperature higher than its vaporization point at the absolute pressure where the temperature is measured. It is possibly to superheat liquids other than water for a range of industrial uses.


WARNING: DO NOT TRY THIS AT HOME

Put simply, superheating involves raising the temperature of a liquid, for instance, beyond its boiling point without allowing it to vaporize. This can be done by heating water in a sealed container above 100 Celsius. There is an urban myth that has done the rounds for many years that it is possible to superheat the contents of a liquid-filled cup in a microwave and trigger a geyser of fluid when you remove it and stir. Who hasn’t received the spam-mail describing the 26-year old who was severely disfigured by such an incident?

Well, there are risks associated with all cooking, and heating a liquid in a microwave for long enough will produce a boiling temperature liquid and a container surface coated with scalding hot “condensation” that could cause you to jolt and end up splashing yourself with scalding liquid.

Apparently, it happens, so be careful. However, I think it would be hard to actually “superheat” the liquid, although the guys in this video may have done just that using pure, distilled water or similar.

The liquid would have have no so-called “nucleation” points, specks of dust, particles, or whatever, even scratches inside the beaker to seed bubbles of steam. The water could very easily surpass its boiling point without actually boiling.

Like Snopes says, it is possible but takes a lot of effort to cause superheating in a normal cup under normal conditions in a microwave oven. Nevertheless, it’s not worth risking a scalding in an attempt to duplicate the above experiment with your morning coffee.

Ambiguous aspirin

AspirinTwo back-to-back papers in the well-known chemistry journal Angewandte Chemie recently could have potentially serious consequences for the pharmaceutical industry, because they reveal what the authors claim are inherent ambiguities in the crystalline forms of aspirin.

A team of scientists from Denmark, Germany, and India suggest that the recently reported form II of the ubiquitous pharmaceutical may indeed exist but the crystallographic evidence could just as readily be interpreted as being from a single crystal of form I. The findings could have implications for patent arguments over novel forms of the purportedly generic drug.

Even from the early days of crystal studies into aspirin, there were serious issues surrounding its structure. PJ Wheatley obtained the first crystal structure in 1964, but certainly not without a degree of ambiguity. “After Wheatley, Chick Wilson got very high R-factors in his neutron study of 2000,” Desiraju told SpectroscopyNOW.com, “this is what was nagging me throughout, why were these R-factors so high?” Desiraju and his colleagues suggest that the Zaworotko study does not represent much of an improvement on the precision of these earlier studies and moreover confounds attempts to define a new polymorphic form of aspirin.

Read the full story on SpectroscopyNOW.com

Antibiotics from green tea

Green tea antibioticResearchers from Slovenia have used spectroscopy to home in on the active site of an essential bacterial enzyme, DNA gyrase. They say they now understand more clearly how a compound found in green tea, EGCG, which is a health-boosting antioxidant, works to kill bacteria.

The findings should allow researchers to design new, synthetic versions of EGCG that improve on its activity without side effects.

“I think that this direction is worth pursuing,” team leader Roman Jerala told me, “EGCG besides being unpatentable is not very stable in the body and has low bioavailability but this could be improved.” In their paper, the researchers discuss several possible research directions, however Jerala concedes that he and his colleagues lack the synthetic capabilities to pursue them. “We could only go in this direction with support from other labs,” he says, “Hopefully pharmaceutical companies will consider it.”

More…