Adverse Drug Reactions

A statue of Asclepius. The Glypotek, Copenhagen.The Wall Street Journal reports (Jan 2, 2009) that a new collaboration between pharmaceuticals giant Pfizer and two Boston hospitals will test whether computerized patient records can boost reporting of adverse drug reactions (ADRs) making it a routine part of filling out electronic patient charts.

Some time ago (Catalyst column, ChemWeb.com, June 1998), I discussed the implications of the more than 100,000 deaths in the US each year allegedly caused by patients’ reactions to their medication – three times the number killed in car accidents. So-called adverse drug reactions (ADRs) are, estimated to be the fourth biggest killer in the US after heart disease, cancer and stroke. Recently, there has been an upsurge of interest in ADRs and calls in the US for an independent body to be established to make control of drugs once they have passed though the regulatory process easier and save lives.

That 100,000 is just a statistic of course, except for those patients and their loved ones affected. Every drug has side-effects and although they do not exist through malicious design, one can perhaps see that the drug R&D process is not perfect.

A pharmaceutical company for reasons of economics and politics cannot possibly study the effects of every putative drug on every ‘type’ of individual in the different circumstances in which it might be used. This is where medication monitoring services come in handy. Pharmacogenomics and personalised medicine that focus on each patient’s single nucleotide polymorphisms (SNPs) may remedy this. But, despite the emergence of inexpensive genomics and predictions of the $1000 genome, this is still true when it comes to administering to the elderly and children as they can be more sensitive than the proverbial adult. Moreover, in the supposedly clinically correct environment of the hospital there are likely to be even more exacerbating factors at work for each individual patient than there might be for a patient with a straightforward bacterial infection, say.

An individual’s genome may be at the root of a particular type of adverse drug reaction. As Catalyst discussed early in 1998. Ten percent of Caucasians and about two percent of Chinese people cannot metabolise the analgesic (painkiller) codeine into its active form, morphine. The drug therefore simply does not ‘work’ for them. The problem boils down to those patients lacking the gene for the liver enzyme CYP2D6 responsible for the conversion. This particular effect was discovered by Alastair Wood a clinical pharmacologist at Vanderbilt University in Nashville, Tennessee. The drug having no apparent effect might lead the GP to prescribe a higher, perhaps intolerable dose. For a Chinese person lacking CYP2D6 the result can be severe nausea.

CYP2D6 metabolises a variety of drugs in addition to codeine, for instance, the antihypertensive propranolol (Inderal), propafenone (Rythmol), for heart arrhythmia, and many of the tricyclic antidepressants. In these cases though people lacking CYP2D6 actually experience an exaggerated effect as the active form stays in their system longer.

In the hospital environment, muscle relaxants used in anaesthesia can be a particular problem for some patients, because they have a faulty gene for the enzyme, butyrylcholinesterase, that would naturally metabolise that drug. For example, succinylcholine stops patients breathing during surgery, this is fine while mechanical ventilation is continued but for some patients the apnoea does not cease and they can die. Peculiar peak concentrations of the TB drug isoniazid have been seen with some patients and have been correlated with a faulty N-acetyltransferase.

In fact, there are many, many variations in drug response that have been recognised and the pharmaceutical companies are becoming well aware of the potential for profit these variations might bring if they can develop drugs tailored to an individual’s genome. The National Institutes of Health in the US has also recognised the potential for improving medicine and is in the process of establishing a Pharmacogenetic Polymorphic Variants Resource database for genes encoding proteins that determine variations in drug responses.

Pharmacogenomics ties in closely with the reporting of adverse drug reactions, although not all ADRs are due to genes. The anti-obesity drugs dexfenfluramine and fenfluramine which are often taken in combination with phentermine – as fen/phen – caused serious ADRs in the form of major heart valve problems in 31% of patients taking the combined medication. The eventual withdrawal of the drug once the problem was widely recognised and publicly known was swift but fenfluramine had been on the market 24 years.

However, while the voluntary reporting of ADRs is fairly common within the medical profession their existence is not well known. Indeed, aside from mentioning a few cursory side effects doctors are often unaware of potentially serious reactions to particular drugs and this is compounded by the fact that all this reporting of ADRs is purely voluntary with the onus on the pharmaceutical companies. As such, there are many people unfairly affected by these drugs and there are actions against pharmaceutical companies like the lawsuit against Levaquin, Actos, etc. It took twelve years before the antihistamine drug used by countless hayfever sufferers every summer was withdrawn in preference to its safer metabolite. The major ADR of terfenadine is potentially fatal heart arrhythmia especially in users taking certain antibiotics at the same time.

A group of medical scientists led by Alastair Wood, published a paper in the New England Journal of Medicine (1998, 339, 1851) calling for an independent drug safety board to be established to keep tabs on ADRs. This body would be there to help protect patients as well as ensuring that medical practitioners were made fully aware of the putative hazards of the countless drugs they prescribe.

According to Wood and his colleagues, ADRs are a serious cause of patient morbidity and mortality. They make the point that there have been independent bodies in place to investigate the likes of plane crashes, train and major traffic incidents, chemical and radiation accidents for many years. These bodies can make recommendations to prevent similar serious episodes happening again following an accident. But, there is no organisation with responsibility for monitoring ADRs and to ensure proposals put forward following an investigation are taken on board.

The ad hoc approach to reporting of ADRs and reactions to drug products seems at odds with the fact that we have Internet and information technology available. Wood and his colleagues say that for all this technology it is remarkable that little use is made of it for drug surveillance to help avoid the huge numbers of deaths that occur. The likes of terfenadine and phen-fen which do end up being withdrawn by the FDA are few and far between and the evidence on which the decision is based while strong is not often in the form of formal statistical analysis. One of the problems is that the US Food and Drug Administration (FDA) does not have the resources to carry this out nor is it in the interests of the pharmaceutical marketers to gather such data.

Wood and his colleagues believe that the solution to the problem is to make this surveillance obligatory through the creation of a body independent of the agency that carries out drug approvals – the FDA. A second, independent body would help avoid conflicts of interest, in that the FDA would not have to investigate problems with drugs it had approved! In their paper in the NEJM the authors state,

We must expect that predicted and unpredicted adverse events from drugs will continue to occur. If we accept that the true safety profile of a new drug is dependent on the experiment that necessarily follows the drug’s release into the marketplace, then we must fund and implement mechanisms to ensure that the experiment is properly monitored, the data appropriately analysed, and the conclusions disseminated rapidly.

Clinical trials can involve a few thousand people, once approved, millions may take it soon after especially now that TV marketing is available in the US to the companies.

Not all ADRs are lethal, just adverse, and some are simply unavoidable because of the individual circumstances in which a drug is administered. They may be unpredictable and unavoidable in some cases but once an ADR occurs the medical community should be made aware of the risks as soon as possible so that better judgements about prescribing a drug can be made and ADRs pushed right down that list of causes of death.

This original version of this article appeared in my Catalyst column in ChemWeb’s The Alchemist in March 1999 before Vioxx, pre TGN1412, and only the intro has been updated January 2009.

Automatic for the chemist

UPDATE: This work eventually led to the Synthia software from Merck.

For decades, chemists have toiled over reaction flasks searching for new ways to mix and match atoms to make new molecules with which to cure ills, boost crops and generally improve our standard of living. There are countless still who spend their working days scouring the scientific literature for shortcuts and using trial and error to find fast and efficient synthetic routes to that all-powerful catalyst or a wonder drug from an obscure soil fungus. Less than flask-happy chemists hope to use computer programs to design their reactions for them and ultimately control the robot arm to shake the test-tube for them.

German chemist Johann Gasteiger together with colleagues at the Institute for Organic Chemistry at the University of Erlangen-Nurnberg has spent fifteen years or so designing a neural network program that might be a first step on the way to hanging up the lab-coat.

cannabinoid comes easier?
Why spend weeks designing a synthesis?
His system uses the accrued information found in commercial databases containing hundreds of thousands of chemical reactions – each with its own reaction conditions: cooking time, pressure, catalysts, reagents and acidity, listed together with physical parameters about the molecules involved.

Today, a chemist might search such a database manually or use a search program to pick out reactions of interest. This, according to Gasteiger, can get embarrassing, “A single search can lead to a list of several hundred reactions from a database that can contain millions,” he explains, “so manual analysis is both laborious and time consuming.” One way to cut down on the effort involved is to classify the multitude of reactions.

Chemists have been classifying whole swathes of reactions for years by naming them after their inventors – the Wittig, the Beckman, the Diels-Alder, but, posits Gasteiger, this system does not help very much in indicating to the chemist exactly what takes place in a particular reaction brew. This is especially true because there are literally dozens of variants in each class. He felt that the solution would be a neural network could do the sorting for him. “There are two approaches to teaching a neural network chemistry”, explains Gasteiger, “supervised and unsupervised learning.” The former is labour intensive and involves presenting the network with input patterns for thousands of reactions and telling it which ones work in which circumstances. “We prefer the unsupervised approach,” says Gasteiger, “It cuts the workload considerably.”

How to teach a neural network chemistry? Gasteiger and his team have used a Kohonen network – a computer model of how our brains organise sensory information – sights, sounds, and tactile feelings in which inputs are mapped onto a two-dimensional network of neurones. By extending this mapping process to the properties of reactions in a database they could gain important information about many reactions at once.

Instead of sensory inputs for the network the researchers used each factor affecting a reaction – such as temperature and acidity – and these co-ordinates were fed into the neural network.

The team picked on a single broad class of reactions to test their networking ideas: reactions that involved adding a carbon-hydrogen group to an alkene. This type of reaction encompasses a variety of important schemes used to produce many industrially useful chemicals such as esters for artificial flavourings – so-called Michael additions, Friedel-Crafts alkylations by alkenes and free radical additions to alkenes.

They used a search program to narrow things down first – they obtained a set of 120 reactions from a 370 000 strong database. They then chose seven characteristic physical properties associated with the actual portion of the molecule that changes – the reaction centre – as the input for the neural network. For instance, the ability of the double bonds between carbon atoms to attract electrons, its electronegativity, the total charge, and the degree of possible distortion of the electron cloud in the bond, its polarisability.

The network they used is a grid of 12×12 neurones with a “weight” associated with the seven chosen variables. When a reaction is input the variables are mapped into the neurone whose weights are most similar to the input. Reactions are input sequentially and after each entry the weights on each neurone are adjusted to make them more similar to the input variables. The adjustment is highest the closer the hit on each neurone and tails off with distance.

The next input if it has similar variables will be mapped on to a neurone close to the first but if it is different a neurone it will locate on a distant neurone and the weights will be adjusted again. The result of these weight adjustments is that the network is trained to recognise patterns of parameters and to place a particular reaction accordingly. Eventually a 2D landscape of reactions is built up with similar reactions close to each other forming groups of reaction types. Logically, reactions far apart in the landscape are very different. Isolated peaks in the landscape point to unusual and uncommon reactions.

The most exciting aspect of the way Gasteiger’s neural network can classify reactions is not that it verifies the system already used by chemists every day, but that if they have a new compound they can look at the seven variables, feed them into the trained network and the network will assign it to a specific neurone. This allows the chemist to see the likely reaction a molecule will undergo in the lab. For instance, if a molecule finds itself at the centre of the area of the map covered by the so-called Michael addition then it is likely to undergo a standard Michael addition. If it is further afield it will probably undergo something more exotic.

It took less than 20 seconds for Gasteiger’s team to train the network with their sample of 120 reactions on a Sun workstation. So to train it on the full reaction database would take little more than a day or two allowing some time for checking. Gasteiger points out that computer time once the neural network is trained is very short (less than half a second) so making predictions about a particular molecule is very fast.

Classifying reactions is not the whole story though – once you know what type of reaction a molecule will undergo, the next step is to work out how it can be used to build up more complex molecules. Chemists usually picture a target molecule and cut it up into smaller jigsaw pieces that can then be re-assembled in the reaction flask. The difficulty lies not only in knowing where to make the breaks to simplify the reactions needed to put the puzzle back together, but in finding reactions that can make the lugs of each jigsaw piece fit together properly. This might be where Gasteiger’s neural network could help in predicting what would work.

Corey’s own program for automating the process, LHASA (Logic and Heuristics Applied to Synthetic Analysis), is marketed by LHASA UK, a company based at the University of Leeds). According to Nigel Greene of LHASA UK, “LHASA is a knowledge-based expert system not a reaction database.” It uses what he calls transforms to describe a generic chemical reaction class e.g. the Michael addition. These transforms are compiled manually from a study of the chemical literature. The program then searches the query compound for the correct stuctural requirements in order to apply the transforms, which is tantamount to picturing the break-up of the jigsaw.

According to James Hendrickson of Brandeis University, “there are literally millions of different routes possible, from different starting materials, to any substance of interest.” He and his team have devised a program (SYNGEN), which can find the shortest route to any molecule from available starting materials. First, SYNGEN looks for the best way to dismantle the target jigsaw. Then, for each dissection it generates the reactive chemical groups needed to carry out that reaction sequence to build the product. Results are displayed onscreen. “In a number of cases to date, the computer has generated the current industrial routes to several pharmaceuticals, such as estrone,” explains Hendrickson. SYNGEN has also proposed more efficient routes to numerous compounds such as lysergic acid, the precursor to ergot drugs and LSD. A new version of the program is in development ready for licensing to pharmaceutical companies this year.

William Jorgensen of Yale University in New Haven Connecticut is working on yet another program CAMEO (Computer Aided Mechanistic Evaluation of Organic reactions). The chemist feeds the starting materials – using a sketchpad – and the reaction conditions – via drop-down menus – into CAMEO, virtually speaking, and the program attempts to predict the course of the reaction. It assembles a reaction from underlying mechanistic steps because as Jorgensen points out a large fraction of organic reactions are just combinations of various fundamental steps.

Sometimes CAMEO (also marketed by LHASA UK) claims no reaction product will emerge, a chemical rule would be broken if it were. The chemist can then run the reaction again virtually in a different solvent or at a higher temperature and watch the result, cutting testing time in the lab.

The various programs apart may not seem to offer a chance for the chemist to boost their leisure time but together they may provide a way of classifying reaction types, working out what type of reaction might take to yield a new molecule using a neural network, feeding it into CAMEO to see whether reactions with other molecules could lead to it and then using SYNGEN to optimise the route.

Some chemists are not worried about losing their jobs just yet though. Al Meyers of Colorado State University at Fort Collins, says, “There is a delicate balance between reacting species, solvents concentrations, selective reaction behaviour, and most important, the human ability to observe what is happening, cannot be incorporated into a reaction software package.” Software will play its role though, “The synthesis programs can bring into focus the many options available to the seasoned chemist”, he adds.

We will have to wait and see who or what is shaking the reaction flasks in ten years time.

Touch Wood! A Guide to Viagra Louts

Elephant penis, photo by David Bradley
When it comes to keeping up appearances, even a bucketful of gooey oysters, half a dozen XXX videos and a smattering of sensual massage don’t always have the desired result. In desperation, even tiger penis tea and rhino horn – is it hung round the neck, or what? – sound more appealing than the best medicine had to offer those who are willy nilly: a drug to circumvent the problem injected straight into the penis through the front opening in one’s underwear. A romantic interlude before lovemaking I don’t think. More a case of ‘take away the pain Doctor, but leave the swelling’.

Then along came Viagra – which was just swell! A solution, or rather a pill, for that little bedroom engineering problem. Medical science had come good, providing a much-needed boost for flaccid men the world over. With medical approval in hand, pharmaceutical company Pfizer launched their product on to a desperate market shooting to number one almost overnight. Prozac? Who needs it when Viagra gets to the root?

The story began several years ago when scientists found a tiny gas molecule called nitric oxide (NO) acting as a chemical communicator in our cells. Importantly, for sexual health, the release of NO by cells in the penis activates an enzyme called guanylate cyclase. This crucial reaction makes another molecule, cyclic guanosine monophosphate (cGMP). CGMP relaxes penile smooth muscle allows the arteries to expand and so blood rushes in giving an erection.

Horny old goat, photo by David Bradley

Normally, a second enzyme (PDE5 found mainly in the penis) gradually breaks down cGMP so without continued stimulation an erection flops. Pop a Viagra though and this second enzyme is blocked. Even the most flaccid of penis will release some NO with erotic stimulation and once it does there is no PDE5 available to damp it down again and for two-thirds of men erection will ensue – within an hour.

Viagra, aka sildenafil citrate, was originally to be a drug for treating angina and high blood pressure. When patients failed to return spare tablets after the clinical trial, though, Pfizer suspected something was up. It turned out that one side-effect was a spontaneous and sustained erection. Pfizer grasped the potential in treating what the doctor will call erectile dysfunction – ‘not getting it up’, to you and me. After the usual safety checks, they carried out a trial on more than 3000 patients aged 19 to 87 to see what effect the drug would have on impotent men and whether it could help them achieve a satisfactory sexual result. Rather than use video cameras to monitor ‘activity’, Pfizer opted for a questionnaire approach and trusted the patients, to be honest.


The responses tallied with anecdotal evidence from the heart patients: Viagra gets it up regardless of problem, general health, race, or age. The amazing thing is that Viagra gets it up in four out of five men regardless of why they are limp. There are so many causes of impotence, from psychological and old-age to injury and prostate problems, that a drug aimed at curing any one problem may not have been so successful. Viagra, or to give it its proper chemical name 1-{[3-(6,7-dihydro-1-methyl-7-oxo-3-propyl-1H-pyrazolo [4,3-d]pyrimidin-5-yl)-4-ethoxyphenyl]sulfonyl}-4-methylpiperazine citrate (here is the pdb file), is satisfaction almost guaranteed despite the name being more of a mouthful than the tiny sky-blue diamond tablets.

Viagra has been available in the USA on prescription since March 1998 and, according to George Dunea of the Cook County Hospital in Chicago writing in the British Medical Journal that year, has probably led to a lot of doctors there with writer’s cramp pumping out almost two million prescriptions! Shares in Pfizer quickly reached a high plateau and should be sustained with anticipated billion dollar sales for next year – sex truly does sell.

Everything in the bedroom is not rosy though – far from it. One bizarre side-effect experienced by some users is a strange blue-green hue to their vision. The colour blindness passes but aside from jokes about too many erections making you blind, some eye specialists are worried about long-term effects on sight. Viagra has also been blamed for headaches, hot flushes, rash, dizziness, diarrhoea, priapism (sustaining an erection for hours after orgasm) and Pfizer provide an even longer list in the accompanying notes.

There is also the tragedy of sixteen men who have died while allegedly using the drug. Eight others died during the clinical trials. There is concern that simply resurrecting your sex life with a drug could have been the cause, with hearts pumping blood to places it has not been for a while. Most victims were in their sixties and seventies, and most had heart problems or diabetes. Although this does not prove that Viagra may have side-effects on the heart – Pfizer’s clinical research found that reduced blood pressure could occur.

Another problem has reared its ugly head too. It did not take long for people to realise that if Viagra can boost the ego, so to speak, of impotent men, then it could probably double the efforts for those without problems. Healthy men claiming impotency may be among the biggest beneficiaries of those millions of prescriptions. Popping a Viagra in the hope of giving even an active sex life that extra rise.

Not surprisingly, an Internet blackmarket quickly emerged with genuine Viagra as well as dubious products called Veagra and Viagre being marketed. Viagra is fast becoming a recreational drug and the UK’s Medicines Control Agency has even set up The Special Enquiry Unit of ‘V-men’ to hunt down anyone illegally selling Viagra. Once Viagra is made available on the NHS later this year, the government has said that individual GPs will be responsible for making sure only needy patients are prescribed it and they will be responsible for the outcome. There have already been more than a dozen attempts to market Viagra illicitly in Britain.

With abuse of drugs come increased risks, especially for particular users. Certain social groups have known for many years that a group of compounds known as organic nitrates (amyl nitrate/nitrite, or poppers, are probably the best known) can speed your pulse, boost libido and allegedly make sex a more fast-paced and thrilling experience. While there are numerous nitrate-based prescription drugs too for treating hypotension including some based on nitroglycerine – the opposite of high blood pressure. The problem is that anyone taking these drugs with Viagra could suffer plummeting blood pressure and possibly death. There have been reports of patients taking nitroglycerine treatment for low blood pressure who have experienced blackouts but doctors are warned to check for other medications and drugs before prescribing Viagra.

But what about the ladies? Pfizer is currently carrying out trials in Europe and discussions are taking place in the US to see whether women with sexual dysfunction, reduced libido, or lubrication problems (which can start at menopause) might benefit from taking the drug. There is now evidence that male erectile dysfunction might be more closely related to similar problems in women where blood flow to the genitals is just as important in sex. What’s sauce for the goose…after all.

There could soon be an alternative to Viagra. Vasomax (phentolamine) made by Zonagen is still passing through the approval pipeline. It apparently has the distinct advantage over Viagra in having only a 20 minute pre-love period. It’s also safer for patients taking nitrates. Pfizer, of course, intend to pre-empt approval of Vasomax and are working on a wafer form of Viagra that would be absorbed through the tongue and so act much more quickly.

The Public Health Minister Tessa Jowell has announced that Viagra will be available by prescription this year, but declined to say how the prescriptions would be limited and exactly how patients would be assessed for need.

Anyone, thinking about Viagra must weigh up the risks. The best bet might be to give the soft lights, romantic music and oysters another try. If you do opt for Viagra just pray you don’t discover you suffer from premature ejaculation…

Article by David Bradley

Science Writer originally appeared in the BBC Tomorrow’s World magazine.

Footnote added subsequently: But, what about the question of dissolving it in water? Sildenafil citrate is orally available, so presumably it dissolves in water but I doubt the manufacturers will be creating a fizzy version any day soon.

Since this article was first published several other drugs have come available – Cialis, Levitra, Uprima (goes under the tongue, sub lingually as it were), and new delivery methods for Alprostadil.

Interview with Eric Scerri

This “Personal Reactions” interview with Eric Scerri originally appeared in my column in The Alchemist webzine, 1998-04-03.

Biography:
Eric ScerriProfessor Eric Scerri, born 30th August 1953, Malta. Nominated for the Dexter Award in the History of Chemistry. Interested in the philosophy of chemistry, especially philosophical aspects of the periodic system and of quantum chemistry.


Position:

Assistant Professor, Bradley University, Illinois.

Major life events:
Gaining a PhD in History and Philosophy of Science at King’s College, London on the Relationship of Chemistry to Quantum Mechanics. Being invited to the home of philosopher of science Sir Karl Popper for a discussion on quantum mechanics, chemistry, philosophy, life and the universe. Going to the US as a postdoctoral fellow in History and Philosophy of Science at Caltech. Becoming editor of Foundations of Chemistry.


How did you get your current job?
Job advert in Chemical and Engineering News.

What do you enjoy about your work?
Lecturing to students and generally interacting with people. Being paid to do what I enjoy the most, chemistry.

What do you hate about your industry?
The presence of large numbers of people who do no research, do not keep up with recent developments and pontificate endlessly about how “professional” they are.

What was your first experiment?
My first experiment while teaching was the fountain experiment.

Did it work?
No it did not. As anyone who has tried it will tell you, it’s tricky. I made sure I got it to work the second time.

What was your chemistry teacher at school like?
Excellent, warm and inspiring. Both women: Mrs Davis and Mrs Walden at Walpole Grammar, Ealing, London. The school has now been demolished to make space for a housing estate.

Meeting Popper must have been a formative experience?
You bet! First, he got very angry with me because I had sent him an article in which I was criticising his views on the discovery of hafnium. According to him and many others Bohr predicted that hafnium should be a transition metal and not a rare earth and that led directly to the discovery of hafnium by Coster and von Hevesey. The full story is far more complicated as I and others have emphasised.

Popper in fact accepted my specific criticisms on the hafnium case. I think his initial anger was a sort of knee-jerk reaction, which he had to all critics. After about five minutes, he became a perfectly charming host and answered all my questions and made me feel like an equal even in purely philosophical matters.

What is your greatest strength?
Presentation of ideas in lectures. Being able to criticise arguments.

Weakness?
Sometimes over-critical.

What advice would you give a younger scientist?
Concentrate on mastering mathematical techniques. If the student ever wants to go into theory she will have to be a master of mathematical techniques. Chemical theory is very, very interesting.

What would you rather be if not a scientist?
A jazz and blues musician.

In whose band?
In my own band! I have been playing since I was 16 or so.

Which scientist from history would you like to meet?
Linus Pauling

What would you ask him?
About the genesis of quantum chemistry and about the people he came into contact with during his postdoctoral stay in Germany. I think he had the deepest respect for them but was personally more interested in applications to chemistry than reaching a deep understanding of quantum physics. His own approach may have appeared a little too cavalier to the European purists. By his own admission Pauling was working with Bohr’s old quantum theory when he first went to Europe only to be informed by Wolfgang Pauli that more sophisticated versions of quantum mechanics had been developed. Pauling immediately made the switch.

How has the Internet influenced what you do?
Enormously. First of all on a practical level I can find addresses, e-mails, phone numbers of anyone I care to with a little bit of searching. If I read an interesting article I can track down the author and ask them a question a few moments after first reading their ideas.

I should also point out that the Internet brings problems. A student recently wrote a paper for me on the history of the periodic table. He referred exclusively to material on the Internet. Most of the paper was filled with inaccuracies, complete mistakes etc. It was not the student’s fault. The problem is that anyone can set up a beautifully illustrated web page without bothering about the academic content and cast it out on to the Web for unsuspecting students to find. There is of course no [peer] review process for what goes on to the Web.

Wasn’t the student a bit naive to assume total credibility of unqualified sources?
Okay, you are right. He was not a brilliant student and he was lazy. Let’s just say it is tempting for students to sit in their own rooms and surf the Web instead of getting their butts into the library.

Why do you think the public fears science?
Lack of knowledge of course and the hard-edged and clinical image portrayed by many scientists.

What are the ultimate goals for chemists?
I am a philosopher of chemistry and chemical educator. I cannot really answer this question which seems to be directed towards “real chemists”. But do you really mean “ultimate goals”? If I were a theoretical chemist I would say to be able to calculate everything from first principles so that we would never need to do experiments and could pack up and go home. If I were a real chemist reaching such “ultimate goals” would not be much fun.

What will chemistry do in the next ten years?
Nor am I a fortune-teller.

You could speculate though…
Well, I really think computational chemistry and modelling will go on expanding as quickly as do developments in the computer industry. Chemists are going to have to get used to the idea that more and more “experiments” will be done on the computer. This should not imply however that quantum chemistry could explain everything in chemistry – that chemistry has been reduced. Far from it. It just means that computational chemistry can be used as a useful tool along with the various spectroscopic techniques, which have already revolutionised chemistry.

What invention would you like to wipe from history?
Chemical weaponry

Size matters – quetta ronna yotta…atto zepto yocto ronto quecto

TL:DR – The SI prefixes for units in multiples of three orders of magnitude:

quetta (1030), ronna (1027), yotta (1024), zetta (1021), exa (1018), peta (1015), tera (1012), giga (109), mega (106), kilo (103)

milli (10-3), micro (10-6), nano (10-9), pico (10-12), femto (10-15), atto (10-18), zepto (10-21), yocto (10-24), ronto (10-27), quecto (10-30)


UPDATE: 2022 – It’s time for an update to my 1997 article about SI prefixes as the Système International has added a couple more to its list. From 1991 until this year, the prefix with the mostest was “yotta”. It represented 1024.

SI now has two above that: “ronna” meaning 1027 and “quetta” meaning 1030. At the other extreme beyond “yocto”, meaning 10-24, we now have “ronto” (10-27) and “quecto” (10-30).

British metrologist Richard J. C. Brown proposed the two new macro prefixes ronna and quetta to stay ahead of needs of big data science and to preclude the use of unofficial prefixes. The micro prefixes ronto and quecto were enlisted for the sake of symmetry.

How low can you go? Come to that, how high can you go?

Small graphic showing a mock up of an eye test chart of the letters representing the size prefixes discussed in the article

quetta ronna yotta zetta exa peta tera giga mega kilo….milli micro nano pico femto atto zepto yocto ronto quecto

Q R Y Z E P T G M k h da d c m µ n p f a z y r q

Many readers will probably be familiar with measuring millilitres of solution, more than likely conversant in micromolar concentrations, well aware of picoamps and certainly not averse to a discussion on nanometre dimensions. Probe the majority about zepto and yocto and you may draw a blank. For the analytical chemist though, zepto and yocto represent the smallest of the small (at the moment). Is there any point to the infinitesimal? asks David Bradley.

Zepto and yocto, I should explain, are the rather bizarre extension of the sliding scale that drops by thousandths from milli to atto and beyond. Zepto is 10-21 (symbol z) and yocto 10-24 (symbol y) and on a molar scale they represent quantities of about 600 to 600 000 molecules (zeptomolar) and from 1 to about 600 (yoctomolar). These terrifyingly small amounts characterise actual detection limits for certain analytical methodology and while demonstrating the technical prowess of the scientists and the power of their instrumentation which, in the last decade or so, have attained them, they seem to be taking things a bit too seriously. After all, what possible relevant effect could such minuscule concentrations of any particular analyte have?

10n Prefix Symbol
1030 quetta Q
1027 ronna R
1024 yotta Y
1021 zetta Z
1018 exa E
1015 peta P
1012 tera T
109 giga G
106 mega M
103 kilo k
102 hecto h
101 deca, deka da
100 (none) (none)
10-1 deci d
10-2 centi c
10-3 milli m
10-6 micro µ
10-9 nano n
10-12 pico p
10-15 femto f
10-18 atto a
10-21 zepto z
10-24 yocto y
10-27 ronto r
10-30 quecto q

According to Steve Morton of instrument manufacturer Cambridge-based company Unicam Atomic Absorption analytical scientists are interested in two different types of detection limit, which make the ability to observe yocto and zepto amounts of material crucial in every arena in which the analytical chemist works, from pollution detection to watching biochemical reactions in single nerve cells and detecting enzymes and traces of drugs, for instance.

First, there is the instrumental detection limit (IDL), “This,” he explains, “is a measure of the detecting power of the instrument.” Perhaps the most important selling point of a machine. It can be thought of more specifically as the machine’s ability to distinguish between a sample with a nominal concentration of zero units of the analyte one is interested in – a so-called blank – and the sample being sampled. (For anyone particularly interested in the details about what is meant by a blank, there are no doubt several heavy statistical analysis books available.) Put simply, however, a blank is ideally a sample that is physically and chemically identical to the sample of interest but does not contain the analyte of interest.

Adam McMahon of Manchester Metropolitan University adds that there are various definitions of detection limits. “All,” he says, “should be based on the principle that the smallest quantity or concentration detectable is that which can be shown to give a significantly larger analytical response than a blank.”

Morton points out that the limitations of an analysis are not necessarily those of the machine but the quality of this blank. “Good blanks are increasingly difficult to obtain as the detecting power of the instruments improve,” he says. For example, calcium is a very sensitive and very common element. To measure it by a technique such as graphite furnace atomic absorption spectroscopy, the detection limit is determined by the quality of the blank sample, not the instrument as one might intuitively expect.

The second type of detection limit is determined more by the operator and the equipment, the method detection limit (MDL). This is a measure of the performance of the method, and says Morton includes all the sample preparation and dilution steps as well as the actual measurement – it is what he refers to as a ‘real world’ limit. It represents the minimum concentration in the sample that can be distinguished from ‘none’. Againthere are numerous statistics books that will lead the interested reader through the ins and outs of standard deviations and what they mean in terms of this ‘none’.

While miniaturisation is very familiar to electronics engineers it is only really in the last few years that analytical scientists have begun to apply the lithographic and other techniques to creating micromachined devices for separation and analysis. The likes of Andreas Manz at the SmithKline Beecham Zeneca Centre for Analytical Science at Imperial College London have expended a great deal of research time in reducing the volume of analytical devices. As I mentioned in my ChemWeb round-up of 1997 they can create microchannels in glass chips which work as high-performance liquid chromatography (HPLC) machines with the equivalent power of a million theoretical plates. One of the standard analytical tools, a gas chromatograph, can take anything up to half an hour to carry out the equivalent of 100 000 theoretical plates. Manz and others working on similar devices can cut the time considerably, or produce far better separations on minute volumes.

Nanotechnology is not only about tiny machines for chopping up arterial plaques and assembling miniature steam engines. Instead, it has the power to create yet more powerful analytical devices that can measure ever smaller quantities. There is a world of difference between measuring the glucose in a single heart cell and the sugar in a soft drink!

According to Bill Marton, an independent consultant and former manager of the analytical development laboratory at one of the largest pharmaceutical companies, “One place that nanotechnology and measuring single molecules will be important is in neurochemistry.”

When such small volumes are involved in an analysis the signals received are concomitantly small. Labelling an analyte with a glowing tag or an enzyme can boost the signal and cover up the noise without the need for soundproofing. For those chemists working in genomics another approach has been analyte amplification: if you don’t have enough of the stuff, just make more! The Nobel-winning PCR (polymerase chain reaction) boosts the tiny amounts of nucleic acids being studied to the point at which they can be sequenced, or at the least mapped.

According to William Heineman of the University of Cincinnati, “The key to achieving zepto- and yocto-mole detection limits is to combine two or more strategies.” He says that the likes of capillary electrophoresis, a separation technology that can work on the tiniest of scales but with big molecules, such as proteins, can work with laser techniques, like laser-induced fluorescence to push the limits down.

To achieve the ultimate detection limits, high sensitivity must be combined with selectivity and minimization of blank signals. High sensitivity can be achieved by an inherent gain mechanism that is most frequently provided by an electron multiplier or photomultiplier. Selectivity excludes signals from interfering species and can be achieved by a separation method, such as gas chromatography, liquid chromatography and electrophoresis, by the selectivity of enzyme or antigen reactions, or by spectroscopic selectivity particularly in atomic spectroscopy.

Contamination of the sample with the analyte (or loss of analyte from the sample) must be minimised by careful sample handling, use of clean reagents and even the use of clean-room facilities. McMahon considers such areas as being the way forward for the ultimate detection limit: “These three themes,” he posits, “indicate the directions for future research effort.”

There are a few spare Latin prefixes left for the ambitious analyst looking to go lower than zepto and yocto although a sample containing less than one molecule of analyte is perhaps going a bit far.

Footnote
For the etymologists among you: zepto is an adaptation of the Latin for seven with the s changed for z to avoid confusion with other esses and the “a” swapped for an “o” to make it consistent with nano, pico etc. Yocto has similar roots as octa, but with the y added to clarify the abbreviation.

Further reading
M A Cousino, T B Jarbawi, H B Halsall and W R Heineman, Anal Chem, 1997, 69, 544A-549A.

Hawthorn for Health

Farmers who tear up hedgerows may be destroying a source of new medicines for treating heart disease.

Farmers who tear up hedgerows may be destroying a source of new medicines for treating heart disease.

According to Ann Walker of Reading University’s Department of Food Science and Technology, a number of plant extracts, including that from the hedgerow favourite, the hawthorn (Cragaegus laevigata) could have positive medicinal effects. Walker and her team are studying the effects of non-nutrient phytochemicals found in hawthorn berries on volunteers with mildly raised blood pressure. She says that antioxidant nutrients and phytochemicals can play a key role in maintaining a healthy cardiovascular system.

A recent study carried out in the Netherlands showed that the flavonoids found in apples, black tea and onions reduce the risk of heart disease in elderly men. “Compounds, including flavonoids, from hawthorn have a dilatory effect on large and small arteries” says Walker, “causing an increase in blood volume, which reduces pressure”.

Almost all of the studies on hawthorn have been carried out in Germany on heart failure cases and this has given hawthorn a continental reputation as a herb with powerful action on the heart. However, the constituents of hawthorn are similar to those found in foods, so physiological action is mild. Indeed, practitioners of herbal medicine regard hawthorn as a tonic herb – it also has some bizarre pagan roots – that is suitable for long term treatment of even the mildest hypertension.

Walker reckons that, compared with new synthetic drugs, a shorter R&D time is required to produce proven herbal extracts that work. “Herbal medicines, as well as having a traditional history of use going back over the centuries, are used by practitioners of phytotherapy on a daily basis. Hence clinical trials can be started straight away.”

Synchrobite: The Hawthorne effect is the psychological principle that any group that is singled out for study or consideration will perform better for knowing that it has been so selected.

This item appeared first in Issue 3 of Elemental Discoveries in February 1997, see our announcement in the CHEMED-L archives.

Shining, Unhappy Plants

It is the dead of night, one summer just after the turn of the next century. Despite the darkness, a Midwestern farmer is surveying his acres of crops. From several clumps of plants scattered randomly throughout his fields there emanates an eerie blue glow. The farmer worries: The plants are obviously under stress.

If scientists in the United Kingdom are right, this scene might be played out all over the world. Glowing blue plants may someday provide an early-warning system that will alert farmers to infection and herbivore attack in time for defensive action.

At the Institute of Cell and Molecular Biology at the University of Edinburgh, a team led by plant biochemist Anthony Trewavas has been developing a genetic-engineering program to meet this goal. They are working with a protein that causes certain marine creatures, such as the jellyfish Aequorea victoria to give off light when they are attacked by predators. In response to touch, jellyfish cells fill rapidly with calcium ions, which act as a cellular alarm signal during the organism’s response to stress. The calcium ions bind to various molecules, including the protein aequorin. In binding to calcium aequorin gains an influx of energy, which it dissipates by giving off photons. In other words, it glows.

Plant cells also have an electrical response to stresses such as infection, touch and cold shock. Calcium ions pour in, again playing a signaling role in mobilizing the organism’s defenses. Trewavas and his team wanted to effectively amplify the calcium signal so that the farmer could lend a helping hand to a stressed plant. He reported the team’s latest results at the annual Science Festival of the British Association for the Advancement of Science in Newcastle-upon-Tyne in September.

A motivation for the research is the widespread use of blanket spraying of pesticides. Farmers practice blanket spraying in anticipation of infection or infestation because they would lose crops if they waited for visible signs of attack on leaf surfaces–if you wait, it is often too late to rescue the harvest. Farmers equipped with an early-warning system might be able to spray in time to prevent losses, and to spray only areas affected.

In the early stages of their work, the Edinburgh team transferred the genes that code for the fluorescent calcium-binding protein aequorin from the jellyfish into tobacco plants and mosses. They succeeded in their first goal: When wounded or infected or otherwise stressed, test plants responded quickly by giving off a very faint blue glow, detectable by ultrasensitive camera equipment.

“At the moment,” says Trewavas, “the light is not visible to the naked eye, but that is because this is a jellyfish gene, not a plant gene.” The jellyfish gene includes a number of DNA sections (codons) that plants use rarely, if ever, and this difference in how the genetic information is arranged limits plants’ ability to “read” the gene. “That means we need to resynthesize the gene to optimize it for plants,” he said.

The team hopes to increase expression of the protein, using appropriate promoters, so that the glow is visible in darkness. The choice of promoters could also make the signal more specific, so that, for instance, it would indicate a response to infection rather than to cold shock. Even if one seed in a thousand produced a plant capable of glowing, the warning would be more effective than that achieved in experiments using microinjected fluorescent dyes. Dyes that respond to accelerated calcium flow have been used to monitor plant stress, but these techniques are limited to single or small groups of cells.

Trewavas is optimistic that his technology will be available to farmers by 2000. “If the jellyfish can do it,” he says, “then so can we.” Neal Stewart, Jr., assistant professor of biology at the University of North Carolina at Chapel Hill, shares Trewavas’s bullish outlook and is beginning his own research. “I think that perhaps the year of commercialization may be optimistic–maybe not–but new and improved fluorescent proteins should be on line soon.”

The reference for my original article on this topic is American Scientist, Volume 84, Issue 1, p.25-26

Photo flattery and copyright

I am always rather flattered when somebody asks to use my photos in their publication or on their social media etc… I had a request from Mark McG, one of the organisers of Strawberry Fair, to use some of my photos from the famous one-day festival in the Cambridge Edition magazine, very happy for him to do that, with credit. All for a good cause.

The flipside is when you see a photo you took at another event appear on an organisation’s Facebook page where they didn’t ask permission, they didn’t give credit, and worse still they cropped off the logo. Frustrating, irritating, annoying. Doesn’t really matter, you might think, but hey…credit where credit’s due right?

Aside from it being a simple courtesy to credit the photographer, the online or print use without permission may have scuppered the photographer’s chances of selling the photo to another outlet or even just entering it into a photography competition.

So, here are the rules, they apply to all creative output really, words, pictures, music etc:

  1. All of my photos are my copyright, per se. Nobody needs to assert that, it is a given in law, unless otherwise stated.
  2. If you wish to use any of my photos, regardless of where I have already posted them myself, I expect a permission request – email me.
  3. Wherever you use any of my images, I expect a credit – Photo by Dave Bradley, https://sciencebase.com/photos (you can request not to include the web address, but I’d prefer you to use it and to make the whole credit a dofollow link)
  4. If you wish to crop the photo, please ask, especially if you plan to crop the logo from the image
  5. If your site/social media is a commercial concern, I expect payment. An invoice will be forthcoming based on prominence and value, if you asked permission, we can discuss the actual fee. However, if there was no permission, the standard fee will be £250 per photo for any site/social media with fewer than 250k readers/subscribers. Above that, we need to talk.
  6. A bonus tip for editors and social media managers: If you are supplied with a photo of unknown provenance, make sure you find the source and have clearance from the copyright holder before using it.

This post was actually written on 8th May 2019, but I’ve put it back at the beginning of the blog in 1996…

Science and Stuff

 

SB-songs-200px SB-photos-200px SB-science-200px SB-science-200px
Songs Snaps Science Book

I posted my first web page in December 1995, it was the online version of my first chemistry news roundup, a column I dubbed Elemental Discoveries, the RSC younger chemist magazine formerly known as Gas Jar, which I’d renamed along with Dr Mac as “New Elements”. That column persisted on various free servers until I got patronage from a well-known chemistry software company who began hosting it thereafter until I registered the domain name Sciencebase.com in July 1999. This post is just holding page to give visitors a bit of history. Thank you very much. Come again.