The open heart of science.

Posted 13/03/2012 by adamqshaw
Categories: Uncategorized

Most people are familiar with most of the basic elements of the scientific method: Observe a phenomenon, form a possible explanation, make predictions following from that hypothesis and design experiment after experiment to test those predictions until you have reasonable evidence to support (or disprove) your idea. This process represents humanity’s best attempt to rid ourselves of the baggage of our egos; our biases, our experiences, our preconceived intuition of how things should be, and try to get at the truth of how things really are. Though not everyone may acknowledge or accept this i.e. post-modernists.

However, people may be less familiar with the next steps in the chain: Peer-Review and Publication. This is, in principle at least, more important than a hoop to jump through and something to pad a scientist’s CV. Openness is at the heart of science and it has to be that way, for several reasons.

One is that this task of bypassing our own assumptions and intuitions is too large a task to be left to one person, or group of people. The person whose hypothesis it is, who did the experiment, is too close to the matter. Not only because of the rather common desire to be right (though there are many stories in science of people happy to be proven wrong), but also because if you are immersed in the nitty-gritty details of the work you may not be able to see the whopping great flaw in your logic, methods, assumptions etc. It may take someone looking at the situation from a completely different field to see the figurative pink elephant in the vacuum chamber.

Another one, that (in my opinion) is often overlooked in the academic pressure to publish new results, is that people need all the details of your methods, results and conclusions if they are to repeat your experiment. It’s not enough for you to repeat your experiment 3 million times if there is something wrong with the equipment: a dodgy fibre-optic cable, perhaps? Someone else has to do the same (or similar) thing in a different place and get the same results for people to be really convinced that you are truly examining something about the universe and not the effects of stray fields from the microwave oven. This is also the way that scientists police themselves and catch frauds as in the infamous case of Jan Hendrik Schon.

The key point is that truly open space for ideas to be challenged and attacked is necessary to build a respectable consensus. Nothing can ever be completely proven beyond any doubt.  Eternal Truth, with a capital T, doesn’t exist and to do science one must accept that all knowledge is provisional. Any theory, no matter how successful, can be overturned if enough evidence and experiments present themselves. Of course, that begs the question: Who decides what is enough evidence? The short answer is no one. If you want your idea to be taken as a serious explanation of natural phenomena, you need to convince people of it. This is not a democracy, you can’t buy, win or charm votes. It’s a meritocracy and the only thing that counts is the strength of your evidence (this is what disqualifies creationists, climate denialists, homeopaths and other bullshit peddlers from the game).

You have to throw your work to the wolves and let them tear at it. If your work stays standing, if the work of other scientists stand in support of it, then you may convince enough people to form a “consensus”. This is what draws the line on the current moment saying “This is the way it is (we think).”

Now, this is probably all very idealistic and is definitely very idealised. I’ve heard enough horror stories from colleagues of slow editors, non-sensical replies from referees and preference for buzzword articles. I well aware that the current academic publishing system is not perfect. It suffers from numerous flaws, such as publication bias, and often seem the driving force behind scientists more than the science itself. However, I wrote what I did above to declare, and to reaffirm to myself just how important openness is to science and the key role peer-reviewed journals play in that.

Think of it something like how, after watching the news and seeing the morons who get elected to government, you might need to think about how democracy should work to motivate your arse to go out and vote.

The reason for this affirmation and this post is a couple of interesting issues pertaining to Openness and the current system of Peer-Review.

The first concerns the production of mutant strains of avian flu. Those words are strong enough to make some people uncomfortable already. The science involved and what can be discovered from these is bound to be amazing and of very high impact, though I’m not a biologist so cannot comment too much.  The question is: should such a study be published in full with all the details of how these strains were created? Remember, part of the reason these details are revealed in most papers is so the results can be replicated. However, the costs of a functioning microbiology lab aren’t what they used to be and there are a lot of angry/deluded people out there with very malleable consciences.

The details of this issue can be read here at Nature.

This raises the question about “How open is too open?” I sincerely believe that no area of science should ever be put off-limits. It is a necessity that people explore every aspect of life, the universe and everything. Even if we may never get any answers, even if we strongly suspect we never will, we can never let ourselves give in to it. Just as we will never know everything, we can never truly know what’s “unknowable,” and the moment we think we do, science stops! If, for example, we could never understand or explain what happened in the first nanosecond after the Big Bang we still have to keep trying. If we stopped, shrugged and said it was unknowable, we might not find out what happened in the second nanosecond or the third. Each piece of knowledge is always worth the effort to find it and if we box it off as impossible to know, that effort will grind down to a stop. This, incidentally, is a large part of why religious “answers” are so ultimately unsatisfying.

However, this does not mean that every result needs to be shared. Most (sane) people would accept the need to keep certain things secret, in military circles for example. No one expected people on the Manhattan Project to be publishing their results. So what happens if everyday academics “stumble” across military level science, or at least potentially dangerous stuff? If I strain and ponder a bit, I could probably think of a few other things that maybe humanity might be better off not know if the data existed (well, maybe just the one). Should there be limits on this academic openness?

To be honest, I don’t have a bloody clue. I didn’t know when the issue of this research came up a few months ago and my wife asked my opinion. I can see that there would be good reasons to withhold some data or some results because of potentially dangerous implications. You could also probably guess my reluctance based on how important I feel the spirit of openness is to the scientific process. My cynical nature also makes me suspicious on what criteria would be used to judge what was “Too dangerous” and who would do the judging. As I said, the scientific community needs open discourse, in part, to try to overcome the short-sighted egos of individuals and small groups. It seems counter-productive to this effort to hand censorship power to which ever group is set up for the task. Also, just as we can’t box away the “unknowable” due to the risk of losing knowledge, can we box away the dangerous and risk losing research findings that could potentially be of great benefit to society?

It’s above my pay grade to know such things, but I guess it’s probably a good thing that the WHO recommended that the paper be published in full.

It is interesting to note that the scientist’s doing the work did decide to halt their endeavours to allow the debate concerning the potentially risks and benefits of this line of research.

The second line of thought is linked to a growing debate about the business of peer-reviewed publications. The full-story can be read in this APS news piece. The cliff note version is: There are two bills dueling in the US House of Representatives; one seeking strike down and one seeking to expand, current laws stating that research article, stemming from public funding via the NIH must be made publicly available on PubMed within a year. This is rubbing some of the journal publishers the wrong way, as they make their money by charging institutions for access to the articles. This political debate is stirring the growing sense of disgruntlement within the university community concerning the role of private businesses in academic publishing.

The details are well-summed up in the APS article I linked to. For my part, although I admire the desire to make scientific results available to the public, I do wonder is full-access to the article is necessary or even preferable. If, we only consider the public angle, i.e. the responsibility to let the taxpayers know what is being done with their money (what, why and what has been found) perhaps lay-summaries would be a more suitable approach. The simple fact is that scientific literature requires a lot of specialist knowledge, familiarity with technical jargon, statistical measures of significance etc. More or often than not, I don’t feel qualified to understand papers in my own field. If scientists were to write summaries or abstracts of their work specially aimed at a lay-audience; detailing what they found, how they found it, possible areas of future research and (perhaps most importantly) why it matters, this may be more effective at informing the public than simply putting the paper online with no further comment. The Cochrane Collaboration, stalwart defenders against quackery, seems to be leading the way in such summaries. Other advantages to such a policy, would be forcing scientists to write with the aim of engaging the public in their work, something far too many in all fields seems to neglect if not openly disdain. One would also hope that it would avoid public figures sneering at things like “Fruit-fly research,” without a clue of how valuable such work is.

However,there is something very wrong with the current publishing status quo. Currently, universities not only have to pay large sums in subscription fees to access scientific journals, but academic must also pay to have their work published in those journals. The authors cede the copyright of their work, data and images to the publishers. Also, the scientists who serve as referees in the peer-review process for these journals receive no payment for this effort, it is entirely voluntary.

A good run-down of the problem was written by George Monbiot in the Guardian.

Though I know that the staff of such publishers put in a lot of hard work, I can’t help but feel the current arrangement reads like something straight out of Marx, with one group responsible for a large part of the means of production and another reaping the profit. The publisher’s themselves also seem to resort to the tried-and-true business defence form “We agree but don’t live the government forcing us to do things or getting involved with out business.” Please! Where do they think the money for the subscriptions come from? The same public funds that are the source of research budgets and other university costs. At the very least, the current system seems to be a giant waste of money that could be better spent in hiring scientists, equipment or in improving teaching.

It’s also help if publishers would stop bringing out all those extra titles, allowing them to charge more for their various bundles (with access to different journals grouped like access to cable channels) . Surely, if something was not suitable to be published in Nature or Nature Physics etc. someone else should get to publish it rather than being bumped down the road to Nature Comms

Can’t say I know the answer. That’s well-beyond me. I’m not sure a world archive of peer-reviewed journals is practical in a non-Utopian world. Any global collaboration seems to get quickly bogged down by politics and division, the integrity of the scientific method and peer-review needs no be kept clear of such things (though that itself may be only possible in Wonderland). Also, quality products need money so charging for something in the process is not unreasonable. However, do universities have to pay quite so much? Or to pay to be published and to read? Must academics also sacrifice the copyrights on their work and data to the publishers too??

Tricky problem. Something about this needs to change, but I’m glad I’m not the one who has to solve it.

The Beauty of Reality…and Miley Cyrus?

Posted 04/03/2012 by adamqshaw
Categories: Reason

When I was an undergraduate one of my lecturers had a phrase. It would come after he had introduced us to some amazing fact that defied our everyday “middle-world” expectation or displayed the power of science. The wave-like interference of particles, for example, or how measurements of the cosmic-microwave background match the theory so elegantly (as emphasized by the always excellent xkcd).

After “dropping the knowledge” he would turn from the board and look at us; as if speaking to us all individually, but at the same time, and say:

“If you don’t find that amazing, there’s something wrong with you.”

You could claim he was just preaching to the choir, he was after all giving a physics lecture to physics students. But I think he was talking about something deeper. What if you knew someone, who, after seeing the sun sink over the ocean from a mountain-top,only said, “Meh!” Or someone who bypassed the greatest works of art and music, and instead turned to wallpaper samples and the Bay City Rollers. In short, there is a beauty and a power in reality, as revealed by science, that is equal in every way to the most admired masterpieces of paint, sculpture or sound, Something that goes beyond simple interest in scientific facts, and captivates you from that moment when you feel that exquisite sensation of the penny-dropping and  you’re flooded with sudden understanding.

If you don’t get that feeling, then either, you have not been introduced to the power of science in the right way (a ringtone of Vivaldi’s Storm is not going to have the same effect as being at the symphony) or there is something wrong with you.

That something, at least, does appear to be right with Miley Cyrus who posted this on her twitter feed.

Lawrence Krauss is an excellent spokesman for cosmology and space science, and it is hard to understate the awe that accompanies the thought of atoms scattered across the space to become you. And though, he is a well-known skeptic and critic of religion, I believe that the “Forget Jesus,” part of the speech was not meant as a jibe as Christianity. It is an effective, if provocative, comparison and you can’t blame a guy for trying to add some dramatic flare in giving a public lecture. If you’re trying to talk about something being destroyed so that you can live, JC is the obvious reference point. However, with all of my admitted meagre respect for the Bible, a giant ball of fusion-power plasma erupting into a supernova is a fair bit more impressive a death-scene than being nailed to a plank of wood.

However, Ms Cyrus being famous for country-music, the genre most associated with middle-America, the reaction of her fans could be seen a mile off.

Further reactions can be seen on twitter here.

As the reactions of offended Christians go, this is very mild. Some of the responders clearly indicate sharing Ms Cyrus’s awe of our stellar origins. However, the Lords of PR swiftly swung into action, citing the image and the quote as “sensitive content” and, as far I can tell, closing any further comment. I wouldn’t be surprised if some kind of apology and clarifying statement was to follow, as “damage control.”

To me, this is just sad. That a celebrity cannot stretch herself and try to express some wonder at the beauty of the world revealed to us by science, without being forced into self-censorship by petty morons.

In other circumstances this would be a miracle to science promoters; a celebrity with a large youth following stepping up and saying “Hey there’s more to this science-stuff than nerds looking at test tubes, and some it’s pretty damn awesome.” But no, instead everyone has focussed on three words, in a speech that contained such beautiful truths about the universe, that formed part of a comparison made for dramatic effect. “So forget Jesus.”

This is not even a (specifically) anti-religious point, but rather a point about people’s fascination with petty details when they could be gaining an insight into the workings of the universe itself. It just seems like being presented with a stunning landscape of a mist-shrouded forest around a snow-capped mountain, and complaining that one of the trees has “The Rangers suck!” carved into the trunk. Feel free to insert your own favourite sports team as needed.

Maybe that’s the “something wrong” my teacher referred to. Sometimes we (myself included) can all be too shallow to see the wood for what’s written on the trees.

Hat tip: Jerry Coyne at Why Evolution is True.

SPM moves forward: Imaging the charge in a single-molecule.

Posted 01/03/2012 by adamqshaw
Categories: Science Research

Tags: , , , ,

In experimental science, and I would imagine in other fields, there are often very clear trailblazers. The guys who seem to sneeze out high-impact publications that push the boundaries of what is possible, and usually with data so beautiful you could weep. The kind of guys who; after you have devised an amazing experiment and just begin to get the resources together for it, publish the same idea the next week with the sort of ease that makes you suspect they had just ‘knocked it out’ on a lazy Sunday afternoon.

In the field of Scanning Probe Microscopy (SPM) one such trailblazer is Leo Gross.

Scanning Probe Microscopy is the name given to a group of scientific instruments that can allow us to actually see the atoms on a material’s surface.

I’d like you to take a moment to consider that. We can actually make devices that allow us to see atoms. Until the beginning of the 20th century the very existence of atoms was doubted by the majority of the scientific community at the time. In the 1980’s, the first microscopes capable of seeing these elusive building blocks were invented. Now, thanks to Leo Gross, the technology moves forward again.

These microscopes do not work in the way expected by most people when you say the word “microscope”. It does not use light and there are no lenses. A common analogue to introduce people to the field is to consider an old-fashioned record player. Like the record player, a scanning probe microscope slides a sharp tip across the sample to “feel” the atoms and molecules upon it. I write “feel” in quotation marks as the actual mechanism of detecting the atoms differs between instruments and the exact needs of the experiment.

The two big boys of the STM world are the Scanning Tunnelling Microscope (STM) and the Atomic Force Microscope (AFM). Those interested should follow the links for the full details on how these devices work. For some time there was a little bit of smugness on the side of STM users as, their instrument was the first to allow atomic resolution and continued to do so reliably as a matter of course, whereas the AFM had to be worked at from its invention to yield its first pictures of atoms.

In 2009, Leo Gross spun that around, by taking picture of molecules that actually look like the diagrams you would draw in Chemistry class.  The details of this can be found in this IBM press release from the time and this Science paper. The key to this coup for AFM is that STM is, by its nature, an electrical measurement, a very-sensitive and glorified current detector. As such its ultimate resolution is limited by the electron states surrounding the molecule like a cloud. AFM does not have this limit, as its name suggests it works by “feeling” the various forcefields that surround the atoms in the molecules as so show them as they “are”. This difference is shown below in the image from the Science paper. A) Shows the chemical model of the pentacene molecule B) Shows the STM image and C) Shows the AFM image of the same molecule. Note the amazing similarity between A and C and the comparative “Fuzziness” of B.

Not content to leave it there:  Leo Gross’ group at IBM has been continuing to push the envelope as to what the AFM can do, now they have applied their uncanny ability to take amazing, high-resolution images to an AFM spin-off: Kelvin Probe Force Microscopy (KPFM). With this they have imaged the distribution of charges within a single-molecule. The story can be read for the public at the BBC and the full article is published in Nature Nano article for those with access

Kelvin-Probe uses the fact that when two metals are brought together a voltage/contact potential forms between them, due to the difference is workfunctions (how easy it is to pull the electrons out) of the metals. In KPFM, this voltage is formed between the sample and the tip of the probe which responds to the localised change. A feedback circuit is then used to apply its own voltage between tip and sample, countering the effect. The instrument’s record of how much voltage is used to cancel out the contact potential is recorded with sub-nanometer resolution and used to form an image.

To test the technique, Gross’ group studied a molecule called napthalocyanine, which has been shown to behave as a molecular switch. The molecule is shaped like a + sign, with four distinct arms and hydrogen atoms can be switched from one set of arms to another in a controlled way. In one state, the KPFM images show charge concentrated on two opposing arm (call them, North and South). After the switch is ‘pressed’ the images now show the charge concentrated on the other two arms ( East and West) having rotated by 90 degree around the molecule. The observed images matched very well with the distribution of charges predicted by theory (DFT calculations).

Still not satisfied, they went further by improving the resolution of their images by refining their AFM probe by picking up a single carbon monoxide molecule on the end of their tip to use as a much smaller, more refine and more precise probe in their experiment.

Long-blog post short, they can now see where the charge is gathered within on napthalocynanine molecule, which parts hold the positive charges and which the negative.  This knowledge is of great interest to scientist working on nanoelectronics devices. In such materials, especially those made from organic molecules, their function depends on the  separation of charge and the dynamics its storage and transport. In particular, these processes are critical to the operation of solar cells and their natural analogue the photosynthetic proteins in some plants and bacteria. The ability to actually see these effects is a major step-forward in both the development of SPM and future electronics.

About those cables again…

Posted 24/02/2012 by adamqshaw
Categories: Uncategorized

Two posts into my science blogging career and I already have to issue a bit of a correction. Well, a clarification at least.

The ScienceExpress post that first brought us news about the possible hiccups in those amazing “faster-than-light neutrino” from OPERA, seems to have been inaccrurate in it’s details. The official statement from OPERA explains that, although a potentially iffy optical fibre connection is a suspect in the 60 nanosecond mystery, it has not-yet been confirmed to exactly account for the disrepency. That little detail seems to have originated in the ScienceExpress post.

The statement is repeated in full in this Nature piece.

It’s also worth checking out this related posts by Matt Strassler, and this interesting opinion piece byRobert Garisto, revealing a journal editors take on the situation.

Always, always, check the cables!

Posted 22/02/2012 by adamqshaw
Categories: Science Research

Tags: , ,

The latest news from CERN is that the famous “Faster than light neutrinos” that filled both the scientific and popular press late last year, may be down to an experimental error after all.

The breaking news has been announced on Science Magazine’s “Science Express” board. I’ve put the link below and the note is worth reading for yourselves.

Science Express Post

However, in a nutshell, the big brew-ha-ha started when Scientists in the OPERA project sent neutrinos from the CERN lab in Geneva to a lab in Gran Sasso in Italy. Neutrinos are subatomic particles, which, due to their incredibly small mass and neutral charge, interact with practically nothing. They fly through space and whizz through the Earth and all of us with hardly any effect. They don’t even stop to wave. It was always thought that the travelled at speeds very close to but just under the speed of light , as the Special Theory of Relativity demands that nothing can travel any than light itself.

This claim, which has been tested and retested for nearly a century, was called into question when the OPERA scientists found that their neutrinos reached Gran Sasso 60 nanoseconds before (1 billionth of a minute) they should do travelling if light speed. These scientists knew that this would be a major upset to one of the fundamental theories of modern physics, and though scientists are always open-minded to new (reasonable) ideas, you don’t overturn Einstein unless you are really, really sure. So they repeated the experiment, over 3000 times, and after finding the same result, threw their findings out to the scientific community. They hoped, in a wonderful example of how science should be done, that a much larger number of eyes on the data and minds on the problem may see anything they overlooked.

Now the problem may have been spotted (may, I said, MAY, more data is needed). In a fashion haunting familiar to anyone working a in a lab the problem may be the result of a faulty cable. In this case, an optical fibre in the GPS system used to accurately track the distance between the two stations and keep the timing of the neutrinos flight. Seems data was passing through this cable faster than it was believed to be. How much faster, you say? Well, about 60 seconds, funnily enough.

Though such nanosecond precision in data transmission isn’t a requirement in my line of work, it just reinforces an important lab.life lesson. Always check the cables.

At the limit of Moore’s Law

Posted 22/02/2012 by adamqshaw
Categories: Science Research

Tags: , , , , ,

Stop to consider the amazing explosion of technology that has rapidly filled our lives in the last 50 years. Such the computer you’re reading this blog on, or perhaps you are viewing this page on a device smaller than your hand, or even the fact that these words can be read by anyone, from anywhere in the world.

Where did this all begin? What was the ‘seed’ that allowed all this technology to grow so fast and sprout up in every aspect of our lives?

I think a good argument could be made that the answer to this question is: The transistor.

A transistor is, basically, a tiny electrical switch. It stops and starts the flow of current along a circuit in response to series of electrical pulses.  These allows a computer’s to send a control signal to all of its various parts using this off/on flow of current to “speak” the 0/1 language of binary and perform operations by forming Logic Gates. In this way, the transistors form part of computer’s “nervous system” relaying signals from the CPU (its brain) to the various “organs” (the hard drive, the speakers etc.)

However, modern transistors are tiny things and the driving force behind the rapid increasing in computing power (while computers themselves shrink) is that scientists have found way to make smaller and smaller transistors allowing them to pack more and more onto the circuit boards in your PC.  Unfortunately, there is an rather obvious physical limit to how small you can make something: the atom. And it’s seems we have hit that limit with the latest reports in Nature Nano of a functioning Single Atom Transistor.

A common type of transistor is the Field Effect Transistor (FET), where the current flows between two electrodes called a Source and a Drain, and is controlled by a third called the Gate. The names are quite helpful as the operation of the FET can be described by a comparison to water. Imagine the Source to be a tap and the Drain, a plughole, and that the water needs to pass through length of rubber tubing dangling from the end of the tap into the sink. Turn on the tap and set the water going, then start to squeeze and release the tube rhythmically, stopping and releasing the flow. This is your Gate.

Of course in a real FET the water corresponds to the movement of charge carriers such as electrons or holes, the hose is a narrow channel of semiconductor whose size and shape is controlled by applying a voltage across it, restricting the flow of charge.

The single-atom transistor works in the same way. Michelle Simmons’ Group at the University of New South Wales first started with a hydrogen-passivated silicon surface. Using the tip of a Scanning Tunneling Microscope (STM) they carved away the hydrogen atoms in a “+ shaped” to form the Source, Drain and two Gate electrodes leaving a tiny gap in the centre. They then added phosphine (PH3) to the system, which binds to the newly bare silicon and ignores the hydrogen covered parts. This dopes the electrodes to make them more conducting.  In the tiny gap of the centre of the + shape, the author, managed to take three PH3 molecules on the surface and; breaking, swapping and reforming through some impressively simple thermal treatment, replaced a single silicon atom with one of Phosphorus. Measurement of the current between source and drain showed that it did indeed depend on the voltage applied across the gate electrodes, confirming the device was “transisting”.

I won’t go into further detail about the conduction measurements and the nitty gritty of energy levels and electrostatic potential calculations, however, I do hope to impress upon you the importance of transistors and the gravity of the work. Though single atoms have previously been shown to behave like transistors in the right circumstances, this is the first time a single atom transistor has been engineered i.e. the electrodes, the doping and the transistor atom itself have been deliberately and deterministically put in place with atomic precision. This is an amazing feat of nano-fabrication and represents the ultimate size limit of present computer hardware: It doesn’t get smaller than one atom.

As an extra bonus there are some signs that such a device can beyond the current level of computing. At low gate voltages, the conduction measurements along with calculations also show that the phosphorus atom retains its discrete quantum levels allowing for potential applications in Quantum Computing, logical operations and devices that make use of the “spookiness” of the quantum mechanics.

Of course this is a long step from use in an actual computer, let alone your laptops. The device can only be operated blow liquid nitrogen (<77 K) temperatures and the device fabrication makes use of ultra-high vacuum conditions. However, this is how science evolves into technology: someone finds out what is actually possible and then someone tries to make it work at a higher temperature or make the device a little bit more stable, inching closer and closer to practicality.

Not every scientific breakthrough makes it from the lab into your home, just as not every drug gets from the petri dish to your medicine cabinet or every species gets from an amoeba to the zoo. However, where there is enough passion and excitement, the Nerds will find the way.