Archive for July 2010

Building up graphene nanoribbons

July 22, 2010

Just a quickie….to let you know that another article I wrote on graphene nanomaterials is now online on the Chemistry World website. That’s the third article on nanographene in two months – clearly a hot area or something!

Science and the Public Conference – 3rd July

July 17, 2010

This interesting conference organised mainly by Alice Bell from the Imperial College Science Communication department made me realise that until quite recently, most of the “Science Communication” has been done by social scientists rather than by physical scientists. You could discuss for quite a while the pros and cons of this situation (As I see it, Pros: They can be more objective as they are not so close to the science, they generally have better communication skills and are more encouraged to use them, Cons: They may not understand the science properly, can’t cut through jargon and hype so easily, don’t relate so well to people in general and, not being scientists themselves are not used to scientific methodology – more on this later) but, apart from a few star scientist communicators, that is the way things have been. It was interesting that most of the attendees seemed to be scientists, whereas most of the speakers (certainly more than 50 %) seemed to be social scientists. The point that I am trying to make here is that you would expect at a science communication conference that scientists and social scientist science communicators would be able to find a middle ground where they could actually discuss things on terms that everyone can understand. For the most part this did succeed, though I felt that there were one or two notable exceptions.

But first the highlights:

I saw a highly entertaining talk by a chap from Carbon Visuals who is trying to get people to understand amounts and relative sizes without giving numbers. At fist I thought this was going to be akin to New Scientist’s ongoing “this is equivalent to x blue whales” commentary but I was pleasantly surprised. Basically he was trying to put amounts into a context that people can relate to, so for Imperial College students he used a campus map and related CO2 emissions to the size of the buildings, for an office near Kings Cross he used the new St Pancras station building, for a presentation to Londoners he used Trafalgar Square, and so on. Not only did he have some really cool graphics but the idea of getting people to relate something unknown to something they know intimately was a great idea.

There were some interesting presentations about science education too, including the effect of inaccuracies in cartoons (like when Tom from Tom and Jerry bangs his head and forgets who he is, then after another bang on the head all his memories come back – sadly, knocking an amnesiac on the head is unlikely to bring about positive effects in real life) on how and what children learn (depends on their age,apparently, but kids do learn from cartoons and they don’t always realise that they are learning), and a couple of good presentations about how people learn and how to get at what the public’s general attitudes to science are (pretty hard because most people don’t even know what is meant when you ask about “science” and they also confuse the concept of science with the concept of learning about science). I’m not going to go into great detail on any of these but I did find most of the day really interesting and informative, and it gave me a lot to think about.

Now without hurting any feelings, to the part I personally found less useful. There was a keynote lecture by a clearly eminent social scientist who had been studying how social scientists study communication of science (I think I mean the theory of science communication and the methods used) for some years. His first less-than-groundbreaking conclusion was that when you ask questions and get answers that don’t fit with your expected categories of answer, you should not just junk the answers as these may tell you something important about the survey you are doing. To a scientist this is like saying don’t junk the outliers as they may tell you something  about the system you are studying, i.e., blindingly obvious. To me the concept that social scientists may have been ignoring their outliers all this time is quite a scary one.

He then went on to discuss a specific type of what he called science communication which is basically those gadgety toy/model/arty things that are based on science like mousetrap tables and personalised rings made from a culture of your own bone (I was fascinated by the latter idea though I can’t think who would want to do this – it’s a bit like Brangelina allegedly exchanging vials of each other’s blood). He then compared these specific models to the general precepts used by social scientists to engage the public in science, and concluded that the “design” model was a rather passive one which does not really seek to engage or to inform in any real way. I do take issue with the whole of scientists efforts at science communication/engagement being lumped in with these rather esoteric toy systems, and while I agree that the systems he discussed are passive and don’t actively go out to engage or educate people, they are not designed to do that. They are designed to intrigue. Perhaps he should look at some systems that are designed to go out and educate people, and see how these bear up against his fundamental ideas. Oh yes, and he persistently mis-pronounced microbial as micro-bile. Clearly he hasn’t checked his ideas with scientists as they wouldeasily have been able to correct him on this point.

So in conclusion: Yes there is a lot to be gained from scientists and social scientists interacting and trying to work out together what is the best way to communicate science and scientific workings to “the public” (whoever they are). As to whether that is really happening yet… well in some circumstances it is, and in others it is not. Both scientists and social scientists have important skills to bring to the table in this; the scientists bring their understanding, their rigour, their enthusiasm, and the social scientists bring their understanding of how people react, their education and communication skills. This can be a meeting of equals, but without both sides coming together and being prepared to listen to and work with a system that at first will seem quite alien to them, it will simply be a look through a frosted window at someone else’s house, and the details will still be misunderstood.

Materials Views

July 13, 2010

Also please check out my contributions to the Nano Channel of Materials Views from Wiley.

So far, I have written about:

Brushes grown on brushes

Hot-spot-light on cells

Nanoparticle temperature memory sensor

Electronics made from ribbons

Micro-medics

I’ll keep you up to date with anything else I do. Enjoy!

Sense about Science Annual Lecture – Dr Fiona Godlee

July 13, 2010

Firstly I want to apologise for being offline for so long – we moved house and it took a while to get the internet hooked up again. Then there was all the catching-up with work to do……anyway here I am again finally.

So the Sense about Science Lecture. SaS were kind enough to send me an invite to this when I asked. As an ex-Editor myself I am very interested in discussions about peer-review and its various failings.

Fiona Godlee is the current Editor of the British Medical Journal (BMJ); she has been in the job since 2005. In that time she has clearly had to grapple with numerous cases of ethical misconduct, in fact, going on what she told us, it would be fair to say that the medical and pharmaceutical literature is rife with it.

She discussed the problem of reporting bias – not unethical in itself but still something that skews perceptions and misleads the reader. Basically, people don’t like to report negative results. Drugs that don’t work very well will not make any money, and an experiment that shows nothing or worse, shows a problem, will jeopardise funding or even the job of that particular researcher in many cases. This is particularly true in medical research where the funding is usually coming from large pharmaceutical companies, but there is an element of it in academic research too, even where funding comes purely from the government. All university researchers have limited time and resources, and these are being squeezed at every corner. Papers that can be published in high-impact factor journals will advance a researcher’s career and help secure further funding. But papers that report negative results are rarely accepted for publication in high-impact or indeed reputable journals, as the editors know they will not be cited and would bring down their impact factor. This propagates reporting bias and also means that the experiments that did not work are doomed to be repeated again and again in different labs, as no-one knows that someone else has already tried it. This is a waste of the precious resources that are being so tightly squeezed already.

But Godlee’s lecture concentrated mainly on two perceived threats to science  – the external threat of those who refuse to accept the evidence-based approach to science, and the internal threat of undisclosed conflict of interest, which leads to data suppression, ghost- and guest-writing. In my background of materials, physics and chemistry, these are not such a big issue but neverthless it was interesting to consider.

Where researchers receive a large amount of their funding from industry, as is the case in medical research, these sorts of conflicts can always arise. This is because the researchers have a vested interest in the outcome. Of course you could argue that researchers always have a vested interest in the outcome of their research, as breakthroughs will advance reputation and facilitate additional funding (this could be another whole blog postingin itself). However the peer-review system generally does a reasonable job of picking up most of those who attempt to play the system. But when the vested interest is a large financial one, the pressures become greater. Government-funded researchers will generally only lose that one grant if a project does not work out. The decision about who to award the grant to next time is made again using peer-review and is regulated by committees and procedures which, however much the researchers rail against them, are there for a good reason. If all your money, on the other hand, comes from one or two companies who are focussed on being able to sell their particular drugs to the public in a few years’ time, there is a much greater pressure on you to create and publish positive results only. If the project yields a negative, your chances of receiving future funding from that source are very much reduced, being decided upon by a couple of top executives at that company who will tend to scratch the backs of those who scratch theirs.

Godlee had a few proposals to make to deal with this problem.

1) She wants article-level impact metrics rather than journal-level ones to become the norm. I think this is very sensible and could be applied easily, quickly, and would help a lot. Of course, it would not solve the problem on its own but it would give  a better guideline as to which are the good papers. I don’t think that it would prevent journals from publishing dross though, as some of the dross can still be highly cited.

2) She wants to stop big pharma from directly evaluating its own products (the current committee to approve drugs in the UK is made up of industry representatives) . Again this is sensible in theory though in practice I wonder if you would find that most of those with the necessary expertise to make the decisions are involved in some way with the pharmaceutical industry and there is no-one suitable who is not.

3) She wants to see central industry funding for independent advice and independent drugs trials. If you can get them to agree to to it, this is probably a good idea. Not sure why any company would sign up to this though, unless it is made law. This system is apparently in use in other countries, so I guess they must have laws about it. It would be interesting to see how these came about.

4) She wants to see publication of entire data sets to create better transparency. I’ve touched on my opinions about this before. This really only creates transparency if you are in a position to be able to use the data. It might solve some of the issues but it would create a load more when those who do not properly understand the data get hold of it and start to draw conclusions from it.

5) She would also like to see more investigative journalism. This is always a good thing, but how to promote it? Journalists are busy people, with ever shorter deadlines and ever more stories to write. To produce a good investigation they need to be given time and resources to do this, which means that they will be less productive in terms of numbers. Editors will not be prepared in most cases to countenance this unless it is strongly called for by the readers and subscribers. It is up to us!

The final point Godlee made was simple: The public must question everything. This is not a new idea – the motto of the Royal Society translates as “take nobody’s word for it”. But it is sadly a point that, in today’s society of quick headlines and instant gratification, probably does need making.