I am currently working on the premise that there are actually two problems here. 1. what are my feelings about how science should be conducted in areas of challenging, controversial and globally significant science and 2. given we get 1. right, how is that science then communicated to policy makers.
I think I can get at the first question easily, so, for now, I'll focus on the second. I've been reading various things - the literature, 'The honest broker', CSA guidelines on the use of scientific and engineering advice in policy making, and other bits and bobs from luminaries such as Jonathan Porritt and Crispin Tickell.
I confess, the first time I read 'The honest broker' I struggled to perceive any difference between the three modes of interaction that were non-advocative (i.e. pure scientist, science arbiter and honest broker). To me, they felt like the same thing. I've now realised they are a spectrum, and the defining scale of that spectrum is consideration of the policy maker (from providing information that the scientist thinks is relevant, to providing information that the policy maker asks for, through to providing information that expands the choices of the policy maker). Of course, in complete contrast, the advocate seeks to narrow that choice.
The question I have to ask myself within this framework is 'what should I be'? Clearly, this is spectacularly simplified as one's position varies as a function of time and circumstance AND end-members do not make good exemplars (in that I am unlikely to be 100% of any of them). To what extent in this context to the Oxford Principles help??? Here they are as a reminder...
Principle 1: Geoengineering to be regulated as a public good
While the involvement of the private sector in the delivery of a geoengineering technique should not be prohibited, and may indeed be encouraged to ensure that deployment of a suitable technique can be effected in a timely and efficient manner, regulation of such techniques should be undertaken in the public interest by the appropriate bodies at the state and/or international levels.
Principle 2: Public participation in geoengineering decision-making
Wherever possible, those conducting geoengineering research should be required to notify, consult, and ideally obtain the prior informed consent of, those affected by the research activities. The identity of affected parties will be dependent on the specific technique which is being researched - for example, a technique which captures carbon dioxide from the air and geologically sequesters it within the territory of a single state will likely require consultation and agreement only at the national or local level, while a technique which involves changing the albedo of the planet by injecting aerosols into the stratosphere will likely require global agreement.
Principle 3: Disclosure of geoengineering research and open publication of results
There should be complete disclosure of research plans and open publication of results in order to facilitate better understanding of the risks and to reassure the public as to the integrity of the process. It is essential that the results of all research, including negative results, be made publicly available.
Principle 4: Independent assessment of impacts
An assessment of the impacts of geoengineering research should be conducted by a body independent of those undertaking the research; where techniques are likely to have transboundary impact, such assessment should be carried out through the appropriate regional and/or international bodies. Assessments should address both the environmental and socio-economic impacts of research, including mitigating the risks of lock-in to particular technologies or vested interests.
Principle 5: Governance before deployment
Any decisions with respect to deployment should only be taken with robust governance structures already in place, using existing rules and institutions wherever possible.
I suppose that, to a certain extent, they confirm some of my feelings about the science - the importance of impartiality, public engagement though transparency, the critical role of governance, but do less for me in terms of deciding how I will act as a policy advisor. At the moment, I've realised, I trust my sense of right and wrong. As an example, at the recent LWEC meeting we were challenged to determine where immediate funding might be pointed. The point was raised that EPSRC had an assessment/framework project and an SRM project and, to balance the portfolio, CDR should be targeted.Clearly, from a purely selfish point of view (and the argument about scientists being driven by research funding really irritates me) I'd have been better of advocating more SRM. However, clearly the right thing to do was to press for CDR, which I did. I am not expecting a medal for this, I simply make the observation that many scientists are much less selfish and not at all driven by financial gain, despite this criticism being levelled at us on a regular basis.