From 27-29 June 2003 Pugwash held a workshop on Science, Ethics and Society in Paris, France.
Pugwash Workshop No. 286 on Science, Ethics and Society
Report by Venance Journé and Judith Reppy
Introduction
The social responsibility of scientists is a central issue for the Pugwash movement, yet only a few workshops have specifically dealt with this topic since Pugwash’s first Conference in 1957. On 27-29 June 2003, a workshop convened in Paris to address the ethical responsibilities of scientists in today’s world. Thirty-one participants from 13 countries attended this workshop , and 9 students from 6 countries gathered for a parallel workshop, with the last session being a common session of the 2 groups. On June 26th, during a public evening organised at the Centre National des Arts et Métiers, Joseph Rotblat, John Ziman, Francesco Calogero and Mollie Painter-Morland spoke about different aspects of the social responsibility of scientists. This session was followed by a discussion with the public.
The organisation of this workshop benefited from a grant by the French Ministry of Research and New Technologies and from a generous support by the Fondation Charles Léopold Mayer.
Science and technological advances can create numerous opportunities for the benefit of humankind; equally, there is potential for harm through both deliberate misuse and unintended consequences. A number of legitimate scientific research activities are dual use, with both civilian and military applications, and an evaluation of whether to pursue such research always has a moral element. One of the areas of greatest concern at present is biotechnology. The current pace of advance in the biological sciences is such that applications follow very soon from fundamental research. Moreover, the world is experiencing parallel and contrasting processes of globalisation and fragmentation, processes that foster the spread of new technology even as control regimes are undermined. The increasing privitisation of research also has implications for researchers’ sense of responsibility. All these new developments necessitate renewed thinking about the ethical criteria for scientific activities.
The discussion at the workshop was wide-ranging, but it returned repeatedly to a core concern: the tension between democracy and expertise. Although decisions that affect society should be made by the people, on certain issues, scientific or technical expertise is essential. The dangers arising from this tension are two-fold: that essential scientific knowledge may be ignored in favour of democratic decision-making?, or conversely, that the authority of science may be extended to domains that are properly the concern of all citizens. Because, in most cases, scientists are in the best position to foresee the consequences of their own work, they bear the ultimate responsibility to help the public understand the scientific, social, or political issues at stake.
Morality and ethics
All of the discussions at the workshop were underpinned by concern with the scope of scientists’ responsibility for their research and the way to translate that responsibility into ethical decisions. The values we pursue emerge out of both our life experiences and the social milieu, which create an enduring belief that certain conduct is preferable. Therefore, for our purposes, we defined morality as the totality of current norms, principles, and values existing in society, and ethics as the process of determining how one should act to balance the interests of various stakeholders, taking into account moral values.
This formulation, however, raises the question of whose values should prevail? Because ethical issues today are of unprecedented complexity, developing a global consensus on values– and, eventually, a shared morality—is especially challenging. We discussed a proposal to avoid both relativism (which provides no means of making normative judgements) and absolutism (which can hide power relations or lead to totalising structures when minority views are eliminated). The goal is to make the process of reaching ethical positions transparent, so that all the relevant parties know what is at stake and can contribute to the process.
What guidelines should we follow in reaching moral judgements? Deontology, as discussed by Kant, is an abstract and rule-based approach that cannot deal with specific circumstances: it holds that the rationally self-evident good should always be chosen, without taking into account the consequences of our actions in specific cases. By contrast, utilitarianism privileges the ends over the means. It assumes that we can always predict the outcomes of our actions and, further, that a cost benefit analysis can be done to assess their impact on all the stakeholders. Because this is manifestly not the case in real life, utilitarianism, like deontology, cannot satisfactorily answer the question of how to live a good life.
We are left with the practical question of how we should morally educate, specifically scientists. It was argued that the problem should be construed in terms of a process rather than a fixed set of principles, and this process should strive to equip scientists with tools to grapple with moral dilemmas in their specific historical and social contexts. This includes recognizing that different sectors of a community may have quite different views of their identity and interests and that we should respect that diversity.
We discussed a pedagogical approach to moral education known as “narrative witnessing,” which is widely used to teach ethics in Japan. This approach has its theoretical basis in Wittgenstein’s view of language as a form of action involving participation of a social group. Rather than being handed down as a fixed set of rules or definitions, consensus on a moral position emerges from a dynamic process of discussion and exchange. In practice, narrative witnessing begins with a case study that poses a moral dilemma, then encourages students to elucidate the principles at stake through group discussion. Thus, the principles remain grounded in the specifics of the case, but general agreement is possible.
Is science neutral?
The ‘neutrality of science’ is often evoked to defend scientists from criticism when scientific knowledge or a new technology is used in ways harmful to society. The claim that science is neutral shifts the blame away from those who use science, but strict neutrality is in fact impossible to achieve. Scientists are part of society and their activities necessarily are affected by the social context they inhabit, including their institutional homes, sources of funds, and intellectual milieu. Even scientists working in traditionally independent organisations like universities are not exempt from these influences, especially now that universities are deeply involved in joint projects with industry and government.
Instead, science can rightly claim that it strives for objectivity. The value of the scientific method lies in its contribution to objectivity through the checks and balances provided by peer review, replication of experiments, and open publication. The result is that scientists are held accountable for their claims. In effect, the scientific method produces an ethics of science in which the moral values are those of truth-telling and transparency. In turn, science’s reputation for objectivity forms the basis for its special authority in controversial technical issues.
Who should make the decisions involving science and technology? The case was made that all levels of society should be involved, although there are some narrowly defined issues (for example, the value of pi) in which scientific expertise should be privileged. However, in such instances, scientists’ special knowledge brings with it the responsibility to make ethical decisions. .Further, ethical decisions should be seen as context-dependent: we will have a different view of animal research done to cure serious disease, compared to animal research done to develop commercial cosmetic products.
Responsibility of scientists and society
The issue of scientific responsibility to society has most often arisen after the misuse of scientific discoveries, and we discussed several important examples. For instance, the continuing threat posed by nuclear weapons and vast quantities of excess nuclear weapon material (such as highly enriched uranium in Russia) is not sufficiently recognised by the media and political leaders. If highly enriched uranium is available, it is relatively easy to build a nuclear device that would easily have a yield similar to that of the Hiroshima bomb. It is therefore of crucial importance to control and eliminate these materials. Clearly, because of their essential role in creating nuclear weapons, scientists bear a special responsibility in the case of weapons of mass destruction, and they have a responsibility to inform the public about the dangers of such weapons.
An alternative position would argue that, because political authorities have all the power to take decisions, they bear all the responsibility. The social responsibility of any group or individual depends on access to information and influence over decision-making.. In some highly charged cases–for example, stem cell research–it is difficult to have an informed public debate, but, it was argued, we need to put up with the misinformation and extreme opinions as part of the price of democracy.
The 2003 Severe Acute Respiratory Syndrome (SARS) epidemic–the way it happened and the way it was reported–is another example of a case in which scientists had a responsibility to inform and advise the public. There is a strong link between the spread of disease (including AIDS) and the structure of society, our lifestyle, and our behaviour. Though an epidemic like SARS was to be expected, once it occurred, most reporting was media-generated and wrong. The press gets most of its information from the web, and the web makes spreading rumours as truth easy. For example, Lancet printed a letter claiming that SARS came from outer space, which triggered a meeting of a NASA advisory board. There were also claims that China had accidentally released the virus from a bio-weapons facility. In addition, the differences among countries in the number of cases and fatalities are not yet completely understood. Vietnam got rid of the disease very fast, but Hong-Kong and Canada were most affected, which may indicate again that patterns of lifestyle or types of governmental decisions affect the success of disease control. In China, the government was slow in taking appropriate measures, not because of exaggerated powers of centralised control but rather because of lack of control. The severity of the problem was not accurately reported to central authorities, and at the same time, they were unable to control population movements spreading the disease.
Sometimes, a scientist may not recognise the potential harm of his research until the government takes interest in it. One participant spoke about an episode during the Vietnam War, when he felt uneasy about the possible uses of work he was doing when the FBI came to interview him.
There was a discussion of measures proposed by scientists to prevent threats to the human species posed by the most powerful technologies currently under development (robotics, genetic engineering and nano-technologies). These proposals include a set of measures to control nano-technologies (such as a moratorium on research in certain areas), a nano-safety protocol, pledges, activities by scientific academies promoting ethical debates, and early warning committees. Another interesting concept is preventive arms control, which would involve a qualitative assessment of weapons and future technical developments. Such preventive measures would aim to analyse whether military activities might have destabilising or negative effects on international security; if so, these measures would strive to stop development of the new weapons at an early stage. This issue is complex because of the ambiguous line between civilian and military paths of development, particularly in basic research. Nevertheless, restrictions on R&D could be applied in specific areas such as nuclear tests or high energy beams in the atmosphere and space. These matters need to be studied in much more detail (a work that very few scientists find time to carry out), and in civilian as well as military research, methodologies have to be designed to determine quantitative and qualitative thresholds and design procedures to separate legal from banned R&D activities.. There is, however, no consensus that there should be restrictions on military technological research.
Secrecy and counter-terrorism
The events of September 11th, 2001 and the incidents with anthrax have resulted in new regulations, particularly in the United States. These new laws have consequences on scientific activities and exchanges.
Perceptions of terrorist danger vary widely. It was argued that, although persons who commit ‘terrorist’ acts may be well educated (e.g., many members of the Aum Shinriku sect had PhDs), most terrorists use relatively simple or conventional methods. The danger of the misuse of science lies elsewhere. Specifically, such dangers can arise from offensive state-sponsored programs, such as the large and secret Soviet biological weapons program, which persisted into the 1990s despite being illegal under the 1972 Biological Weapons and Toxins Convention (BWC). Some current U.S. programs in biodefense, by simulating offensive capabilities, also seem to cross the line into violating the BWC. Some would argue that a focus on bioterrorism is misplaced, serving only to cover State bioweapons programs , deflect attention from other problems, or provide an excuse to limit the spread of biotechnology to developing countries. Nevertheless, it was also stated that there is a possibility that a terrorist action could create an enormous catastrophe–just because it has never happened does not mean that it will not happen in the future.
In response to the September 11 attack, the Organisation for the Prohibition of Chemical Weapons met and concluded that the most efficient response to the increased threat is to bring into force as soon as possible the protocols of the Convention on Chemical Weapons, including the destruction of all stockpiles. With respect to biological weapons, in 2001 the United States blocked the draft verification protocol at the Biological Weapon Convention Review Conference. Ironically, since then the USA has adopted a national system of controls that is much more far reaching than the one it rejected at the international level.
The argument that open publication of scientific information can provide opportunities for misuse and deliberate harm raises a series of questions for researchers, administrations and publishers. It was argued that in certain cases (bio-weapons technology, for example) ready-to-use information poses an unacceptable risk, and for these cases scientists should refrain from speaking publicly about their research and even from trying to know more. The difference between biological weapons, where one single organism can multiply and spread everywhere, and nuclear weapons was stressed. In addition, it may be possible to engineer organisms to evade natural immunities or vaccines.
Several cases leading to different answers were discussed. In the case of anthrax, the situation could be dangerous if only one country had knowledge about the disease, thereby preventing others from preparing a defence. In such a case, the best defence is information that can be used to ensure public safety. When a weapon’s potential is common knowledge, sharing information provides a basis for making vaccines, thereby greatly lessening its military use..
A completely different case is the publication of the smallpox’s genome. Once the World Health Organisation announced the eradication of smallpox world-wide, the stocks should have been destroyed and the genome should not have been sequenced. Instead, now that the genome is published on the web and can be reproduced, the disease exists forever. It was strongly argued that it was more important to destroy smallpox forever than to understand it. The damage is irremediable and could have deadly consequences. Engineered organisms could be far more dangerous than the naturally occurring virus because there is no acquired immunity and no assurance that vaccines could be created quickly to protect the population.
There was a discussion of the U.S. perspective on bio-terrorism. New laws increase bio-security by prescribing measures for laboratories possessing dangerous pathogens to limit access. These measures should be internationally harmonised.
In addition to physical security measures for pathogens, there is the issue of controlling the spread of information that might be used for bioterrorism or biological weapons. A meeting was organised in Washington in January 2003 by the National Academy of Sciences to discuss the balance between communication of scientific knowledge and security. Important issues were discussed: Should more research be declared as classified? Should research methods with the potential for misuse be suppressed in publications? Should there be review boards to consider the national security implications of publications and presentations? Should the access and dissemination of scientific information be restricted only to ‘approved persons’? This meeting was followed by a meeting of editors of prominent scientific journals, authors, and government representatives on practical strategies to enhance security.
Agreement in the NAS meeting was reached on the following points: 1) some information could be dangerous (this was not recognised by all in the meeting), and the review process needs to ensure that benefits from publication outweigh the risks to society (an example of ‘forbidden’ knowledge might be how to weaponise anthrax). 2) each field has to decide what is dangerous and develop a process to deal with it. The American Society for Microbiology (ASM), for instance, has a review board that does a cost-benefit analysis for articles submitted to the journals it publishes. 3) editors and scientists should act responsibly without government intervention, which is unlikely to be sensitive to differences among scientific fields. 4) the possibility of including ethical issues in peer review was raised, but no agreement was reached in the NAS meeting.
A statement from the meeting on safeguarding scientific information was published in Nature. It emphasised that the integrity of science must be preserved. The content of a publication should be verifiable–if not, there could be abuses and perpetuation of errors, eroding the foundation of science. The same codes of conduct should apply to all organisational contexts, e.g., no separate rules for the military. In any case, boundaries between the different institutional settings for doing science are no longer clear.
Again, the particular situation of biotechnology, where a single individual can do a lot of harm, was emphasised. Others argued, however, that the current obsession with terrorism is misplaced and that it would be more useful to address the root causes of terrorism. It was argued that how these new secrecy rules would prevent terrorism are not clear. The new regulations divert resources, including funds, from more valuable research, there is a big asymmetry between terrorist threats and the tools proposed to deal with them, and the rules could have negative impacts on scientific practice.
The issue of the negative impact of the new American security procedures on the developing world was raised. The effects are wide-ranging. Boycott of technology transfer can hinder technological developments in certain countries (nano-technology in Brazil, for example), while discriminatory measures against foreigners, such as the closure of some laboratories to foreign scientists, the surveillance of foreign recipients of PhDs, the raising of hurdles for students visas, creates barriers to scientific exchanges between North and South. It was felt that the same laws should apply in every country. If the same rule is not applied to all countries, how can a country developing high technology be obliged to open its laboratories to foreign scrutiny under threat of being labelled as a terrorist country?
Codes of conduct
Chemical and biological research has the potential to be used for the production of biological or chemical weapons by states, or–more worrying–by individuals or small groups. In the face of such dangers, several proposals for codes of professional conduct have been put forward in the context of existing international Conventions. These include the development of a code of conduct (in the framework of the Biological and Toxin Weapons Convention), a project to raise awareness on ethical issues (in the framework of the Chemical Weapons Convention), and— in an appeal launched by the International Red Cross Committee—a network for the prevention of the spread of disease. In addition, professional organisations, such as the ASM, have a renewed interest in codes of conduct. Further, in order to define its own role with respect to the ethical aspects of scientific practices, the International Congress of Scientific Unions is undertaking a strategic review of the rights and responsibilities of Science and Society.
Initiatives connected to international conventions
At the Fifth Review Conference of the Biological and Toxin Weapons Convention (BWC) in November 2002, States Parties to the Convention agreed to have an annual expert meeting until the next review conference in 2006. Participants at these meetings will discuss and promote common understanding and effective action at the national level on several issues and, during the 2005 meeting, planned to include discussions on the ‘content, promulgation and adoption of codes of conduct for scientists. Recent advances in biotechnology and genetic research have created difficulties for policymakers because most of the technologies involved in this research have dual-use potential. The annual meetings of experts can facilitate the exchange of ideas on best practices, increase awareness on biological warfare, create linkages among various organisations, assess changes in biotechnology, and help states with limited resources meet their treaty obligations. The limited mandate of these meetings provides an opportunity for the participation of outside organisations, which can often take initiatives state parties cannot. The meetings could also help create new linkages and networks among policymakers and academia, industry, and intergovernmental and non-governmental organisations. It was felt that Pugwash should be involved in these activities.
Efforts to develop codes of conduct and ethics, in addition to scientific oaths, have been underway in scientific and non-governmental organisations for many years now. The issue of dual-use technology becomes more salient and pressing, and recent events, such as the unexpected development of a lethal mousepox virus in Australia, have produced a sense of urgency as. As an example, it was mentioned in our meeting that the American Society of Microbiology, which has long had a code of conduct, revised it after the events of September 2001 to include a statement that scientific work should be for the ‘benefit of humanity’ and that scientists have an obligation to report illegal activities. It was noted that the latter policy can cause problems if reports of illegal activities reflect prejudice or are misused to pursue private grudges
There is in general broad support for the concept of codes of conduct, but no consensus on what form they should take. It is widely recognised, however, that truly effective codes can only come from within the scientific community and cannot be imposed from above. Scientists should, therefore, take a proactive approach. The first task will be the identification of relevant stakeholders, a list that may include more that initially envisaged. The introduction of such a code could have implications in all scientific areas, not only biology. It is important that the scientific communities identify the strategic elements to be discussed (such as responsibilities and liabilities, whistleblower provisions, and implications for scientific freedom). There was agreement that codes of conduct should not be too general or abstract, but, as mentioned above, should be designed with the full participation of all stakeholders. It was further agreed that attention should be given to procedures for reporting illegal activities, including protection for those who report.
Scientists and policymakers have identified codes of conduct as an important element of a holistic approach to addressing biological weapons proliferation. At this intersection of science and security, the concept of neutral science is clearly outdated and cannot justify failures to take responsibility for the possible hostile use of biology. At the same time, however, members of the global society should ensure that security concerns do not unduly or unnecessarily constrain research with great potential benefit for humanity. Several participants mentioned that this initiative could be applied to fields other than biological research; it was also argued that codes of conduct for scientists should also be codes of conduct for all citizens.
Another example of action in the context of international conventions is the Ethics Project of the Organisation for the Prohibition of the Chemical Weapons (OPCW). Under the Convention (CWC), each state party undertakes broad responsibilities to never develop, produce, otherwise acquire, stockpile, or retain chemical weapons, nor to transfer, directly or indirectly, chemical weapons to anyone– i.e,. all citizens of state parties are covered. The CWC sets forth a set of ethical issues which are of importance for professionals. Increased awareness about chemical weapons among professional and scientific communities will further the object and purpose of the Convention and ease its implementation.
The purposes of the Ethics Project are to: promote the development of a code of conduct among chemistry/engineering professionals consistent with the object and purpose of the Convention; to identify scientific issues related to chemical disarmament; to develop networks to link chemists to disarmament; and to develop education. In order to target the relevant students and professionals, scientific societies, professional institutions, and universities should develop public awareness of the ethics involved in their respective fields of study. Two components of the OPCW will implement this policy: the Scientific Advisory Board and the national authorities of each State Party. An information package has been prepared and disseminated to all universities and relevant institutions. Many professional associations at both international and national levels have already adopted codes of ethics, some of which incorporate references to chemical weapons. Therefore, since this project is in its preliminary stage, rather than striving to develop brand new fundamental principles, effort should be made to thoroughly study available ethical norms and their potential for incorporation into the existing codes. The goal of this project is to reach each scientist and student through the relevant professional institutions. Given adequate time and resources, this goal is attainable.
Unlike the CWC, the Biological Convention does not prohibit research; rather, the prohibited activities start with weapons development. The two conventions have very different levels of resources for their implementation. For example, the BWC does not have a secretariat, most countries do not yet have implementing legislation for the BWC, and only five countries implement it effectively. The codes of conduct that do exist do not mention weapons or military work.
The CWC is a very complex instrument affecting the life of many sectors of the economy. The Ethics Project already has an early warning function, and by developing awareness and knowledge, it will help provide a means of avoiding situations requiring sanctions, which in many cases would be impossible to apply. It was noted that no significant breach of the Convention has been detected, and no challenge inspection has been requested. The vast majority of the State Parties are developing countries with no relevant chemical industry, scarce resources to implement the convention, and other more urgent priorities. This is where this project is important.
The question of riot control agents was discussed. Although the Parties to the Convention must report the substances used for law enforcement, there is no verification provision. The 2003 First Review Conference of the 1993 Chemical Weapons Convention failed to address adequately the question of calmative and incapacitating chemical agents, despite the importance of the subject. It is known that a number of countries, in particular the United States, are developing such agents for law enforcement and possibly armed conflict. In this regard, the Moscow theatre siege, which saw the use of a fentanyl-like agent, was an important event and showed that so-called “non-lethal” agents carry a significant mortality when used on the general public. It is of great concern that no government formally expressed concern about the use of this gas in the context of the Chemical Weapons Convention.
Other initiatives for codes of conduct
In the category of initiatives outside international conventions, we discussed the public appeal, ‘Biotechnology, Weapons and Humanity’, launched in September 2002 by the International Committee of the Red Cross. The appeal calls for a wide range of actors to work to prevent the misuse of biotechnology and the spread of infectious disease (www.icrc.org). It draws attention to potential risks brought by advances in biotechnology and identifies the responsibilities of governments and the scientific community to ensure that such advances are used to benefit humanity. The last appeal launched by the ICRC was in 1918 after the use of chemical weapons.
The appeal presumes that violence can be prevented by limiting the ways in which victims are vulnerable and the ways in which weapons can be designed, produced, transferred to the user, and used. Prevention can address any of these points; e.g., vulnerability to deadly disease can be addressed through public health preparedness. Similarly, the design, production, transfer, and use of biological weapons can be prevented by stronger international legal instruments, better intelligence, increased awareness within the scientific, defence, and industrial communities, codes of conduct, and awareness of the dual use potential of biotechnological advances. Each such measure is a necessary step but not, in itself, sufficient to reduce this risk. To be effective, all measures should work in synergy and enhance each other in a ‘Web of Prevention’. The importance of education at all levels was stressed. This appeal has had resonance at the individual level, but it seems that institutions are not interested.
The Peace Pledge initiative in Japan was also discussed. The pledge campaign was motivated by Japan’s growing technical nuclear capabilities and ambiguities about Japan’s nuclear intentions, despite its non-nuclear principles and legal constraints. The initiative asks individual scientists and engineers to pledge not to engage in any activities on nuclear weapons and other weapons of mass destruction. As a commitment to peace, it is also to increase the awareness and responsibility of each scientist and to enhance dialog among scientists. A bottom-up approach was adopted to better reach individual scientists.
The results have not been encouraging. First, there has been negative feed-back, with people arguing that Japan should not give up its nuclear option. This is seen as a signal that the long-standing rejection of nuclear weapons by the Japanese people may be weakening. Second, the number of signatories is quite low (fewer than 100 in Japan, mostly non scientists). There seem to be three major reasons for this: a lack of interest combined with the attitude that scientists should not be involved in political activities; the organisational culture of Japan in which individuals are willing to sign only if their organisation signs; and the changing security climate in Japan due the North Korean nuclear capabilities. It was noted that the younger generation in Japan does not feel the same degree of moral responsibility to oppose nuclear weapons as previous generations. And scientists are reluctant to adopt codes of conduct because nuclear weapons are already covered by international laws.
Education
Education in ethics was not discussed separately but was prominent in many of the issues discussed. Very few educational institutions include ethical training in their curricula. There was agreement that scientific studies, including military studies, should include a presentation of these issues. At present, time limits during training, plus early specialisation and professionalism (which tends to dictate that students should focus only on their research), are barriers to including ethical considerations in research education. Participants debated several questions, including whether the courses should be mandatory of not, during which stage of training they should be taught (from the very beginning of the training or when the student starts doing research), who should teach these courses (historians of science, sociologists, or practitioners themselves), and the content of these courses. It was proposed that ethics should not come too early and could be incorporated into the final year of study in professional schools. Courses on science technology and society, in contrast, would be valuable early in training.
In the context of chemical and biological warfare research, it was suggested that Pugwash should write to deans of universities and encourage them to include these issues in standard biology and chemistry courses. It was noted that few practising professional understand the relevance of ethics courses in professional training, and most see it as unnecessary. The problem is, therefore, to get a broad-base agreement that ethics is an important issue and to get the students to attend the courses. It was noted academic societies are making progress in acknowledging something has to be done in this respect. It was also mentioned that this issue is not restricted to natural sciences; rather, moral and ethical content should also be included in social sciences courses.
The issue of a licence to practise for scientists, modelled on those for practising medical doctors, was raised. There are arguments for both sides of this issue, and it was suggested that national academies should look into the difficult question of whether a licence to generate knowledge would be useful in the current world of techno-science.
Follow-up
At numerous points during the workshop we discussed ways in which Pugwash could undertake activities to promote a larger role for ethics in science. It was pointed out that many of these issues had been discussed in Pugwash in the past, and that we should build on that base. Some of the suggestions were not controversial; others elicited considerable discussion and differing opinions. For example, Pugwash could join with other groups, such as the International Congress of Scientific Unions, to promote codes of conduct. Against this suggestion it was argued that Pugwash has in the past had a principle of not associating with other organisations–even when their goals were similar–preferring instead to work mostly behind the scenes. Pugwash could, on its own, issue a letter on the ethical responsibilities of scientists or sponsor a book on the subject which would lay out the choices available.
Another idea was to sponsor a website for discussion of ethical issues among scientists. Others argued, however, that conversations within the workplace were much more likely to have an impact than interactions over the Internet. Alternately a website could be to share information about educational initiatives and materials suitable for use in ethics training.
A third area in which Pugwash could play a role would be to work to institutionalise early warning functions against the misuse of science, by seeking to strengthen in international arms conventions, provide information about the physical and social risks of new technologies, and increase societal support for whistleblowers.