Как выбрать гостиницу для кошек
14 декабря, 2021
1.1.1 Early science and the making of the bomb
Nuclear fission was first recognised by Otto Hahn and Fritz Strassmann in Berlin in 1938. They bombarded uranium with neutrons and found that atoms of barium — roughly half the atomic weight of uranium — were produced. They showed the results to their colleague, Lise Meitner, exiled in Stockholm with her nephew Otto Frisch. Together they used Bohr’s liquid drop model to explain how the addition of a neutron had caused resonant vibrations in the uranium nucleus, splitting it in two. The following year, 1939, Frederic Joliot and his co-workers, Kowarski and von Halban, showed that each fission event releases neutrons, which introduces the possibility of a chain reaction. This was something that had been foreseen by Leo Szilard in 1933 and even patented by him for the production of bombs. That same year, Niels Bohr had established that it was the isotope U-235 — constituting only 0.7% of natural uranium — which fissioned; in fact, the physics of the vibrating nucleus were such that it was only the odd numbered isotopes that could be fissioned by low energy neutrons.
Until 1939 progress in understanding fission and nuclear reactions generally had been slow. But with war in Europe, American scientists, many of them refugees, began working together secretly to see if fission could be put to military use. America entered World War II in December 1941 and the following year the work was brought together officially under the umbrella of the Manhattan project.
Even as early as 1939, however, it was clear that, so far as bomb-making was concerned, kilogram quantities of U-235 would be needed and at that time there was no way of separating the isotopes. Based on Bohr’s work, however, it was realised that odd-numbered isotopes of element 94 (later named plutonium) should also be fissile and, unlike uranium isotopes, it should be possible to isolate this chemically. Experiments in which U-238 was bombarded with sub-atomic particles in the Berkeley cyclotron eventually led, in February 1941, to the separation of a minute quantity of plutonium. But in order to produce enough to manufacture a bomb, the nuclear chain reaction had first to be demonstrated.1
Enrico Fermi and Leo Szilard had been working on arrays of graphite and uranium at Columbia University and, based on this work, they succeeded in creating the world’s first nuclear reactor at the University of Chicago in December 1942. Fermi’s reactor contained 349 tonnes of graphite, 36 tonnes of uranium dioxide and 5 tonnes of uranium metal; it had a power of 2 watts. Scaling this up to produce a reactor of 250 MW was a major undertaking but design and construction were completed in less than two years. Fermi and his team later (1946) formed the nucleus of the Argonne National Laboratory (ANL). The first of the three Hanford piles went critical in September 1944. The fuel was metallic natural uranium clad in aluminium and loaded into horizontal aluminium tubes within a graphite moderator. Cooling was provided by water from the Columbia River, which was pumped through the aluminium tubes. Such a reactor is capable of producing about 0.25 kg of plutonium per day. To limit formation of Pu-240 and higher isotopes, fuel was discharged at low burn up and this was facilitated by the ability to load and unload fuel at power. The plutonium was separated from the irradiated fuel and was then shipped in the form of plutonium nitrate slurry to Los Alamos, where it was reduced to plutonium metal.
Meanwhile, work had been progressing on isotopic separation of U-235. Four methods were investigated: gas centrifuge, gaseous diffusion, mass spectrometry and liquid thermal diffusion. Mechanical problems with the centrifuges caused this technique to be abandoned but the other three yielded useful quantities. This work was performed at Clinton Laboratories (later to become Oak Ridge National Laboratory, ORNL), Tennessee and quantities of U-235 were shipped from there to Los Alamos for construction of a gun-type device in which two sub-critical masses of U-235 are quickly brought together. For the plutonium bomb, however, it was discovered that the material supplied by Hanford contained small quantities of Pu-240, spontaneous fission of which would cause premature detonation. Consequently a more sophisticated implosion design was needed for which a test would be necessary. Enough plutonium had been shipped from Hanford to Los Alamos to create three bombs and it was decided to use one for a full scale trial in the Nevada desert. This was the Trinity test of 16 July 1945. Three weeks later (6 August) the uranium bomb (nicknamed ‘Little Boy’) was dropped on Hiroshima. Three days later, the second plutonium bomb (‘Fat Man’) was used to destroy Nagasaki; the third plutonium bomb was never used.1
What is clear from this brief description is that much of the applied science and technology that, even today, underpins the exploitation of nuclear energy came about as a direct result of a concerted effort to make these fearful weapons. Small wonder then that the public has difficulty in disassociating nuclear power from nuclear weapons.