Origins
As the father of nuclear physics,[11]
Ernest Rutherford is credited with splitting the atom in 1919.[12]
His team in England bombarded nitrogen with naturally occurring alpha
particles from radioactive material and observed a proton emitted with
energy higher than the alpha particle. In 1932 two of his students John Cockcroft and Ernest
Walton, working under Rutherford's direction, attempted to split
the atomic nucleus by entirely artificial means, using a
particle accelerator to bombard lithium
with protons, thereby producing two helium nuclei.[13]
After James Chadwick discovered the neutron
in 1932, nuclear fission was first experimentally
achieved by Enrico Fermi in 1934 in Rome, when his
team bombarded uranium with neutrons.[14]
In 1938, German chemists Otto
Hahn[15]
and Fritz Strassmann, along with Austrian
physicists Lise Meitner[16]
and Meitner's nephew, Otto Robert Frisch,[17]
conducted experiments with the products of neutron-bombarded uranium.
They determined that the relatively tiny neutron split the nucleus of
the massive uranium atoms into two roughly equal pieces, which was a
surprising result. Numerous scientists, including Leo Szilard who was one of the first, recognized
that if fission reactions released additional neutrons, a
self-sustaining nuclear chain reaction could result. This spurred
scientists in many countries (including the United States, the United
Kingdom, France, Germany, and the Soviet Union) to petition their
governments for support of nuclear fission research.
In the United States, where Fermi and Szilard had both emigrated,
this led to the creation of the first man-made reactor, known as Chicago Pile-1, which achieved criticality
on December 2, 1942. This work became part of the Manhattan Project, which built large reactors at the Hanford
Site (formerly the town of Hanford, Washington) to breed plutonium
for use in the first nuclear weapons, which were used on the cities of Hiroshima
and Nagasaki.
A parallel uranium enrichment effort also was pursued.
After World War II, the fear that reactor research
would encourage the rapid spread of nuclear weapons and technology[vague], combined with
what many scientists[who?]
thought would be a long road of development, created a situation in
which the government attempted to keep reactor research under strict
government control and classification. In addition, most[which?]
reactor research centered on purely military purposes. There was an
immediate[when?]
arms and development race when the United States military[who?]
refused to follow the advice of its own scientific community to form an
international cooperative to share information and control nuclear
materials[citation needed]. By
2006, things have come full circle with the Global Nuclear Energy
Partnership (see below.)[citation needed]
Electricity was generated for the first time by a nuclear reactor on
December 20, 1951, at the EBR-I experimental station near Arco,
Idaho, which initially produced about 100 kW (the Arco Reactor was
also the first to experience partial meltdown, in 1955). In 1952, a report by the Paley
Commission (The President's Materials Policy Commission) for
President Harry Truman made a
"relatively pessimistic" assessment of nuclear power, and called for
"aggressive research in the whole field of solar
energy."[18]
A December 1953 speech by President Dwight Eisenhower, "Atoms for Peace," emphasized the useful harnessing of the
atom and set the U.S. on a course of strong government support for
international use of nuclear power.
|