Nuclear power's original mistake: Trying to domesticate the bomb

11:34   09 April 2017    1174

By Stephen Mihm

In recent years, the Fukushima disaster, staggering cost overruns, and a rising tide of cheap solar power has pushed the nuclear energy industry closer and closer to the brink. Last week, the Westinghouse Electric Co., long a leader in nuclear power plant design and construction, went bankrupt. Suddenly, the demise of nuclear energy is no longer impossible to imagine.

Dwight Eisenhower, for one, would not be pleased. More than sixty years ago, the 34th president of the United States launched an idealistic quest to turn what he called the “greatest of destructive forces … into a great constructive force for mankind.” Unfortunately, his campaign, while noble in intent, didn’t pay attention to the bottom line -- a problem that has bedeviled the industry ever since.

The symbolic dawn of the atomic age took place on December 2, 1942, when the physicist Enrico Fermi and his associates managed to build the first self-sustaining nuclear chain reaction with the so-called Chicago Pile: a small, ramshackle assembly made of graphite, uranium oxide, wood, and uranium metal. The achievement helped pave the way for the the bombs dropped on Hiroshima and Nagasaki.

In 1946, Congress consolidated control over the nation’s nuclear energy in a civilian agency known as the Atomic Energy Commission. The Commission oversaw research on nuclear power at sites like the Bettis Atomic Power Laboratory outside Pittsburgh, though it turned to Westinghouse to run it. At this early stage, research remained focused on military applications: nuclear submarines, ships, and eventually, planes.

Westinghouse helped design and build the reactor that would eventually propel the Nautilus, the first nuclear sub. As he laid down the keel for the Nautilus on June 14, 1952, President Harry Truman also began bending the arc of atomic history toward peacetime needs at home. “This vessel is the forerunner of atomic-powered merchant ships and airplanes, of atomic power plants producing electricity for factories, farms, and homes,” he said.

The election of Dwight Eisenhower later that year helped make this vision a reality; so, too, did the members of the Joint Committee on Atomic Energy in Congress. In July 1953, the construction of the nation’s first full-scale civilian nuclear power plant was funded.

Much of this, as historian Paul Boyer has observed, was driven by a desire to find a “silver lining” in the mushroom cloud. While Americans had generally supported the use of nuclear weapons on Japan, the growing specter of thermonuclear war in the 1950s sparked a growing desire to find peaceful applications for the new technology that would compensate for its destructive powers.

The government sought private partners for the project, eventually settling on the Duquesne Light Company in Pittsburgh. It was a pragmatic alliance: the new plant in the nearby town of Shippingport would be very close to the Bettis Atomic Power Laboratory run by Westinghouse. Westinghouse would design and build the reactor, and Duquesne Light would build and maintain the non-nuclear portions of the facility.

Eisenhower wasted no time in trumpeting the news of the new plant. In December 1953, he appeared before the United Nations and delivered his “Atoms for Peace” speech. Eisenhower pledged that the United States would solve what he called the “fearful atomic dilemma” -- to figure out how the new technology could improve life on Earth rather than destroy it.

Foremost among these was the promise of nuclear energy. “The United States knows that peaceful power from the atomic energy is no dream of the future,” he declared. “That capability, already proved, is here -- now -- today.”

The president didn’t have that quite right, though: the government didn’t even break ground for the Shippingport reactor until the following December, as one history of the project points out. When they did, though, it was a spectacle worthy of the atomic age. From distant Denver, Eisenhower appeared on television. He picked up a plastic rod with a zirconium handle and a small plastic ball on the tip. This ball, Eisenhower explained, contained small quantities of neutron-emitting polonium and beryllium.

He then waved his magic wand over a small cabinet containing uranium. As the neutrons hit the uranium, a fission counter recorded the collisions via a pointer that would slowly move from left to right, triggering a relay that would shut off electricity coursing over a telephone line that connected Denver and Shippingport. That interruption would trip another relay that would in turn activate the controls used on a power shovel at the work site.

Construction began in earnest the following spring, but was quickly beset by a problem all too familiar with today’s nuclear energy industry: cost overruns. The original price tag of $38 million proved overly optimistic; it eventually cost $72 million to build, with research and development costs pushing the total to $120 million.

As the project neared completion in late 1957, the project’s directors worked hard to insure that the reactor achieved “criticality” -- a self-sustained nuclear reaction -- on the precise day, fifteen years earlier, that the Chicago Pile had performed the same feat on a more modest scale.

The Shippingport plant soon began generating electricity, and the government pronounced the reactor a success. And on one level, it certainly was: it sparked the construction of dozens of nuclear power plants in the U.S. and abroad, a good number of them designed and built by Westinghouse. Everything miraculous about the nuclear power industry began at Shippingport.

But so did everything that is overwhelming it now. Aside from the cost overruns, the electricity produced by Shippingport was quite costly. Though Duquesne Light bought electricity from the government at the rate of 8 mills per kilowatt-hour, the actual cost ranged between 55 and 60 mills.

In succeeding decades, nuclear power costs declined, but the industry remained heavily dependent on subsidies. And in recent years, the investment cost of developing new nuclear plants ballooned from $2,065 per kilowatt in 1998 to $5,828 in 2015, according to a World Nuclear Association report.

All of this bodes ill for the future of nuclear power. But the failure for nuclear to live up to its potential should hardly surprise us. It began as an idealistic attempt to domesticate the bomb in peacetime; the actual economic cost and benefits of doing was a secondary concern at best.

The world has spent nearly sixty years trying to make economic sense of nuclear power, with limited success. It may be time to pull the plug.

Stephen Mihm, an associate professor of history at the University of Georgia, is a contributor to the Bloomberg View.

More about: #nuclear  

other articles of categories