This article discusses the mathematics of the Manhattan Project which developed the first nuclear reactors and atomic bombs and established high expectations for the effectiveness of mathematical modeling and computer simulations that continue to the present day. It argues that the mathematics used in the Manhattan Project was unusually easy compared to other major inventions and discoveries — which has led to unrealistic and often disappointed expectations for subsequent “New Manhattan Projects.”

The Manhattan Project is remarkable in the history of invention and discovery in the number of scientific and technological advances made in a short period of time (1939-1945) and in that the first full system test, the detonation of the first implosion plutonium bomb at the Trinity Test Site on July 16, 1945, was a success. It is a case in which theoretical calculations and simple numerical simulations using electromechanical devices such as Marchant Calculators and IBM punched-card machines with much less computing power than a 1980’s Apple II computer were seemingly able to correctly design extremely destructive weapons that worked right the first time.

The achievements of the Manhattan Project include the first nuclear reactors, the enrichment and production of the explosive Uranium-235 isotope in large quantities, the discovery of the element plutonium, the bulk production and chemical purification of the explosive Plutonium-239 isotope, and working atomic bomb designs for both uranium and plutonium, based on substantially different principles. All in seven years; most of the work took place between 1942 and July 1945. Comparable advances in rocketry, for example, took place between World War I (1914) and Sputnik (1957), a span of over forty years.

The remarkable, perhaps unprecedented, success of the Manhattan Project established high expectations for so-called “Big Science” and for the effectiveness of theoretical calculations, mathematical models, and computer simulations. Although most attempts to replicate the success of the Manhattan Project since World War II have failed, the Manhattan Project continues to exert a powerful influence over the expectations of the general public, policy makers, and scientists. For example, a Google search for the phrase “New Manhattan Project” on July 6, 2013 returned 188,000 hits. These include numerous big budget proposals to address the current high energy and gasoline prices.

This article argues that, despite their dramatic effects and public impact, the atomic bombs were probably quite simple devices with relatively few parts and a particularly simple geometry compared to most major inventions and discoveries which made mathematical modeling unusually effective compared to most inventions and discoveries before or since. In addition, the extremely hazardous sub-critical assembly tests performed made it possible to test and verify the mathematical models just short of an actual explosion, which helped avoid the many failed full system tests found in other inventions and discoveries. In part for this reason, the Manhattan Project is a poor guide for modern research and development programs, including those aimed at solving the current energy shortages.

**Lies, Damned Lies, and National Security**

A major difficulty in understanding and evaluating the unusual success of the Manhattan Project is that it remains a highly classified research and development program. To this day, many aspects of the program including the exact designs of the Little Boy uranium bomb used on Hiroshima and the Fat Man plutonium implosion bomb used on Nagasaki remain classified. It remains possible that some parts of the official story are false, whether for legitimate national security reasons or other reasons.

The US government and the leaders of the Manhattan Project appear to have prepared a public relations campaign months in advance of the bombings. A key aspect of this was the so-called Smyth Report released to the general public on August 12, 1945. The Smyth Report purported to be a history of the Manhattan Project and a discussion of those technical and scientific issues that could be safely revealed to the public. It included a brief foreword by General Leslie Groves, the leader of the Manhattan Project.

The Smyth Report became a bestseller in the fall of 1945 and early months of 1946. For some time, it was probably the only source of significant information on the Manhattan Project and atomic weapons in the United States.

Most significantly, from the point of view of this article, the Smyth Report established an official history that it was probably illegal for participants of the Manhattan Project to deviate from.

In particular, the foreword (signed by General Leslie Groves) to the version of the Smyth Report published by Princeton University Press contained the following warning:

All pertinent scientific information which can be released to the public at this time without violating the needs of national security is contained in this volume. No requests for additional information should be made to private persons or organizations associated directly or indirectly with the project. Persons

disclosingor securing additional information by any means whatsoever without authorization are subject to severe penalties under the Espionage Act.

(Emphasis added)

The Smyth report emphasized theoretical nuclear physics and the basic theory of neutron scattering and fission chain reactions, something that was already in the published physics literature or could be easily duplicated by other nations such as the Soviet Union. The Smyth report probably made Einstein’s equation relating energy and mass

which appears on one of the first pages of the report famous. Even to the present day, most popular accounts of the Manhattan Project feature this equation prominently, even though it provides no practical information on how to build an atomic bomb. By design, metallurgy, chemistry, anything that was deemed difficult for a potential enemy to duplicate was either kept entirely out of the report or discussed in very general, vague terms. Again, it is not inconceivable that incorrect information was incorporated in the report.

Historically, all or almost all major physical inventions involve large amounts of physical trial and error, usually many full system tests before success is achieved. It is often this physical trial and error that is most costly and time consuming to duplicate. Hence, in the case of atomic bombs, the physical trial and error and its results would often have been among the most important aspects of the program to keep secret. Thus, the Smyth Report probably had the effect of emphasizing the role of theoretical calculations in nuclear physics and creating the impression of a rather straightforward development of the bombs from theoretical calculations to working prototypes, especially since there seem to have been no failed full system tests.

**Are atomic bombs simple?**

The open/published literature suggests that the atomic bombs were actually rather simple devices with either a simple spherical geometry (Fat Man) or a simple cylindrical geometry (Little Boy). At present many experts believe that the best unclassified reconstruction of the weapons is found in John Coster-Mullen’s Atom Bombs: The Top Secret Inside Story of Little Boy and Fat Man. Both his proposed designs for Little Boy and Fat Man are quite simple with Fat Man in particular having less than twenty major parts and spherical symmetry.

It is important to understand the significance of simplicity for mathematical modeling. A rocket engine, for example, has tens of thousands of complicated parts including valves and pumps. Consequently simulating a rocket engine is difficult even with modern super-computers let alone the primitive tools available to the Manhattan Project. Even the internal combustion engines of the 1940’s may have been far beyond the atomic bombs in complexity and difficulty to simulate, both because of a larger part count and more complex geometries.

One may ask if atomic bombs are so simple, why don’t more nations and groups have them? It is likely that the hard part of building an atomic bomb is producing bulk quantities of the explosive enriched Uranium-235 and Plutonium-239 radioactive isotopes used in the bombs. This is likely the reason for Israel’s June 7, 1981 attack on the Osirak nuclear reactor in Iraq and the alleged recent joint US-Israeli cyberattack on Iran’s uranium enrichment centrifuges.

**Assume a spherical cow**

There is a family of jokes about physicists known as “spherical cow” jokes. In a typical spherical cow joke, a biologist or other non-physicist, an engineer, and a physicist are asked some question about a cow or herd of cows, such as how much milk will they produce. The biologist gives some sort of verbal, airy-fairy biology type answer. The engineer gives a practical, hard-headed engineer answer. Finally, the physicist gives his answer and starts with “First, we assume a spherical cow…”.

These jokes have a serious point — about oversimplification of the real world by physicists. In particular, in their formal training, physicists are frequently taught to solve a range of theoretical equations such as Maxwell’s Equations for electromagnetism for very simple and often unrealistic geometries, especially spheres but also sometimes cylinders and other simple geometric forms.

The implosion bomb design used for the plutonium bombs appears to have consisted of a series of nested nearly perfect concentric spherical shells of materials, forming a solid sphere. The outer shell consisted of high explosives used to compress the spherical charge of plutonium to the critical density at which an atomic explosion would occur. The pictures of Gadget (the Trinity Bomb) and Fat Man (the Nagasaki bomb) above probably show the outermost casing of the spherical implosion bombs.

It is important to understand that many mathematical systems are much easier to solve if they have spherical symmetry and are converted to spherical coordinates. In many cases, these problems can be reduced from three dimensions to a much simpler one-dimensional problem. This is one of the reasons that physicists are notorious for trying to treat cows as perfect spheres. But, in the case of the implosion bombs, the spherical assumption was likely valid.

**Assume a Cylindrical Cow**

According to John Coster-Mullen’s reconstruction, Little Boy consisted of a cylindrical “gun barrel” with two matching cylindrical charges of Uranium-235 which were slammed together by explosives.

In many cases, mathematical systems with cylindrical symmetry are much easier to solve than general systems after converting to cylindrical coordinates. In many cases, these problems can be reduced from three dimensions to two dimensions, or even one dimension which is generally much easier to solve. This is why, after spheres, physicists are trained to solve mathematical problems with cylindrical symmetry, e.g. cylinders such as pipes, in cylindrical coordinates.

Although Little Boy was probably not as easy to model as Fat Man, it was still a very simple, mathematically tractable geometry. Far simpler than many mechanical inventions such as the nautical chronometers used to measure longitude or rocket engines, for example.

**The Suicide Club**

A major problem with mathematical modeling is knowing that the model is correct. In the case of physics, the underlying theory may be wrong. In addition, the actual implementation of the model, today as a computer program, may be incorrect. In practice, both problems may be present. How did the scientists and engineers at the Manhattan Project know they had the right predictions for the Trinity bomb? Actually, they weren’t sure. In most cases, it is necessary to compare the model with actual physical tests or experiments, often full system tests of some kind.

In the case of the atomic bomb, the first full system test worked, which is unusual although not unheard of in the history of invention and discovery. In the particular case of the atomic bombs, it was possible to test the accuracy of the theoretical models of neutron scattering and the fission chain reaction by assembling almost critical (a critical mass is the amount of uranium 235 or plutonium 239 that will initiate a chain reaction and explode) amounts of the Uranium 235 or Plutonium 239 — by hand, just short of an explosion, an extremely dangerous procedure that killed at least two people.

It is actually an unusual situation in mathematical modeling where it is possible to test the mathematical model almost fully without a full system test. In many cases, that cannot be done. A full system test, often many full system tests, is required.

It is perhaps worth pausing for a moment to consider the personal courage and extreme fear of a German/Nazi victory that motivated people to perform such a risky procedure, especially with a new, poorly understood physical phenomenon like nuclear fission.

**Conclusion**

The Manhattan Project had and continues to have a powerful influence on expectations for science in general and specifically for the effectiveness of theoretical calculations and today computer simulations. It appears to be an example where scientists, theoretical physicists such as Hans Bethe and Richard Feynman, were able to make calculations and correctly design and predict the operation of extremely destructive devices that worked right the first time. This remains a powerful ideal not only in the minds of the general public and policy makers, but also in the minds of practicing scientists.

There have been many attempts to replicate the spectacular success of the Manhattan Project since World War II. Most of these “New Manhattan Projects” have failed. Probably the largest poster child for the failed New Manhattan Projects remains the War on Cancer which has consumed roughly ten times the budget of the Manhattan Project with very disappointing results. Many attempts by physicists to replicate the Manhattan Project in physics have also failed or produced disappointing results. Just recently, the fusion power project the National Ignition Facility at Lawrence Livermore National Laboratory has suffered massive cost and schedule overruns and disappointing results in the now decades old effort to achieve practical nuclear fusion power.

The historical record of the last sixty years suggests that the Manhattan Project was anomalous in comparison to other major inventions and discoveries, both before and since. On close examination, it appears likely that mathematical modeling was unusually successful in the Manhattan Project due to the relative simplicity (low part count) and simple geometry of the first atomic bombs. In addition, the sub-critical assembly tests made it possible to validate the mathematical models without failed full-system tests, which is unusual and created high expectations for mathematical modeling in the future.

© 2013 John F. McGowan

**About the Author**

*John F. McGowan, Ph.D.* solves problems using mathematics and mathematical software, including developing video compression and speech recognition technologies. He has extensive experience developing software in C, C++, Visual Basic, Mathematica, MATLAB, and many other programming languages. He is probably best known for his AVI Overview, an Internet FAQ (Frequently Asked Questions) on the Microsoft AVI (Audio Video Interleave) file format. He has worked as a contractor at NASA Ames Research Center involved in the research and development of image and video processing algorithms and technology. He has published articles on the origin and evolution of life, the exploration of Mars (anticipating the discovery of methane on Mars), and cheap access to space. He has a Ph.D. in physics from the University of Illinois at Urbana-Champaign and a B.S. in physics from the California Institute of Technology (Caltech). He can be reached at jmcgowan11@earthlink.net.

**Credits**

Most of the images in this article were produced by the US Government and therefore are in the public domain under US law.

The picture of the Hiroshima bombing is from Wikimedia Commons (http://commons.wikimedia.org/wiki/File:Atomic_cloud_over_Hiroshima_%28from_Matsuyama%29.jpg) It was taken from the “Enola Gay” aircraft after it dropped the bomb and is in the public domain.

The picture of the cover of the Smyth Report is from Wikipedia (http://en.wikipedia.org/wiki/File:Smyth_Report.jpg) and is in the public domain.

The picture of Fat Man being assembled at Tinian is from Wikimedia Commons (http://commons.wikimedia.org/wiki/File:Fat_Man_Assembly_Tinian_1945.jpg) and is in the public domain.

The picture of Gadget (The Trinity Bomb) is a still frame from a YouTube video of the movie of the Trinity Test taken by the US Government; it should be in the public domain.

The picture of spherical coordinates is from Wikimedia Commons (http://commons.wikimedia.org/wiki/File:Spherical_Coordinates_%28Colatitude,_Longitude%29.svg) and is in the public domain.

The picture of Little Boy is from Wikimedia Commons (http://commons.wikimedia.org/wiki/File:Little_boy.jpg) and is in the public domain.

The picture of cylindrical coordinates is from Wikimedia Commons (http://commons.wikimedia.org/wiki/File:Cylindrical_Coordinates.svg) and is in the public domain.

**References and Suggested Reading**

Atomic John by David Samuels, The New Yorker, December 15, 2008 (Article about John Coster-Mullen)

New Manhattan Project for Energy Independence

Forbes introduces New Manhattan Project to Tackle Energy Dependence, Rising Gas Prices

“Is it time for a New Manhattan Project” by William Schreiber, MIT Faculty Newsletter, September/October 2007

“Time for a New Manhattan Project?” by David Woolner (Blog Post, Next New Deal: The Blog of the Roosevelt Institute, April 1, 2011)

CRS Report for Congress

Prepared for Members and Committees of Congress

The Manhattan Project, the Apollo Program,

and Federal Energy Technology R&D

Programs: A Comparative Analysis

Deborah D. Stine

Specialist in Science and Technology Policy

June 30, 2009

A New Manhattan Project

November 12, 2009

Thomas J. Espenshade and Alexandria Walton Radford

Inside Higher Ed

The New Manhattan Project: Q&A with NYGC Scientific Director Robert Darnell

Bio-IT World, November 28, 2012

A New Manhattan Project for Clean Energy Independence: The United States Must Marshal Its Resources and Talent to Tackle the Challenge of Coping with Climate Change

By Alexander, Lamar

Magazine article from Issues in Science and Technology, Vol. 24, No. 4

Thom Hartman Program: Full Show 2/8/13: A New Manhattan Project

Fox News: Needed — A New Manhattan Project for Bio-Defense

“The biologist gives some sort of verbal, airy-fairy biology type answer. ” Yeah, probably silly stuff like liters, rates, hours, protein, fats, and the usual useless things that airy-fairy biologists talk about all the time.

“airy fairy” refers to the way the jokes are often told. It is not my opinion of biologists or the other types of scientists or people who are in the jokes.

John

Yeah, seriously. Way to take a joke.

I’m sure John doesn’t think physicists are as hyper-literal as that joke implies, either.

no.

designing “spherical”

implosion required

explosive lensing to focus unstable nonlinear shock waves,in a approximate spherical geometry, to ensure focus in a time scale of a few millionths of a second.

it was worked out by john von neumann.

balancing implosion against explosion REQUIRED solving very nasty integral equations–the details are still classified.

it was worked out by stanislaw ulam

preventing predetonation required an extension of markov chain theory.

it was created by mark kac, with help from dick feynman.

to evaluate certain–classified—integrals, stan ulam invented the first

practical monte carlo method.

it’s not about how good you computers

are, it’s about mathematical genius.

similarly, we got to the moon with very

primitive computers.

The spherical design of the implosion bombs almost certainly made those integrals and calculations much easier to solve.

There was extensive physical trial and error testing of the implosion lens design. They would detonate the explosives, the “lens” with a blank core, not plutonium or uranium and see what happened. They likely did this because the calculations were not all that accurate or it took a lot of trial and error to achieve the near perfect spherical symmetry required for the “spherical cow” calculations to be correct.

Despite the fancy name, Monte Carlo simulation/integration is a brute force numerical simulation procedure. It is very heavily used in physics today, but often not with the spectacular results of the Manhattan Project.

Even today it would be difficult to do a detailed Monte Carlo three-dimensional simulation of a bomb. The Monte Carlo simulation had to be greatly simplified in some way to perform it with Marchant Calculators and/or IBM punched-card machines in the 1940s.

Again, the spherical shape of the implosion bomb probably made it possible to do one-dimensional or nearly one-dimensional numerical simulations using the Marchant Calculators and IBM punched-card machines that were nonetheless sufficiently accurate when combined with the implosion lens tests and the sub-critical assembly tests.

With respect to the Apollo program, I would say the orbital dynamics calculations were relatively “easy” as such things go. The hard part was developing rocket engines capable of getting first to orbit and then to escape velocity. A rocket engine is much more complex than an orbital trajectory or an atomic bomb (probably). Hence, one finds enormous amounts of trial and error and failed full systems tests in the history of rocket development.

A physical test can avoid the need to solve the mathematics at all.

Secondly, inventors and discoverers seem to develop an intuitive feel for their problem through large amounts of trial and error, practice, perhaps in the same way that top athletes must develop an accurate but intuitive understanding of physics, even though they probably cannot express this understanding in formal mathematical equations. Hence the physical trial and error is not blind trial and error but involves educated guesses that are sometimes very accurate. This intuitive understanding and luck probably explains how people invented or discovered many things without modern computers or even mathematics in some cases.

John

the x-15 (which was a hypersonic manned rocket plane} was designed when computers

were very primitive.

in fact, basically, using slide rules, and

analog computers.

don’t forget that V2 rocket engines–with their thousands of moving parts–were also

designed using slide rules and electric desk calculators.

the engines for the old atlas was designed

using very primitive computers.

the new atlas engines come from

russia. they were designed and tested in

the 70’s. these engines were fifty years ahead of ours—and were designed with

very primitive computers.

russian engineers knew math.

the spherical shape was only a first approximation–the shock wave hits the

inner workings of the bomb,and is refracted

and reflected.

the experimental work on implosion was useless–and it was the failure of the

experimental team–that led to recruiting

von neumann.

the math made it work.

// The spherical design of the implosion bombs almost certainly made those integrals and calculations much easier to solve

//

except it was only approximately spherical

because of the parts inside.

anyway, even on one dimension, the are

singular nonlinear kernel wiener-hopf equations.

after they created the theorems to find the right convergent algorithm—they were solvable.

//

Even today it would be difficult to do a detailed Monte Carlo three-dimensional simulation of a bomb. The Monte Carlo simulation had to be greatly simplified in some way to perform it with Marchant Calculators and/or IBM punched-card machines in the 1940s.

//

yes, and that was a work of genius as

good error bounds had to be proved.

//

A physical test can avoid the need to solve the mathematics at all.

//

not so easy for nonlinear systems—unless

a lot is known about the qualitative mathematics. theorems, you know.

Susan, if you can, please post your smaller comments grouped within a single comment or two in the future.

A point of clarification on the use of “easy” in my article.

I am using “easy” in the sense that the mathematics was relatively easy compared to the mathematics that was used or would need to have been used in lieu physical trial and error in other major inventions and discoveries (breakthroughs).

This is not to say the mathematics used in the Manhattan Project was, for example, easy relative to the mathematics in a college level applied mathematics course or indeed even in an advanced graduate level applied mathematics course.

Sincerely,

John

john,

in fact, new cutting math had to be created

for the manhattan project.

that’s why von neumann, marc kac, stan ulam

were needed.

the same was true for rockets.

//

A rocket engine is much more complex than an orbital trajectory or an atomic bomb (probably). Hence, one finds enormous amounts of trial and error and failed full systems tests in the history of rocket development.

//

rockets are complex–without the

mathematics, the trial and error would have increased by thousands of times.

orbital mechanics is easy,only because

three centuries of mathematical geniuses

made it so. they have names like newton,

lagrange, gauss, hamilton…

it is not as simple as the two page chapter on newton’s equations makes it

appear.

try finding out from scratch what the minimum fuel orbit between earth and mars

must be.

it’s a nice problem in calculus of variations.

john,

aside from all that, i agree with your

basic point that calling a new Manhattan

project for things like fusion reactors does not work.

steady state fusion reactors require very difficult unstable nonlinear free boundary problems to be solved. the math is about five to ten years in the future.

it was precisely too much faith in “trial

and error”, and “experimental intuition”,

and “powerful computers,in the absence

of a good math theory”, that led to sixty

years of expensive failure.

inertial fusion is an expensive attempt

to set up a simpler system. but, it is still harder than expected.

so, i agree with your core point.

off the thread, and back to research.

the problem with inertial laser fusion

is getting control of the implosion.

once again, it’s only approximately spherical, because: even 122 laser beams don’t

form a true spherical wave.

and it is a highly nonlinear problem.

they assumed the problem could be brute

forced with experimental intuition and

experimental tests, and supercomputers.

well, it doesn’t work that way.

they need to prove theorems and get the

right mathematical theorems first.

too many computer jocks and experimental

tinkers, not enough research mathematicians of von neumann quality.

these fusion guys never seem to learn.

We had a saying when we built something for the first time:

‘If in doubt, make it stout, out of things you know about.’

I used this principle throughout a 30+ year career to build many things – from superconducting magnets to multi-megawatt power converters to bioforming reactors to digital sound amplifiers – the first versions were rather stout and the unknowns were reduced as much as possible. But there was always some modeling that went on – Ansys was king of the magnetic hill at the time and I later used Algor and analysis tools built into CAD software (a bit dicey, there, BTW). I always looked for symmetry and simplification, but often the models were constrained by our knowledge of material and device properties, not the arithmetic.

A reasonable model and an understanding of the basic physics of the thing you are building are essential – you CAN’T build a nuclear weapon w/o understanding the conversion of mass to energy nor can you build an implosion device based on simple symmetry – the explosives aren’t a uniform shell, to begin with and they don’t go off at the same time, either – they are a set of very big hammers that hit at slightly different times. Anticipating uncertainty and allowing for it in your design is the mark of a experienced designer – they may not be masters in their field and things may not work they way they expect, but at least they have the humility to know that things will go wrong and relying on a chain of iron-clad assumptions is a quick way to oblivion.

But when it came to making something that we could sell to a prophet [sic] we started taking things out – and it’s amazing how simple the final devices became. Some of this came about through modeling, but most came about through an intimate understanding of how the things worked, an understanding that couldn’t have been gained any other way than by running – and breaking – the ‘stout’ and ‘known’ prototypes.

Modeling today is a far cry from the intimate understanding of all the individual processes – and their uncertainties – that were used to design the German prototypes that became the Russian rocket engines. I could whomp up a rocket engine model on my CAD system and convince an investor it would work the fist time – and every time thereafter – but that’s a fool’s game. Each part is different and will be assembled slightly differently – if I don’t allow for ‘real’ parts my model will work great but my rocket engine will probably kill somebody (and it’s very instructive to look at the videos of failed Soviet launches and read the stories of the disasters in their program). They went for stout and simple – the US went for delicate and not simple. Who got to the moon first? Who’s on Mars today?

A stout iPhone, anybody?

Quel interessant projet!Il doit vraiment etre exploité à fond!

Can anyone recommend a math program that is well suited to advanced physics? All that I have seen are primarily written for typical industrial engineering programs such as Mathcad. My math skills are weak and I need the assistance of a math program and/or one capable of advanced math to describe the physical functions of black holes. As somewhat of a savant I can “see” all the internal functions of black holes but am unable to descibe them mathmatically.. any help out there?