api-monitoring

Ever Wonder Who Invented the Cleanroom?

cleanroom-devops

There wasn’t always an industry standard approach to protected computing environments. Rather, the clean room was pioneered by one man, who’s being posthumously honored by the Inventors’ Hall of Fame.

In the 1960s, the United States’ biggest political enemy was the Soviet Union. But in the race to build deadlier nuclear weapons, sometimes the enemy was smaller than one-tenth the size of a human hair.

Sandia National Laboratories, in Albuquerque, New Mexico (formerly Los Alamos’s “Z-Division”) focused on engineering production. It was spun off by Robert Oppenheimer into an independent laboratory in 1945. (Geek trivia: The US wasn’t fighting Russian zombies. The cool sci-fi moniker is actually derived from the name of its first director, Jerrold Zacharias.)

In the immediate post-war era, Sandia focused primarily on weapons development, pushing the limits on miniaturization in both mechanical parts and transistors. But as objects became smaller and tolerances for particle contamination dropped, the challenges became larger.

“A particle that’s only one-tenth of a micron—a tenth of the width of your hair—could short out a 1960s transistor,” says Gil Herrera, Sandia’s director of microsystems science, technology, and components. If more particles were involved, even completely mechanical parts could fail.

The cause, explains Herrera, is particle distribution. When a mechanical assembly is being put together, whether it’s a Swiss Watch or a weapons control system, particles fall evenly onto it. But once it is encased and operating, electro-static and other forces cause the particles to attract each other, eventually gathering in clumps big enough to jam the system.

The problem had been known since the 19th century, when particles clogged the delicate works of precision wristwatches. By World War II, an elaborate system of defenses had been developed, from purpose-built clothing that didn’t shed particles (the precursors of “bunny suits”) to vigilant washing of surfaces to vacuums with HEPA (High-efficiency Particulate Absorption) filters, which were developed as part of the Manhattan Project “to remove fissionable particles from the air.”

By the mid-twentieth century, these standards had dropped particle counts to one million particles per square foot of air. While that was unprecedented, it was still not good enough for the tolerances required by transistors and modern electro-mechanical miniaturization.

Sandia’s Willis Whitfield, honored posthumously this year by the National Inventors Hall of Fame, had the great insight that no new technology was needed to solve the problem. Instead, Whitfield suggested a re-thinking of how to use the existing ingredients. Like a chef using butter, flour, and eggs to make a soufflé rather than a cake, he added more air.

In Whitfield’s scheme, called the “laminar flow” cleanroom, air is continuously exchanged through HEPA filters. In a prototype, the filtered air was pushed horizontally, but the standard configuration became air flowing from the ceiling through the floor. Whitfield included additional air to compensate for the oxygen workers would breathe in.

The scheme sounds simple and obvious, until you think about its Achilles’s heel: What would it really be like to work in a room with air constantly blowing through? Nearly everything in a cleanroom is subject to disturbance by air currents, from papers on desks to the parts under assembly. Cautions Herrera, “Some of these components are not significantly bigger than glitter.”

The success of the method lies in Whitfield’s calculations to establish the minimum flow of air required. It turns out that to exchange the air ten times a minute only requires one mile per hour, a rate which is virtually undetectable by a room’s occupants and won’t dislodge tiny components.

Whitfield published his results in 1962. Yet, despite the positive reception from many in the industry, a year later he was publicly accused of fraud at a trade meeting by several manufacturers who were skeptical his claimed reduction could have actually been achieved. In one of the most famous anecdotes in the field, an audience member from Bell Labs agreed with them that the calculations in Whitfield’s paper were terribly wrong. When Bell Labs had tried to replicate his results, they’d discovered he’d under-reported the reduction by a factor of ten!

Refinements to Whitfield’s design set the standard in the semi-conductor industry for a Class I cleanroom, in which there is only one ¼ micron particle per one cubic foot of air. Any engineer using standard pressure equations, gas laws, and the like can calculate how to scale up to even the largest modern cleanrooms, which can be as large as a million square feet (for electronic devices, constant temperature and humidity must also be maintained), says Herrera. “I’ve seen the original unit: It looks like a small storage box, but the basic principles are pretty much the same. You flow air down, any other direction, all it would do is push the particles around.”

Image Source: Sandia National Labs

Image Source: Sandia National Labs

Nevertheless, there’s an ironic ending to this story. Modern cleanrooms actually allow more particles per million than Whitfield’s original design. The reason? “The nature of these devices has become so sophisticated and the cost of making them so expensive that the wafers are now twelve inches in diameter,” says Herrera. “In Willis’s time, the wafers were one inch in diameter, but the industry has gone to automation. A wafer is rarely exposed to the cleanroom environment, because it’s always in these sub-environments within these tools.”

See also:

 

subscribe-3