//php echo do_shortcode(‘[responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”]’) ?>
In 1987, the US Occupational Safety and Health Administration issued the Hazardous Waste Operations and Emergency Response standards, or HAZWOPER. HAZWOPER could have landed like a bombshell in the semiconductor industry, most of which had long lacked any sort of playbook or competence in dealing with the extremely dangerous chemicals that are critical to chipmaking. But as it turned out, the industry was ready for HAZWOPER, thanks in large part to the efforts of Neal Langerman.
Through most of the 1970s, Langerman had been a chemistry professor at Utah State University, specializing in chemical hazards and human health; he had previously been on the faculty of Tufts University Medical School. Then in 1980, a chemical company called JT Baker asked Langerman to help develop and teach a chemical-safety training course aimed at managers at a major semiconductor plant in Phoenix.
Langerman was surprised to discover how lacking the plant’s existing safety processes were — and even more surprised when it turned out that the plant was ahead of most of the rest of the industry. “Most of these companies had barely begun to think about process safety,” he recalls. “There were no standards, leaving every fab to deal with safety in a disorganized, ad hoc way.”
Langerman started focusing on semiconductor industry safety. Although he would remain on the Utah State faculty for another four years, by 1981, he was doing more and more consulting and training for the industry, both through JT Baker and through a consulting company he set up called Chemical Safety Associates.
The garlic test
Langerman had his work cut out for him: In spite of his leading-edge technology and the unusually hazardous materials he was working with, the industry was still in the Wild West when it came to safety. The risks weren’t theoretical. The silane gas that was used in chipmaking is explosive — it ignites on contact with air — and some facilities around the world have seen fatal explosions and fires. Some US factories used phosphorous oxychloride, which, if leaked, decomposes to a skin- and lung-burning mist of phosphoric and hydrochloric acids.
In the late 1970s, a Phoenix fab had suffered an acid-mist release that left 50 employees huddling naked in a parking lot under the spray of a fire hose. Other processes at some plants decompose into deadly phosgene gas. The concerns were multiplied in 1984, when a gas leak at a Union Carbide insecticide plant in Bhopal, India, killed more than 2,000 people in hours and ultimately resulted in more than 15,000 deaths and nearly half a million people left with chronic health problems. “The semiconductor industry realized it could not risk exposing employees and others and knew it had some serious problems to solve,” says Langerman.
While human safety was the primary concern, the financial costs of shutting down a fab were sobering in their own right: At that time, the price tag for stopping production was reckoned at $1 million per minute, and the full cost of shutting down, cleaning , and restarting a fab could run into the hundreds of millions of dollars. Add in the potential for fires, regulatory penalties, and lawsuits, and it wasn’t hard to see why Langerman was suddenly in big demand. “The goal became to reduce the risk to damn near zero,” he says. By the time he left academia in 1984 to devote all his time to safety consulting, his company employed 15 Ph.D.- and master’s-level chemists and chemical engineers.
The best way to remove risk related to a hazardous chemical is to stop using the chemical. But that wasn’t an option in chipmaking, which revolves around chemically etching metallic surfaces, as well as scouring surfaces clean down to microscopic scales. To cope with its dependence on these harsh substances, the industry needed to revamp the ways it stored, used, and disposed of the chemicals. This would allow it to prevent trouble, in addition to building up its capabilities to identify and respond to leaks and other emergencies.
To help the industry make that leap, Langerman was a persistent advocate for standardizing industry-wide chemical-hazard prevention and response practices. One key development in his favor was the emergence in the early 1980s of a solution to a problem that had been haunting the industry since its birth: the lack of a technology capable of rapidly detecting a dangerous leak. Some chipmaking chemicals are so toxic that they can cause health problems in parts per billion, and there was simply no device or chemical kit that could reliably spot such tiny contaminations in the minutes that might separate a sudden leak from catastrophe. The main detection method for a leak of the highly toxic chipmaking chemical arsine, for example, was to train employees to be on the alert for the chemical’s garlicky odor — a technique that necessitated banning spaghetti sauce at fab cafeterias lest the garlic in the sauce cause a panic.
The detection gap was finally filled in the early 1980s by the introduction of “Chemcassette” technology that could pick up even trace amounts of chipmaking chemicals. Based on a technique first developed in the 1940s, Chemcassette devices relied on a strip of paper tape impregnated with chemicals that would change color with even tiny exposures to particular chemicals. An analytical tools distributor named MDA Scientific, later acquired by Honeywell, developed the Chemcassette device in 1971. But it took another decade to bring out a version sensitive to chipmaking chemicals. Besides being rapid and sensitive, notes Rick Gorny, a retired Honeywell chemical engineer who worked on the Chemcassette at MDA, the resulting stained tape provided striking evidence of a leak. “You finally had physical evidence you could plunk down on a manager’s desk to show them there was a problem,” says Gorny.
Checklists and buddies
At the same time, other companies were bringing out chemicals that could be used to quickly neutralize leaked toxins, and equipment that could more safely move the toxic chemicals in and out of the chipmaking equipment. Armed with these innovations, Langerman and his team made the rounds of semiconductor companies to help them get their processes up to speed. To clue managers into the risks of silane, Langerman would take them outdoors to demonstrate how easily a cylinder of the gas could ignite explosively — a demonstration he finally abandoned after nearly causing a brush fire outside one fab. To turn even routine maintenance workers at the plant into leak-detection technicians, he had them carry pH test strips to dip in any drops of liquid they spotted around equipment and to sound the alarm for anything but a neutral result. “Nothing was off the table, no matter how mundane,” says Langerman.
Borrowing from the airline industry and from the military, Langerman helped create safety checklists for handling chemicals, and “buddy” policies calling for two sets of eyes watching over each step. And he encouraged senior executives to get involved in the safety makeovers. “The strength of a company’s safety culture is a direct reflection of the participation of the most senior people,” he says. “If the CEO doesn’t make it a priority, nobody will.”
Through the early and mid-1980s, Langerman pushed the gospel of formalizing and standardizing safety practices. That journey would continue into the mid-1990s, he says, but the 1980s were the period of key transition. “That’s when order began to prevail over chaos in the safety realm,” he says. As a result, the industry was well along in that transition in 1987, when OSHA brought out the HAZWOPER regulations. “By that time, companies knew how to plan for things to go wrong instead of trying to figure out how to fix it after things went wrong,” says Langerman.
Langerman remained active in improving semiconductor industry safety until just two years ago, when he finally retired. He says he looks back now on his four decades of actively promoting that improvement with a feeling of accomplishment. Yet he’s also quick to note that safety should continue to be seen as a work in progress. “There are still things that go wrong in the industry,” he says. “But because safety processes have been formalized, companies are much, much better at responding to problems and learning from them.”
David H. Freedman is a Boston-based science writer. His articles appear in The Atlantic, Newsweek, Discover, Marker by Medium, and Wired, among many other publications. He is the author of five books, the most recent being “Wrong,” about the failure of expertise.