Metalworking is one of the oldest crafts, going back far beyond recorded history. But until a few thousand years ago, one of the most abundant metals—iron—was virtually unknown. The ancient Egyptians and Sumerians knew iron only from meteors, and considered it heavenly, a gift from the gods. Later civilizations discovered how to smelt it, and their period came to be known as the Iron Age, but they didn’t know how to consistently make it strong, and not brittle. During the Industrial Revolution, we discovered how to make it strong, consistent, and even cheap.
Now high-quality steel is everywhere around us: holding up our buildings (in both girders and reinforced concrete), in our cars and ships, in trains and their rails, in bridges and towers, in electrical infrastructure, in refrigerators and washing machines, in pots and pans, in forks and knives, in hammers and saws, in nails and screws, in tables and chairs.
The wonder of the metal is gone, but the wonder of the process remains. A steel foundry is an amazing sight, with enormous blast furnaces several stories high, raging at thousands of degrees 24/7, attended by a swarm of workers, like a great beast with an insatiable hunger for ore, coke, and lime.
The product they put out is of such purity and consistency that it would be hailed by the ancients as a miracle, a feat of craftsmanship possible only through divine intervention. But in the modern economy, it is a commodity.
What is this material? Why is it so hard to make? And how did it go from mythical to mundane?
Let’s begin at the beginning. Why is metal in general, and iron in particular, even desirable?
The use of a material is based on its properties, both physical and economic. To understand most materials and how they are used, ask three basic questions: (1) Is it suitable for the job—does it have the strength, elasticity, resistance to heat, etc.? (2) Can it easily be made into the desired shape? (3) How much does it cost?
Among the main classes of materials used in the pre-industrial world—stone, wood, clay, metal, glass, and textiles—stone and metal were the strongest, the only ones that could be used for cutting tools and weapons. Stone is brittle: it can only be worked by chipping and chiseling, and when it breaks, it cracks and shatters. Metal, by contrast, is elastic and malleable, especially when heated. It can be hammered into shape; if hot enough, it becomes liquid and can be poured into molds. And when it is subject to stress, it deforms by bending or denting, often leaving it mostly in shape and potentially repairable if not still usable.
From the ancient world to today, metal has been valuable for its properties: Because of its strength, it can be used as a structural material, forming the walls of vehicles and the skeleton of buildings; because of its hardness, it can be used for tools, including cutting implements such as knives, saws, and other blades; because of its resistance to heat, it can be used for cooking utensils or engines. And unlike stone, it can be worked into many shapes for all these purposes.
The first metal used widely, in prehistoric times, was bronze, an alloy of copper and tin. Iron, which must be smelted at a higher temperature, was developed later, but took over because it is harder, tougher, and more abundant. If you want to understand one metal and its importance to civilization, look to iron.
Where does iron come from? How do we make it?
Iron is abundant in the Earth’s crust, but like everything else in nature, it comes to us in an inconvenient form. You can’t dig up pure iron out of the ground. Instead, what you find is an oxidized form of the metal called ore. A common form of iron ore looks like red, rusty dirt—in fact, ore is chemically basically the same as rust (or looking at it another way, when iron rusts, it’s effectively turning back into ore). To get any metal out of its ore, you have to separate the elemental metal from the oxygen, by heating it in a furnace at thousands of degrees, a process called smelting.
There is one, rare form of metallic iron that is found in nature: meteoric iron that falls to Earth from space. Before humanity discovered how to smelt iron, meteoric iron was the only form of the metal they knew. Imagine the surprise of the ancients when a large rock fell from the heavens in a violent crash—and then when they tried to chip away at the rock, it turned out to be, not brittle like stone, but malleable. It was inevitable that this would be seen as a divine material, a gift from the gods, and indeed the word for iron in some ancient languages translates roughly as “metal from heaven”. Tutankhamun, the pharaoh of Egypt, was buried with a dagger made of meteoric iron.
We don’t know when, how or by whom iron smelting was first discovered, but we know how the process was done. A furnace is constructed: the smallest is a cylinder about three feet high, but they can be much larger. Iron ore and charcoal are dumped into the furnace, in alternating layers; it helps to add lime as well. A fire is started, perhaps air is pumped in the bottom with a bellows. The whole mass burns for hours before the furnace is opened. If successful, a spongy mass of iron will collect at the bottom of the furnace and can be removed with tongs. The mass is called a “bloom” and this type of furnace is a bloomery.
Of course, iron ore is not pure; it can contain impurities such as sulfur and phosphorous. Some of the impurities are separated out during smelting, forming a waste product called “slag”. This is the purpose of adding lime: it acts as a “flux”, that is, a material that helps separate out the slag. A small amount of slag remains in the bloom, however, so the next thing to do is to pound it with a hammer, which forces the slag out, and smooths out the bloom.
The iron produced this way is relatively soft (for a metal!) and can be worked into shape by the blacksmith at his forge. Using hammer, vice, and other tools, against the flat plane of an anvil or its curved horn, a skilled blacksmith can make many implements, from swords to plowshares. The metal is easiest to work when heated, so the smith returns it to the furnace again and again—literally striking while the iron is hot. Because it is worked with tools, this form of iron is called “wrought iron”.
But there is a problem. Wrought iron is too soft for some purposes; typically softer than bronze, which had been used for centuries. It’s good for many things that don’t take much abuse, from forks to buckles. But when your life depends on the strength of your material—say, in a sword—you want something tougher. You want the strengthened form of iron known as steel.
And so, over the centuries, blacksmiths learned ways to harden and toughen the material. With no science of chemistry to guide them, they plied their craft by trial, error, and lore. They found that heating wrought iron in a bed of charcoal would harden the surface of the material, which was good for forming a cutting edge. Something seemed to be getting into the material from the outside to harden it. The hardness was increased if, rather than allowing the material to cool slowly in air, it was cooled quickly by plunging it into water or oil, a technique known as “quenching”. These processes made the iron hard (resistant to scratching or denting) but also brittle. If, however, the iron was heated again, to a lower temperature this time, it would relax, losing some of its hardness but making up for it with increased toughness (the ability to absorb energy without breaking). This is known as tempering.
These techniques helped, but they were only skin-deep, strengthening the surface or case of the material. To make an object hard throughout, further techniques were applied. Thin strips of surface-hardened iron could be welded together; this not only distributed the strengthened metal throughout the final product, but it also created aesthetic patterns, and was known as “pattern welding”. (Japanese swordmaking used a similar technique, but instead of welding separate strips together, a single piece of metal was hammered flat and then folded on itself many times.)
There were a few ways—difficult, laborious, time-consuming, expensive ways—to distribute the hardness through the material uniformly. One was to heat the iron in charcoal, as described above, but for a very long time, on the order of weeks. This process was known as “cementation” and its output was “blister steel”. In ancient India there was a process that involved melting the iron with other materials in crucibles; this produced “Seric iron” or “Wootz steel”. Wootz steel was exported to Damascus, where it was made into legendary swords.
Ancient and medieval smiths knew these techniques, but they didn’t know why they worked, and of course they also “knew” and practiced many things that did not work, but which, without scientific methods, couldn’t be distinguished or separated from the things that did. In a deeply superstitious age, this included religious rituals of purification and prayers to invoke the gods.
And so, iron was still mythical. Steel was still rare, and expensive. And it stayed that way for thousands of years.
Now let us fast forward to medieval Europe, where a new form of iron was about to arise.
As Europe climbed out of the Dark Ages after the fall of Rome, the population and the economy slowly grew, and the demand for iron increased. For efficiency, furnaces were built larger and larger, to smelt bigger heats of metal.
As the furnaces grew taller, they needed more air to burn. Sometimes they were built on the side of a hill, to take advantage of natural breezes that blew into them. More and larger bellows were added. Eventually the furnaces were so large that hand-pumped bellows weren’t strong enough to force the air all the way up the stack; the solution was to build the foundry on a river and power the bellows with a water wheel. This large, air-hungry furnace is called a blast furnace, for the powerful blast of air it requires. (There seems to be no hard line or qualitative difference between the bloomery and the blast furnace; rather, it was a gradual evolution.)
Once a furnace is big enough, the fire burning long and hot enough, something new and different happens to the iron: some of it melts. Liquid iron running out of the furnace had a different character than the solid bloom that collected at the bottom: it was brittle. It could not be worked with tools: if you tried to hammer a solid mass of it after it cooled, it would simply break. It was useless at the forge, a botched product, a nuisance. There was nothing to do but to throw it back into the furnace to be remelted.
Or was there? There is, of course, one thing you can do with liquid metal that you cannot do with any type of bloom: pour it into a mold. This had been done with copper and bronze, of course, for thousands of years. Perhaps this liquid iron, if it could not be worked, could instead be cast.
With this perspective, by the 1400s if not before, smelters began making liquid iron on purpose—now called “cast iron”, as opposed to wrought iron. (Indeed, China had probably been making cast iron since about 500 BC. As far as I can tell, though, the technique was independently reinvented in Europe, and European ironmaking techniques weren’t influenced much if at all by China. So, like gunpowder and the compass, China gets credit for being first, but doesn’t really tie into the story of the Industrial Revolution.)
Some of the first cast iron products may have been church bells, which need to be large, single-piece objects; and cannons, which are very different in purpose but similar in structure. (Cannons and cannonballs were especially popular during the religious wars of the 16th and 17th centuries.) Iron foundries may even have converted between them in times of war vs. peace, and historian Douglas Fisher mentions “a 15th-century English foundryman whose shield bore a replica of a bell and a cannon.”
Cast iron found demand for many other purposes, but it was still brittle and so not suitable for all uses. Even for cannon it turned out to be problematic: accumulated stress on the cannon with repeated firing caused it to fail catastrophically, i.e., to explode without warning. Not really how you want your cannon to behave.
Whatever was getting into wrought iron to make it harder, cast iron seemed to have too much of. So, just as blacksmiths discovered techniques for hardening the wrought form, techniques were developed to refine the cast iron into less brittle forms: essentially turning it into wrought iron or steel. Cast iron would be formed into ingots called “pig iron” (so named because they were cooled in side troughs off a main trough, and looked like suckling pigs). An ingot of pig iron would be placed in a special “finery forge”; heating it there would—through means still unknown and mysterious—refine the material until it was once again malleable.
A new refining technique called “puddling” was developed by Henry Cort in 1784. Puddling used a “reverberatory furnace” in which the burning fuel and the iron ingots were kept in separate chambers, with the heat radiating off a curved ceiling; this kept the iron from absorbing any impurities in the fuel, which could adulterate and weaken the final product. The other key feature of puddling was turning the ingots with long poles, thus constantly exposing different sides to the air.
Despite all these improvements, by 1850, steel was still difficult and costly to make. The smelting of ore into cast iron had become relatively efficient, but the further refining took days or weeks—an enormous amount of fuel and labor.
That was about to change.
Just what was going on with all these different processes? Why was wrought iron soft and malleable, while cast iron was hard and brittle? What was so special about steel that allowed it to have the best of both worlds? Why did hammering and folding, quenching and tempering, fining and puddling, do their jobs?
Was something being added to the metal to harden it, or removed? Or simply rearranged? And if there was a foreign substance in the iron, what was it? From knowing how different kinds of furnaces affected iron, one might speculate that it was something in the air, or in the fuel, or “in” the heat itself—for heat was at one point theorized to be a kind of fluid.
Scientists and philosophers have asked these questions since the beginning of the Iron Age. Aristotle thought that iron was hardened through purification; in his Meteorologica he wrote that through repeated heating at the forge, “the dross sinks to the bottom and is removed from below, and by repeated subjection to this treatment the metal is purified and steel produced. … the better the quality of the iron the smaller the amount of impurity.” (Lee 1952) This was not a bad hypothesis—the stronger material is more pure—but, like many pre-scientific theories, it was wrong. The correct answer would wait for the science of chemistry.
Thousands of years later, a French chemist named Réaumur found the first piece of the answer: Through experiments melting iron together with many other types of materials in crucibles, he showed in 1722 that steel was not pure iron, but a “dirty” iron that included some other substance; he suspected “sulfurs and salts”. In 1781, T. O. Bergman, called the father of analytical chemistry, isolated the substance and identified it—as “phlogiston”, which was believed to be contained in all combustible bodies, and released in combustion. Unfortunately for Bergmann, there was a tiny flaw in his theory: the phlogiston model of combustion is wrong, and phlogiston does not exist. The substance he had isolated was real, however, and in 1786, Monge, Vandermonde and Berthollet correctly identified it as carbon. So the very substance that fueled the smelting process, was also what infiltrated the iron to harden it!
Today, we know that iron with less than about 0.1% carbon is wrought iron, with more than 2.1% it is cast iron, and in the sweet spot in between lies steel, the Goldilocks of iron-carbon alloys. The task of refining cast iron, then, is to remove the excess carbon. But in the early 1850s, the best methods for this took days or weeks. How to speed up the process?
Enter Henry Bessemer. Bessemer knew that carbon was the element that he needed to remove from cast iron to refine it, and he knew that oxygen unites readily with carbon. Existing methods of refining cast iron relied on exposure to the air. His breakthrough, in this context, was surprisingly simple:
Why not just use a lot more air?
That’s exactly what Bessemer did. He designed a converter that would hold molten iron coming out of a smelter, and blow a blast of air up through the bottom, greatly increasing the flow of oxygen. The air rapidly oxidizes the carbon in the metal, refining it into steel or wrought iron. When Bessemer first tried this, the effect was so dramatic that a burst of flame leapt out of the top of the furnace like a volcano.
The Bessemer process was a huge speedup: what used to take days or weeks was reduced to less than 30 minutes.
Initially, Bessemer would stop the process before the carbon was completely removed, leaving a small amount in the iron, which resulted in a “mild steel”. This process, however, was difficult to control precisely. In the long term, the best process turned out to be to remove virtually all of the carbon, and then add back a measured amount of carbon and any other elements to create a precisely controlled alloy.
Bessemer’s process had one other problem: It was discovered, once more people started to try it, that it only worked on iron ores that were low in phosphorous. Phosphorous in the ore would ruin the product. This problem was solved by Percy Gilchrist and Sidney Gilchrist Thomas: they used the mineral dolomite for the lining of the converter, which removed the phosphorous. This is known as the Thomas or Gilchrist-Thomas process, or the basic Bessemer process (“basic” here not meaning “simple” but in the chemical sense meaning the opposite of acid).
(Note: a similar technique was simultaneously invented by an American named William Kelly, but Bessemer got his name on the process and went down in the history books.)
The Bessemer process gave the world cheap, quality steel that could be produced in mass quantities. It reduced the cost of steel by over 80%, from £40 to £6–7 per long ton (according to Wikipedia).
Cheap steel hit the market in the great age of railroads. It was useful for locomotives, as stronger boilers could be made higher-pressure and thus more powerful, but it was crucial for rails. Rails could not be made of cast iron, which, being brittle, would crack under stress—a literal train wreck waiting to happen. So wrought iron was used instead, but being softer, it wore out quickly: some sections of track needed to be replaced every three months. Steel rails lasted years, more than ten times the life of wrought iron, saving the railroads enormous costs. Andrew Carnegie brought the Bessemer process to America in the 1870s, and reduced the cost of steel rails from $100 per ton in 1873 to $18 per ton by the 1890s. In that era, rails were the top product of the steel industry.
Steel was also used in construction. A skeleton of steel girders could bear the weight of a building, taking that load off the masonry; this let buildings be constructed taller, with more room for windows, leading to the era of skyscrapers. Outside the city, steel found many uses on the farm, especially in John Deere’s steel plow, which broke the tough prairie soils of the Midwest, and in barbed wire, which helped fence in cropland.
Steel has come a very long way.
New methods have emerged: Bessemer was partially replaced by the Siemens–Martin process, which uses an “open hearth” furnace that recycles heated gases for efficiency; both were eventually superseded by the “basic oxygen” process, which uses a pure oxygen blast instead of air. The electric arc furnace, good for recycling scrap, was developed in the 1900s and led to smaller plants called “mini-mills” (championed by the upstart Nucor starting in the 1970s).
New manufacturing techniques allow steel to be formed into virtually any shape with extreme precision and at mass scale. “Rolling” forces hot metal through turning cylinders, lengthening it and squeezing it into rods or beams. “Continuous casting” pours molten steel out of the converter directly into a mold, forming a long, continuous strip. Powerful forging machines stamp and press metal into shape—the blacksmith at his anvil is now part of a romanticized past, seen only in Renaissance faires, Colonial Williamsburg, or Game of Thrones.
Steel is also combined with other metals for specific purposes. Alloying steel with chromium reduces corrosiveness, giving us “stainless steel” for utensils and many other implements; alloying with tungsten increases heat resistance, necessary for cutting or grinding tools that generate a lot of friction, hence its name “tool steel” or “high-speed steel”. It can also be coated with other materials: a thin sheet of steel coated with an even thinner layer of tin makes the light, sturdy, and non-corrosive tin cans that now hold much of our food.
And since the 1960s, all of these processes have come under computer control, for extreme precision and efficiency. The composition of an alloy can be measured to hundredths of a percent, and the rolling gauges are set to ten-thousandths of an inch.
Today, steel is everywhere. Almost all iron is made into steel, and it is the most common metal—in the words of Vaclav Smil, it is “still the iron age”. Without it, the modern world would be unrecognizable—indeed, it would be impossible. Just try to imagine New York City without its skyscrapers. Imagine wooden paneling on cars, or washing machines. Our electrical infrastructure would be less efficient, with no steel available for turbines or transformers. Food would still be packaged in heavy, brittle glass jars, instead of thin cans. Powered flight would be at best difficult and expensive, and the space program would probably have been impossible.
The metal may no longer be mythical, but it is still marvelous.