Iron and steel production is thought to be one of the least environmentally friendly industries going. A major reason for this view is that the production of these metals from raw material is extremely energy intensive, aside from the other emissions coming from the production process. The following image, from a Steel Industry Trade group (yes I am aware of bias here) shows that each ton of steel produced today uses 30% less energy than it did just 20 years ago.
If you were to push back further, you would see that instead of requiring the 12 MBtu of energy to make a ton of steel today, in 1975 producers required about 48 MBtu and back in 1955 they required 58. Or to put it another way, the energy intensity of primary steel production has improved by nearly 80 percent in the last half-century. It is worth remembering that there has been no crash program to make the steel sector “greener.” These dramatic improvements have even occurred during a time when dirty energy costs were falling in real terms (the 80s and 90s). Why, then, did such dramatic improvements occur, and why are they likely to continue to occur? Competition. The steel industry has gone global and there is also great competition from alternative building materials. Producers in a competitive system have every incentive to economize on the use of all inputs, so how come this “race to the bottom” is not widely understood? By the way, it’s a pretty good bet that the quality of steel coming out of the furnaces of today is higher than from a half-century ago, so these figures above are surely understating just how much of an improvement we have seen.