New alloy makes it possible to produce iron from electrolysis
Extracting iron from iron ore creates approximately two tons of carbon dioxide (CO2) emissions for every ton produced. To reduce this burden on the environment, researchers at the Massachusetts Institute of Technology (MIT) have developed a way of extracting iron using electrolysis.
Electrolysis is an established means of producing metals such as aluminium and lithium. Until now its application in iron production has been hindered due to the difficulty in finding a suitable anode material capable of weathering the challenging conditions. The anode material has to withstand temperatures of over 1500°C and must not be corroded by the molten oxides or the oxygen released in the process.
Led by Professor Donald Sadoway, John F Elliott Professor of Materials Chemistry at MIT, the researchers experimented with a number of metals before settling on a chromium–iron alloy that is both affordable and capable of enduring the extreme conditions. The lower cost is aided by the fact that high–purity chromium is not required.
The chromium–iron anode and a molybdenum cathode are immersed in a melt of iron oxide, calcium oxide, aluminium oxide, and magnesium oxide. During electrolysis, iron is deposited on the cathode – notably, this is purer than conventionally produced iron.
The method works because of a chromium and aluminium oxide film that forms on the surface of the anode, protecting it from corrosion while remaining conductive. In addition to producing a purer product, the process requires less energy than using a blast furnace, so benefits from an obvious reduction in CO2 emissions.
Scaled-up tests are currently being planned and a company has been formed to commercialise the method. The plan is to adapt it to extract other metals from their ores, such as nickel, chromium and titanium.
'We’re going to go hop-scotching across the major part of the periodic table,' said Sadoway.