Corrosion checking – in vitro corrosion testing for implants
In vitro corrosion testing of medical implants is crucial to their implementation. Dr Nigel Corlett, Managing Scientist at engineering and scientific consultancy Exponent International Ltd, Harrogate, UK, explains how such analysis enables effective materials selection.
Implantable medical devices must possess sufficient corrosion resistance to the human body’s saline environment to maintain structural and functional integrity, as well as minimise the release of leachable substances, such as nickel (Ni), chromium (Cr) and cobalt (Co), which might otherwise react adversely with the surrounding tissue.
Implantable devices are commonly manufactured from alloys such as stainless steel, nitinol, titanium alloys, and cobalt-based alloys like MP35N. All these alloys rely on a passive oxide film (several nanometre’s thick) for corrosion resistance. If this film is damaged or compromised, localised corrosion such as pitting can ensue (see image). Regulatory agencies such as the US Food and Drug Administration (FDA) therefore typically require device manufacturers to demonstrate that their implants will not pit or excessively corrode, and have an acceptable metal-ion leaching rate during their expected in vivo service. In vitro corrosion testing can evaluate implants and their subcomponents to this end.
In particular, the ASTM International Standard F 2129 – Standard Test Method for Conducting Cyclic Potentiodynamic Polarisation Measurements to Determine the Corrosion Susceptibility of Small Implant Devices – is commonly used for this purpose.
In the current version of the ASTM F 2129 standard (2006), the device or alloy component is immersed fully in the physiological test solution, such as phosphate-buffered saline (PBS), at 37°C. The solution is deaerated with high-purity nitrogen for 30 minutes prior to immersion and then throughout the test. The device is allowed to stabilise and the rest potential (Er), or open-circuit potential, is monitored for one hour or until it stabilises to a rate of change of less than 3mVmin-1. The potential is then increased anodically at a slow scan rate such as 0.167mVs-1 until either a preset vertex potential (Ev) is reached, or, if breakdown occurs and localised corrosion ensues, until a preset current-density threshold (it) for scan reversal is reached.
The ASTM F 2129-06 standard permits a minimum Ev value of 0.8V, with respect to the saturated calomel reference electrode (SCE), on the premise that proceeding beyond this potential is not physiologically relevant. In many instances, however, the potential scan will not reach this value, and instead, breakdown of the passive film and pitting corrosion will ensue.
Breakdown occurs at the so-called breakdown potential (Eb) and is evident by a large and rapid increase in the measured current or current-density. For alloys such as 316L and nitinol in PBS, the location of breakdown is typically visible as a cloud of corrosion product appearing in solution around the pits (see image). The ASTM F 2129 testing produces a series of polarisation curves of the applied electrochemical potential in V (SCE) against the measured current in A – or current density in A cm-2 – for each device tested, and from which the values of Er and Eb (or Ev) can be determined, while monitoring the rest potential during the immersion period prior to running the polarisation scan provides information on the rest potential drift (∆Er).
There remains some debate in the medical-device community about how best to interpret the results of ASTM F 2129 with respect to determining the corrosion resistance of the test device. Unfortunately, neither the FDA nor any other regulatory agency provide any guidance as to what constitutes an acceptance criterion, while ASTM F 2129 suggests comparison with a predicate device. It has been suggested comparing Eb with potential ranges based on empirical experience, such that values of Eb greater than 0.6V (SCE) represent an ‘optimum corrosion-resistant’ condition. This approach, however, is independent of the alloy and the environment, nor does it take into account the different rest potentials attained by different alloys. It is worth noting that neither Er nor Eb are intrinsic properties of a metal or an alloy.
Both parameters are influenced by the environment, such as pH, solution chemistry, temperature, and the surface finish of the alloy, for example mechanically polished versus electropolished. Both factors are time dependant and can vary with immersion time, and, in the case of Eb, the potentiodynamic scan rate has an influence. An alternative approach is to consider the gap between the breakdown potential and the rest potential (Eb-Er), such that the larger the gap, the greater the resistance to pitting corrosion. This is not a new concept as Eb-Er is commonly used by researchers to quantify the pitting-corrosion resistance of alloys.
The premise of using Eb-Er is that pitting will only occur if Er exceeds Eb during in vivo service. In this manner, the gap Eb-Er may be regarded as a safety margin, taking into account differences between the physiological test solution and the in vivo environment, the effects of immersion time, and the variability associated with measuring Eb and Er.
Recent research has shown that immersion time is an important factor when evaluating pitting-corrosion resistance based on Eb-Er. The results indicate that, for mechanically-polished nitinol wire in PBS, both Er and Eb shift anodically (positively) with immersion time, but that the shift in Er is greater than that in Eb. The average Eb-Er value following a 156 hour immersion was 0.125V less than that measured following a one hour immersion, as would be determined using ASTM F 2129.
The development of a statistically valid model for an acceptance criterion for corrosion resistance would benefit the medical device community. Recent research has attempted to develop such a model. The statistical model is worth mentioning because of how it treats partially censored data, where some devices exhibit breakdown while others do not. Essentially, the model considers three scenarios – none of the devices exhibit breakdown, all of the devices exhibit breakdown, and only some of the devices exhibit breakdown.
In the first scenario, little analysis is required and the devices can be deemed corrosion resistant if a sufficient quantity has been tested. In the second case, standard methods for normally distributed data can be used to calculate the probability of breakdown at or before a given Eb-Er, where the minimum Eb-Er is considered a safety margin, and an upper bound on the probability of breakdown is estimated based on the sample size and prescribed level of confidence in the results.
In the third scenario, a statistical mixture model is constructed, whereby a Poisson distribution is used to determine the probability of breakdown of any given device and a normal distribution is used to calculate the conditional probability that breakdown of those devices will occur at or before a given Eb-Er.