Linear polarization resistance (LPR) and potentiodynamic polarization (PDP) are two widely used electrochemical methods for corrosion rate determination, yet their reliability across bare, inhibitor-treated, and coated steels remains debated. Systematic comparisons against gravimetric benchmarks under varying scan rates are also lacking, limiting the confidence in their quantitative accuracy. Here, we show that the accuracy of both LPR and PDP is governed by the system type rather than the test protocol. For bare steel immersed in NaCl solution, both methods converged toward weight-loss (WL) values, with LPR-modified rates (∼6.1 mpy) closely matching WL (6.7–7.2 mpy). In inhibited steel (1 mM triazole in 1 M HCl), only LPR produced rates within the WL range (52–55 mpy), whereas PDP overestimated up to 150 mpy at higher scan rates and disrupted the inhibitor film, as confirmed by impedance loss. For polymer-coated steel, LPR yielded ultra-low rates (∼10−6 mpy) consistent with intact protection, while PDP curves were dominated by capacitive charging and lacked defensible Tafel regions. Statistical analysis (ANOVA, F = 59.05, p < 0.0001; R2 adj = 0.79) confirmed system type as the dominant factor, with test type significant only through its interaction with system. These findings establish a practical, risk-based framework: LPR provided closer agreement with gravimetry for bare and inhibited steel under the tested conditions, while coated systems required barrier-focused diagnostics because PDP-derived kinetics were dominated by non-kinetic artifacts. By aligning test choice with system context, this study resolves longstanding inconsistencies in the corrosion literature and provides industries with quantitative basis for more reliable electrochemical monitoring.
扫码关注我们
求助内容:
应助结果提醒方式:
