Rapid degradation of blue organic light-emitting diodes (OLEDs) is an ongoing challenge for the display and lighting industry. Bimolecular exciton annihilation reactions are one of the leading causes of molecular degradation in these devices, but are so far quantified mostly by fitting data to simplified rate equation models that crudely approximate the exciton and charge carrier densities in the recombination zone while neglecting the other layers in the device entirely. Here, we implement a rigorous drift-diffusion-based degradation model and compare its luminance fade and voltage rise to that of a corresponding rate-based model for a prototypical exciton-polaron-based degradation scenario. We find that the luminance fade predicted by the rate model yields functionally similar, but quantitatively different results than the drift-diffusion simulation, though reasonable agreement can be achieved by using effective values for the annihilation rate coefficient and hot polaron degradation probability. Importantly, the drift-diffusion model indicates that trap state defects formed in the emissive layer lead to only a minor increase in voltage, whereas those formed in the transport layers lead to a larger increase that is on par with experiment. These results suggest that OLED luminance loss and voltage rise largely originate from different sets of defect states formed in the emissive and transport layers, respectively, and that rate model degradation parameters fit from experiment should be viewed as effective values that do not directly correspond to the rate of the actual microscale processes occurring in the device.