The radiance temperature, determined using a radiation thermometer, depends on the spectral emissivity of the measured surface. If the surface is opaque, to determine its temperature from the signal measured by the thermometer, it is common to use the approximation of an average effective emissivity, weighted over the spectral range of operation of the thermometer, and to consider the average effective reflectivity as its complement, such that the sum of their values is one. We found that, when the spectral emissivity of the surface has relevant variations, such consideration can induce noticeable deviations in the calculated radiance temperature, so besides the average effective emissivity, a true effective reflectivity should be calculated as well. By using the real spectral emissivity values of two different surfaces, we performed numerical calculations of the deviations on the estimated radiance temperature, due to the approximation described above, for different common wavelengths and thermometer spectral responses. We found that for a ceramic sample of known spectral emissivity, the above approximation led to deviations in the estimated radiance temperature of 1.4 K at 348.15 K and 2 K at 723.15 K.