Diagnostics are critical on the path to commercial fusion reactors, since measurements and characterisation of the plasma is important for sustaining fusion reactions. Gamma spectroscopy is commonly used to provide information about the neutron energy spectrum from activation analysis, which can be used to calculate the neutron flux and fusion power. The detection limits for measuring nuclear dosimetry reactions used in such diagnostics are fundamentally related to Compton scattering events making up a background continuum in measured spectra. This background lies in the same energy region as peaks from low-energy gamma rays, leading to detection and characterisation limitations. This paper presents a digital machine learning Compton suppression algorithm (MLCSA), that uses state-of-the-art machine learning techniques to perform pulse shape discrimination for high purity germanium (HPGe) detectors. The MLCSA identifies key features of individual pulses to differentiate between those that are generated from photopeaks and Compton scatter events. Compton events are then rejected, reducing the low energy background. This novel suppression algorithm improves gamma spectroscopy results by lowering minimum detectable activity (MDA) limits and thus reducing the measurement time required to reach the desired detection limit. In this paper, the performance of the MLCSA is demonstrated using an HPGe detector, with a gamma spectrum containing americium-241 (Am-241) and cobalt-60 (Co-60). The MDA of Am-241 improved by 51% and the signal to background ratio improved by 49%, while the Co-60 peaks were partially preserved (reduced by 78%). The MLCSA requires no modelling of the specific detector and so has the potential to be detector agnostic, meaning the technique could be applied to a variety of detector types and applications.