High Purity Germanium (HPGe) detectors are powerful detectors for gamma-ray spectroscopy. The sensitivity to low-intensity gamma-ray peaks is often hindered by the presence of Compton continuum distributions, originated by gamma-rays emitted at higher energies. This study explores novel, pulse shape-based, machine learning-assisted techniques to enhance Compton background discrimination in Broad Energy Germanium (BEGe™) detectors. We introduce two machine learning models: an autoencoder-MLP (Multilayer Perceptron) and a Gaussian Mixture Model (GMM). These models differentiate single-site events (SSEs) from multi-site events (MSEs) and train on signal waveforms produced in the detector. The GMM method differs from previous machine learning efforts in that it is fully unsupervised, hence not requiring specific data labelling during the training phase. Being both label-free and simulation-agnostic makes the unsupervised approach particularly advantageous for tasks where realistic, high-fidelity labeling is challenging or where biases introduced by simulated data must be avoided. In our analysis, the full-energy Peak-to-Compton ratio of the \( ^{137}\)Cs, a radionuclide contained in a cryoconite sample, exhibits an improvement from 0.238 in the original spectrum to 0.547 after the ACM data filtering and 0.414 after the GMM data filtering, demonstrating the effectiveness of these methods. The results also showcase an enhancement in the signal-to-background ratio across many regions of interest, enabling the detection of lower concentrations of radionuclides.