The gold (Au) mining industry requires rapid and cost-effective quantitative analysis techniques to enhance detection efficiency and reduce operational costs. Compared to conventional Au detection methods, laser-induced breakdown spectroscopy (LIBS) offers distinct advantages, including rapid analysis capabilities, minimal sample preparation requirements, and simultaneous multi-element detection, which makes it a widely adopted technique in material analysis. Additionally, it meets the demand for more efficient detection methods in Au mining. This study employed LIBS to detect and analyze trace Au in gold ore samples. The characteristic spectral lines used in the subsequent calculation of the experiment were determined in the experimental preparation stage. The systematic optimization of experimental parameters was used to calculate the relative standard deviation (RSD) of the sample and determine the optimal ratio of laser energy to binder for effective sample breakdown. Based on this condition, the transverse magnetic field intensity and spectral collection delay time were adjusted, and the influence of transverse magnetic field strength on the signal-to-noise ratio (SNR) of Au spectral lines under varying delay conditions was investigated. Compared with the condition of no constraint and no delay, the limit of detection (LOD) of Au decreased from 52.52 to 23.29 mg/kg under a 0.4 T constant magnetic field and 2.4 μs. Through hardware enhancements, including the replacement of conventional focusing lenses with microscopic objectives, the spectral data of Au samples with different gradient concentrations were obtained. The LOD was reduced to 5 mg/kg by calculation and data fitting. Finally, plasma temperatures and electron densities across different magnetic field intensities were calculated to determine local thermodynamic equilibrium (LTE) conditions, while the effect of magnetic confinement on the enhancement of plasma spectral lines was theoretically explained.