In this paper, we propose two hybrid conjugate gradient algorithms for solving nonconvex optimization problems on Riemannian manifolds. The conjugate parameter of the first method extends a hybrid formula [Comput. Oper. Res. 159 (2023) 106341] from Euclidean to Riemannian spaces. The conjugate parameter of the second method integrates the Fletcher–Reeves conjugate parameter with another flexible conjugate parameter. An adaptive restart strategy is then incorporated into their respective search directions to enhance their theoretical properties and computational efficiency. As a result, both methods independently generate sufficient descent directions regardless of any stepsize strategy on Riemannian manifolds. Under typical assumptions and using the Riemannian weak Wolfe conditions to generate stepsize, the global convergence results of these two families are demonstrated. Numerical comparisons with existing methods using different Riemannian optimization scenarios verify the effectiveness of our proposed methods.