The Advancements in tactile sensors and machine learning techniques open new opportunities for achieving intelligent grasping in robotics. Traditional robot is limited in its ability to perform autonomous grasping in unstructured environments. Although the existing robotic grasping method enhances the robot's understanding of its environment by incorporating visual perception, it still lacks the capability for force perception and force adaptation. Therefore, tactile sensors are integrated into robot hands to enhance the robot's adaptive grasping capabilities in various complex scenarios by tactile perception. This paper primarily discusses the adaption of different types of tactile sensors in robotic grasping operations and grasping algorithms based on them. By dividing robotic grasping operations into four stages: grasping generation, robot planning, grasping state discrimination, and grasping destabilization adjustment, a further review of tactile-based and tactile-visual fusion methods is applied in related stages. The characteristics of these methods are comprehensively compared with different dimensions and indicators. Additionally, the challenges encountered in robotic tactile perception is summarized and insights into potential directions for future research are offered. This review is aimed for offering researchers and engineers a comprehensive understanding of the application of tactile perception techniques in robotic grasping operations, as well as facilitating future work to further enhance the intelligence of robotic grasping.