Several studies have reported low adherence and high resistance from clinicians to adopt digital health technologies into clinical practice, particularly the use of computer-based clinical decision support systems. Poor usability and lack of integration with the clinical workflow have been identified as primary issues. Few guidelines exist on how to analyze the collected data associated with the usability of digital health technologies. In this study, we aimed to develop a coding framework for the systematic evaluation of users' feedback generated during focus groups and interview sessions with clinicians, underpinned by fundamental usability principles and design components. This codebook also included a coding category to capture the user's clinical role associated with each specific piece of feedback, providing a better understanding of role-specific challenges and perspectives, as well as the level of shared understanding across the multiple clinical roles. Furthermore, a voting system was created to quantitatively inform modifications of the digital system based on usability data. As a use case, we applied this method to an electronic cognitive aid designed to improve coordination and communication in the cardiac operating room, showing that this framework is feasible and useful not only to better understand suboptimal usability aspects, but also to recommend relevant modifications in the design and development of the system from different perspectives, including clinical, technical, and usability teams. The framework described herein may be applied in other highly complex clinical settings, in which digital health systems may play an important role in improving patient care and enhancing patient safety.