Many tasks in image-guided surgery require a clinician to manually position an instrument in space, with respect to a patient, with five or six degrees of freedom (DOF). Displaying the current and desired pose of the object on a 2D display such as a computer monitor is straightforward. However, providing guidance to accurately and rapidly navigate the object in 5-DOF or 6-DOF is challenging. Guidance is typically accomplished by showing distinct orthogonal viewpoints of the workspace, requiring simultaneous alignment in all views. Although such methods are commonly used, they can be quite unintuitive, and it can take a long time to perform an accurate 5-DOF or 6-DOF alignment task. In this article, we describe a method of visually communicating navigation instructions using translational and rotational arrow cues (TRAC) defined in an object-centric frame, while displaying a single principal view that approximates the human's egocentric view of the physical object. The target pose of the object is provided but typically is used only for the initial gross alignment. During the accurate-alignment stage, the user follows the unambiguous arrow commands. In a series of human-subject studies, we show that the TRAC method outperforms two common orthogonal-view methods-the triplanar display, and a sight-alignment method that closely approximates the Acrobot Navigation System-in terms of time to complete 5-DOF and 6-DOF navigation tasks. We also find that subjects can achieve 1 mm and 1° accuracy using the TRAC method with a median completion time of less than 20 seconds.
Motivated by the need to support those self-managing chronic pain, we report on the development and evaluation of a novel pressure-based tangible user interface (TUI) for the self-report of scalar values representing pain intensity. Our TUI consists of a conductive foam-based, force-sensitive resistor (FSR) covered in a soft rubber with embedded signal conditioning, an ARM Cortex-M0 microprocessor, and Bluetooth Low Energy (BLE). In-lab usability and feasibility studies with 28 participants found that individuals were able to use the device to make reliable reports with four degrees of freedom as well map squeeze pressure to pain level and visual feedback. Building on insights from these studies, we further redesigned the FSR into a wearable device with multiple form factors, including a necklace, bracelet, and keychain. A usability study with an additional 7 participants from our target population, elderly individuals with chronic pain, found high receptivity to the wearable design, which offered a number of participant-valued characteristics (e.g., discreetness) along with other design implications that serve to inform the continued refinement of tangible devices that support pain self-assessment.