Animal models are essential for assessing cardiovascular responses to novel therapeutics. Cardiovascular safety liabilities represent a leading cause of drug attrition and better preclinical measurements are essential to predict drug-related toxicities. Presently, radiotelemetric approaches recording blood pressure are routinely used in preclinical in vivo haemodynamic assessments, providing valuable information on therapy-associated cardiovascular effects. Nonetheless, this technique is chiefly limited to the monitoring of blood pressure and heart rate alone. Alongside these measurements, Doppler flowmetry can provide additional information on the vasculature by simultaneously measuring changes in blood flow in multiple different regional vascular beds. However, due to the time-consuming and expensive nature of this approach, it is not widely used in the industry. Currently, analysis of waveform data obtained from telemetry and Doppler flowmetry typically examines averages or peak values of waveforms. Subtle changes in the morphology and variability of physiological waveforms have previously been shown to be early markers of toxicity and pathology. Therefore, a detailed analysis of pressure and flowmetry waveforms could enhance the understanding of toxicological mechanisms and the ability to translate these preclinical observations to clinical outcomes. In this review, we give an overview of the different approaches to monitor the effects of drugs on cardiovascular parameters (particularly regional blood flow, heart rate and blood pressure) and suggest that further development of waveform analysis could enhance our understanding of safety pharmacology, providing valuable information without increasing the number of in vivo studies needed.
Objective: Establish whether the reliable measurement of cardiac time intervals of the fetal ECG can be automated and to address whether this approach could be used to investigate large datasets.
Design: Retrospective observational study.
Setting: Teaching hospitals in London UK, Nottingham UK and New York USA.
Participants: Singleton pregnancies with no known fetal abnormality.
Methods: Archived fetal ECG's performed using the MonicaAN24 monitor. A single ECG (PQRST) complex was generated from 5000 signal-averaged beats and electrical cardiac time intervals measured in an automated way and manually.
Main outcome measure: Validation of a newly developed algorithm to measure the cardiac time intervals of the fetal ECG.
Results: 188/236 (79.7%) subjects with fECGs of suitable signal:noise ratio were included for analysis comparing manual with automated measurement. PR interval was measured in 173/188 (92%), QRS complex in 170/188 (90%) and QT interval in 123/188 (65.4%). PR interval was 107.6 (12.07) ms [mean(SD)] manual vs 109.11 (14.7) ms algorithm. QRS duration was 54.72(6.35) ms manual vs 58.34(5.73) ms algorithm. QT-interval was 268.93 (21.59) ms manual vs 261.63 (36.16) ms algorithm. QTc was 407.5(32.71) ms manual vs 396.4 (54.78) ms algorithm. The QRS-duration increased with gestational age in both manual and algorithm measurements.
Conclusion: Accurate measurement of fetal ECG cardiac time intervals can be automated with potential application to interpretation of larger datasets.
Purpose: Coagulation-fibrinolysis markers are widely used for the diagnosis of Stanford type A acute aortic dissection (SAAAD). However, the role of these markers in estimating prognosis remains unclear.
Methods: A single-center retrospective study was conducted to identify the relationship between preoperative D-dimer and fibrinogen levels on SAAAD postoperative early prognosis.
Results: Of 238 SAAAD patients who underwent surgery between January 2012 and December 2018, 201 (84.5%) and 37 (15.5%) patients constituted the survival and non-survival groups, respectively, 30 days after surgery. D-dimer and fibrinogen levels in the survival and non-survival groups were 45.2 ± 74.3 vs. 91.5 ± 103.6 μg/mL (p = 0.014) and 224.3 ± 95.6 vs. 179.9 ± 96.7 μg/mL (p = 0.012), respectively. According to logistic predictor analysis of 30-day mortality, significant factors showed patent type (OR 10.89, 95% CI 1.66-20.31) and malperfusion (OR 4.63, 95% CI 1.74-12.32). Increasing D-dimer (per +10 μg/mL) and decreasing fibrinogen (per -10 μg/mL) were significantly associated with patent type and malperfusion. Receiver operating characteristic analysis was performed to distinguish between survival and non-survival. The cutoff value of D-dimer was 60 μg/mL (sensitivity 61.1%; specificity 82.5%; area under curve [AUC] 0.713 ± 0.083); fibrinogen was 150 mg/dL (sensitivity 44.4%; specificity 84.0%; AUC 0.647 ± 0.092). Kaplan-Meier survival curve analysis showed that patients with D-dimer levels > 60 μg/mL and fibrinogen levels < 150 mg/dL had significantly low survival rates at 30 days after surgery (60.0%, p < 0.001).
Conclusion: Preoperative coagulation-fibrinolysis markers may be useful for predicting early prognosis in SAAAD.
Background: Susceptibility to and severity of COVID-19 is associated with risk factors for and presence of cardiovascular disease.
Methods: We performed a 2-sample Mendelian randomization to determine whether blood pressure (BP), body mass index (BMI), presence of type 2 diabetes (T2DM) and coronary artery disease (CAD) are causally related to presentation with severe COVID-19. Variant-exposure instrumental variable associations were determined from most recently published genome-wide association and meta-analysis studies (GWAS) with publicly available summary-level GWAS data. Variant-outcome associations were obtained from a recent GWAS meta-analysis of laboratory confirmed diagnosis of COVID-19 with severity determined according to need for hospitalization/death. We also examined reverse causality using exposure as diagnosis of severe COVID-19 causing cardiovascular disease.
Results: We found no evidence for a causal association of cardiovascular risk factors/disease with severe COVID-19 (compared to population controls), nor evidence of reverse causality. Causal odds ratios (OR, by inverse variance weighted regression) for BP (OR for COVID-19 diagnosis 1.00 [95% confidence interval (CI): 0.99-1.01, P = 0.604] per genetically predicted increase in BP) and T2DM (OR for COVID-19 diagnosis to that of genetically predicted T2DM 1.02 [95% CI: 0.9-1.05, P = 0.927], in particular, were close to unity with relatively narrow confidence intervals.
Conclusion: The association between cardiovascular risk factors/disease with that of hospitalization with COVID-19 reported in observational studies could be due to residual confounding by socioeconomic factors and /or those that influence the indication for hospital admission.
The use of intracoronary imaging with intravascular ultrasound (IVUS) or optical coherence tomography (OCT) can define vessel architecture and has an established role in guidance and optimisation of percutaneous coronary intervention. Additionally intracoronary imaging has an emerging role in diagnosis, afforded by the ability to depict vessel wall characteristics not seen on angiography alone. Use of intracoronary imaging is recommended by international consensus guidelines from the European Society of Cardiology and two recent expert consensus position statements from the European Association of Percutaneous Coronary Interventions (EAPCI). However, uptake in contemporary practice in the United Kingdom appears to lag behind these recommendations. Imaging is particularly advantageous in complex coronary lesions (such as left main stem coronary artery, bifurcation, or heavily calcified lesions) and in complex patients (acute presentations, atypical presentations, and renal dysfunction). Stent detail to the level of individual struts can be appreciated with intracoronary imaging, which facilitates appropriate stent selection and optimisation of the final stent result. We highlight specific subgroups that benefit from an imaging guided approach to percutaneous coronary intervention. We review the evidence and the role of intracoronary imaging and highlight specific subgroups that show particular benefit from imaging guided percutaneous coronary intervention.
The practice of interventional cardiology has changed dramatically over the last four decades since Andreas Gruentzig carried out the first balloon angioplasty. The obvious technological improvements in stent design and interventional techniques have facilitated the routine treatment of a higher risk cohort of patients, including those with complex coronary artery disease and poor left ventricular function, and more often in the setting of cardiogenic shock (CS) complicating acute myocardial infarction (AMI). The use of mechanical cardiac support (MCS) in these settings has been the subject of intense interest, particularly over the past decade . A number of commercially available devices now add to the interventional cardiologist's armamentarium when faced with the critically unwell or high-risk patient in the cardiac catheter laboratory. The theoretical advantage of such devices in these settings is clear- an increase in cardiac output and hence mean arterial pressure, with variable effects on coronary blood flow. In doing so, they have the potential to prevent the downward cascade of ischaemia and hypoperfusion, but there is a paucity of evidence to support their routine use in any patient subset, even those presenting with cardiogenic shock. This review will discuss the use and haemodynamic effect of MCS devices during percutaneous coronary intervention (PCI), and also examine the clinical evidence for their use in patients with cardiogenic shock, and those undergoing 'high risk' PCI.

