Background: Patients with rare, pathogenic cardiomyopathy (CM) and arrhythmia variants can present with atrial fibrillation (AF). The efficacy of AF ablation in these patients is unknown.
Objective: This study tested the hypotheses that: 1) patients with a pathogenic variant in any CM or arrhythmia gene have increased recurrence following AF ablation; and 2) patients with a pathogenic variant associated with a specific gene group (arrhythmogenic left ventricular CM [ALVC], arrhythmogenic right ventricular CM, dilated CM, hypertrophic CM, or a channelopathy) have increased recurrence.
Methods: We performed a prospective, observational, cohort study of patients who underwent AF catheter ablation and whole exome sequencing. The primary outcome measure was ≥30 seconds of any atrial tachyarrhythmia that occurred after a 90-day blanking period.
Results: Among 1,366 participants, 109 (8.0%) had a pathogenic or likely pathogenic (P/LP) variant in a CM or arrhythmia gene. In multivariable analysis, the presence of a P/LP variant in any gene was not significantly associated with recurrence (HR 1.15; 95% CI 0.84-1.60; P = 0.53). P/LP variants in the ALVC gene group, predominantly LMNA, were associated with increased recurrence (n = 10; HR 3.75; 95% CI 1.84-7.63; P < 0.001), compared with those in the arrhythmogenic right ventricular CM, dilated CM, hypertrophic CM, and channelopathy gene groups. Participants with P/LP TTN variants (n = 46) had no difference in recurrence compared with genotype-negative-controls (HR 0.93; 95% CI 0.54-1.59; P = 0.78).
Conclusions: Our results support the use of AF ablation for most patients with rare pathogenic CM or arrhythmia variants, including TTN. However, patients with ALVC variants, such as LMNA, may be at a significantly higher risk for arrhythmia recurrence.
The authors report for the first time to their knowledge, implantation of a standard implantable cardioverter-defibrillator lead for permanent delivery of left bundle branch area pacing. Implantation was successful and safe in 11 of 12 patients, with adequate defibrillation testing, good electrical and electrocardiographic parameters, and uneventful device-related short-term follow-up.
Background: Holter monitoring may raise suspicion of an underlying catecholaminergic polymorphic ventricular tachycardia (CPVT) diagnosis. Although not a primary investigation for CPVT, Holter monitoring is ubiquitously used as a diagnostic tool in the heart rhythm clinic.
Objectives: The objective of this study was to explore Holter monitoring in CPVT diagnosis.
Methods: This retrospective cohort study analyzed off-therapy Holter monitoring from 13 ryanodine receptor 2-positive CPVT and 34 healthy patients from the Canadian Hearts in Rhythm Organization national registry. Using the Edwards method, the ratio of ambient-maximum heart rate during Holter monitoring was correlated with exertion level to separate premature ventricular contractions (PVCs) during periods of adrenergic and nonadrenergic stress. A receiver operating characteristic curve analysis determined the optimal threshold for isolating CPVT-induced PVCs during adrenergic states.
Results: PVC burden differed between groups (P = 0.001) but was within population norm, suggesting ambient PVCs are uncommon in CPVT. CPVT patients had higher PVC counts than healthy controls (P = 0.002), with a different distribution based on adrenergic state. The optimal threshold for separating PVCs into periods of adrenergic and nonadrenergic stress in CPVT patients was 76% of the maximum heart rate during the monitoring period. Compared with healthy controls, CPVT patients had a higher PVC count, limited to periods of adrenergic stress, defined by >76% maximum heart rate threshold (P = 0.002; area under the receiver operating characteristic curve: 0.84). Below this threshold, there was no significant PVC difference (P = 0.604).
Conclusions: Holter monitor PVC counts alone are inadequate for CPVT diagnosis, owing to the adrenergic nature of the disease. Quantifying PVC prevalence at a heart rate threshold >76% identified CPVT with moderate sensitivity (69%) and high specificity (94%).
Background: Atrial fibrillation (AF) is associated with impaired renal function and chronic kidney disease (CKD).
Objectives: This study assessed the effects of rhythm control on renal function compared with rate control among patients recently diagnosed with AF.
Methods: A total of 20,886 patients with AF and available baseline estimated glomerular filtration rate (eGFR) data undergoing rhythm control (antiarrhythmic drugs or ablation) or rate control therapy, initiated within 1 year of AF diagnosis in 2005 to 2015, were identified from the Korean National Health Insurance Service database. The composite outcome of ≥30% decline in eGFR, acute kidney injury, kidney failure, or death from renal or cardiovascular causes was compared with the use of propensity overlap weighting between rhythm or rate control strategies in patients with or without significant CKD (eGFR <60 mL/min/1.73 m2).
Results: Of the included patients (median age 62 years, 32.7% female), 2,213 (10.6%) had eGFR <60 mL/min/1.73 m2. Among patients with significant CKD, early rhythm control, compared with rate control, was associated with a lower risk of the primary composite outcome (weighted incidence rate: 2.77 vs 3.92 per 100 person-years; weighted HR: 0.70; 95% CI: 0.52-0.95). In patients without significant CKD, there was no difference in the risk of the primary composite outcome between rhythm and rate control groups (weighted incidence rate: 3.41 vs 3.21 per 100 person-years; weighted HR: 1.06; 95% CI: 0.96-1.18). No differences in safety outcomes were found between rhythm and rate control strategies in patients without or with significant CKD.
Conclusions: Among patients with AF and CKD, early rhythm control was associated with lower risks of adverse renal outcomes than rate control was.
Background: Most clinical trials define successful atrial fibrillation (AF) treatment as no AF episodes longer than 30 seconds. Yet, there has been minimal study of how patients define successful treatment and whether their perspectives align with trial outcomes.
Objectives: Survey patients with AF to identify: 1) what aspect of AF is most important to address (frequency, duration, or severity of AF episodes); 2) what AF burden would be considered acceptable to consider treatment successful; and 3) to establish patient preferences for successful treatment thresholds for a validated patient-reported outcome (PRO) score.
Methods: We surveyed patients receiving active care for AF at a single tertiary care center modeled after the Toronto AF Severity Scale (AFSS). The survey consisted of current and "successful treatment" AF frequency, burden, and symptom domains; and baseline socioeconomic information.
Results: Of 7,000 invitations, 852 individuals completed the survey (12% response) with a mean age of 65 ± 13 years, 36.5% were female, and they had a mean CHA2DS2-VAsc score of 2.9 ± 1.9. Overall, 114 (13%) selected a decrease in AF episode duration as their top treatment priority, 505 (59%) episode frequency, and 230 (27%) episode severity. Overall, 207 (24%) patients would only consider a treatment successful if they never had AF again, whereas 645 (76%) patients considered success to be fewer AF episodes. A total of 341 (40%) patients would only consider a treatment successful if AF episodes lasted less than a few minutes, whereas 509 (60%) patients would accept AF episodes lasting >30 minutes. An AFSS symptom score ≤5 was considered a good outcome by 80% of respondents.
Conclusions: Patients prioritize decreased AF frequency over improvements in severity or duration, and an AFSS ≤5 would be a reasonable outcome of AF treatment. Most patients would consider treatment successful if they had more than 1 AF episode lasting longer than 30 seconds. Future clinical trial design should consider patients' perspectives when designing outcomes.
Background: Although targeting atrial fibrillation (AF) drivers and substrates has been used as an effective adjunctive ablation strategy for patients with persistent AF (PsAF), it can result in iatrogenic scar-related atrial tachycardia (iAT) requiring additional ablation. Personalized atrial digital twins (DTs) have been used preprocedurally to devise ablation targeting that eliminate the fibrotic substrate arrhythmogenic propensity and could potentially be used to predict and prevent postablation iAT.
Objectives: In this study, the authors sought to explore possible alternative configurations of ablation lesions that could prevent iAT occurrence with the use of biatrial DTs of prospectively enrolled PsAF patients.
Methods: Biatrial DTs were generated from late gadolinium enhancement-magnetic resonance images of 37 consecutive PsAF patients, and the fibrotic substrate locations in the DT capable of sustaining reentries were determined. These locations were ablated in DTs by representing a single compound region of ablation with normal power (SSA), and postablation iAT occurrence was determined. At locations of iAT, ablation at the same DT target was repeated, but applying multiple lesions of reduced-strength (MRA) instead of SSA.
Results: Eighty-three locations in the fibrotic substrates of 28 personalized biatrial DTs were capable of sustaining reentries and were thus targeted for SSA ablation. Of these ablations, 45 resulted in iAT. Repeating the ablation at these targets with MRA instead of SSA resulted in the prevention of iAT occurrence at 15 locations (18% reduction in the rate of iAT occurrence).
Conclusions: Personalized atrial DTs enable preprocedure prediction of iAT occurrence after ablation in the fibrotic substrate. It also suggests MRA could be a potential strategy for preventing postablation AT.