Background: Migraine is a highly prevalent and incapacitating neurological disorder associated with the highest global disability burden in people aged 15 to 49 years. Europe has the fourth-highest prevalence of migraine, after North America, South America, and Central America, and above Asia and Africa. Migraine leads to relatively modest direct healthcare expenditure but has substantial indirect costs due to reduced productivity. Methods: The economic burden of migraine was estimated in comparison with the general population of the United Kingdom (UK) using an analytical fiscal modeling framework applying the government cost perspective. Published measures of migraine's impact on labor participation were applied to rates of economic activity/inactivity of the general population. The model estimates lifetime changes to earnings from employment, direct and indirect taxes paid, and financial support requirements over the life course. Incremental differences between those affected and unaffected by migraine are reported as net fiscal consequences to public accounts. Fiscal costs are reported as the discounted average per capita over a 20-year time horizon and for the entire annual UK cohort with prevalent migraine. Results: People affected by migraine are more likely to be absent from work, unemployed, and disabled, and to retire early. A 44-year-old individual affected by migraine was associated with £19 823 in excess fiscal costs to the UK government, £1379 per year living with the condition, compared with someone not affected by the disease. Annually, migraine was estimated to represent £12.20 billion to the public economy, approximately £130.63 per migraine episode. The model predicted annual productivity losses in the health and social care workforce to be £2.05 billion and total annual productivity losses to be over £5.81 billion. Conclusions: This fiscal analysis monetizes the occupational consequences of migraine to the UK government, both in terms of lost tax revenue and transfer payments. The findings are substantial and useful to characterize disease severity and to inform the body of evidence considered by decision makers appraising the cost-effectiveness of health technologies.
Background: Despite design enhancements in endocutters, key challenges related to limited surgical access and space can impact stapling and, potentially, surgical outcomes. Objectives: This study aimed to develop consensus statements outlining the clinical value of precise articulation and greater anatomical access in minimally invasive surgery performed by bariatric, colorectal, and thoracic surgeons. Methods: Colorectal, bariatric, and thoracic surgeons from Japan, the United States, United Kingdom, and France participated in a 2-round modified Delphi panel. Round 1 included binary, Likert scale-type, multiple-response, and open-ended questions. These were converted to affirmative statements for round 2 if sufficient agreement was reached. Consensus was set at a predefined threshold of at least 90% of panelists across all surgical specialties and regions selecting the same option ("agree" or "disagree") for the affirmative statements. Results: Of the 49 statements in the round 2 questionnaire, panelists (n=135) reached consensus that (1) tissue slippage outside stapler jaws can occur due to limited access and space; (2) greater jaw aperture could help to manipulate thick or fragile tissue more easily; (3) articulation of an endocutter is clinically important in laparoscopic surgeries; (4) improved access to hard-to-reach targets and in limited space would improve safety; and (5) an endocutter with improved access through greater articulation would become common use. Discussion: By understanding user-specific challenges and needs from both specialty- and region-wide perspectives, endoscopic stapling devices can continue to be refined. In this study, improved articulation and greater jaw aperture were the key design features examined. Improved articulation and greater jaw aperture were key stapler design features identified in this study that may mitigate the risk of instrument clashes and intraoperative complications such as anastomotic leaks. Conclusions: This study gained insights into surgeons' perspective across a variety of specialties and from 3 distinct geographies. Participating surgeons reached consensus that an endocutter with greater jaw aperture and articulation may improve surgical access and has potential to improve surgical outcomes.
Background: Patients with advanced or recurrent endometrial cancer (EC) have limited treatment options following platinum-based chemotherapy and poor prognosis. The single-arm, Phase I GARNET trial (NCT02715284) previously reported dostarlimab efficacy in mismatch repair-deficient/microsatellite instability-high advanced or recurrent EC. Objectives: The objective of this study was to compare overall survival (OS) and describe time to treatment discontinuation (TTD) for dostarlimab (GARNET Cohort A1 safety population) with an equivalent real-world external control arm receiving non-anti-programmed death (PD)-1/PD-ligand (L)1/2 treatments (constructed using data from a nationwide electronic health record-derived de-identified database and applied GARNET eligibility criteria). Methods: Propensity scores constructed from prognostic factors, identified by literature review and clinical experts, were used for inverse probability of treatment weighting (IPTW). Kaplan-Meier curves were constructed and OS/TTD was estimated (Cox regression model was used to estimate the OS-adjusted hazard ratio). Results: Dostarlimab was associated with a 52% lower risk of death vs real-world treatments (hazard ratio, 0.48; 95% confidence interval [CI], 0.35-0.66). IPTW-adjusted median OS for dostarlimab (N=143) was not estimable (95% CI, 19.4-not estimable) versus 13.1 months (95% CI, 8.3-15.9) for real-world treatments (N = 185). Median TTD was 11.7 months (95% CI, 6.0-38.7) for dostarlimab and 5.3 months (95% CI, 4.1-6.0) for the real-world cohort. Discussion: Consistent with previous analyses, patients treated with dostarlimab had significantly longer OS than patients in the US real-world cohort after adjusting for the lack of randomization using stabilized IPTW. Additionally, patients had a long TTD when treated with dostarlimab, suggesting a favorable tolerability profile. Conclusion: Patients with advanced or recurrent EC receiving dostarlimab in GARNET had significantly lower risk of death than those receiving real-world non-anti-PD-(L)1/2 treatments.
Background: Closed claims are frequently used in outcomes research studies. Lately, the availability of open claims has increased the possibility of obtaining information faster and on a larger scale. However, because of the possibility of missing claims and duplications, these data sets have not been highly utilized in medical research. Objective: To compare frequently used healthcare utilization measures between closed claims and open claims to analyze if the possibility of missing claims in open claims data creates a downward bias in the estimates. Methods: We identified 18 different diseases using 2022 data from 2 closed claims data sets (MarketScan® and PharMetrics® Plus) and 1 open claims database (Kythera). After applying an algorithm that removes possible duplications from open claims data, we compared healthcare utilizations such as inpatient, emergency department, and outpatient use and length of stay among these 3 data sets. We applied standardized differences to compare the medians for each outcome. Results: The sample size of the open claims data sets was 10 to 65 times larger than closed claims data sets depending on disease type. For each disease, the estimates of healthcare utilization were similar between the open claims and closed claims data. The difference was statistically insignificant. Conclusions: Open claims data with a bigger sample size and more current available information provide essential advantages for healthcare outcomes research studies. Therefore, especially for new medications and rare diseases, open claims data can provide information much earlier than closed claims, which usually have a time lag of 6 to 8 months.
Background: Compression therapy is the gold standard for the treatment of chronic venous insufficiency (CVI). Two-layer bandage (2LB) systems have been shown to be a safe and effective treatment option. Objective: To estimate the total cost per response (CPR) for the resolution of edema and wounds in patients with CVI treated with a 2LB system as part of their overall wound healing regimen. Methods: A probabilistic decision tree model was developed to estimate the incremental CPR for a 2LB system. The model simulated 10 000 patients to estimate the CPR for the resolution of edema and wound healing. The analysis was performed using clinical data from a published single-arm, multicenter prospective study of CVI indicated for compression therapy. The response outcomes of interest were resolution of edema and rate of wound healing. The follow-up time was a maximum of 6 weeks, and the perspective of the study was a US outpatient treatment center. Economic data for compression therapy were based on the public prices of a 2LB system. Dressing changes occurred per manufacturer instructions for use. Results: The study comprised 702 patients (56% female), with a total of 414 wounds. The median duration of the wounds was 42 days, and the median size at the initial visit was 3.5 cm2. The average pain reduction fell by 67% using a visual analog score. Bandages were typically changed once or twice a week (51.7%). Wound healing occurred in 128 of the 414 wounds (30.9%). The expected incremental CPR of a 2LB system for the resolution of edema was $65.67 (range, $16.67-$124.32). The expected incremental CPR of a 2LB system for the healing of a wound was $138.71 (range, $35.71-$273.53). Conclusion: This economic evaluation complements previous clinical effectiveness and safety studies of 2LB systems for the treatment of CVI. The results demonstrate that the costs of incorporating 2LB into standard wound-healing protocols are negligible compared with overall treatment costs. Two-layer bandages may be considered a cost-effective first-line system for the treatment of wounds caused by CVI.