Swept-source OCT angiography (SS-OCTA) scans of eyes with age-related macular degeneration (AMD) were used to replace color, autofluorescence, infrared reflectance, and dye-based fundus angiographic imaging for the diagnosis and staging of AMD. Through the use of different algorithms with the SS-OCTA scans, both structural and angiographic information can be viewed and assessed using both cross sectional and en face imaging strategies.
Presented at the 2022 Charles L. Schepens, MD, Lecture at the American Academy of Ophthalmology Retina Subspecialty Day, Chicago, Illinois, on September 30, 2022.
Patients with AMD.
Review of published literature and ongoing clinical research using SS-OCTA imaging in AMD.
Swept-source OCT angiography imaging of AMD at different stages of disease progression.
Volumetric SS-OCTA dense raster scans were used to diagnose and stage both exudative and nonexudative AMD. In eyes with nonexudative AMD, a single SS-OCTA scan was used to detect and measure structural features in the macula such as the area and volume of both typical soft drusen and calcified drusen, the presence and location of hyperreflective foci, the presence of reticular pseudodrusen, also known as subretinal drusenoid deposits, the thickness of the outer retinal layer, the presence and thickness of basal laminar deposits, the presence and area of persistent choroidal hypertransmission defects, and the presence of treatment-naïve nonexudative macular neovascularization. In eyes with exudative AMD, the same SS-OCTA scan pattern was used to detect and measure the presence of macular fluid, the presence and type of macular neovascularization, and the response of exudation to treatment with vascular endothelial growth factor inhibitors. In addition, the same scan pattern was used to quantitate choriocapillaris (CC) perfusion, CC thickness, choroidal thickness, and the vascularity of the choroid.
Compared with using several different instruments to perform multimodal imaging, a single SS-OCTA scan provides a convenient, comfortable, and comprehensive approach for obtaining qualitative and quantitative anatomic and angiographic information to monitor the onset, progression, and response to therapies in both nonexudative and exudative AMD.
Proprietary or commercial disclosure may be found in the Footnotes and Disclosures at the end of this article.
In this study, we identify risk factors that predict the progression of acquired vitelliform lesions (AVLs) over time.
Retrospective cohort study.
One hundred sixty-three eyes of 132 patients with a diagnosis of intermediate age-related macular degeneration (iAMD) with AVL.
This retrospective study evaluated consecutive eyes with AMD from a retina clinic population and included 1181 patients and 2362 eyes. After excluding cases with associated geographic atrophy, macular neovascularization (MNV), vitreomacular traction, and those with <2 years of follow-up data, the final analysis cohort consisted of 163 eyes (132 patients) with ≥1 AVL. The first available visit in which an AVL was evident was considered the baseline visit, and follow-up data were collected from a visit 2 years (± 3 months) later. Progression outcomes at the follow-up visit were classified into 6 categories: resorbed, collapsed, MNV, stable, increasing, and decreasing. Subsequently, we analyzed the baseline characteristics for each category and calculated odds ratios (ORs) to predict these various outcomes.
The study focused on identifying predictive factors influencing the evolution of AVL in iAMD eyes.
In total, 163 eyes with AVL had follow-up data at 2 years. The collapsed group demonstrated a significantly greater baseline AVL height and width compared with other groups (P < 0.001). With regard to qualitative parameters, subretinal drusenoid deposits (SDDs) and intraretinal hyperreflective foci (IHRF) at the eye level, AVL located over drusen, and IHRF and external limiting membrane disruption over AVL were significantly more prevalent in the collapsed group compared with other groups (P < 0.05 for all comparisons). Odds ratios for progressing to atrophy after 2 years of follow-up, compared with the resorbed group, were significant for SDD (OR, 2.82; P = 0.048) and AVL height (OR, 1.016; P = 0.006).
The presence of SDDs and greater AVL height significantly increases the risk of developing atrophy at the location of AVL after 2 years of follow-up. These findings may be of value in risk prognostication and defining patient populations for inclusion in future early intervention trials aimed at preventing progression to atrophy.
Proprietary or commercial disclosure may be found in the Footnotes and Disclosures at the end of this article.
To compare 1-year outcomes of eyes with diabetic macular edema (DME) treated in routine clinical practice based on the proportion of visits where intravitreal VEGF inhibitor injections were delivered.
Cohort study.
There were 2288 treatment-naive eyes with DME starting intravitreal VEGF inhibitor therapy from October 31, 2015 to October 31, 2021 from the Fight Retinal Blindness! international outcomes registry.
Eyes were grouped according to the proportion of visits at which an injection was received, Group A with less than the median of 67% (n = 1172) versus Group B with greater than the median (n = 1116).
Mean visual acuity (VA) change after 12 months of treatment.
The mean (95% confidence interval [CI]) VA change after 12 months of treatment was 3.6 (2.8–4.4) letters for eyes in Group A versus 5.2 (4.4–5.9) letters for eyes in Group B (P = 0.005). The mean (95% CI) central subfield thickness (CST) change was −69 (−76 to −61) μm and −85 (−92 to −78) μm for eyes in Group A versus Group B, respectively (P = 0.002). A moderate positive correlation was observed between the number of injections received over 12 months of treatment and the change in VA (P < 0.001). Additionally, eyes that received more injections had a moderately greater CST reduction.
This registry analysis found that overall VA and anatomic outcomes tended to be better in DME eyes treated at a greater proportion of visits in the first year of intravitreal VEGF inhibitor therapy.
Proprietary or commercial disclosure may be found in the Footnotes and Disclosures at the end of this article.
To compare the clinical implications of central bouquet hemorrhages (CBHs) to primarily subretinal hemorrhages, both occurring in the setting of pathologic myopia with lacquer crack formation.
Multicenter retrospective cohort study.
Twenty-five eyes (11 primarily subretinal hemorrhages and 14 CBH) were monitored over a median of 35 (interquartile range [IQR], 9.50–54) months.
Comprehensive ophthalmic examinations and OCT were reviewed. The study employed linear mixed-effects models to compare the impact of CBH versus primarily subretinal hemorrhages on baseline visual acuity (VA), rate of VA improvement, and final VA, adjusting for the follow-up period. Times of hemorrhages reabsorbtion and rate of ellipsoid zone (EZ) layer disruption on OCT were recorded.
Eyes with CBH exhibited significantly worse baseline VA (0.93 ± 0.45 logarithm of the minimum angle of resolution [logMAR]; 20/160 Snellen vs. 0.36 ± 0.26 logMAR [20/50 Snellen], P < 0.001), a slower rate of VA improvement (P = 0.04), and a trend toward worse final VA (0.48 ± 0.47 logMAR [20/60 Snellen] vs. 0.16 ± 0.16 logMAR [20/30 Snellen], P = 0.06) compared with eyes with primarily subretinal hemorrhages. The CBH group experienced longer median reabsorption times (10 [IQR, 4.6–23.3] months vs. 2.3 [IQR, 2–3.2] months), and a higher prevalence of EZ layer disruption (86% vs. 0%), than the group with primarily subretinal hemorrhages. Central bouquet hemorrhage reabsorption was followed by the appearance of vertical hyperreflective lines in the central fovea in 67% of eyes, persisting for up to 6 years of follow-up.
Central bouquet hemorrhage signifies a distinct condition in pathologic myopia, characterized by worse visual outcomes, prolonged structural impact, and possible irreversible damage, compared with primarily subretinal hemorrhages. Central bouquet hemorrhage regression should be taken into account in the differential diagnosis of vertical hyperreflective lesions in the central fovea on OCT in eyes with pathologic myopia.
Proprietary or commercial disclosure may be found in the Footnotes and Disclosures at the end of this article.
In this study, we aimed to characterize the frequency and distribution of ocular surgeries in patients with inherited retinal diseases (IRDs) and evaluate associated patient and disease factors.
Retrospective cohort.
Subjects aged ≥ 18 years who were followed at the Johns Hopkins Genetic Eye Disease Center.
We studied a retrospective cohort of patients with an IRD diagnosis to analyze the occurrence of laser and incisional surgeries. Subjects were categorized into 2 groups: central dysfunction (macular/cone/cone-rod dystrophy, “MCCRD group”) and panretinal or peripheral dysfunction (retinitis pigmentosa–like, “RP group”). Genetic testing status was recorded. The association of patient and disease factors on the frequency, distribution, and timing of surgeries was analyzed.
Prevalence, prevalence odds ratio (POR), hazard ratio (HR) of ophthalmic procedures by phenotype.
A total of 1472 eyes of 736 subjects were evaluated. Among them, 31.3% (n = 230) had undergone ocular surgery, and 78.3% of those (n = 180/230) had a history of more than 1 surgery. A total of 602 surgical procedures were analyzed. Cataract extraction with intraocular lens implantation (CEIOL) was the most common (51.2%), followed by yttrium aluminum garnet capsulotomy, refractive surgery, retinal surgery, and others. Cataract extraction with intraocular lens implantation occurred more frequently in RP than in MCCRD subjects (POR, 2.59; P = 0.002). Retinitis pigmentosa subjects underwent CEIOL at a younger age than patients with MCCRD (HR, 2.11; P < 0.001).
Approximately one-third of patients with IRD had a history of laser or incisional surgery. Cataract extraction with intraocular lens implantation was the most common surgery; its frequency and timing may be associated with the IRD phenotype. This data may inform the design of prospective research. Such efforts may illuminate routine clinical decision-making and contribute to surgical strategy development for cell and gene therapy delivery.
Proprietary or commercial disclosure may be found in the Footnotes and Disclosures at the end of this article.
To describe regional variation in microbes causing infectious endogenous endophthalmitis (EE) in the United States.
This is a retrospective, national database analysis utilizing the 2002–2014 National Inpatient Sample database.
Using the International Classification of Disease 9 codes, we identified cases with EE. Cases were stratified regionally into Northeast, South, West, or Midwest.
Unadjusted chi-square analysis followed by adjusted multivariate logistic regression was performed to evaluate variation in demographic factors, comorbidities using the Elixhauser Comorbidity Index (ECI), microbial variation, mortality, and use of vitrectomy or enucleation by region.
Proportion of microbes, mortality, and vitrectomy by region in addition to factors with significant odds ratios for mortality and for in-hospital vitrectomy.
A total of 10 912 patients with infectious EE were identified, with 2063 cases in the Northeast (18.9%), 2145 cases in the Midwest (19.7%), 4134 cases in the South (37.9%), and 2570 cases in the West (23.6%). Chi-square analysis indicated significant regional variation in patient demographics, microbes causing the infection, ECI, mortality, and surgical intervention. The 4 most common microbes for all regions were methicillin-sensitive Staphylococcus aureus (MSSA), Streptococcus, Candida, and methicillin-resistant Staphylococcus aureus. Methicillin-sensitive S. aureus was the most common cause of EE in all regions, although the proportion of MSSA infection did not significantly vary by region (P = 0.03). Further, there was significant regional variation in the proportion of other microbes causing the infection (P < 0.001). Higher rates of vitrectomies were seen in the South and Midwest regions than that in the Northeast and West (P = 0.04).
Regional variation exists in the infectious microbes causing EE. Further studies are needed to elucidate the etiology of these variations.
Proprietary or commercial disclosure may be found in the Footnotes and Disclosures at the end of this article.