Blog Archive

Αλέξανδρος Γ. Σφακιανάκης

Monday, December 28, 2020

Nursing

Building peace amid the storm
imageNo abstract available

Drug News
No abstract available

Communicating about surgical care safety during the pandemic
No abstract available

Name That Strip
imageNo abstract available

A student-led quality improvement project on fall prevention
No abstract available

Clinical Rounds
imageNo abstract available

Idiopathic pulmonary fibrosis: What nurses need to know
imageAbstract: Idiopathic pulmonary fibrosis (IPF) is a restrictive lung disease in which the cause cannot be determined. This article discusses restrictive lung diseases that fall under the general category of interstitial lung disease with a focus on IPF—a fatal disease characterized by progressive fibrosis and interstitial pneumonia, dyspnea, and decreasing pulmonary function.

Idiopathic pulmonary fibrosis: What nurses need to know
No abstract available

Managing pain in seriously ill patients with substance use disorders
imageAbstract: Managing pain can be challenging, especially in patients with serious illnesses and a history of substance use disorders. This article discusses the challenges of addressing pain in these patients and offers perspectives regarding their clinical management.

Managing pain in seriously ill patients with substance use disorders
No abstract available


#
Medicine by Alexandros G. Sfakianakis,Anapafseos 5 Agios Nikolaos 72100 Crete Greece,00302841026182,00306932607174,alsfakia@gmail.com,
Telephone consultation 11855 int 1193,

Orthopedics

Altmetrics Attention Scores for Randomized Controlled Trials in Total Joint Arthroplasty Are Reflective of High Scientific Quality: An Altmetrics-Based Methodological Quality and Bias Analysis
imageIntroduction: The Altmetric Attention Score (AAS) has been associated with citation rates across medical and surgical disciplines. However, factors that drive high AAS remain poorly understood and there remains multiple pitfalls to correlating these metrics alone with the quality of a study. The purpose of the current study was to determine the relationship between methodologic and study biases and the AAS in randomized controlled trials (RCTs) published in total joint arthroplasty journals. Methods: All RCTs from 2016 published in The Journal of Arthroplasty, The Bone and Joint Journal, The Journal of Bone and Joint Surgery, Clinical Orthopedics and Related Research, The Journal of Knee Surgery, Hip International, and Acta Orthopaedica were extracted. Methodologic bias was graded with the JADAD scale, whereas study bias was graded with the Cochrane risk of bias tool for RCTs. Publication characteristics, social media attention (Facebook, Twitter, and Mendeley), AAS, citation rates, and bias were analyzed. Results: A total of 42 articles were identified. The mean (±SD) citations and AAS per RCT was 17.8 ± 16.5 (range, 0 to 78) and 8.0 ± 15.4 (range, 0 to 64), respectively. The mean JADAD score was 2.6 ± 0.94. No statistically significant differences were observed in the JADAD score or total number of study biases when compared across the seven journals (P = 0.57 and P = 0.27). Higher JADAD scores were significantly associated with higher AAS scores (β = 6.7, P = 0.006) but not citation rate (P = 0.16). The mean number of study biases was 2.0 ± 0.93 (range, 0 to 4). A greater total number of study biases was significantly with higher AAS scores (β = −8.0, P < 0.001) but not citation rate (P = 0.10). The AAS was a significant and positive predictor of citation rate (β = 0.43, P = 0.019). Conclusion: High methodologic quality and limited study bias markedly contribute to the AAS of RCTs in the total joint arthroplasty literature. The AAS may be used as a proxy measure of scientific quality for RCTs, although readers should still critically appraise these articles before making changes to clinical practice.

Exploring Alternative Sites for Glenoid Component Fixation Through Three-Dimensional Digitization of the Glenoid Vault: An Anatomic Analysis
imageIntroduction: Glenoid component loosening has remained one of the most common complications for total shoulder arthroplasty. Three-dimensional modeling of the glenoid may reveal novel information regarding glenoid vault morphology, providing a foundation for implant designs that possess the potential to extend the survivorship of the prosthesis. Methods: A three-dimensional digitizer was used to digitize the glenoids of 70 cadaveric scapulae. We identified ideal position, fit, and maximum diameter for cylinders of 5, 10, and 15 mm depths. Maximum diameter and volume were also measured at the glenoid center, and the data were compared. Results: The vault region that accommodates the greatest diameter and volume for 5, 10, and 15 mm depth cylinders were identified in the postero-inferior glenoid. Across all specimens, this region accommodated a cylinder diameter that was 24.82%, 40.45%, and 50.34% greater than that achieved at the glenoid center for 5, 10, and 15 mm depth cylinders, respectively (all, P < 0.0001). The location of this site remains reliable for each cylinder depth, regardless of sex. Discussion: This study presents novel findings pertaining to glenoid morphology through the analysis of a newly characterized glenoid vault region. This region has not been identified or described previously and has potential to serve as an alternative to the glenoid center for peg or baseplate fixation. Our method of vault analysis and findings may be used to guide further research regarding pathologic glenoid anatomy, providing a foundation for alternative approaches to glenoid prosthesis fixation in total shoulder arthroplasty and related procedures.

Prevalence and Treatment of Osteoporosis Prior to Elective Shoulder Arthroplasty
imageIntroduction: The rate of preoperative osteoporosis in lower extremity arthroplasty is 33%. The prevalence of osteoporosis in shoulder arthroplasty patients is inadequately studied. The purpose of this study was to (1) determine the prevalence of osteoporosis in patients undergoing elective shoulder arthroplasty, (2) report the percentage of patients having dual-energy x-ray absorptiometry (DEXA) testing before surgery, and (3) determine the percentage of patients who have been prescribed osteoporosis medications within 6 months before or after surgery. Methods: This retrospective case series included all adults aged 50 years and older who underwent elective shoulder arthroplasty at a single tertiary care center over an 8-year period. National Osteoporosis Foundation (NOF) criteria for screening and treatment were applied. Results: Two hundred fifty-one patients met the inclusion criteria; 171 (68%) met the criteria for DEXA testing, but only 31 (12%) had this testing within 2 years preoperatively. Eighty patients (32%) met the NOF criteria for receipt of pharmacologic osteoporosis treatment, and 17/80 (21%) received a prescription for pharmacotherapy. Discussion: Two-thirds of elective shoulder arthroplasty patients meet the criteria to have bone mineral density measurement done, but less than 20% have this done. One in three elective shoulder arthroplasty patients meet the criteria to receive osteoporosis medications, but only 20% of these patients receive therapy.

Tonnis Angle and Acetabular Retroversion Measurements in Asymptomatic Hips Are Predictive of Future Hip Pain: A Retrospective, Prognostic Clinical Study
imageBackground: This study evaluated the prevalence of radiographic abnormalities potentially indicative of femoroacetabular impingement on AP pelvic radiographs in asymptomatic adolescents and young adults and aimed to determine whether the abnormalities were predictive of future hip pain. Methods: AP pelvis images from scoliosis radiographs were obtained from patients 12 to 25 years of age free of any clinical hip/lower extremity symptoms between January 2006 and September 2009. The following radiographic abnormalities were collected: lateral center-edge angle of Wiberg >40° or <25°, Tönnis angle <0° or >10°, acetabular retroversion (crossover sign with a posterior wall sign), acetabular overcoverage (crossover sign without a posterior wall sign), and anterior offset alpha angle, calculated using alpha angle of Nötzli >50°. Patients were retrospectively followed (average 3.11 years) to identify those who subsequently developed hip pain. Results: Of the 233 patients (466 hips) who were asymptomatic at the time of radiographic evaluation, at least one radiographic abnormality was present in 60% (281/466) of the hips. Within that group of hips (n = 281), 69% (195/281) of hips demonstrated a single abnormality, whereas 31% (86/281) of hips were associated with multiple abnormalities. Among all hips (n = 466), a lateral center-edge angle <25° or >40° was the most common radiographic abnormality, present in 27% (127/466) of hips. Anterior offset alpha angle and acetabular overcoverage were the most common abnormalities to present together, found in 5% (25/466) of hips. In the multivariable model, a decreasing Tönnis angle (hazard ratio per 1-degree decrease: 1.25, 95% confidence interval, 1.10–1.42, P = 0.0006) and the presence of acetabular retroversion (hazard ratio: 3.55, 95% confidence interval, 1.15–10.95, P = 0.0272) were predictive of the development of future hip pain. Conclusions: Our study demonstrates a high prevalence of radiographic abnormalities indicative of femoroacetabular impingement in asymptomatic adolescents and young adults. A decrease in Tönnis angle and the presence of acetabular retroversion were predictive of future hip pain.

Risk Factors Associated With Infection in Open Fractures of the Upper and Lower Extremities
imageIntroduction: Open fractures are associated with a high risk of infection. The prevention of infection is the single most important goal, influencing perioperative care of patients with open fractures. Using data from 2,500 participants with open fracture wounds enrolled in the Fluid Lavage of Open Wounds trial, we conducted a multivariable analysis to determine the factors that are associated with infections 12 months postfracture. Methods: Eighteen predictor variables were identified for infection a priori from baseline data, fracture characteristics, and surgical data from the Fluid Lavage of Open Wounds trial. Twelve predictor variables were identified for deep infection, which included both surgically and nonoperatively managed infections. We used multivariable Cox proportional hazards regression analyses to identify the factors associated with infection. Irrigation solution and pressure were included as variables in the analysis. The results were reported as adjusted hazard ratios (HRs), 95% confidence intervals (CIs), and associated P values. All tests were two tailed with alpha = 0.05. Results: Factors associated with any infection were fracture location (tibia: HR 5.13 versus upper extremity, 95% CI 3.28 to 8.02; other lower extremity: HR 3.63 versus upper extremity, 95% CI 2.38 to 5.55; overall P < 0.001), low energy injury (HR 1.64, 95% CI 1.08 to 2.46; P = 0.019), degree of wound contamination (severe: HR 2.12 versus mild, 95% CI 1.35 to 3.32; moderate: HR 1.08 versus mild, 95% CI 0.78 to 1.49; overall P = 0.004), and need for flap coverage (HR 1.82, 95% CI 1.11 to 2.99; P = 0.017). Discussion: The results of this study provide a better understanding of which factors are associated with a greater risk of infection in open fractures. In addition, it can allow for surgeons to better counsel patients regarding prognosis, helping patients to understand their individual risk of infection.

Immediate Versus Delayed Hip Arthroscopy for Femoroacetabular Impingement: An Expected Value Decision Analysis
imageIntroduction: Hip arthroscopy is an increasingly used surgical procedure for both intra- and extra-articular hip pathologies, including femoroacetabular impingement (FAI). Although the arthroscopic approach is known to be preferable to open, the optimal timing of such intervention is unclear. The purpose of this study was to carry out an expected value decision analysis of immediate versus delayed hip arthroscopy for FAI. Its hypothesis is immediate hip arthroscopy is the preferable treatment option. Methods: An expected value decision analysis was implemented to systematize the decision-making process between immediate and delayed hip arthroscopies. A decision tree was created with options for immediate and delayed surgeries with utilities characterizing each state obtained from surveying 70 patients. Fold-back analysis was then carried out, calculating expected values by multiplying the utility of each health outcome by the probability of that outcome. Corresponding expected values were then summed to "fold back" the decision tree one layer at a time. This was repeated until overall expected values (0 to 100) for immediate and delayed hip arthroscopies resulted with the higher value indicating the preferable option. Results: Fold-back analysis demonstrated that immediate hip arthroscopy is the preferred treatment for FAI over delayed with expected values of 78.27 and 72.63, respectively. Restoration of good function after hip arthroscopy was the most notable contributor to this difference. Immediate hip arthroscopy remained superior even as vast adjustments to preoperative physical function were made in one-way sensitivity analysis. Complications of hip arthroscopy leading to total hip arthroplasty were the least notable contributors to overall expected values. Discussion: This study confirms that immediate surgery is the preferred option when using decision-making analysis combining patient-reported utilities of health outcomes and the probabilities of those outcomes from the literature. This is consistent across a range of estimates of poor function in both the delayed and immediate surgery arms.

Rising Global Opportunities Among Orthopaedic Surgery Residency Programs
imageObjective: We surveyed Orthopaedic Surgery Residency (OSR) programs to determine international opportunities by the academic institutional region within the United States, location of the international experience, duration, residency program year (PGY), funding source, and resident participation to date. Design: We emailed a survey to all OSR programs in the United States to inquire about global opportunities in their residency programs. Further contact was made through an additional e-mail and up to three telephone calls. Data were analyzed using descriptive and chi-square statistics. This study was institutional review board exempt. Setting: This research study was conducted at the University of Nebraska Medical Center, a tertiary care facility in conjunction with the University of Nebraska Medical Center College of Medicine. Participants: The participants of this research study included program directors and coordinators of all OSR programs (185) across the United States. Results: A total of 102 OSR programs completed the survey (55% response rate). Notably, 50% of the responding programs offered a global health opportunity to their residents. Of the institutions that responded, those in the Midwest or South were more likely to offer the opportunity than institutions found in other US regions, although regional differences were not significant. Global experiences were most commonly: in Central or South America (41%); 1 to 2 weeks in duration (54%); and during PGY4 or PGY5 (71%). Furthermore, half of the programs provided full funding for the residents to participate in the global experience. In 33% of the programs, 10 or more residents had participated to date. Conclusions: Interest in global health among medical students is increasing. OSR programs have followed this trend, increasing their global health opportunities by 92% since 2015. Communicating the availability of and support for international opportunities to future residents may help interested students make informed decisions when applying to residency programs.

Maintaining Access to Orthopaedic Surgery During Periods of Operating Room Resource Constraint: Expanded Use of Wide-Awake Surgery During the COVID-19 Pandemic
imageIntroduction: Wide-awake local anesthesia no tourniquet (WALANT) presents a nonstandard anesthetic approach initially described for use in hand surgery that has gained interest and utilization across a variety of orthopaedic procedures. In response to operating room resource constraints imposed by the COVID-19 pandemic, our orthopaedic service rapidly adopted and expanded its use of WALANT. Methods: A retrospective review of 16 consecutive cases performed by 7 surgeons was conducted. Patient demographics, surgical details, and perioperative outcomes were assessed. The primary end point was WALANT failure, defined as intraoperative conversion to general anesthesia. Results: No instances of WALANT failure requiring conversion to general anesthesia occurred. In recovery, one patient (6%) required narcotics for pain control, and the average postoperative pain numeric rating scale was 0.6. The maximum pain score experienced was 4 in the patient requiring postoperative narcotics. The average time in recovery was 42 minutes and ranged from 8 to 118 minutes. Conclusion: The WALANT technique was safely and effectively used in 16 cases across multiple orthopaedic subspecialties, including three procedures not previously described in the literature. WALANT techniques hold promise for use in future disaster scenarios and should be evaluated for potential incorporation into routine orthopaedic surgical care.

Utility of the Current Procedural Terminology Codes for Prophylactic Stabilization for Defining Metastatic Femur Disease
imageIntroduction: Cohorts from the electronic health record are often defined by the Current Procedural Terminology (CPT) codes. The error prevalence of CPT codes for patients receiving surgical treatment of metastatic disease of the femur has not been investigated, and the predictive value of coding ontologies to identify patients with metastatic disease of the femur has not been adequately discussed. Methods: All surgical cases at a single academic tertiary institution from 2010 through 2015 involving prophylactic stabilization of the femur or fixation of a pathologic fracture of the femur were identified using the CPT and International Classification of Disease (ICD) codes. A detailed chart review was conducted to determine the procedure performed as documented in the surgical note and the patient diagnosis as documented in the pathology report, surgical note, and/or office visit notes. Results: We identified 7 CPT code errors of 171 prophylactic operations (4.1%) and one error of 71 pathologic fracture fixation s(1.4%). Of the 164 prophylactic operations that were coded correctly, 87 (53.0%) had metastatic disease. Of the 70 pathologic operations that were coded correctly, 41 (58%) had metastatic disease. Discussion: The error prevalence was low in both prophylactic stabilization and pathologic fixation groups (4.1% and 1%, respectively). The structured data (CPT and ICD-9 codes) had a positive predictive value for patients having metastatic disease of 53% for patients in the prophylactic stabilization group and 58% for patients in the pathologic fixation group. The CPT codes and ICD codes assessed in this analysis do provide a useful tool for defining a population in which a moderate proportion of individuals have metastatic disease in the femur at an academic medical center. However, verification is necessary.

Investigating the Bias in Orthopaedic Patient-reported Outcome Measures by Mode of Administration: A Meta-analysis
imageBackground: Patient-reported outcome measures (PROMs) are critical and frequently used to assess clinical outcomes to support medical decision-making. Questions/Purpose: The purpose of this meta-analysis was to compare differences in the modes of administration of PROMs within the field of orthopaedics to determine their impact on clinical outcome assessment. Patients and Methods: The PubMed database was used to conduct a review of literature from 1990 to 2018 with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses protocol. All articles comparing PROMs for orthopaedic procedures were included and classified by the mode of administration. Each specific survey was standardized to a scale of 0 to 100, and a repeated random effectsmodel meta-analysis was conducted to determine the mean effect of each mode of survey. Results: Eighteen studies were initially included in the study, with 10 ultimately used in the meta-analysis that encompassed 2384 separate patient survey encounters. Six of these studies demonstrated a statistically notable difference in PROM scores by mode of administration. The meta-analysis found that the standardized mean effect size for telephone-based surveys on a 100-point scale was 71.7 (SE 5.0) that was significantly higher (P , 0.0001) than survey scores obtained via online/tech based (65.3 [SE 0.70]) or self-administered/paper surveys (61.2 [SE 0.70]). Conclusions: Overall, this study demonstrated that a documented difference exists in PROM quality depending on the mode of administration. PROM scores obtained via telephone (71.7) are 8.9% higher than scores obtained online (65.3, P , 0.0001), and 13.8% higher than scores obtained via self-administered on paper (61.8, P , 0.0001). Few studies have quantified statistically notable differences between PROM scores based solely on the mode of acquisition in orthopaedic


#
Medicine by Alexandros G. Sfakianakis,Anapafseos 5 Agios Nikolaos 72100 Crete Greece,00302841026182,00306932607174,alsfakia@gmail.com,
Telephone consultation 11855 int 1193,

Pediatric Gastroenterology and Nutrition

Journal of Pediatric Gastroenterology and Nutrition Editor Comment: Progress, Transition, and Thank You!
No abstract available

The World is One Family
No abstract available

Multiparametric Magnetic Resonance Imaging to Monitor Pediatric Liver Disease: Looks Interesting but More Work to Do
No abstract available

The Liver in Sickle Cell Disease
imageLiver involvement is found in nearly 40% of children with sickle cell disease. The most frequent complication is cholelithiasis. The most severe complication is acute hepatic crisis, with symptoms ranging from increasing jaundice to multiple organ failure and death. The emergency and mostly efficient treatment is exchange transfusion. Chronic cholangiopathy is increasingly recognized, with autoimmune features in most cases, worsened by chronic ischemia. Transfusion-related iron overload is not yet a concern in children, and hepatotoxicity of iron chelators is rare. We propose recommendations to prevent, explore, and treat these complications. We emphasize the close collaboration required between hepatologists and specialists of sickle cell disease.

Living Related Liver Transplantation for Metabolic Liver Diseases in Children
imageMetabolic liver diseases (MLDs) are a heterogeneous group of inherited conditions for which liver transplantation can provide definitive treatment. The limited availability of deceased donor organs means some who could benefit from transplant do not have this option. Living related liver transplant (LrLT) using relatives as donors has emerged as one solution to this problem. This technique is established worldwide, especially in Asian countries, with shorter waiting times and patient and graft survival rates equivalent to deceased donor liver transplantation. However, living donors are underutilized for MLDs in many western countries, possibly due to the fear of limited efficacy using heterozygous donors. We have reviewed the published literature and shown that the use of heterozygous donors for liver transplantation is safe for the majority of MLDs with excellent metabolic correction. The use of LrLT should be encouraged to complement deceased donor liver transplantation (DDLT) for treatment of MLDs.

Evaluation of the Effectiveness of In-line Immobilized Lipase Cartridge in Enterally Fed Patients With Cystic Fibrosis
imageBackground: Pancreatic insufficiency occurs in most patients with cystic fibrosis (CF) contributing to malnutrition. In the United States, 3600 patients with CF require enteral feeding (EF). Oral pancreatic enzymes are commonly used with EF, despite not being designed or approved for this use. An immobilized lipase cartridge (ILC) for extracorporeal digestion of enteral feedings was developed. The sponsor provided it to patients via a structured program, which we evaluated to assess the effectiveness of the ILC on nutritional status. Methods: The program provided the ILC to patients prescribed the device while reimbursement efforts were ongoing. Baseline anthropometric data were obtained and subsequent measurements of height, weight, and body mass index (BMI) were collected at 6 and 12 months. Results: Inclusion criteria were met by 100 patients (age = 0--45 years). Over 12 months of use in patients >2 years of age (n = 93), there were significant improvements seen in height and weight z-scores with improvement trend seen in BMI. The frequency of achieving the 50th percentile increased steadily for weight and BMI from baseline to 12 months but not for height. Conclusions: This evaluation of a program to assist patient access to ILC demonstrates that better growth is possible over standard of care. The association of ILC use with significant improvements in anthropometric parameters over a 12-month period in people with CF demonstrates the effectiveness of ILC as rational enzyme therapy during enteral feedings.

Factors Associated With Nonadherence in an Emergency Department-based Multicenter Randomized Clinical Trial of a Probiotic in Children With Acute Gastroenteritis
imageNonadherence in clinical trials affects safety and efficacy determinations. Predictors of nonadherence in pediatric acute illness trials are unknown. We sought to examine predictors of nonadherence in a multicenter randomized trial of 971 children with acute gastroenteritis receiving a 5-day oral course of Lactobacillus rhamnosus GG or placebo. Adherence, defined as consuming all doses of the product, was reported by the parents and recorded during daily follow-up contacts. Of 943 patients with follow-up data, 766 (81.2%) were adherent. On multivariate analysis, older age (OR 1.19; 95% CI: 1.00–1.43), increased vomiting duration (OR 1.23; 95% CI: 1.05–1.45), higher dehydration score (OR 1.23, 95% CI: 1.07–1.42), and hospitalization following ED discharge (OR 4.16, 95% CI: 1.21--14.30) were factors associated with nonadherence; however, those with highest severity scores were more likely to adhere (OR 0.87, 95% CI: 0.80–0.95). These data may inform strategies and specific targets to maximize adherence in future pediatric trials.

Fructose Malabsorption in Chilean Children Undergoing Fructose Breath Test at a Tertiary Hospital
imageFructose is a highly abundant carbohydrate in western diet and may induce bowel symptoms in children as in adults. The main objective of this study is to describe the frequency of fructose malabsorption (FM) in symptomatic patients 18 years or younger undergoing fructose breath test in a single tertiary center between 2013 and 2018, and to evaluate whether certain symptoms are related to positivity of the test. Out of 273 tests 183 (67%) were compatible with FM. The most frequent pretest symptom in the overall study population was bloating (83%), followed by abdominal pain (73%). Patients with positive test were younger than those with a negative test (median 5 vs 8 years, P < 0.001). In multivariate analysis, which included age, sex, and symptoms (diarrhea, abdominal pain, bloating, nausea), only age <6 years (odds ratio 2.93, 95% confidence interval 1.64–5.23) and absence of nausea (odds ratio = 3.32, 95% confidence interval 1.56–7.05) were associated with FM.

Sucrase-isomaltase Gene Variants in Patients With Abnormal Sucrase Activity and Functional Gastrointestinal Disorders
imageObjectives: The aim of the study was to determine prevalence and characterize sucrase-isomaltase (SI) gene variants of congenital sucrase-isomaltase deficiency in non-Hispanic white pediatric and young adult patients with functional gastrointestinal disorders (FGIDs), and abnormal sucrase activity on histologically normal duodenal biopsy. Methods: Clinical symptoms and disaccharidase activities data were collected for an abnormal (low) sucrase (≤25.8 U, n = 125) activity group, and 2 normal sucrase activity groups with moderate (≥25.8–≤55 U, n = 250) and high (>55 U, n = 250) sucrase activities. SI gene variants were detected by next-generation sequencing of DNA from formalin-fixed paraffin-embedded tissues of these patients. FGIDs symptoms based on Rome IV criteria and subsequent clinical management of abnormal sucrase activity cases with pathogenic SI gene variants were analyzed. Results: Thirteen SI gene variants were found to be significantly higher in abnormal sucrase cases with FGIDs symptoms (36/125, 29%; 71% did not have a pathogenic variant) compared to moderate normal (16/250, 6.4%, P < 0.001) or high normal (5/250, 2.0%, P < 0.001) sucrase groups. Clinical management data were available in 26 of abnormal sucrase cases, and only 10 (38%) were correctly diagnosed and managed by the clinicians. Concomitant lactase deficiency (24%; 23/97) and pan-disaccharidase deficiency (25%; 13/51) were found in the abnormal sucrase group. Conclusions: Heterozygous and compound heterozygous mutations in the SI gene were more prevalent in cases with abnormal sucrase activity presenting with FGIDs, and normal histopathology. This suggests heterozygous pathogenic variants of congenital sucrase-isomaltase deficiency may present as FGIDs. Concomitant lactase or pan-disaccharidase deficiencies were common in abnormal sucrase cases with SI gene variants.

Variants in the Enteric Smooth Muscle Actin γ-2 Cause Pediatric Intestinal Pseudo-obstruction in Chinese Patients
imageObjectives: Pediatric intestinal pseudo-obstruction (PIPO) is a severe gastrointestinal disorder occurring in children, leading to failure to thrive, malnutrition, and long-term parenteral nutrition dependence. Enteric smooth muscle actin γ-2 (ACTG2) variants have been reported to be related to the pathogenesis of PIPO. This study aimed to determine the presence of ACTG2 variants in Chinese PIPO patients. Methods: Whole-exome sequencing was performed using samples from 39 recruited patients, whereas whole ACTG2 Sanger sequencing was performed using samples from 2 patients. Published data was reviewed to determine the number of pathogenic variants and the genotype related to ACTG2 variants in the Chinese population. Results: A total of 21 Chinese probands were found to carry heterozygous missense variants of ACTG2, among which 20 were de novo. Fifteen probands had p.Arg257 variants (c.770G>A and c.769C>T), and the other 2 probands had c.533G>A (p.Arg178His) and c.443G>T (p.Arg148Leu) variants. Four probands had novel variants c.337C>T (p.Pro113Ser), c.588G>C (p.Glu196Asp), c.734A>G (p.Asp245Gly), and c.553G>T (p.Asp185Tyr). Conclusions: Variants affecting codon 257 of ACTG2 protein sequence appeared to be frequent in both Chinese and Caucasian PIPO patients, whereas p.Arg178 variants were less common in Chinese patients compared with Caucasian patients. The 4 novel variants in ACTG2 were also found to be related to Chinese PIPO.


#
Medicine by Alexandros G. Sfakianakis,Anapafseos 5 Agios Nikolaos 72100 Crete Greece,00302841026182,00306932607174,alsfakia@gmail.com,
Telephone consultation 11855 int 1193,

Ear and Hearing

The Effect of Hearing Loss and Hearing Device Fitting on Fatigue in Adults: A Systematic Review
imageObjectives: To conduct a systematic review to address two research questions: (Q1) Does hearing loss have an effect on fatigue? (Q2) Does hearing device fitting have an effect on fatigue? It was hypothesized that hearing loss would increase fatigue (H1), and hearing device fitting would reduce fatigue (H2). Design: Systematic searches were undertaken of five bibliographic databases: Embase, MedLine, Web of Science, Psychinfo, and the Cochrane Library. English language peer-reviewed research articles were included from inception until present. Inclusion and exclusion criteria were formulated using the Population, Intervention, Comparison, Outcomes and Study design strategy. Results: Initial searches for both research questions produced 1,227 unique articles, after removal of duplicates. After screening, the full text of 61 studies was checked, resulting in 12 articles with content relevant to the research questions. The reference lists of these studies were examined, and a final updated search was conducted on October 16, 2019. This resulted in a final total of 20 studies being selected for the review. For each study, the information relating to the Population, Intervention, Comparison, Outcomes and Study design criteria and the statistical outcomes relating to both questions (Q1 and Q2) were extracted. Evidence relating to Q1 was provided by 15 studies, reporting 24 findings. Evidence relating to Q2 was provided by six studies, reporting eight findings. One study provided evidence for both. Using the Grading of Recommendations Assessment, Development and Evaluation guidelines, the quality of evidence on both research questions was deemed to be "very low." It was impossible to perform a meta-analysis of the results due to a lack of homogeneity. Conclusions: As the studies were too heterogeneous to support a meta-analysis, it was not possible to provide statistically significant evidence to support the hypotheses that hearing loss results in increased fatigue (H1) or that hearing device fitting results in decreased fatigue (H2). Despite this, the comparative volume of positive results and the lack of any negative findings are promising for future research (particularly in respect of Q1). There was a very small number of studies deemed eligible for the review, and there was large variability between studies in terms of population, and quantification of hearing loss and fatigue. The review highlights the need for consistency when measuring fatigue, particularly when using self-report questionnaires, where the majority of the current evidence was generated.

Dorsomedial Prefrontal Cortex Repetitive Transcranial Magnetic Stimulation for Tinnitus: Promising Results of a Blinded, Randomized, Sham-Controlled Study
imageObjectives: Tinnitus is the perception of sound in ears or head without corresponding external stimulus. Despite the great amount of literature concerning tinnitus treatment, there are still no evidence-based established treatments for curing or for effectively reducing tinnitus intensity. Sham-controlled studies revealed beneficial effects using repetitive transcranial magnetic stimulation (rTMS). Still, results show moderate, temporary improvement and high individual variability. Subcallosal area (ventral and dorsomedial prefrontal and anterior cingulate cortices) has been implicated in tinnitus pathophysiology. Our objective is to evaluate the use of bilateral, high frequency, dorsomedial prefrontal cortex (DMPFC) rTMS in treatment of chronic subjective tinnitus. Design: Randomized placebo-controlled, single-blinded clinical trial. Twenty sessions of bilateral, 10 Hz rTMS at 120% of resting motor threshold of extensor hallucis longus were applied over the DMPFC. Fourteen patients underwent sham rTMS and 15 were submitted to active stimulation. Tinnitus Handicap Inventory (THI), visual analog scale, and tinnitus loudness matching were obtained at baseline and on follow-up visits. The impact of intervention on outcome measures was evaluated using mixed-effects restricted maximum likelihood regression model for longitudinal data. Results: A difference of 11.53 points in the THI score was found, favoring the intervention group (p = 0.05). The difference for tinnitus loudness matching was of 4.46 dB also favoring the intervention group (p = 0.09). Conclusions: Tinnitus treatment with high frequency, bilateral, DMPFC rTMS was effective in reducing tinnitus severity measured by THI and matched tinnitus loudness when compared to sham stimulation.

The Influence of Forced Social Isolation on the Auditory Ecology and Psychosocial Functions of Listeners With Cochlear Implants During COVID-19 Mitigation Efforts
imageObjectives: The impact of social distancing on communication and psychosocial variables among individuals with hearing impairment during COVID-19 pandemic. It was our concern that patients who already found themselves socially isolated (Wie et al. 2010) as a result of their hearing loss would be perhaps more susceptible to changes in their communication habits resulting in further social isolation, anxiety, and depression. We wanted to better understand how forced social isolation (as part of COVID-19 mitigation) effected a group of individuals with hearing impairment from an auditory ecology and psychosocial perspective. We hypothesized that the listening environments would be different as a result of social isolation when comparing subject's responses regarding activities and participation before COVID-19 and during the COVID-19 pandemic. This change would lead to an increase in experienced and perceived social isolation, anxiety, and depression. Design: A total of 48 adults with at least 12 months of cochlear implant (CI) experience reported their listening contexts and experiences pre-COVID and during-COVID using Ecological Momentary Assessment (EMA; methodology collecting a respondent's self-reports in their natural environments) through a smartphone-based app, and six paper and pencil questionnaires. The Smartphone app and paper-pencil questionnaires address topics related to their listening environment, social isolation, depression, anxiety, lifestyle and demand, loneliness, and satisfaction with amplification. Data from these two-time points were compared to better understand the effects of social distancing on the CI recipients' communication abilities. Results: EMA demonstrated that during-COVID CI recipients were more likely to stay home or be outdoors. CI recipients reported that they were less likely to stay indoors outside of their home relative to the pre-COVID condition. Social distancing also had a significant effect on the overall signal-to-noise ratio of the environments indicating that the listening environments had better signal-to-noise ratios. CI recipients also reported better speech understanding, less listening effort, less activity limitation due to hearing loss, less social isolation due to hearing loss, and less anxiety due to hearing loss. Retrospective questionnaires indicated that social distancing had a significant effect on the social network size, participant's personal image of themselves, and overall loneliness. Conclusions: Overall, EMA provided us with a glimpse of the effect that forced social isolation has had on the listening environments and psychosocial perspectives of a select number of CI listeners. CI participants in this study reported that they were spending more time at home in a quieter environments during-COVID. Contrary to our hypothesis, CI recipients overall felt less socially isolated and reported less anxiety resulting from their hearing difficulties during-COVID in comparison to pre-COVID. This, perhaps, implies that having a more controlled environment with fewer speakers provided a more relaxing listening experience.

Peripheral Auditory Involvement in Childhood Listening Difficulty
imageObjectives: This study tested the hypothesis that undetected peripheral hearing impairment occurs in children with idiopathic listening difficulties (LiDs), as reported by caregivers using the Evaluation of Children"s Listening and Processing Skills (ECLiPS) validated questionnaire, compared with children with typically developed (TD) listening abilities. Design: Children with LiD aged 6–14 years old (n = 60, mean age = 9.9 yr) and 54 typical age matched children were recruited from audiology clinical records and from IRB-approved advertisements at hospital locations and in the local and regional areas. Both groups completed standard and extended high-frequency (EHF) pure-tone audiometry, wideband absorbance tympanometry and middle ear muscle reflexes, distortion product and chirp transient evoked otoacoustic emissions. Univariate and multivariate mixed models and multiple regression analysis were used to examine group differences and continuous performance, as well as the influence of demographic factors and pressure equalization (PE) tube history. Results: There were no significant group differences between the LiD and TD groups for any of the auditory measures tested. However, analyses across all children showed that EHF hearing thresholds, wideband tympanometry, contralateral middle ear muscle reflexes, distortion product, and transient-evoked otoacoustic emissions were related to a history of PE tube surgery. The physiologic measures were also associated with EHF hearing loss, secondary to PE tube history. Conclusions: Overall, the results of this study in a sample of children with validated LiD compared with a TD group matched for age and sex showed no significant differences in peripheral function using highly sensitive auditory measures. Histories of PE tube surgery were significantly related to EHF hearing and to a range of physiologic measures in the combined sample.

Better Hearing in Norway: A Comparison of Two HUNT Cohorts 20 Years Apart
imageObjective: To obtain updated robust data on a age-specific prevalence of hearing loss in Norway and determine whether more recent birth cohorts have better hearing compared with earlier birth cohorts. Design: Cross-sectional analyzes of Norwegian representative demographic and audiometric data from the Nord-Trøndelag Health Study (HUNT)—HUNT2 Hearing (1996–1998) and HUNT4 Hearing (2017–2019), with the following distribution: HUNT2 Hearing (N=50,277, 53% women, aged 20 to 101 years, mean = 50.1, standard deviation = 16.9); HUNT4 Hearing (N=28,339, 56% women, aged 19 to 100 years, mean = 53.2, standard deviation = 16.9). Pure-tone hearing thresholds were estimated using linear and quantile regressions with age and cohort as explanatory variables. Prevalences were estimated using logistic regression models for different severities of hearing loss averaged over 0.5, 1, 2, and 4 kHz in the better ear (BE PTA4). We also estimated prevalences at the population-level of Norway in 1997 and 2018. Results: Disabling hearing loss (BE PTA4 ≥ 35 dB) was less prevalent in the more recent born cohort at all ages in both men and women (p < 0.0001), with the largest absolute decrease at age 75 in men and at age 85 in women. The age- and sex-adjusted prevalence of disabling hearing loss was 7.7% (95% confidence interval [CI] 7.5 to 7.9) and 5.3% (95% CI 5.0 to 5.5) in HUNT2 and HUNT4, respectively. Hearing thresholds were better in the more recent born cohorts at all frequencies for both men and women (p < 0.0001), with the largest improvement at high frequencies in more recent born 60- to 70-year old men (10 to 11 dB at 3 to 4 kHz), and at low frequencies among the oldest. Conclusions: The age- and sex-specific prevalence of hearing impairment has decreased in Norway from 1996–1998 to 2017–2019.

Search for Electrophysiological Indices of Hidden Hearing Loss in Humans: Click Auditory Brainstem Response Across Sound Levels and in Background Noise
imageObjectives: Recent studies in animals indicate that even moderate levels of exposure to noise can damage synaptic ribbons between the inner hair cells and auditory nerve fibers without affecting audiometric thresholds, giving rise to the use of the term "hidden hearing loss" (HHL). Despite evidence across several animal species, there is little consistent evidence for HHL in humans. The aim of the study is to evaluate potential electrophysiological changes specific to individuals at risk for HHL. Design: Participants forming the high-risk experimental group consisted of 28 young normal-hearing adults who participated in marching band for at least 5 years. Twenty-eight age-matched normal-hearing adults who were not part of the marching band and had little or no history of recreational or occupational exposure to loud sounds formed the low-risk control group. Measurements included pure tone audiometry of conventional and high frequencies, distortion product otoacoustic emissions, and electrophysiological measures of auditory nerve and brainstem function as reflected in the click-evoked auditory brainstem response (ABR). In experiment 1, ABRs were recorded in a quiet background across stimulus levels (30–90 dB nHL) presented in 10 dB steps. In experiment 2, the ABR was elicited by a 70 dB nHL click stimulus presented in a quiet background, and in the presence of simultaneous ipsilateral continuous broadband noise presented at 50, 60, and 70 dB SPL using an insert earphone (Etymotic, ER2). Results: There were no differences between the low- and high-risk groups in audiometric thresholds or distortion product otoacoustic emission amplitude. Experiment 1 demonstrated smaller wave-I amplitudes at moderate and high sound levels for high-risk compared to low-risk group with similar wave III and wave V amplitude. Enhanced amplitude ratio V/I, particularly at moderate sound level (60 dB nHL), suggesting central compensation for reduced input from the periphery for high-risk group. The results of experiment 2 show that the decrease in wave I amplitude with increasing background noise level was relatively smaller for the high-risk compared to the low-risk group. However, wave V amplitude reduction was essentially similar for both groups. These results suggest that masking induced wave I amplitude reduction is smaller in individuals at high risk for cochlear synaptopathy. Unlike previous studies, we did not observe a difference in the noise-induced wave V latency shift between low- and high-risk groups. Conclusions: Results of experiment 1 are consistent with findings in both animal studies (that suggest cochlear synaptopathy involving selective damage of low-spontaneous rate and medium-spontaneous rate fibers), and in several human studies that show changes in a range of ABR metrics that suggest the presence of cochlear synaptopathy. However, without postmortem examination by harvesting human temporal bone (the gold standard for identifying synaptopathy) with different noise exposure background, no direct inferences can be derived for the presence/extent of cochlear synaptopathy in high-risk group with high sound over-exposure history. Results of experiment 2 demonstrate that to the extent response amplitude reflects both the number of neural elements responding and the neural synchrony of the responding elements, the relatively smaller change in response amplitude for the high-risk group would suggest a reduced susceptibility to masking. One plausible mechanism would be that suppressive effects that kick in at moderate to high levels are different in these two groups, particularly at moderate levels of the masking noise. Altogether, a larger scale dataset with different noise exposure background, longitudinal measurements (changes due to recreational over-exposure by studying middle-school to high-school students enrolled in marching band) with an array of behavioral and electrophysiological tests are needed to understand the complex pathogenesis of sound over-exposure damage in normal-hearing individuals.

Selection Criteria for Cochlear Implantation in the United Kingdom and Flanders: Toward a Less Restrictive Standard
imageObjectives: The impact of the newly introduced cochlear implantation criteria of the United Kingdom and Flanders (Dutch speaking part of Belgium) was examined in the patient population of a tertiary referral center in the Netherlands. We compared the patients who would be included/excluded under the new versus old criteria in relation to the actual improvement in speech understanding after implantation in our center. We also performed a sensitivity analysis to examine the effectiveness of the different preoperative assessment approaches used in the United Kingdom and Flanders. Design: The selection criteria were based on preoperative pure-tone audiometry at 0.5, 1, 2, and 4 kHz and a speech perception test (SPT) with and without best-aided hearing aids. Postoperatively, the same SPT was conducted to assess the benefit in speech understanding. Results: The newly introduced criteria in Flanders and the United Kingdom were less restrictive, resulting in greater percentages of patients implanted with CI (increase of 30%), and sensitivity increase of 31%. The preoperative best-aided SPT, used by both countries, had the highest diagnostic ability to indicate a postoperative improvement of speech understanding. We observed that patient selection was previously dominated by the pure-tone audiometry criteria in both countries, whereas speech understanding became more important in their new criteria. Among patients excluded by the new criteria, seven of eight (the United Kingdom and Flanders) did exhibit improved postoperative speech understanding. Conclusions: The new selection criteria of the United Kingdom and Flanders led to increased numbers of postlingually deafened adults benefitting from CI. The new British and Flemish criteria depended on the best-aided SPT with the highest diagnostic ability. Notably, the new criteria still led to the rejection of candidates who would be expected to gain considerably in speech understanding after implantation.

Vestibular Function in Children With a Congenital Cytomegalovirus Infection: 3 Years of Follow-Up
imageObjectives: Congenital cytomegalovirus (cCMV) infection is the most common nongenetic cause of sensorineural hearing loss in children. Due to the close anatomical relationship between the auditory and the vestibular sensory organs, cCMV can also be an important cause of vestibular loss. However, the prevalence and nature of cCMV-induced vestibular impairment is still underexplored. The aim of this study was to investigate the occurrence and characteristics of vestibular loss in a large group of cCMV-infected children, representative of the overall cCMV-population. Design: Ninety-three children (41 boys, 52 girls) with a confirmed diagnosis of cCMV were enrolled in this prospective longitudinal study. They were born at the Ghent University Hospital or referred from another hospital for multidisciplinary follow-up in the context of cCMV. The test protocol consisted of regular vestibular follow-up around the ages of 6 months, 1 year, 2 years, and 3 years with the video Head Impulse Test, the rotatory test, and the cervical Vestibular Evoked Myogenic Potential test. Results: On average, the 93 patients (52 asymptomatic, 41 symptomatic) were followed for 10.2 months (SD: 10.1 mo) and had 2.2 examinations (SD: 1.1). Seventeen (18%) patients had sensorineural hearing loss (7 unilateral, 10 bilateral). Vestibular loss was detected in 13 (14%) patients (7 unilateral, 6 bilateral). There was a significant association between the occurrence of hearing loss and the presence of vestibular loss (p < 0.001), with 59% (10/17) vestibular losses in the group of hearing-impaired children compared to 4% (3/76) in the group of normal-hearing subjects. In the majority of the cases with a vestibular dysfunction (85%, 11/13), both the semicircular canal system and the otolith system were affected. The remaining subjects (15%, 2/13) had an isolated semicircular canal dysfunction. Sixty-one patients already had at least one follow-up examination. Deterioration of the vestibular function was detected in 6 of them (10%, 6/61). Conclusions: cCMV can impair not only the auditory but also the vestibular function. Similar to the hearing loss, vestibular loss in cCMV can be highly variable. It can be unilateral or bilateral, limited or extensive, stable or progressive, and early or delayed in onset. As the vestibular function can deteriorate over time and even normal-hearing subjects can be affected, vestibular evaluation should be part of the standard otolaryngology follow-up in all children with cCMV.

Human Frequency Following Responses to Filtered Speech
imageObjectives: There is increasing interest in using the frequency following response (FFR) to describe the effects of varying different aspects of hearing aid signal processing on brainstem neural representation of speech. To this end, recent studies have examined the effects of filtering on brainstem neural representation of the speech fundamental frequency (f0) in listeners with normal hearing sensitivity by measuring FFRs to low- and high-pass filtered signals. However, the stimuli used in these studies do not reflect the entire range of typical cutoff frequencies used in frequency-specific gain adjustments during hearing aid fitting. Further, there has been limited discussion on the effect of filtering on brainstem neural representation of formant-related harmonics. Here, the effects of filtering on brainstem neural representation of speech fundamental frequency (f0) and harmonics related to first formant frequency (F1) were assessed by recording envelope and spectral FFRs to a vowel low-, high-, and band-pass filtered at cutoff frequencies ranging from 0.125 to 8 kHz. Design: FFRs were measured to a synthetically generated vowel stimulus /u/ presented in a full bandwidth and low-pass (experiment 1), high-pass (experiment 2), and band-pass (experiment 3) filtered conditions. In experiment 1, FFRs were measured to a synthetically generated vowel stimulus /u/ presented in a full bandwidth condition as well as 11 low-pass filtered conditions (low-pass cutoff frequencies: 0.125, 0.25, 0.5, 0.75, 1, 1.5, 2, 3, 4, 6, and 8 kHz) in 19 adult listeners with normal hearing sensitivity. In experiment 2, FFRs were measured to the same synthetically generated vowel stimulus /u/ presented in a full bandwidth condition as well as 10 high-pass filtered conditions (high-pass cutoff frequencies: 0.125, 0.25, 0.5, 0.75, 1, 1.5, 2, 3, 4, and 6 kHz) in 7 adult listeners with normal hearing sensitivity. In experiment 3, in addition to the full bandwidth condition, FFRs were measured to vowel /u/ low-pass filtered at 2 kHz, band-pass filtered between 2–4 kHz and 4–6 kHz in 10 adult listeners with normal hearing sensitivity. A Fast Fourier Transform analysis was conducted to measure the strength of f0 and the F1-related harmonic relative to the noise floor in the brainstem neural responses obtained to the full bandwidth and filtered stimulus conditions. Results: Brainstem neural representation of f0 was reduced when the low-pass filter cutoff frequency was between 0.25 and 0.5 kHz; no differences in f0 strength were noted between conditions when the low-pass filter cutoff condition was at or greater than 0.75 kHz. While envelope FFR f0 strength was reduced when the stimulus was high-pass filtered at 6 kHz, there was no effect of high-pass filtering on brainstem neural representation of f0 when the high-pass filter cutoff frequency ranged from 0.125 to 4 kHz. There was a weakly significant global effect of band-pass filtering on brainstem neural phase-locking to f0. A trends analysis indicated that mean f0 magnitude in the brainstem neural response was greater when the stimulus was band-pass filtered between 2 and 4 kHz as compared to when the stimulus was band-pass filtered between 4 and 6 kHz, low-pass filtered at 2 kHz or presented in the full bandwidth condition. Last, neural phase-locking to f0 was reduced or absent in envelope FFRs measured to filtered stimuli that lacked spectral energy above 0.125 kHz or below 6 kHz. Similarly, little to no energy was seen at F1 in spectral FFRs obtained to low-, high-, or band-pass filtered stimuli that did not contain energy in the F1 region. For stimulus conditions that contained energy at F1, the strength of the peak at F1 in the spectral FFR varied little with low-, high-, or band-pass filtering. Conclusions: Energy at f0 in envelope FFRs may arise due to neural phase-locking to low-, mid-, or high-frequency stimulus components, provided the stimulus envelope is modulated by at least two interacting harmonics. Stronger neural responses at f0 are measured when filtering results in stimulus bandwidths that preserve stimulus energy at F1 and F2. In addition, results suggest that unresolved harmonics may favorably influence f0 strength in the neural response. Lastly, brainstem neural representation of the F1-related harmonic measured in spectral FFRs obtained to filtered stimuli is related to the presence or absence of stimulus energy at F1. These findings add to the existing literature exploring the viability of the FFR as an objective technique to evaluate hearing aid fitting where stimulus bandwidth is altered by design due to frequency-specific gain applied by amplification algorithms.

Effects of Signal Type and Noise Background on Auditory Evoked Potential N1, P2, and P3 Measurements in Blast-Exposed Veterans
imageObjectives: Veterans who have been exposed to high-intensity blast waves frequently report persistent auditory difficulties such as problems with speech-in-noise (SIN) understanding, even when hearing sensitivity remains normal. However, these subjective reports have proven challenging to corroborate objectively. Here, we sought to determine whether use of complex stimuli and challenging signal contrasts in auditory evoked potential (AEP) paradigms rather than traditional use of simple stimuli and easy signal contrasts improved the ability of these measures to (1) distinguish between blast-exposed Veterans with auditory complaints and neurologically normal control participants, and (2) predict behavioral measures of SIN perception. Design: A total of 33 adults (aged 19–56 years) took part in this study, including 17 Veterans exposed to high-intensity blast waves within the past 10 years and 16 neurologically normal control participants matched for age and hearing status with the Veteran participants. All participants completed the following test measures: (1) a questionnaire probing perceived hearing abilities; (2) behavioral measures of SIN understanding including the BKB-SIN, the AzBio presented in 0 and +5 dB signal to noise ratios (SNRs), and a word-level consonant-vowel-consonant test presented at +5 dB SNR; and (3) electrophysiological tasks involving oddball paradigms in response to simple tones (500 Hz standard, 1000 Hz deviant) and complex speech syllables (/ba/ standard, /da/ deviant) presented in quiet and in four-talker speech babble at a SNR of +5 dB. Results: Blast-exposed Veterans reported significantly greater auditory difficulties compared to control participants. Behavioral performance on tests of SIN perception was generally, but not significantly, poorer among the groups. Latencies of P3 responses to tone signals were significantly longer among blast-exposed participants compared to control participants regardless of background condition, though responses to speech signals were similar across groups. For cortical AEPs, no significant interactions were found between group membership and either stimulus type or background. P3 amplitudes measured in response to signals in background babble accounted for 30.9% of the variance in subjective auditory reports. Behavioral SIN performance was best predicted by a combination of N1 and P2 responses to signals in quiet which accounted for 69.6% and 57.4% of the variance on the AzBio at 0 dB SNR and the BKB-SIN, respectively. Conclusions: Although blast-exposed participants reported far more auditory difficulties compared to controls, use of complex stimuli and challenging signal contrasts in cortical and cognitive AEP measures failed to reveal larger group differences than responses to simple stimuli and easy signal contrasts. Despite this, only P3 responses to signals presented in background babble were predictive of subjective auditory complaints. In contrast, cortical N1 and P2 responses were predictive of behavioral SIN performance but not subjective auditory complaints, and use of challenging background babble generally did not improve performance predictions. These results suggest that challenging stimulus protocols are more likely to tap into perceived auditory deficits, but may not be beneficial for predicting performance on clinical measures of SIN understanding. Finally, these results should be interpreted with caution since blast-exposed participants did not perform significantly poorer on tests of SIN perception.


#
Medicine by Alexandros G. Sfakianakis,Anapafseos 5 Agios Nikolaos 72100 Crete Greece,00302841026182,00306932607174,alsfakia@gmail.com,
Telephone consultation 11855 int 1193,

Retina

THE RAP STUDY, REPORT TWO: The Regional Distribution of Macular Neovascularization Type 3, a Novel Insight Into Its Etiology
imagePurpose: To explore the regional distribution of macular neovascularization type 3 (MNV3). Methods: Seventy-eight eyes of 78 patients were reviewed. We defined the location of each lesion after applying a modified ETDRS grid and the incidence of simultaneous MNV1 or 2. Also, we investigated the distribution of MNV3 at the outline of the foveal avascular zone and when the diameter of foveal avascular zone was less than 325 µm. Results: The distribution of MNV3 was 4 lesions (5%) from the center to 500 µm, 72 (92%) from 500 µm to 1500 µm, and 2 (3%) from 1,500 µm to 3000 µm. The distribution in respect of the ETDRS fields was 7 (9%) nasal, 16 (20%) superior, 32 (40%) temporal, and 23 (31%) inferior. No additional MNV1 or 2 were found elsewhere. Most lesions tended to distribute along straight bands radiating from the perifoveal area, mainly in the temporal half (72%). None of the cases had MNV3 at the boundary of the foveal avascular zone. Only five cases had foveal avascular zone diameter of less than 325 µm, the closest lesion was 425 µm away from the center. Conclusion: MNV3 lesions are most likely neither symmetrical nor uniformly distributed. They have a higher affinity to distribute radially in the temporal perifoveal area.

OPTICAL COHERENCE TOMOGRAPHY ANGIOGRAPHY CAN CATEGORIZE DIFFERENT SUBGROUPS OF CHOROIDAL NEOVASCULARIZATION SECONDARY TO AGE-RELATED MACULAR DEGENERATION
imagePurpose: Choroidal neovascularization (CNV) is a common complication of patients affected by age-related macular degeneration, showing a highly variable visual outcome. The main aim of the study was, at baseline, to perform a quantitative optical coherence tomography angiography assessment of CNV secondary to age-related macular degeneration and to assess posttreatment outcomes. Methods: Seventy-eight naïve age-related macular degeneration-related CNV patients (39 men, mean age 78 ± 8 years) were recruited and underwent complete ophthalmologic evaluation and multimodal imaging. Several OCT and optical coherence tomography angiography parameters were collected, including vessel tortuosity and vessel dispersion (VDisp), measured for each segmented CNV. All patients underwent anti–vascular endothelial growth factor PRN treatment. Vessel tortuosity and VDisp values of CNVs were tested at baseline to establish a cutoff able to distinguish clinically different patient subgroups. Results: Mean best-corrected visual acuity was 0.49 ± 0.57 (20/62) at baseline, improving to 0.31 ± 0.29 (20/41) at the 1-year follow-up (P < 0.01), with a mean number of 6.4 ± 1.9 injections. Our cohort included the following CNV types: occult (45 eyes; 58%), classic (14 eyes; 18%), and mixed (19 eyes; 24%). Observing optical coherence tomography angiography parameters, classic, mixed, and occult CNV revealed significantly different values of VDisp, with classic forms showing the highest values and the occult CNVs showing the lowest (P < 0.01); mixed forms displayed intermediate VDisp values. The ROC analysis revealed that a CNV vessel tortuosity cut-off of 8.40, calculated at baseline, enabled two patient subgroups differing significantly in visual outcomes after anti–vascular endothelial growth factor treatment to be distinguished. Conclusion: A baseline quantitative optical coherence tomography angiography-based parameter could provide information regarding both clinical and functional outcomes after anti–vascular endothelial growth factor treatment in age-related macular degeneration-related CNV.

FEATURES OF THE MACULAR AND PERIPAPILLARY CHOROID AND CHORIOCAPILLARIS IN EYES WITH NONEXUDATIVE AGE-RELATED MACULAR DEGENERATION
imagePurpose: We investigated macular and peripapillary choroidal thickness (CT) and flow voids in the choriocapillaris in eyes with nonexudative age-related macular degeneration. Methods: We retrospectively reviewed the medical records of patients with nonexudative age-related macular degeneration and classified their eyes into three categories: pachydrusen, drusen, and subretinal drusenoid deposit. Mean macular and peripapillary CT and choriocapillaris flow void area were compared among the three groups. Results: The three groups included 29, 33, and 33 patients, respectively. The mean macular and peripapillary CT findings were 260.64 ± 75.85 µm and 134.47 ± 46.28 µm for the pachydrusen group; 163.63 ± 64.08 µm and 93.47 ± 39.07 µm for the drusen group; and 95.33 ± 28.87 µm and 56.06 ± 11.64 µm for the subretinal drusenoid deposit group (all, P < 0.001). Mean macular and peripapillary flow void area varied among the subretinal drusenoid deposit group (57.07 ± 6.16% and 55.38 ± 6.65%), drusen group (58.30 ± 6.98% and 49.11 ± 9.11%) and pachydrusen group (50.09 ± 5.77% and 45.47 ± 8.06%) (all P < 0.001). Conclusion: The peripapillary CT and flow voids in the choriocapillaris varied according to the features of drusen in nonexudative age-related macular degeneration eyes. Greater flow voids and thinner CT in eyes with subretinal drusenoid deposits may suggest that these eyes have diffuse choroidal abnormalities both in and outside the macula.

REAL-COLOR VERSUS PSEUDO-COLOR IMAGING OF FIBROTIC SCARS IN EXUDATIVE AGE-RELATED MACULAR DEGENERATION
imagePurpose: To compare the morphological characteristics of subretinal fibrosis in late age-related macular degeneration using multicolor (MC) imaging, color fundus photography (CFP), and ultra-widefield CFP (UWFCFP). Methods: Thirty-two eyes of 31 patients diagnosed with subretinal fibrosis complicating exudative age-related macular degeneration were included. Included eyes were imaged by MC, CFP, and UWFCFP. The overall ability to visualize fibrosis, its margins, and dissimilarity with surrounding atrophy was graded using a score (0: not visible, 1: barely visible, 2: mostly visible, and 3: fully visible) by two readers. Area of fibrosis was calculated. Scaling, lesion colocalization on all three imaging techniques, and area measurements were performed using ImageJ. Results: Ninety-six images of 32 eyes were graded. The average area of fibrosis was 14.59 ± 8.94 mm2 for MC, 13.84 ± 8.56 mm2 for CFP, and 13.76 ± 8.79 mm2 for UWFCFP. Fibrosis was fully visible in 87.5% of cases using MC and 50% using CFP and UWFCFP. Fibrosis' margins were sharply defined in 40.6% of eyes with MC, 15.6% and 9.4% with CFP and UWFCFP, respectively. Multicolor imaging provided superior distinction between fibrosis and atrophy (100% for MC vs. 13.4% for CFP and 33.3% for UWFCFP). The inter- and intra-reader agreement was high for all measurements (P < 0.0001). Conclusion: Multicolor technology allows for improved visualization and analysis of subretinal fibrosis when compared with CFP and UWFCFP, especially when surrounding atrophy is present.

PREVALENCE AND RISK FACTORS FOR THE DEVELOPMENT OF PHYSICIAN-GRADED SUBRETINAL FIBROSIS IN EYES TREATED FOR NEOVASCULAR AGE-RELATED MACULAR DEGENERATION
imagePurpose: To assess the prevalence and incidence of and risk factors for subretinal fibrosis (SRFi) in eyes with neovascular age-related macular degeneration (nAMD) that underwent vascular endothelial growth factor inhibitor treatment for up to 10 years. Methods: A cross-sectional and longitudinal analysis was performed on data from a neovascular age-related macular degeneration registry. The presence and location of SRFi were graded by the treating practitioner. Visual acuity, lesion characteristics (type, morphology, and activity), and treatment administered at each visit was recorded. Results: The prevalence of SRFi in 2,914 eyes rose from 20.4% at year interval 0-1 to 40.7% at year interval 9 to 10. The incidence in 1,950 eyes was 14.3% at baseline and 26.3% at 24 months. Independent characteristics associated with SRFi included poorer baseline vision (adjusted odds ratio 5.33 [95% confidence interval 4.66–7.61] for visual acuity ≤35 letters vs. visual acuity ≥70 letters, P < 0.01), baseline lesion size (adjusted odds ratio 1.08 [95% confidence interval 1.08–1.14] per 1000 µm, P = 0.03), lesion type (adjusted odds ratio 1.42 [95% confidence interval 1.17–1.72] for predominantly classic vs. occult lesions, P = 0.02), and proportion of active visits (adjusted odds ratio 1.58 [95% confidence interval 1.25–2.01] for the group with the highest level of activity vs. the lowest level of activity, P < 0.01). Conclusion: Subretinal fibrosis was found in 40% of eyes after 10 years of treatment. High rates of lesion activity, predominantly classic lesions, poor baseline vision, and larger lesion size seem to be independent risk factors for SRFi.

DIAGNOSTIC CHARACTERISTICS OF POLYPOIDAL CHOROIDAL VASCULOPATHY BASED ON B-SCAN SWEPT-SOURCE OPTICAL COHERENCE TOMOGRAPHY ANGIOGRAPHY AND ITS INTERRATER AGREEMENT COMPARED WITH INDOCYANINE GREEN ANGIOGRAPHY
imagePurpose: To examine the characteristics of polypoidal choroidal vasculopathy using B-scan optical coherence tomography angiography (OCTA), and determine the diagnostic criteria of polypoidal choroidal vasculopathy based on OCTA. Methods: This retrospective case series included patients diagnosed with treatment-naïve polypoidal choroidal vasculopathy who underwent indocyanine green angiography (ICGA) and swept-source OCTA at baseline. We compared the characteristics of the polyps detected using B-scan OCTA and ICGA. Then, the diagnostic concordance of each polypoidal lesion between ICGA and OCTA was evaluated. Results: Among 54 eyes of 52 patients, all 54 eyes showed flow signals indicating polyps on both ICGA and B-scan OCTA. All polyps on B-scan OCTA were detected as round/ring-like flow signals inside pigment epithelial detachments, incomplete round/ring-like flow signals overlaid with round/ring-like OCT structures inside pigment epithelial detachments, or flow signals adjacent to a pigment epithelial detachment notch. Using B-scan OCTA, 94.7% of the polypoidal lesions were detected by an independent evaluator with an overall accuracy of 92.6% for counting the polypoidal lesions per eye relative to ICGA and a Kappa value of 0.82. Conclusion: Polyp detection on B-scan OCTA demonstrates high accuracy and is comparable to that obtained on ICGA. B-scan OCTA could replace ICGA for the diagnosis of polypoidal choroidal vasculopathy.

VISUAL PROGNOSIS AFTER PNEUMATIC DISPLACEMENT OF SUBMACULAR HEMORRHAGE ACCORDING TO AGE-RELATED MACULAR DEGENERATION SUBTYPES
imagePurpose: This study compared the visual outcome after pneumatic displacement of submacular hemorrhage among patients with different subtypes of age-related macular degeneration (AMD). Methods: We retrospectively reviewed the medical records of 67 patients (67 eyes) who underwent treatment for submacular hemorrhage associated with AMD. All the patients underwent pneumatic displacement. Demographic parameters, visual acuity, and anatomical features were analyzed among AMD subtypes: typical AMD, polypoidal choroidal vasculopathy (PCV), and retinal angiomatous proliferation (RAP). Results: Among the eyes with submacular hemorrhage, 24, 30, and 13 eyes had typical AMD, PCV, and RAP, respectively. Post-treatment best-corrected visual acuity was best in the PCV group and worst in the RAP group (P < 0.001). The proportion of eyes with improved visual acuity was highest in the PCV subtype and lowest in the RAP subtype (P = 0.044). Logistic regression analysis showed that AMD subtype (P = 0.016) and time to treatment (<7 days) (P = 0.037) are associated with the final visual outcome. Conclusion: The final post-treatment visual outcome after the incidence of submacular hemorrhage was best in the PCV group and worst in the RAP group. Age-related macular degeneration subtype is a significant factor associated with the visual prognosis of submacular hemorrhage.

RISK OF AGE-RELATED MACULAR DEGENERATION IN PATIENTS WITH PERIODONTITIS: A Nationwide Population-Based Cohort Study
imagePurpose: Periodontitis is an inflammatory disease that results in loss of connective tissue and bone support. Evidence shows a possible relationship between periodontitis and age-related macular degeneration (AMD). Methods: This population-based cohort study was conducted using data from the National Health Insurance Research Database in Taiwan, with a 13-year follow-up, to investigate the risk of AMD in patients with periodontitis. The periodontitis cohort included patients with newly diagnosed periodontitis between 2000 and 2012. The nonperiodontitis cohort was frequency-matched with the periodontitis cohort by age and sex, with a sample size of 41,661 in each cohort. Results: Patients with periodontitis had an increased risk of developing AMD compared with individuals without periodontitis (5.95 vs. 3.41 per 1,000 person-years, adjusted hazard ratio = 1.58 [95% confidence interval, 1.46–1.70]). The risk of developing AMD remained significant after stratification by age (adjusted hazard ratio = 1.48 [1.34–1.64] for age <65 years and 1.76 [1.57–1.97] for age ≥65 years), sex (adjusted hazard ratio = 1.40 [1.26–1.55] for women and 1.82 [1.63–2.04] for men), and presence of comorbidity (adjusted hazard ratio = 1.52 [1.40–1.66] for with comorbidity and 1.92 [1.63–2.26] for without comorbidity). In addition, patients with periodontitis showed an increased incidence for both nonexudative type AMD (5.43 vs. 3.13 per 1,000 person-years) and exudative type AMD (0.52 vs. 0.28 per 1,000 person-years). Conclusion: People with periodontitis could be at a greater risk of developing AMD than those without periodontitis. However, we need more evidence to support this association.

OPTICAL COHERENCE TOMOGRAPHY ANGIOGRAPHY FEATURES OF FOCAL CHOROIDAL EXCAVATION AND THE CHOROIDAL STROMA VARIATIONS WITH OCCURRENCE OF EXCAVATION
imagePurpose: To describe retinal and choroidal vascular changes, and choroidal stroma variations occurring in focal choroidal excavation (FCE). Methods: Study design was a cross-sectional case series. Consecutive patients affected by FCE and healthy controls were recruited. All patients underwent complete ophthalmologic assessment and multimodal imaging, including structural optical coherence tomography and optical coherence tomography angiography. Choroidal thickness and stromal index were calculated from structural optical coherence tomography images. Moreover, we measured vessel density values of the superficial capillary plexus, deep capillary plexus and choriocapillaris at the level of the macula. Results: Twenty-two patients (28 eyes; mean age 57.2 ± 16.4) and 28 control eyes (mean age of 56.5 ± 9.8) were included. Five patients (23%) were asymptomatic, whereas 17 patients (77%) complained of visual symptoms. FCE was associated with choroidal neovascularization in 10 eyes (35%). Choroidal stromal component was lower in FCE patients than controls, whereas choroidal thickness was unremarkable. Stromal index values calculated in the region proximal to the FCE was significantly lower than the values obtained from the external region. Deep capillary plexus vessel density was lower in FCE than controls. Choriocapillaris was altered in the region surrounding the FCE, whereas it was normal in the external region. Conclusion: Deep capillary plexus and choriocapillaris plexus were significantly altered in FCE patients. Moreover, choroidal stroma was significantly reduced in the areas closer to FCE compared to the surrounding choroid in patients, as well as compared to healthy controls, suggesting the hypothesis of weakening of the architectural support, creating a more friable point, which can favor FCE development.

IDIOPATHIC FOVEAL HYPOPLASIA: Quantitative Analysis Using Optical Coherence Tomography Angiography
imagePurpose: To evaluate vascular density (VD), fractal dimension, and skeletal density on optical coherence tomography angiography in eyes with idiopathic foveal hypoplasia (IFH). Methods: Patients presenting with IFH to Creteil University Eye Clinic between January 2015 and October 2018 and age-matched healthy controls were retrospectively evaluated. Vascular density, skeletal density, and fractal dimension analyses were computed on optical coherence tomography angiography superficial capillary plexa (SCP) and deep capillary plexa (DCP) images on the whole image using a custom algorithm. Vascular density on the central 1 mm2 and the peripheral 8 mm2 for the two groups was performed. Results: Thirty-six eyes of 21 patients (18 eyes with IFH and 18 control eyes) were included. A decrease of VD at the level of the SCP and DCP was found in eyes with IFH compared with healthy control eyes (P = 0.005 for VD at the level of the SCP and P = 0.003 for VD at the level of the DCP, respectively). On the central 1 mm2, VD was decreased in healthy eyes (32.3% ± 4.8) at the level of the SCP compared to IFH eyes (55.6% ± 46.3) (P < 0.001). Skeletal density was decreased in IFH eyes in both SCP and DCP (P =< 0.001). Fractal dimension was lower in IFH eyes in both SCP and DCP (P < 0.001). Conclusion: Vascular density, skeletal density, and fractal dimension are reduced at the level of SCP and DCP in patients with IFH compared with controls, reflecting a particular anatomical and vascular organization. Quantitative analysis using optical coherence tomography angiography could help to evaluate the severity of IFH.


#
Medicine by Alexandros G. Sfakianakis,Anapafseos 5 Agios Nikolaos 72100 Crete Greece,00302841026182,00306932607174,alsfakia@gmail.com,
Telephone consultation 11855 int 1193,