
Clinical Research
Oakland University William Beaumont School of Medicine
586 Pioneer Drive
Rochester, MI 48309
(248) 370-3634
This section includes Class of 2025 Embark Projects within the Clinical and Translational research areas. Many of these projects were initiated from our OUWB Clinical Faculty within many areas of clinical practice.
Frailty Among Revision Total Knee Arthroplasty Recipients: Epidemiology and Propensity Score Weighted Analysis of Impact on In-hospital Postoperative Outcomes
Avianna Arapovic1, Abdul Kareem Zalikha, M.D.2, Mazen Zamzam, M.S.1, Jacob Keeley, M.S.4, Inaya Hajj Hussein, Ph. D.5, Mouhanad El-Othmani, M.D.3
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Stanford Medicine, Paulo Alto, CA
3Warren Alpert Medical School of Brown University, Providence, RI
4Department of Research, Oakland University William Beaumont School of Medicine, Rochester, MI
5Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, MI
INTRODUCTION
Frailty has been shown to correlate with worse outcomes after total knee arthroplasty (TKA), although less is known regarding its impact on revision total knee arthroplasty (rTKA). This study examines the epidemiologic characteristics and inpatient outcomes of patients with frailty undergoing rTKA.
METHODS
Discharge data from National Inpatient Sample registry was used to identify all patients 50 or older who underwent rTKA between 2006 and 2015. Patients were stratified into frail and non-frail groupings, based on presence of specific ICD-9 diagnostic coding. An analysis comparing the epidemiology, medical comorbidities, and propensity score weighted postoperative clinical and economic outcomes of the two groups was performed.
RESULTS
From 2006 to the third quarter of 2015, a total of 576,920 patients (17,727 frail) who underwent rTKA were included. The average age in the study’s population was 67.2 years old, with a female distribution of 57.4%. Frail patients were more likely to exhibit significantly higher rates of almost all Modified Elixhauser Comorbities than their non-frail counterparts. Frail patients were also more likely to undergo different types of revision, including an increased rate of removal of the prosthesis without replacement. Additionally, frail patients displayed increased likelihood of experiencing any postoperative complication, DVT, postoperative anemia, respiratory complications, and wound dehiscence. Frail patients experienced lower rates of discharge home and increased LOS than the non-frail cohort.
CONCLUSIONS
Patients with frailty undergoing rTKA are at significantly higher risk for inpatient postoperative complications and increased LOS. Understanding the implications of frailty within rTKA is essential for risk assessment and preoperative optimization for this expanding population.
Stroke Risk After Lowering Elevated Blood Pressure in Transient Ischemic Attack
Alexsandra Biel, B.A.1, Jacob Keeley, M.S.3, Brett Todd2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Emergency Medicine, Corewell Health William Beaumont University Hospital, Royal Oak, MI
Oakland University William Beaumont School of Medicine, Rochester, MI
3Department of Research, Oakland University William Beaumont School of Medicine, Rochester, MI
INTRODUCTION
Hypertension is a risk factor for developing stroke after transient ischemic attack (TIA), yet it is unknown if stroke risk is altered by emergency department (ED) antihypertensive therapy. We aimed to investigate stroke rate in a population of TIA patients presenting with elevated blood pressure in the ED, comparing those who received antihypertensive medication in the ED to those who received no treatment. Secondarily, we aimed to assess the association between ED antihypertensive therapy and intensive care unit (ICU) admit rates, hospital length of stay (LOS), and discharge disposition setting in this population.
METHODS
We conducted a retrospective cohort study evaluating adult TIA patients presenting with an elevated blood pressure (diastolic ≥ 140 mm Hg or systolic ≥ 90 mm Hg) at any of our Metro Detroit hospital system’s EDs between August 2016 to April 2022. We collected data on age, sex, race, blood pressure in the ED, ED antihypertensive therapy, stroke in the subsequent hospital stay, hospital LOS, ICU admission rates, and discharge disposition. Patient characteristics were summarized using descriptive statistics and two sample hypothesis testing. We assessed the outcomes of antihypertensive treatment using multivariable logistic regression controlling for patient characteristics.
RESULTS
There were 3,095 patients included in our analysis, of which 21.0% (649) received antihypertensive treatment and 13.9% (429) suffered a stroke. There was no significant difference in stroke rate in the treatment group compared to the no treatment group (aOR, 1.12 [95% CI, 0.87 - 1.43]). There was a slightly longer hospital LOS in the treatment group (2.1 days vs 1.9 days), but no differences were seen in ICU admission or discharge disposition.
CONCLUSIONS
In TIA patients presenting with elevated blood pressure in the ED, antihypertensive therapy does not appear to be associated with decreased stroke risk in the subsequent hospital stay.
Depression Severity in Patients with Traumatic Brain Injuries: Violent vs Non-Violent Causes
Emily Brunett, B.S.1, Dwayne Baxa, Ph. D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, MI
INTRODUCTION
Individuals who experience traumatic brain injuries (TBI) may suffer from a variety of physical, cognitive, and psychological effects, with depression being a common complication. Those who sustain TBIs from violent causes may have additional psychological influences that could exacerbate depressive symptoms. Understanding this relationship may help physicians develop more thorough treatment plans for TBI patients. This study aims to compare depression severity, as measured by the Patient Health Questionnaire-9 (PHQ-9) scores in patients with TBIs from violent and non-violent causes.
METHODS
This is a cross-sectional, retrospective observational study utilizing data from the published 2018 TBI Model System Collaborative: Characterization and Treatment of Chronic Pain after Moderate to Severe Traumatic Brain Injury study where 3,804 TBI Model Systems participants 1 to 30 years post injury completed a survey. The current study categorized those participants into violent (gunshot wounds, assaults, other violence) and non-violent (e.g., motor vehicle accidents, falls, sporting) injury groups and analyzed their PHQ-9 scores to assess their depression severity.
RESULTS
A total of 3,543 TBI patients were included (violent: 429, non-violent 3,114). Wilcoxon two-sample tests and chi-squared tests revealed significantly higher median PHQ-9 scores in violent injury patients (6.0, IQR 2-11) compared to non-violent injury patients (4.0, IQR 1-8) with p<0.0001. Additionally, the mean PHQ-9 score was higher in the violent injury group (7.2± 6.1) compared to the non-violent group (5.4 ± 5.7).
CONCLUSIONS
Patients with TBIs from violent causes reported significantly higher scores on the PHQ-9 depression scale compared to those with non-violent injuries. Although statistically significant, both groups scored within the “mild” level of depression based on the PHQ-9 scale. Addressing psychological health may be beneficial in the treatment of patients with TBIs from both violent and non-violent causes.
Manual Versus Mechanical Cardiopulmonary Resuscitation Complications After Successful Resuscitation for Out-of-Hospital Cardiac Arrest
Connie Chen, M.S.1, Jacob Keeley, M.S.3, Julian Sit, D.O.2, Robert Swor, D.O.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Corewell Health William Beaumont University Hospital, Department of Emergency Medicine, Royal Oak, MI
3Department of Research, Oakland University William Beaumont School of Medicine, Rochester, MI
INTRODUCTION
Mechanical CPR is increasingly being used for field out of hospital cardiac arrest (OHCA) care. Existing literature does not identify a survival benefit of mechanical versus manual CPR. We hypothesized that CPR related injury may impact patient outcomes. Our primary objective compares mechanical and manual CPR-related injury in resuscitated OHCA patients. Our secondary objective compares hospital outcomes including length of stay (LOS) and survival between these two CPR methods.
METHODS
We performed a retrospective study of adult OHCA patients admitted to teaching hospitals in Southeastern Michigan from 2017-2021. Resuscitated patients were matched to hospital electronic medical records (EMRs), included if they had CT imaging of chest or abdomen/pelvis, and dichotomized by CPR method. Patients with no EMR match or unknown CPR method were excluded. Hospital EMRs were queried for CT imaging results, LOS variables, and survival to hospital discharge. Injuries were identified using hospital ICD-10 codes.
RESULTS
808 patients were admitted after OHCA. 235 (132 manual, 103 mechanical) met inclusion criteria. Demographics between groups were comparable in age, gender, or body mass index (BMI). CPR-associated injury was more common with manual CPR (28.8% vs 15.5%, p=0.02), primarily due to an increased rate of rib(s), sternum, or thoracic spine fracture (27.3% vs 14.6%, p=0.02). We identified no differences in survival to hospital discharge, median hospital or ICU LOS, and ventilator time between groups. Paradoxically amongst survivors, mechanical CPR was associated with longer LOS (15.7 vs 11.0 days, p=0.01) and a non-significant increased ventilator time (5.5 vs. 4.1 days, p=0.23).
CONCLUSIONS
We identified a higher rate of injury with manual CPR compared to mechanical CPR. We also did not identify any association between CPR method and ICU LOS, ventilator time, and survival. Further work is needed to assess the impact on outcome of CPR method and injuries associated with resuscitation.
Impact of Chronic Kidney Disease Stages on In-Hospital Outcomes Following Total Joint Arthroplasty
Natalie Dakki, B.S.1, Inaya Hajj Hussein, Ph.D.2, Abdul Zalikha, M.D.3, Mouhanad El-Othmani, M.D.4
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, MI
3Department of Orthopaedic Surgery, Stanford University School of Medicine, Stanford, CA
4Department of Orthopaedic Surgery, Warren Alpert School of Medicine, Brown University, Providence, RI
INTRODUCTION
Past studies have shown that chronic kidney disease (CKD) may play a role in postoperative total joint arthroplasty (TJA) outcomes, though less is known about the risk associated with each CKD stage. This study aims to analyze the impact of CKD stage on in-hospital postoperative outcomes after TJA. We hypothesize more severe CKD will have a higher risk of adverse postoperative outcomes.
METHODS
Discharge data was taken from the National Inpatient Sample, and ICD-9 codes were used to identify inpatient postoperative TJA outcomes from 2006 to 2016. An unweighted multivariable odds ratio analysis, which accounted for race, sex, age, and year, was performed, comparing various CKD stages to a control group without CKD. Outcomes included any complication, central nervous system complications, cardiac complications, deep vein thrombosis (DVT), genitourinary (GU) complications, hematoma, infection, wound dehiscence, and postoperative anemia.
RESULTS
The CKD stage 4 group was found to have the highest significant risk of any (p=<0.0001), cardiac (p=<0.0001), DVT (p=<0.0001), infection (p=<0.0001), and anemia (p=<0.0001) complications. The stage 5 group had the highest significant risk of CNS (p=0.0010) and GU (p=0.0028) complications.
CONCLUSIONS
In general, patients with more advanced CKD tended to have higher rates of complications after TJA when compared to control. Interestingly, the CKD stage 5 group often had lower odds than the stage 4 group when compared to control. These results may be partially explained by a selection bias for the most optimized stage 5 patients, as patients with CKD stage 5 on dialysis may undergo a more stringent perioperative clearance and optimization process. More research is needed to clarify the outcomes and utility of perioperative optimization for these patients.
Predicting Coronary Artery Calcium Using Abdominal Aorta and Visceral Artery Calcification
Jared Dixon, B.S.1, Sayf Al-Katib, M.D.2, Kiran Nandular, M.D.2, Jacob Gannam, M.D.2, Ali Beydoun, M.D.2, Desiree Clement, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Corewell Health William Beaumont University Hospital, Royal Oak , MI
INTRODUCTION
Positive correlation between coronary artery calcium (CAC) scoring and cardiovascular disease (CVD) are well documented in current literature. This study aims to predict CAC scoring in asymptomatic patients by grading calcification of the Abdominal Aorta (AA), Celiac (CA), Superior Mesenteric (SMA), and Renal Arteries (RA) discovered incidentally on abdominal non-contrast Computed Tomography imaging.
METHODS
Retrospective analysis of 281 patients who underwent coronary artery calcium scoring as well as non-contrast CT of the abdomen and pelvis within one year was performed by two independent board eligible radiologists blinded to CAC scores. Atherosclerotic disease of the abdominal aorta and visceral arteries was graded on a 0-3 numerical scale. Raw data for each artery was compiled independently. Comparative analysis of AA and abdominal visceral arterial calcification (AVAC) scores was performed in relation to CAC score from previous examinations. Inter-reader reliability analysis was also performed.
RESULTS
A moderate AA score (>2) and presence of calcification in any of the CA, SMA, or RA (>1) were found to be significant predictors of moderate to severe CVD risk signified by a CAC >300 (95% CI from .690 to .795). AVAC is non inferior to moderate abdominal aortic calcification (AAC) for predicting CAC >300 (95% CI from 0.723 to 0.824). A score of zero for the AAC and zero for all abdominal visceral arteries are both predictive of CAC=0 (95% CI from 0.593 to 0.707 and 0.632 to 0.743 respectively). Visceral arteries are non-inferior to the abdominal aorta for predicting CAC=0 (P=.4217).
CONCLUSIONS
AA calcification and AVAC scores are reliable predictors of both mild to moderate CVD and low risk CVD within 5-10 years. This makes AA and AVAC scoring especially useful as a screening or diagnostic tool when it is an incidental finding or when no CAC score is available.
Water-Soluble Contrast Challenge in the Management of Pediatric Adhesive Small Bowel Obstruction: A Systematic Review
Maximilian Fliegner, B.A.1, Daniel Finn, M.D.2, Spencer Wilhelm, M.D.2, Anthony Stallion, M.D.3, Begum Akay, M.D.3, Pavan Brahmamdam, M.D.3
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Surgery, Corewell Health East, Royal Oak, MI
3Pediatric Surgery, Children’s Corewell Health East, Royal Oak, MI
INTRODUCTION
Adhesive small bowel obstruction (ASBO) is a common complication of intra-abdominal surgery that may be treated non-operatively. Hyperosmotic oral contrast challenges have been used in the non-surgical management of ASBO, although the pediatric literature regarding the use of contrast challenge in ASBO is limited. The purpose of our systematic literature review was to evaluate the efficacy of contrast challenge in non-operative treatment of ASBO in pediatric patients.
METHODS
We followed PRISMA guidelines for this review and performed a comprehensive search strategy of multiple databases. Studies whose patient samples involved only subjects aged 18 and younger were included. Articles were compiled, then titles and abstracts were screened by two independent reviewers with a third-party tie breaker. A comprehensive list of relevant data points was developed for full text data extraction, with emphasis on resolution of bowel obstruction as a primary outcome.
RESULTS
Five studies were included from four countries, including two retrospective reviews, one retrospective cohort, and two prospective observational studies. Gastrografin and Cystografin were both studied with age-based dosing. Serial X-rays determined resolution or failure of contrast passage. A total of 125 patients with ASBO were given water-soluble contrast with 68% resolution of bowel obstruction (Figure). Reported adverse effects were non-operative failure, diarrhea, colicky pain, and reoccurrence of SBO after Gastrografin with varying rates.
CONCLUSIONS
Literature supports the use of water-soluble contrast in ASBO in children; however, the number of studies is small and heterogeneous. Future studies should focus on dosage, surgical decision-making, and adverse events.
Pilot Study: Assessing Chair Yoga Therapy Compliance in Chronic Pain Patients
Andrew Thomas Glaza, B.S.1, Merzia Subhan, B.S.1, Rebecca Clemans2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Corewell Health William Beaumont University Hospital, Royal Oak, MI
INTRODUCTION
Chronic pain affects over 30% of the global population, imposing substantial personal and economic burdens. Multifaceted treatment plans, including pharmacological, physical rehabilitation, and interventional strategies, are recommended; however, barriers to care persist. This pilot prospective cohort study evaluated the feasibility of chair yoga therapy—a nonpharmacological, breath-centric intervention—for managing chronic pain and anxiety. Patients were instructed to integrate in-clinic techniques into home practice, with outcomes measured by compliance and changes in pain and anxiety levels.
METHODS
Twelve chronic pain patients at the Corewell Health William Beaumont University Pain Center completed baseline pain (PEG) and anxiety (GAD-7) assessments before receiving tailored chair yoga instruction. Participants recorded post-session pain and anxiety levels in clinic and via Qualtrics before and after self-led home sessions. Pre- and post-session scores were analyzed using paired t-tests.
RESULTS
Median pre-intervention PEG and GAD-7 scores were 6.2 and 6.0, respectively. Post-intervention, median PEG scores increased slightly to 6.4, while GAD-7 scores modestly decreased to 5.0. Median changes in both scores were 0.0, with no statistically significant differences (pain: p = 0.8125; anxiety: p = 0.5000).
CONCLUSIONS
Chair yoga therapy showed minimal impact on pain and anxiety levels within this cohort, likely due to low compliance with survey submissions and limited follow-up. These findings highlight challenges in engaging patients with nonpharmacological interventions and emphasize the need for improved follow-up strategies. Future research should explore alternative approaches to better evaluate the role of chair yoga in chronic pain management.
Cost Effectiveness of Sending of Routine Collection of Sinus Contents in Skull-Base Patients
Jonathan Grey, B.S.1, Caroline Roberts1, Amanda Bachand, M.S.1, Jacob Keeley, M.S.2, Adam Folbe, M.D.3
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Research, Oakland University William Beaumont School of Medicine, Rochester, MI
3Department of Otolaryngology - Head and Neck Surgery, Corewell Health William Beaumont University Hospital, Royal Oak, MI
INTRODUCTION
Following endoscopic skull-base surgeries, it is common practice to send sinus contents for histopathological examination in addition to the primary pathology. The surgical approach passes through the nasal cavity and sinuses, and any removed sinus contents are often sent as a separate pathology specimen. However, this practice adds to procedural costs. We hypothesize that routinely sending sinus contents is not cost-effective and does not impact clinical management.
METHODS
We conducted a retrospective review of patients who underwent endoscopic endonasal transsphenoidal skull-base surgery performed by a single Otolaryngologist and Neurosurgeon at a tertiary referral center from January 2017 to December 2022. Data were obtained from the institution’s electronic medical record, including patient demographics, presenting symptoms, comorbidities, body mass index (BMI), primary diagnosis, and pathology results. We compared preoperative, postoperative, and histologic diagnoses, excluding cases where sinus contents were not submitted for histologic examination.
RESULTS
A total of 208 procedures were analyzed. In only three cases were sinus contents found to be abnormal. One patient was diagnosed with subclinical Chronic Lymphocytic Leukemia (CLL), which required only observation. In two other patients, fungal mycetomas were found in the sinus contents, and no further treatment was needed. The total cost to society of sending sinus contents for pathology was $29,120, while only $420 resulted in any diagnosis change. Despite this cost, no patient required additional intervention based on sinus contents pathology.
CONCLUSIONS
Hypothesis was proven correct, that routine submission of sinus contents for pathology in skull-base cases is not cost-effective and does not alter patient management. This economic evaluation strongly opposes this practice, as it rarely uncovers another disease process or affects clinical decisions. Furthermore, given the lack of significant benefit and the associated costs, the routine collection of sinus contents for pathology should be discontinued.
Do Critically Ill Patients with Guardianships Receive More Aggressive Treatment?
Misha Haq, B.S.1, Jacob Keeley, M.S.2, Enrique Calvo-Ayala3
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Research, Oakland University William Beaumont School of Medicine, Rochester, MI
3Corewell Health, Royal Oak, MI
INTRODUCTION
Patients suffering from certain conditions may become unable to make their own end-of-life decisions and require a guardian to carry out medical decisions on behalf of the patient. Studies have shown that patients with guardians tend to receive more aggressive care, regardless of the benefit. The aim of this project is to compare the rates of aggressive medical procedures between patients with and without legal guardianship.
METHODS
A case-control study was carried out. All ICU admissions at William Beaumont Hospital (Royal Oak) between 1/1/19-12/31/23 were queried using EPIC. Cases were defined as patients admitted to ICU with legal guardians; controls were defined as patients admitted to ICU without guardians and were matched based on age, gender, and an ICU severity score. Data extracted included age, gender, length of stay in the ICU, occurrence of aggressive procedures (CPR use, tracheostomy, gastrostomy), hospice and death disposition, and presence and relation of guardian. Statistical analysis was preformed using unequal variance two-sample t-tests and chi-square tests.
RESULTS
The use of CPR was higher in the control group (9.6%) compared to the case group (4.7%), p=0.001.
The case group (6.7%) had higher rates of gastrostomy tube placements compared to the control group (3%), p=0.0037. There was no significant difference in placement of tracheostomies between case (5.3%) and control (5.4%) groups, p=0.8953.
CONCLUSIONS
Patients without guardianship had increased occurrences of CPR when compared to patients with guardianship. Patients with guardianship had increased placement of gastrostomy tubes compared to patients without guardianship. There was no significant difference in tracheostomy placement between patients with and without guardians who were admitted to the ICU.
Familial Chiari Malformation: Prevalence of Connective Tissue Disorders and other Comorbidities
Anne Heukwa-Tefoung, B.S.1, Holly Gilmer, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Michigan Head and Spine Institute, Novi, MI
INTRODUCTION
Chiari malformation type I (CM-I) is an anatomical abnormality characterized by the inferior displacement of cerebellar tonsils through the foramen magnum into the upper cervical spine. Symptoms include headache, neck pain, dizziness, diplopia, swallowing difficulty, and unsteady gait. The surgical treatment consists of posterior fossa decompression, with or without duraplasty and C1 laminectomy. Connective tissue disorder is associated with increased risk of poor wound healing, progressive scoliosis, and other delayed complications. However, published data about the prevalence of connective tissue disorders in familial chiari malformation remains limited. Thus, the goal of this study is to determine the prevalence of connective tissue disorders and other comorbidities in familial CM-I.
METHODS
890 participants diagnosed with CM-I between January 2012 and December 2022 were invited, via emails, to complete an online survey composed of ten health-related questions. Data variables were assessed using the Chi-square test.
RESULTS
382 participants (54 males, 382 females) completed the survey (response rate = 42.9%). Due to some incomplete responses, 354 responses were analyzed. Data was divided into two groups. Compared to families with only one person affected, families with two or more people with CM-I were more likely to have a family history of joint replacements (39% vs. 59%), joint dislocations (15% vs 51%), easy bruising (52% vs 75%), and diagnosed connective tissue disorders (15% vs 41%). Ehlers-Danlos syndrome (EDS) was the most prevalent diagnosed connective tissue disorder in familial CM-I.
CONCLUSIONS
Previous studies have shown that Chiari patients with co-morbidities have a higher risk of treatment failure. Our data suggests that connective tissue disorder may be ubiquitous in patients with familial CM-1. Pre-operative identification of connective tissue disorders and other comorbidities in CM-1 patients may help contribute to strategies to help mitigate this risk.
Clinical Patterns and Risk Factors of Twiddler’s Syndrome: A Case Series
Kendall Highhouse, B.S.1, Dwayne Baxa, Ph.D.2
1Oakland University William Beaumont School of Medicine, Rochester MI
2Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, MI
INTRODUCTION
Twiddler’s syndrome classically refers to pacemaker or defibrillator malfunction due to manipulation of the device body via the patient. This manipulation causes dislodgment and/or proximal retraction of the leads. Although this condition has been underreported in the literature, it was initially thought to be most prominent in patients with dementia or other neurocognitive illness; however, there are increasing cases in patients without known neurocognitive disease. The primary goal of this study is to determine common risk factors present in Twiddler’s syndrome patients featured in published case reports. The secondary goal is to use these findings to develop recommendations for clinicians to allow for prompt recognition of patients with Twiddler’s syndrome.
METHODS
PubMed was utilized to seek out published cases of Twiddler’s syndrome. Cases selected were published from 2004-2024 in an effort to identify more recent literature while including as many cases as possible. Article type was restricted to case report. Search terms included “Twiddler’s syndrome,” “Twiddler,” “pacemaker AND malfunction.” Given that some cases provided scant information about the patient and case, articles were required to include age, sex, description of presenting symptoms, and findings from clinical workup. Seven cases were determined to meet these criteria.
RESULTS
Of the seven cases described, only two confirmed existing neurocognitive disorders in the patient. Six of the seven cases occurred in women over the age of fifty with the remaining case occurring in a male with morbid obesity. All patients in these cases presented with sudden cardiac symptoms consistent with pacemaker misfiring and diagnostic confirmation of Twiddler’s syndrome.
CONCLUSIONS
The results demonstrate that Twiddler’s syndrome can often occur outside of the classic scenario of neurocognitive disease and intentional pacemaker manipulation. The results show that other risk factors such as significant obesity, female sex, and age >50 may also predispose patients to ICD malpositioning.
Veracity Enhancement Using Non-Invasive Brain Stimulation - A Systematic Review
Alice Hou1, Shahrukh Naseer1, Abram Brummett, Ph.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, MI
INTRODUCTION
For decades, non-invasive brain stimulation techniques like Transcranial Magnetic Stimulation and Transcranial direct-current stimulation have been used in clinical settings as forms of neuromodulatory treatment for depression and other neurodivergent disorders. In recent years, there has been increased research in utilizing the non-invasive techniques to enhance cognition beyond one’s baseline, including memory, moral, and veracity enhancements. Due to the lack of systematic reviews in the growing field of veracity enhancement, we aim to provide a summative overview of current research studying the impacts of NIBS on truthfulness, as well as its ethical dilemmas and future directions.
METHODS
A systemic review was conducted according to PRISMA guidelines with use of PubMed, Web of Science, and PsycINFO databases using search terms: (TMS/tDCS) AND (deception OR verac* OR lie OR dishonest*)
For each study, data related to the study type, sample size, age, brain localization, stimulation intensity and duration, and findings were extracted and analyzed by two reviewers.
RESULTS
12 papers utilizing tDCS and 10 papers utilizing TMS were found to meet the inclusion criteria.
Utilizing tDCS, veracity was increased after anodal stimulation of the anterior prefrontal cortex (1/1 study), temporoparietal junction (1/1), ventrolateral prefrontal cortex (2/2), and dorsolateral prefrontal cortex (4/6). Results were inconclusive in the other brain regions studied.
Utilizing TMS, studies researched differences between the left vs right DLPFC, with follow-up studies focusing on the latter. Inhibition of the right DLPFC resulted in participants exhibiting more veracity, while stimulation of the right DLPFC resulted in more deception.
CONCLUSIONS
Based on this systematic review, NIBS techniques show ability to pose distinct influences on veracity enhancement, with stimulation of different brain regions generating opposing effects. However, due to the small number of studies on this topic as well as occasional inconsistent results, substantial further research as well as a more standardized methodology is still needed.
Longitudinal Study of Diabetic Retinopathy and Correlation of Renal Insufficiency with OCT and OCTA Abnormality
Fanny Huang, B.A.1, Miaomiao Yu, Ph.D.2, Ruikang Wang, M.D.3, Theodore Leng, M.D.2, Sophia Wang, M.D.2, Yaping Joyce Liao, M.D./Ph.D.2,4
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Stanford University School of Medicine, Department of Ophthalmology, Stanford, CA
3University of Washington, Department of Ophthalmology, Seattle, WA
4Stanford University School of Medicine, Department of Neurology, Stanford, CA
INTRODUCTION
Diabetic retinopathy is the leading cause of blindness in the working age population, and vision loss may be irreversible if not treated early. Vision loss manifests as microvascular retinal changes and abnormal neovascularization, which can be visualized through optical coherence tomography (OCT) and angiography (OCTA). In this study, we examined clinical laboratory biomarkers, ophthalmic biomarkers from OCTA and OCT, and visual acuity data to analyze predictable biomarkers for visual acuity outcomes.
METHODS
A population of patients with diabetes mellitus (218 eyes) and healthy controls (95 eyes) were prospectively included. OCT/OCTA images were analyzed by a custom MATLab script for six vessel parameters and detailed phenotyping of these patients was performed by retrospective data analysis. Visual acuity of diabetic patients were obtained and analyzed from their initial visit (2014-2018) throughout their clinical care to their most recent follow up visit (2020-2022).
RESULTS
Longitudinal evaluation over 7 years revealed that the majority of patients had relatively stable visual acuities. Cluster map analysis revealed that hyperglycemia and evidence of renal failure were more common in those with moderate to severe diabetic retinopathy. OCT and vascular parameters on OCTA correlated with severity of diabetic retinopathy and can be found even in those without clinical evidence of diabetic retinopathy. Vascular analysis using OCTA of the superficial capillary plexus revealed significant correlation of neurodegeneration of the unmyelinated optic nerve axon with 3 vascular parameters (vessel flux, vessel area density, and vessel complexity index) in the superior and inferior peripapillary quadrants – the areas most significantly affected in DM.
CONCLUSIONS
Hyperglycemia and renal dysfunction were correlated with worsening diabetic retinopathy severity. Ophthalmic imaging revealed that neurodegeneration was present even before patients presented with symptoms. Early screening for diabetic retinopathy is crucial to detect microvascular changes, which can help improve clinical diagnosis, treatment, and vision loss before irreversible retinal microvascular changes.
Impact of PPROM Delivery Protocol on Adverse Neonatal Outcomes at Our Institution
Jacqueline Juarez, B.S.1, Dr. Sangeeta Kaur, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Obstetrics and Gynecology, Corewell Health William Beaumont University Hospital Royal Oak, MI
INTRODUCTION
Traditionally, mothers diagnosed with PPROM (preterm prelabor rupture of membranes) between 34w0d to 36w6d had immediate delivery. Our research showing a lower percentage of respiratory distress, mechanical ventilation, and/or length of NICU stay in expectant management can impact neonatal care and form a new standard on managing mothers diagnosed with PPROM between 34w0d to 36w6d.
METHODS
Through retrospective chart review at our institution, adverse outcomes between neonates born to mothers diagnosed with PPROM between 34w0d to 36w6d from January 1, 2020 to November 13, 2023 will be compared using the following variables: rates of mechanical ventilation, rates of respiratory distress, length of NICU stay.
RESULTS
Our data revealed a higher rate of respiratory distress in expectant management (36.0%) vs immediate delivery (15.9%), longer length of NICU stay in expectant management (1 day) vs immediate delivery (0.38 day), and higher rate of mechanical ventilation in expectant management (31.8%) vs immediate delivery (18.5%).
CONCLUSIONS
Our findings go against our hypothesis of a lower percentage of respiratory distress, mechanical ventilation, and/or length of NICU stay in expectant management. There was an unequal comparison of neonatal adverse outcomes born to mothers with diagnosis of PPROM between 34w0d and 36w6d expectant management (25) vs immediate delivery (465). Immediate delivery is indicated if there is evidence of intraamniotic infection, abnormal fetal testing, and/or vaginal bleeding suggesting abruptio placentae (8), further contributing to the increased number of women in the immediate delivery group. At least ½ of patients deliver within 1 week of membrane rupture regardless of management. Therefore, if the neonate was born before 37 weeks, there’s an increased risk of higher rates of respiratory distress, mechanical ventilation, and longer NICU stay. This could be contributing to increased rates of respiratory distress, mechanical ventilation, and longer NICU stay in expectant management compared to immediate delivery.
Procalcitonin as a Biomarker of Acute Kidney Injury in Patients with Suspected Bacterial Sepsis in the Acute Care Setting
Austin Kantola, B.S.1, Jose Navas Blanco, M.D.2, Jacob Keeley
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Corewell Health, Royal Oak, MI
3Department of Research, Oakland University William Beaumont School of Medicine, Rochester, MI
INTRODUCTION
Acute kidney injury (AKI) is a common, life-threatening complication in patients with sepsis and septic shock. Sepsis-associated AKI raises in-hospital mortality eightfold and the risk of chronic kidney disease development threefold. Approximately one-half of patients with sepsis experience an AKI, and about one-third of all AKIs are related to sepsis. Early identification of AKI could save time and resources, lower costs, and foster more positive patient outcomes. Therefore, a biomarker for earlier AKI detection in patients with sepsis would serve an outstanding role. Due to the utility of procalcitonin (PCT) in monitoring bacterial sepsis severity, the authors hypothesize that its concentration in serum could serve as a biomarker for AKI risk stratification in patients suspected of bacterial sepsis.
METHODS
We present a retrospective, multi-center, cohort study of patients admitted between 2015 and 2021 with suspected bacterial sepsis (Sudden Inflammatory Response Syndrome (SIRS) criteria ≥ 2 and a blood culture ordered) and with a serum PCT measured within 24 hours of admission. The inclusion and exclusion criteria reflect data available in the acute care setting.
RESULTS
We found a significant association between serum PCT and the development of AKI in patients suspected of bacterial sepsis (Odd Ratio [OR] 1.011, p < 0.05, 95% CI 1.005 - 1.016). Also there was significant differences between serum PCT level in patients with no AKI compared to those experiencing Kidney Disease Improving Global Outcomes (KDIGO) AKI stage 1 (p < 0.0001, 95% CI 1.66 - 3.93), however, rising PCT levels were not associated with increased KDIGO staging, nor was there a significant difference in PCT between patients who experienced no AKI and those that reached KDIGO stage 2 or 3.
CONCLUSIONS
In conclusion, although PCT is predictive of AKI, there is limited evidence to suggest serum PCT is clinically useful for AKI risk assessment.
Medication Assisted Treatment - The End in Mind
Nandita C. Kapur, B.S.1, Annas Aljassem, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Physical Medicine & Rehabilitation, Corewell Health
INTRODUCTION
Hospitals play a critical role as the first point of contact for opioid use disorder (OUD) treatment. However, stigma and skepticism about medical assisted treatment's (MAT) effectiveness hinder its adoption in acute care settings. Currently, limited research is present in this care setting, and more is needed to overcome misconceptions and enhance the efficacy of MAT in hospitals, improving stabilization and withdrawal management for OUD patients.
METHODS
Patients admitted between 2008 and 2023 under ICD-10-CM codes for OUD were screened based on predefined inclusion and exclusion criteria. Two analyses were conducted: one assessed trends in Methadone, Buprenorphine, or non-medication management following the implementation of a standardized education regimen from 2008 to 2023, while the other evaluated the impact of MAT on length of stay and readmission rates among patients from 2016.
RESULTS
When compared to patients who did not take any medications, there was no significant difference in length of stay (p<0.3469) and readmission rates at 1 (p<0.6940), 3 (p= 0.4593), 6 (p= 0.3320), and 12 (p= 0.7262) months who did take MAT. Over the 16 years of MAT administration in the hospital setting, there was an increase in prescriptions when the standardized educational regimen was given to patients (p<0.0001) and a clear trend showed 2016 as the pivotal year.
CONCLUSIONS
Our findings suggest that a standardized education regimen for patients with OUD led to higher Methadone and Buprenorphine prescriptions, highlighting the role of provider knowledge in treatment access. Importantly, MAT initiation did not affect length of stay or readmission rates, countering concerns about delayed discharge or increased hospital utilization. Further research is needed to explore how housing, insurance, transportation, and structured discharge planning impact MAT continuation and readmission rates. Despite its efficacy, MAT remains underutilized in acute care due to stigma, underscoring the need for continued research to optimize OUD management.
Safety Profile of Spinal Cord Stimulator Paddle Lead Revision and Replacement
Saini Kethireddy, B.S.1, Michael D. Staudt, M.D.2, Eric R. Mong3
1Oakland University William Beaumont School of Medicine, Rochester, MI
2University Hospitals, Cleveland, OH
3Corewell Health William Beaumont University Hospital, Royal Oak, MI
INTRODUCTION
Spinal cord stimulation (SCS) is a neuromodulatory treatment for patients experiencing chronic pain resistant to conservative therapies. It is often used for post-laminectomy syndrome, complex regional pain syndrome, and neuropathic pain. SCS leads can be categorized into percutaneous leads and paddle leads, the latter requiring a laminectomy for implantation. The purported advantage of paddle leads include lower migration rates and more efficient energy delivery. Revision surgery may be required in instances of lead failure, migration or misplacement. However, paddle lead replacement can be challenging due to extensive scar tissue formation, often requiring additional dissection. This has raised concerns about morbidity specific to paddle lead reoperation. This study aims to contribute to the understanding of SCS paddle lead revision in instances of misplaced electrodes.
METHODS
Participants who underwent revision and replacement of SCS paddle leads at Corewell Health William Beaumont University Hospital were identified retrospectively based on the appropriate billing codes. After medical record review, demographic data, pain ratings, operative factors and complications were recorded.
RESULTS
16 patients were identified who underwent lead replacement for misplaced SCS paddles. All paddles were replaced at the time of removal and all patients required a skip laminectomy. The median age was 61 years (range of 33-79). The duration between the index operation and revision was a median of 27.5 months (mean of 44.8 ± 47.5). Operations lasted a median of 2.1 hours (mean of 2.1 ± 0.4). No perioperative complications were observed. All patients reported improved pain outcomes.
CONCLUSIONS
SCS paddle revision surgeries for misplaced leads can be performed safely and result in improved pain coverage and pain control.
Prostatic Urethral Length on MRI Potentially Predicts Late Genitourinary Toxicity After Prostate Cancer Radiation
Kyu Min Kim, B.S.1, Joseph Lee, M.D. / Ph.D.2, Sirisha Nandalur, M.D.2, Allison Hazy, M.D.2, Sayf Al-Katib, M.D.3, Hong Ye, Ph.D.3, Nathan Kolderman, M.D.3, Abhay Dhaliwal, M.D.3, Daniel Krauss, M.D.2, Thomas Quinn, M.D.2, Kimberly Marvin2, Kiran R. Nandalur, M.D.3
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Radiation Oncology, Corewell Health William Beaumont University Hospital, Royal Oak, MI
3Department of Radiology and Molecular Imaging, Corewell Health William Beaumont University Hospital, Royal Oak, MI
INTRODUCTION
Radiation therapy is one of the most widely used treatments for prostate cancer. While many men tolerate treatment well, some experience late genitourinary (GU) toxicity that may impact their quality of life. The aim of this study was to evaluate baseline prostate MRI and clinical characteristics associated with GU toxicity after radiotherapy.
METHODS
This retrospective study evaluated men with baseline prostate MRIs who were treated with definitive radiotherapy for prostate cancer between 2016 and 2023. Prostate MRIs were evaluated by a single, experienced radiologist. GU toxicity was graded using CTCAE v4.0, with acute toxicity defined as ≤180 days and late toxicity as >180 days post-treatment. A multivariable logistic regression model was used for grade ≥2 acute toxicity, and Cox proportional hazards regression was used for late toxicity, adjusted for clinical factors and radiation therapy method.
RESULTS
A total of 361 men were evaluated, with a median follow-up of 15 months. Brachytherapy was associated with increased odds of acute toxicity (OR: 2.9, 95% CI: 1.5–5.8, P < 0.01), while longer membranous urethral length was associated with decreased odds (OR: 0.41, 95% CI: 0.18–0.92, P = 0.03). At 3 years post-treatment, 28.0% of patients developed late GU toxicity. Longer prostatic urethral length was the only factor significantly associated with an increased risk of late GU toxicity (HR: 1.6, 95% CI: 1.2–2.1, P < 0.01), particularly urinary frequency and urgency symptoms (HR: 1.7, 95% CI: 1.3–2.3, P < 0.01).
CONCLUSIONS
Longer prostatic urethral length is associated with worse late GU toxicity after definitive prostate radiotherapy. This objective metric may be useful for treatment decision making.
Aquablation Compared with Simple Prostatectomy for Prostate Volumes >80 Grams
Joshua Kuperus, B.S.1, David Gangwish, M.D.2, Minhaj Jabeer, Ph.D.2, Bernadette M.M. Zwaans, Ph.D.3, Jason Hafron, M.D.3, Kenneth M. Peters, M.D.3, Aidan Kennedy, M.D. 2, Paul Horning, M.D.2, Greg Palmateer1
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Corewell Health William Beaumont University Hospital, Royal Oak, MI
3Oakland University William Beaumont School of Medicine, Rochester, MI
Corewell Health William Beaumont University Hospital, Royal Oak, MI
INTRODUCTION
Aquablation (Aqua) is a novel technique for treating benign prostatic hyperplasia and lower urinary tract symptoms. This study compares Aqua to simple prostatectomy (SP), analyzing functional urinary outcomes, adverse events (AE), and retreatment rates.
METHODS
A single-institution retrospective chart review was conducted for men undergoing open/robotic SP or Aqua from 2017 to 2023 for prostates >80 mL. Data collected included blood transfusions, AE, retreatment rates, postoperative medication use, and International Prostate Symptom Score (IPSS) with quality-of-life (QOL) indicator. Inverse probability of treatment weighting (IPTW) was applied to address differences in baseline characteristics, including prostate size. Statistical analyses were performed using R 4.4.0. Results are presented as IPTW-adjusted comparing SP to Aqua using Fischer’s exact test and analysis of variance reported as beta (B) for continuous variables and odds ratios (OR) for categorical variables with 95% confidence intervals (CI).
RESULTS
In total, 172 patients were studied: 111 Aqua and 61 SP. Groups were well-matched for body mass index (Aqua 28.77 vs SP 28). Aqua patients were older (73.04 vs 68.89), had smaller prostates (135.46 vs 186.53 mL), and lower preoperative urinary retention (21.8% vs 47.5%). Baseline characteristics between groups were adjusted using IPTW. SP outperformed Aqua in 1-year IPSS scores (B=–3.4, CI:-5.7,-1.1, p=0.005), whereas QOL was comparable (B=–0.46, CI:–1.3, 0.33, p=0.2). SP patients continued alpha-blockers less often postoperatively (B=-0.27, CI:–0.39,-0.41, p=0.001). SP showed higher blood transfusion rates (OR=4.22, CI: 1.64, 13.2, p=0.006), longer hospital stays (B=1.7, CI: 1.0, 2.4, p<0.001), and longer operating times (B= 119, CI: 101, 135, p<0.001). SP had lower retreatment rates (OR=0.46, CI: 0.23, 0.87, p=0.019). AE were not significantly different (p=0.8).
CONCLUSIONS
Aqua outperformed SP for blood transfusions, hospital stay, and operative time. SP outperformed Aqua for retreatment rates, IPSS scores at 1-year follow-up, and reliance on alpha-blockers.
Safety and Utility of Robotics in Pediatric Surgery: Implementation and Initial Experience of a Pediatric Robotic Surgery Program
Luke Man, B.S.1, Kristin LeMarbe, M.D.2, Anthony Stallion, M.D.3, Nathan Novotny, M.D.3, Begum Akay, M.D.3, Pavan Brahmamdam, M.D.3
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Surgery, Corewell Health William Beaumont University Hospital, Royal Oak, MI
3Division of Pediatric Surgery, Corewell Health William Beaumont University Hospital, Royal Oak, MI
INTRODUCTION
This study describes the first four years of a pediatric robotic surgery program. The aim is to evaluate the safety and applicability of the robotic approach in a heterogenous pediatric cohort with a varied array of surgical pathology and to provide a framework for integrating pediatric robotic surgery into practice at other institutions.
METHODS
This was a retrospective chart review of all pediatric robotic surgeries performed at a tertiary care hospital from February 2019 and January 2023. Data were collected on patient demographics, procedure type, indications, number of operating physicians and proctors, operative duration, intraoperative events, length-of-stay, and postoperative complications. Summary statistics were performed.
RESULTS
This study included seventy-nine patients, 68.4% female, with ages ranging from 15 months to 20 years (mean 14.2 ± 3.8 years). Twenty-two unique procedures were performed, representing nine distinct organ systems. Cholecystectomy was the most common, followed by appendectomy, then ovarian cystectomy. Proctored cases and dual-attending operations occurred more frequently in the first year. Residents were involved in all cases. There were no robotic equipment or system failures. There was one intraoperative complication that was addressed intraoperatively without clinical consequence. Postoperatively, there were seven Clavien-Dindo I and five Clavien-Dindo IIIb complications. Overall, operative duration and hospital length-of-stay were comparable to commonly reported ranges.
CONCLUSIONS
Robotic surgery in children was safe, with broad applicability, while still allowing for resident teaching. Development of a pediatric robotic surgery program is feasible and the robotic platform can be considered for both routine and complex minimally invasive cases.
Assessment of Color Match of Universal Tinted Sunscreens in Fitzpatrick Skin Phototypes I-VI
Mohsen Mokhtari, M.S.1, Indermeet Kohli, Ph.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Henry Ford Health - Dermatolology, Detroit, MI
INTRODUCTION
Long Wavelength Ultraviolet A1 (LWUVA1) and visible light (VL) are known to exert detrimental effects on skin health, primarily through the induction of oxidative damage and pigmentation1. The majority of currently available sunscreens fail to provide adequate protection against these specific wavelengths, and the few that do offer an unaesthetic appeal. Universal tinted sunscreens are marketed to appeal to all skin types, but there is no research displaying the accuracy of matching.
METHODS
This study sought to evaluate seven brands of universal tinted sunscreens, encompassing a spectrum of price ranges. The experimental protocol involved the application of each sunscreen at two distinct concentrations: 1 and 2 mg/cm2, onto the dorsal arms of study participants representing a broad range of Fitzpatrick skin phototypes (I-VI).
RESULTS
To objectively quantify the effectiveness of each sunscreen in achieving a color match, colorimetry measurements were acquired at baseline and 15 minutes post-application. Furthermore, subjective feedback was received from participants through satisfaction surveys administered after the sunscreen application. The outcomes of the colorimeter data revealed that different products exhibited varying degrees of compatibility with different skin types. The survey responses followed a similar trend, with the majority of the products failing to universally satisfy individuals across diverse skin types. In particular, most sunscreens received a less than 20% satisfaction rating across all patients.
CONCLUSIONS
The findings derived from this study are poised to yield valuable insights for consumers seeking optimal sunscreen choices tailored to their individual skin types. Ultimately, the research outcomes serve as a resource that can inform and guide more informed sunscreen recommendations in the future, facilitating enhanced skin protection and well-being for all.
Allergies Coinciding with Central Sensitization Syndromes (ACCeSS)
Jacqueline Morey, B.A.1, Alemu Fite, Ph.D.2, Carl Lauter, M.D./Ph.D. 2,3, Matthew Sims, M.D./Ph.D.1,2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Internal Medicine, Section of Infectious Diseases and International Medicine, Corewell Health William Beaumont University Hospital, Royal Oak, MI, USA
3Department of Internal Medicine, Section of Allergy and Immunology, Corewell Health William Beaumont University Hospital, Royal Oak, MI, USA
INTRODUCTION
Central Sensitization Syndromes (CSS) are chronic pain conditions associated with central nervous system dysregulation. Previous research identified a link between high numbers of reported allergies, including at least one antibiotic allergy, and diagnoses of interstitial cystitis and fibromyalgia—two CSS conditions. We hypothesized that other CSS diagnoses similarly correlate with increased reported allergies, contributing to the understanding of CSS pathophysiology and its systemic manifestations.
METHODS
A retrospective study was conducted on patients admitted to three Corewell Health William Beaumont University Hospitals in 2021. The cohort included patients reporting at least one antibiotic allergy. Sixteen CSS conditions (7 defined as definite CSS, 9 as potential CSS) and 16 non-CSS control diagnoses were analyzed. For each group, prevalence rates were calculated for patients reporting 1 allergy, ≥10 allergies, and ≥20 allergies. Risk ratios comparing subgroup prevalence to all allergy reporters were generated.
RESULTS
Of patients with ≥10 allergies, 6 of 7 definite CSS and 6 of 9 potential CSS diagnoses had a risk ratio ≥3 (p<0.001); In the ≥20 allergies group, all 7 definite CSS and 2 potential CSS diagnoses showed risk ratios ≥6 (p≤0.01) (none of the controls exceeded this threshold in either group). Among patients with 1 allergy, all definite and potential CSS diagnoses were negatively associated (risk ratio ≤0.6), compared to only 3 control diagnoses meeting this criterion.
CONCLUSIONS
CSS diagnoses are significantly associated with high numbers of reported allergies, distinguishing them from non-CSS conditions. These findings underscore the systemic nature of CSS and its potential diagnostic markers, contributing to better understanding and management of these conditions.
Nephrotoxic Medication Induced Acute Kidney Injury in Hospitalized Pediatric Patients
Emelie-jo Nappo, B.S.1, Neal Blatt, M.D./Ph.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Pediatrics, Corewell Health Children's Hospital, Royal Oak, MI
INTRODUCTION
Acute kidney injury (AKI) is a sudden decline in kidney function that can lead to long-term complications, including chronic kidney disease. In pediatric patients, nephrotoxic medication exposure is a major contributor to AKI, with up to 86% of hospitalized children receiving at least one nephrotoxic medication. Nephrotoxic-induced AKI increases hospital length of stay, healthcare costs, and morbidity. Despite these risks, AKI is underrecognized due to insufficient kidney function monitoring. This study aims to assess the incidence of nephrotoxic-induced AKI in pediatric patients at Corewell Health and evaluate adherence to creatinine monitoring.
METHODS
This retrospective study reviewed electronic health records of pediatric patients hospitalized at Corewell Health Royal Oak Hospital between May 1, 2020, and April 30, 2021. Patients under 18 years old who received three or more nephrotoxic medications or IV aminoglycosides were included. Patient demographics, length of hospitalization, creatinine trends, cystatin C measurements, urine studies, and nephrology consultations were analyzed. AKI was identified based on KDIGO criteria.
RESULTS
Of 189 at-risk pediatric encounters, only 69 (36.5%) had proper creatinine monitoring, while 120 (63.5%) were inadequately evaluated. Among those monitored, 24 (34.8%) met KDIGO criteria for AKI. Of the 120 encounters with inadequate assessment, 90 (75.0%) had only a single creatinine value, and 30 (25.0%) had no creatinine evaluation. Urine studies were conducted in 83.3% of AKI cases, nephrology was consulted in only 16.7%, and cystatin C was assessed in 4.2% of cases.
CONCLUSIONS
Nephrotoxic-induced AKI remains a significant, yet underrecognized issue due to inconsistent creatinine monitoring in hospitalized pediatric patients. A large proportion of at-risk patients are not properly assessed, limiting early detection and intervention. Increased adherence to kidney function monitoring guidelines and improved nephrology involvement are essential to mitigating the impact of nephrotoxic exposure and reducing AKI-related morbidity in pediatric populations.
Low CPAP Adherence Amongst the African American and Arabic Population in Patients with Obstructive Sleep Apnea
Nikki Nguyen, B.S.1, Bhavinkumar Dalal, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2 Corewell Health William Beaumont University Hospital Pulmonary and Sleep Medicine , Royal Oak, MI
INTRODUCTION
Continuous Positive Airway Pressure (CPAP) is the primary treatment for Obstructive Sleep Apnea (OSA); however, adherence remains a challenge, leading to daytime sleepiness and increased cardiovascular risks. While most CPAP adherence studies focus on African American and Caucasian populations, there is limited data on Arabic Americans. This study explores racial disparities in CPAP adherence and identifies factors contributing to low compliance.
METHODS
A retrospective chart review was conducted from adults diagnosed with moderate to severe OSA at Corewell Health Sleep Evaluation Services between 2017 and 2019. CPAP adherence was defined as over 4 hours usage per night on over 70% of nights within 30 days. Data collected included self-reported demographics, comorbidities, CPAP adherence, and baseline sleep study data. Univariate and multivariate analyses identified factors affecting CPAP adherence.
RESULTS
Among 3142 eligible subjects, adherence data was available for 1699 patients, including 1199 Caucasians, 332 African Americans, and 43 Arabic Americans. Overall initial mean adherence was 76.32% across all groups. African Americans had significantly lower adherence compared to Caucasians (63.53% vs. 80.41%, p<0.0001). Arabic Americans also had lower adherence, although not statistically significant (69.65% vs. 80.41%, p=0.075). Multivariate regression analysis showed age (Odds ratio=1.01, p=0.048), race (Odds ratio=1.22, p=0.001), median leak (Odds ratio=0.95, p=0.000), and median pressure (Odds ratio=1.27, p=0.000) were significant predictors of CPAP adherence.
CONCLUSIONS
African and Arabic Americans exhibit lower CPAP adherence compared to Caucasians. The statistical difference for Arabic Americans was not statistically significant due to the small sample size. Younger age, higher median leak, and lower median pressure were also associated with lower adherence. Addressing these factors can help in developing targeted interventions to improve adherence, reducing the associated health risks.
Second-Site Periprosthetic Joint Infection After Subsequent Primary Hip or Knee Arthroplasty
Daniel Nikolaidis, M.S.1, Devin Young, B.S.1, Andrew Steffensmeier, M.D.2, Mark Karadsheh, M.D.2, Robert Runner, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Corewell Health William Beaumont University Hospital, Royal Oak, MI
INTRODUCTION
The risk factors for developing a periprosthetic joint infection (PJI) are well described. However, those associated with developing a second-site or metachronous PJI (MPJI) are poorly understood. The purpose of our study is to determine (1) MPJI prevalence, (2) demographic and medical risk factors, (3) characteristics of index PJI, (4) microbiology profiles, and (5) clinical outcomes associated with a second-site PJI in patients who have a history of treated PJI and undergo a subsequent primary THA or TKA
METHODS
A retrospective, single-center, case-control study identified 77 patients who were treated for an index PJI (hip or knee) between 2013-2022, and who subsequently underwent another primary arthroplasty. We identified patients who developed a second-site PJI. Diagnosis was made using the 2018 Musculoskeletal Infection Society (MSIS) criteria. Minimum follow-up was 2 years. The prevalence of second-site PJI was calculated, and risk factors were assessed by comparing characteristics of patients who had a single PJI (SPJI) to those with MPJI.
RESULTS
We identified 9 patients (11.7%) who developed a second-site PJI after a subsequent primary THA (7) or TKA (2). Average time to MPJI was 56.3 weeks (range, 1.6 to 211.3) after a subsequent primary arthroplasty. Patients who developed a MPJI had significantly faster onset of index PJI compared to those with a SPJI (27.0 vs 104.1 weeks, P-value = 0.003). There were no other statistically significant risk factors for MPJI identified in this study. No single pathogen at index PJI was a risk factor for MPJI.
CONCLUSIONS
Patients who have a history of PJI are at an increased risk for developing a second-site PJI after a subsequent primary THA or TKA. Orthopedic surgeons should be aware of the prevalence and potential risk factors for metachronous PJI when considering a second hip or knee arthroplasty in this unique patient population.
Characterization of Thrombocytopenia and the Immature Platelet Fraction in Neonatal Sepsis
Emily Nolton, B.S.1, Erin Soule-Albridge, B.S.2, Henry Feldman, M.D.2, Martha Sola-Visner, M.D.2, Kyeorda Kemp, Ph.D.3, Patricia Davenport, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Division of Newborn Medicine, Boston Childrens Hospital, Boston, MA
3Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, MI
INTRODUCTION
Neonatal sepsis is known to be a major cause of post-natal morbidity and mortality in the NICU. Blood culture remains the gold standard for diagnosis but multiple new parameters are under investigation to diagnose sepsis earlier, prognosticate the trajectory of illness, and predict outcomes. One new biomarker of interest is the Immature Platelet Fraction (IPF). The IPF is the percentage of the newly released platelets in circulation. This study aimed to characterize the time course and patterns of thrombocytopenia and platelet production (using IPF) in neonates with Gram positive, Gram-negative, and fungal culture-proven sepsis.
METHODS
This is a retrospective chart review of patients admitted to the Boston Children’s Hospital NICU between January 2016 and December 2020. All patients with a positive blood culture for bacteria or fungi who completed at least 5 days of antibiotic/antifungal therapy were included. Exclusion criteria included infants with liver failure, congenital or acquired platelet production defects, and infants with a positive blood culture treated with antibiotics for less than 5 days. Statistical analysis included t-test, Fisher exact or Kruskai-Wallis.
RESULTS
Severe thrombocytopenia occurred more often in septic episodes caused by gram negative bacteria (p=0.008). The incidence of sepsis induced thrombocytopenia was higher in neonates born at younger gestational ages (p=0.0018), with lower birth weights (p=0.0003), and with lower weights at the time of the septic episode (p=0.0001). Finally, the IPF response to thrombocytopenia differs by organism type, with gram negative infections having the lowest IPF values in the face of severe thrombocytopenia.
CONCLUSIONS
Thrombocytopenia occurred in ~60% of septic neonates in our cohort, with higher incidence of thrombocytopenia in infants of younger gestational ages, lower birth weights, and lower weights at the time of sepsis. Episodes of gram-negative sepsis had more severe thrombocytopenia and lower IPF percentages in the face of severe thrombocytopenia, suggestive of a hypo-proliferative state.
The Administration of Antenatal Corticosteroids in the Late-Preterm Period
Sarah O’Mara1, Sara Jaber MD, MPH, FACOG2, Kurt Wharton, MD 2
1 Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Obstetrics & Gynecology, Corewell Health William Beaumont University Hospital, Royal Oak, MI
INTRODUCTION
The Antenatal Late Preterm Steroids trial demonstrated a significant reduction in neonatal respiratory morbidity with late-preterm corticosteroid use (34 to 36+6 weeks’ gestation). In response, the Society of Maternal-Fetal Medicine issued recommendations in August 2016 supporting corticosteroid administration in patients at risk for late-preterm delivery. Despite that, its use in the late preterm period is not yet universally adopted.
METHODS
Retrospective cohort study that included patients who delivered at Corewell Health William Beaumont University Hospital between January 1, 2017, and December 31, 2019, and received at least one dose of corticosteroids between 34 to 36+6 weeks’ gestation. IRB approval was obtained. Neonatal outcomes such as APGAR scores, neonatal intensive care unit (NICU) admission, mechanical ventilation, respiratory distress syndrome (RDS), among others were estimated. Maternal outcomes such as intensive care unit admission, sepsis, stroke among others were estimated. Both overall and by year neonatal and maternal outcomes were assessed. Statistical significance was set at p<0.05.
RESULTS
147 patients received later preterm steroids from 2017 to 2019. Among those, the mean gestational age at delivery was 35+5 weeks’ gestation. Two-dose completion occurred in 51% of cases. Median 1- and 5-minute APGAR scores were 8 and 9, respectively. Admission to the NICU occurred in 65% of neonates. 2.5% required mechanical ventilation, 28% developed respiratory distress syndrome, 41.6% had hyperbilirubinemia, and 23% experienced hypoglycemia. No cases of intraventricular hemorrhage or necrotizing enterocolitis were observed. Comparison across years showed no significant differences in neonatal outcomes except for a higher incidence of hyperbilirubinemia in 2019. Maternal outcomes remained unchanged across the study period.
CONCLUSIONS
Given the controversy in evidence and the lack of long-term studies on the usage of corticosteroids in the late-preterm period, it is imperative that administration is not very liberal and in fact compliant with the proposed selection criteria for those patients that are eligible to receive corticosteroids in the late-preterm period. Our study shows that most of our patients delivered in the preterm period with a mean gestational age of 35+5 weeks. The implementation of a late-preterm corticosteroid protocol has yielded variable neonatal outcomes, with persistent NICU admissions and respiratory morbidity. However, the absence of significant neonatal differences over time, suggests that the administration of antenatal corticosteroids is likely associated with stable and consistent health outcomes. Further research is needed to expand patient recruitment and assess long-term neonatal and maternal implications.
Examining Health Disparities: An Observational Study on Peripheral Vascular Access Outcomes Among Hospitalized Patients
Charlotte O'Sullivan, B.S.1, Nicholas Mielke, M.D.2, Yuying Xing, M.D.3, Amit Bahl, M.D.4
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Creighton University School of Medicine, Omaha NE
3Corewell Health Research Institute, Royal Oak, MI
4Department of Emergency Medicine, Corewell Health William Beaumont University Hospital, Royal Oak, MI
INTRODUCTION
Placement of peripheral intravenous catheters (PIVC) is a routine procedure in hospitals. There is little research describing healthcare disparities in PIVC outcomes. The objective of this study is to explore the relationship between healthcare disparities and PIVC outcomes.
METHODS
This is an observational analysis of adults with PIVC access established in the emergency department requiring inpatient admission between January 1st, 2021, and January 31st, 2023 at William Beaumont Hospitals. Health disparities are defined by the National Institute on Minority Health and Health Disparities. The primary outcome is the proportion of PIVC dwell time to hospitalization length of stay, expressed as the proportion of dwell time (hours) to hospital stay (hours) x 100%. After multivariable linear regression, multivariate linear regression analysis was utilized to adjust for confounders and best estimate the true effect of each variable.
RESULTS
Our study analyzed 144,524 encounters. Racial demographics showed 67.2% White or Caucasian, 27.0% Black or African American, with the remaining identifying as Asian, American Indian or Alaska Native, or other races. The median proportion of PIVC dwell time to hospital length of stay was 0.88, with Asians having the highest ratio (0.94) and Black or African American individuals the lowest (0.82). Black females had a median dwell time to stay ratio of 0.76, significantly lower than White males at 0.93 (p<0.001). After controlling for confounder variables, a multivariable linear regression demonstrated that Black males and White males had a 10.0% and 19.6% greater proportion of dwell to stay, respectively, compared to Black females (p<0.001).
CONCLUSIONS
Black females face the highest risk of compromised PIVC functionality, resulting in approximately one full day less of reliable PIVC access than White males. To comprehensively address and rectify these disparities, further research is imperative to formulate effective strategies aimed at mitigating these disparities and ensuring equitable healthcare outcomes for all individuals.
The Relevance of Age, BMI, and Right Versus Left Ear Surgery to the Incidence of Taste Disturbance After Stapes Surgery
Jonathan Ong, B.S.1, Dennis Bojrab II, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Corewell Health William Beaumont University Hospital, Royal Oak, MI
INTRODUCTION
Ample research has examined surgical outcomes and recovery undergoing stapedectomy or stapedotomy. Although many factors such as age and body mass index (BMI) have been shown to influence post-operative outcomes, it has yet to be determined whether these factors have any impact on the incidence of taste disturbance, a commonly reported complication of stapes surgery. This project took stapes surgery patients and compared self-reported incidences of taste disturbance post-operatively.
METHODS
Stapes surgery patients from 2012-2022 were retrospectively reviewed and were divided into several groups. Groups were classified based on their BMI, age, and the side of the ear operated on. Each group was independently compared using analysis of variance to determine whether these factors had any correlation with increased incidence of post operative taste disturbance.
RESULTS
This study included 807 patients. When comparing underweight (BMI ≤18, n=15), healthy weight (BMI >18 to <25, n=320), overweight (BMI ≥25 to 29.9, n=278), and obese (BMI ≥30, n=194), there was no significant difference in incidence of taste disturbance between the groups (P=.1375). When comparing the younger age group (age ≤45, n=277) with the older age group (age >45, n=530), there was also no significant difference in incidence (P=.1036). However, when comparing whether surgery was completed on the right ear (n=416) versus the left ear (n=391), there was a significant increase in incidence in patients who had undergone surgery in the right ear (P=.004).
CONCLUSIONS
While this study demonstrates that surgery performed on the right ear had a significant difference in the incidence of taste disturbance following stapes surgery when compared to surgery in the left ear, BMI and age do not appear to affect the incidence of taste disturbance.
Comprehensive Analysis of Racial Differences in Comorbidities and Prenatal Care Among Pregnant Women
Ryian Owusu, B.S.1, Kurt R. Wharton, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Obstetrics and Gynecology, Corewell Health, Royal Oak, MI
INTRODUCTION
Women of color face disproportionately higher risks of pregnancy-related morbidity and mortality. Factors such as healthcare access, health literacy, and education influence outcomes. Prenatal care aims to educate women on healthy lifestyles and identify high-risk pregnancies. Determining patient satisfaction with their prenatal care could greatly contribute in defining a link between prenatal care services and maternal-fetal outcomes. This study aimed to evaluate differences in comorbidities, access to prenatal care, and its adequacy across racial groups.
METHODS
A cross-sectional study recruited participants using clinically relevant codes from Corewell Health locations. Eligible participants completed a 42-question Qualtrics survey assessing demographics, physical health, healthcare access, and provider interactions. Data were analyzed using Fisher’s Exact Test.
RESULTS
The study included 63 pregnant women, categorized as White (n=57) or Other (n=6), with the latter group including Black, Asian, and Hispanic individuals. Statistically significant findings included:
- Feeling down in the past month: White respondents reported 2.0% "Very Often," 6.0% "Fairly Often," 26.0% "Sometimes," 44.0% "Almost Never," and 22.0% "Never," compared to 0.0%, 33.3%, 66.7%, 0.0%, and 0.0% among the Other group (p-value = 0.0088).
- Being easily irritated in the past month: White respondents reported 8.2% "Very Often," 12.2% "Fairly Often," 44.9% "Sometimes," 28.6% "Almost Never," and 6.1% "Never," while Other respondents reported 33.3%, 33.3%, 16.7%, 0.0%, and 16.7% (p-value = 0.0338).
CONCLUSIONS
Women of color reported higher rates of mental health challenges and dissatisfaction with provider communication on topics like diet. These disparities highlight the need for targeted interventions to improve prenatal care experiences. However, the small sample size, particularly of the Other group, limits generalizability. Future studies should include larger, more diverse cohorts to explore these disparities more comprehensively.
Using Deep Learning for Digital Subtraction Angiography Based Cerebral Aneurysm Detection and Rupture Risk Assessment
Yu Rim Park, B.S.1, Muhammad Irfan, M.S.2, Khalid Mahmood Malik, M.D.3, Ghaus Malik, M.D.4
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Computer Science & Engineering, Oakland University, Rochester, MI
3Department of Computer Science, University of Michigan, Flint, MI
4Department of Neurosurgery, Henry Ford Health System, Detroit, MI
INTRODUCTION
Accurate detection of intracranial aneurysms is critical due to the high mortality associated with subarachnoid hemorrhage. Several imaging modalities are used for detection of intracranial aneurysms, but the interpretation is challenging, leading to decline in characterization. Specifically in two-dimensional digital subtraction angiography (2D-DSA), the problem lies in the limited depth information provided, preventing accurate assessment of the complex vascular morphology. Our project aims to improve the detection and segmentation of intracranial aneurysms using deep learning models on 2D DSA, and to assess aneurysm rupture risk by overcoming the issue of lack of depth inherent to 2D imaging.
METHODS
Ablation studies on deep learning models including VGG and ResNet were done for development of the best performing fusion model. U-Net model was used for geometric feature extraction, and Tamura was used to incorporate texture features. Following normalization, the features were incorporated into a convolutional neural network (CNN) for rupture prediction into mild, moderate, severe, and critical risk. A total of 569 DSA images from a single institution were used to test clinical applicability, which were labeled and classified by the neurosurgery department.
RESULTS
We conducted an extensive study for aneurysm detection, segmentation, and rupture prediction. Our initial ablation study of various sets of features for aneurysm detection showed an overall accuracy of 0.92 compared to individual set features of features. For the aneurysm rupture analysis, our CNN achieved an accuracy score of 0.87 and area under the curve of 0.93, showing an improved performance compared to the existing methods.
CONCLUSIONS
Our results suggest that utilizing a fusion of deep, geometrical, and texture features leads to a synergistic performance in aneurysm rupture risk analysis using 2D DSA imaging. We also found that our proposed pipeline model outperforms several well-established traditional machine learning algorithms across various metrics.
Venlafaxine Exposure in Pregnancy and its Association with Pre-eclampsia Development
Jhanvi Patel, B.S.1, Dr. Jeffrey Guina, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Corewell Health- Psychiatry
INTRODUCTION
Pregnant women with psychiatric illnesses face an increased risk of hypertensive disorders, including pre-eclampsia. Venlafaxine, a serotonin-norepinephrine reuptake inhibitor (SNRI), is commonly prescribed for mood disorders but has been associated with elevated blood pressure. This study systematically reviews existing literature to evaluate whether venlafaxine use in pregnancy is linked to the development of pre-eclampsia.
METHODS
A systematic review with meta-analysis was conducted following PRISMA-P guidelines. PubMed and EMBASE was searched using Boolean operators combining “venlafaxine” with pregnancy-related hypertensive disorders. Studies reporting blood pressure outcomes in the peripartum period were included. Duplicates were removed, and articles were screened based on relevance, with data extracted and tiered by level of evidence. Statistical analysis will assess the strength of association between venlafaxine exposure and hypertensive outcomes.
RESULTS
It is hypothesized that venlafaxine exposure during pregnancy is associated with an increased risk of pre-eclampsia. A pooled analysis of existing studies may demonstrate higher rates of hypertensive disorders in pregnant women taking venlafaxine compared to controls. Additionally, SNRIs as a drug class may contribute to increased hypertensive outcomes.
CONCLUSIONS
This review provides evidence of a potential association between venlafaxine use during pregnancy and an increased risk of preeclampsia, particularly with higher doses and continued use into the third trimester. While findings across studies were not uniformly conclusive, the overall trend suggests a modest but clinically relevant elevation in risk. These results underscore the importance of individualized risk-benefit analysis when considering venlafaxine therapy during pregnancy.
Toxicity and Clinical Outcomes Following Brachytherapy Using 3-D Dosimetric Planning in Patients with Locally Advanced Cervical Cancer: An Institutional Experience
Nikita M. Patel, B.S.1, Jacob F. Oyeniyi, M.D.2, Maha Saada Jawad3
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Radiation Oncology, Beaumont Health System, Royal Oak, MI
3Assistant Professor, Oakland University William Beaumont School of Medicine, Rochester, MI
Department of Radiation Oncology, Beaumont Health System, Royal Oak, MI
INTRODUCTION
Locally advanced cervical cancer (LACC) is managed primarily with concurrent chemoradiation (CRT) and brachytherapy (BT) boost. Three-dimensional (3-D) planning of radiation treatments provides better target and normal organ visualization and has been shown to minimize treatment toxicity. This study contributes to supporting the efficacy of 3-D planning, anticipating reduced toxicity and improved clinical outcomes for cervical cancer patients.
METHODS
We retrospectively identified 31 consecutively treated patients from 2012 to 2019 who received pelvic external beam radiotherapy (EBRT) and a high dose rate (HDR) cervix boost with 3-D planning for LACC at a single institution. Local (LRFS), regional (RRFS), and distant metastases-free survival (DMFS), as well as progression-free survival (PFS), and overall survival (OS) were analyzed. Acute (≤6 months) and late toxicities (>6 months) were reported.
RESULTS
The median age was 54 years. 25 patients (81%) had squamous histology, 5 (16%) had adenocarcinoma and 1 (3.2%) had mixed histology. The median pelvic radiation therapy (RT) dose was 45 Gy, and the median BT dose was 30 Gy with a median total EQD2 of 84.3 Gy. Median follow-up was 3.2 years. The 5-year LRFS, RRFS, DMFS, PFS, and OS were 87.5%, 87.3%, 67.4%, 51.0%, and 52.1% respectively. The rates of acute/chronic grade ≥2 gastrointestinal, genitourinary, and gynecological toxicities were 6.5%/23%, 6.5%/9.7% and 19%/3.2% respectively. There was no grade 4 or 5 toxicity.
CONCLUSIONS
Our study shows favorable outcomes and toxicity profiles with 3-D brachytherapy planning as compared to historical outcomes with 2-D BT and adds to the growing evidence supporting the use of 3-D BT.
An Analysis on Gender and Race for Burnout Among Michigan Internists
Elan Pszenica, B.S.1, Jookta Basu, B.S.1, Esha Ahmed, M.D.1, Hugo Davila, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, MI
INTRODUCTION
Burnout among physicians, particularly in the field of internal medicine, has become a significant concern in recent years. This condition is characterized by emotional exhaustion, depersonalization, and a reduced sense of personal accomplishment, which can have detrimental effects on both healthcare providers and patient care. While burnout is a recognized issue globally, its impact on internal medicine physicians in Michigan has not been comprehensively studied. Understanding the unique stressors faced by these physicians is critical for developing targeted interventions to reduce burnout rates and improve the overall healthcare environment.
METHODS
The Mini Z 2.0 survey was administered to 123 internal medicine physicians in the Michigan American College of Physicians chapter. This was done via an anonymous online survey that was sent via email to registered internists in the state of Michigan. Anonymous data was collected and a statistical analysis was performed on the de-identified data to compare demographics to levels of burnout.
RESULTS
Of the 123 respondents 14 (11.4%) scored high risk for burnout. 87 (70.7%) of total respondents scored their workplace as not highly supportive (subscale 1) and high levels of work pace and EMR stress (subscale 2). Of these low scores, 42 (49.4%) respondents were female and 43 (50.6%) were male (p=0.3476). 6 (8.1%) were Indian/Pakistani, 51(68.9%) were white, not of Hispanic origin, and 17(23.0%) were of other races (p=0.0749). The vast majority of our population had a net score between 20-40 suggesting a moderate risk of burnout.
CONCLUSIONS
The results demonstrate a moderate risk of burnout in Michigan internists. From our data, we cannot conclude a statistically significant difference in gender or race for burnout rates for Michigan internists. A larger sample size is recommended for future studies on this topic.
Antiferromagnetic Artificial Neurons: A Platform for Neuromorphic Computing and Physiological Modeling
Lily Quach, B.S.1, Hannah Bradley, Ph.D.2, Steven Louis, Ph.D.3, Vasyl Tyberkevych, Ph.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Physics, Oakland University, Rochester, MI
3Department of Electrical and Computer Engineering, Oakland University, Rochester, MI
INTRODUCTION
Antiferromagnetic (AFM) auto-oscillators model biological neurons to produce ultra-fast voltage spikes as a response to external stimulation. AFM neurons have many features that not only closely resembling biological neurons, but also operate with faster speeds and lower energy consumption compared to state-of-the-art computers available today. Dynamic modeling of biological neural circuitry with AFM neurons can propel technology in medicine to augment the degree, efficacy, and quality of healthcare.
METHODS
The physical schematic for the AFM architecture was established to be feasible, inexpensive, energy-efficient, dynamic, and compact. The selected AFM material is Nitric Oxide. It is a cost-efficient and anisotropic material. Anisotropy is defined by dual magnetization fields, which cancel each other out and allow for interconnectivity of physical neurons in a compact space. Platinum serves as the electrical conductor, and was chosen for high spin-orbit coupling The quantitative physical parameters of measurement are established as an ampere input and voltage output.
RESULTS
AFM spiking simulations demonstrate three things: firstly, AFM neurons are capable of accurately modeling the functionality and characteristics of biological neuron including response latency, refractory periods, and inhibition, which arise from an effective internal inertia. Secondly, our AFM artificial neuron model overcomes current AI limitations with its state-of-the-art speed of conduction, inexpensive and compact architecture, and minimal computational energy. Thirdly, this study develops a platform to perform further studies of neural circuitry such as physiological reflexes simulations, and training for robust diagnoses.
CONCLUSIONS
This artificial neuron that uses antiferromagnetic (AFM) auto-oscillators to produce ultra-fast voltage spikes as a response to external stimulation. AFM neurons have many features that not only closely resembling biological neurons, but also operate with faster speeds and lower energy consumption compared to state-of-the-art computers available today.
Comparison of Two Adjuvant High-Dose Rate Vaginal Cuff Brachytherapy Dose Fractionation Regimens for Treatment of Early-Stage Uterine Cancer
Catherine Raciti, B.S.1, Hong Ye, Ph.D.2, Allison J Hazy, M.D.2, Maha Saada Jawad, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Radiation Oncology, Corewell Health William Beaumont University Hospital, Royal Oak, MI
INTRODUCTION
Adjuvant vaginal cuff brachytherapy (VBT) is widely used for early-stage endometrial cancer, with various dose fractionation schedules endorsed by the American Brachytherapy Society (ABS). Initially, our regimen of 30Gy in 6 fractions was based on radiobiological calculations and demonstrated safety and efficacy. During the COVID pandemic, we transitioned to a shorter course of 22Gy in 4 fractions. This study compares outcomes and toxicities of these regimens.
METHODS
This single-institution retrospective review included Stage I-II endometrial cancer patients undergoing adjuvant VBT between 1998 and 2022. High-dose-rate VBT was delivered via a vaginal cylinder prescribed to 5 mm depth at 22Gy in 4 fractions or 30Gy in 6 fractions. Toxicities were assessed using Common Terminology Criteria for Adverse Events (CTCAE). Acute and chronic toxicity were defined as ≤6 months and >6 months post-VBT, respectively. Clinical outcomes included local recurrence (LR), regional recurrence (RR), distant metastasis (DM), cause-specific survival (CSS), disease-free survival (DFS), and overall survival (OS). P-values <0.05 were significant.
RESULTS
A total of 270 patients were included in our analysis, with a median follow-up of 4.7 years (7.7 and 1.3 years for 6 fractions and 4 fractions, respectively, p<0.001) and a median age of 66 (range, 31-89). The majority, 72%, were treated with 6 fractions, and 28% with 4 fractions. The 2-year LR, DM, CSS, and OS rates for 4/6 fractions were 2.4%/3.7%, 6.8%/3.7%, 95.8%/97.9%, and 95.8%/95.8%, respectively. There were no significant differences in clinical outcomes. There was no significant difference in acute toxicity (p=0.468). Chronic vaginal dryness (p = 0.017) and vaginal stenosis (p<0.001) were significantly less with 4 fractions compared to 6. There were no grade ≥ 3 toxicities in either group.
CONCLUSIONS
VBT with 22Gy in 4 fractions is effective and well-tolerated, with comparable outcomes to 30Gy in 6 fractions. Longer follow-up and matched-pair analyses are warranted to validate these findings further.
Pseudoporphyrias: A Systematic Review and Meta-Analysis
Madison G Romanski, B.S.1, Meghan R. Mansour, M.D.2, Nicholas Belair1, Mohsen Mokhtari1, Joseph Fakhoury, M.D.3
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Henry Ford Health Transitional Year Program
3Henry Ford Health Department of Dermatology
INTRODUCTION
Pseudoporphyria, a bullous disorder visually and histologically identical to porphyria cutanea tarda, is distinguished by normal porphyrin levels on biochemical testing. Bullae, vesicles, skin fragility, and scarring arise on the skin in a photodistributed pattern from various triggers. We present a systematic review of pseudoporphyria cases across different patient demographics.
METHODS
PubMed, Scopus, and Embase databases were searched for articles regarding pseudoporphyria. Of the 477 articles screened, 138 articles describing 191 patients were analyzed. The 2009 Oxford Levels of Evidence criteria were used for quality of evidence assessment.
RESULTS
Cases of pseudoporphyria were induced by drugs (136), renal insufficiency (38), UV-exposure (14), and miscellaneous triggers (lime juice, brewer’s yeast, and Coca-Cola) (3). Most cases resolved with trigger discontinuation. Of the few treated (5.1%), photoprotective agents (4.7%) and topical steroids (3.7%) were most common. Hands were the most affected area (78.5%). In drug-induced cases, NSAIDs were most commonly implicated (57.4%), followed by antifungals (7.4%). Therapy was discontinued in the majority of drug-induced cases (90.4%), with complete symptom resolution occurring in 80.1%. For renal insufficiency-induced cases, 97.4% were on dialysis, predominantly hemodialysis. Therapy was discontinued in only 7.9%, however full resolution occurred in 73.7% of patients. Interestingly, 100% of UV radiation-induced cases were biopsy-proven.
CONCLUSIONS
While rare, pseudoporphyria is an important consideration when patients present with a vesiculobullous rash. Thorough review of comorbidities, medications, and recent exposures is essential for trigger identification. A genetic predisposition for pseudoporphyria may be considered, highlighted by a case of monozygotic twins exhibiting identical symptoms after UV-exposure. Further research regarding genetic predisposition and family history, mechanisms behind triggering agents, and diagnostic standards may aid in developing targeted preventive strategies and treatments.
Impact of Michigan Public Act 246 on Opioid Prescribing for Pediatric Surgical Patients: A Retrospective Review
Madison Saunders, B.S.1, Pavan Brahmamdam, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Corewell Health East Department of Pediatric Surgery, Royal Oak, MI
INTRODUCTION
In December 2017, Michigan enacted Public Act 246(P246) in an attempt to address rising opioid misuse. P246 states that as of June 1, 2018, a prescriber of a controlled substance must discuss the risks of opioid addiction, overdose, and the dangers of taking opioids with benzodiazepine, alcohol or any other nervous system depressant with the minor and the minor’s parent or guardian. This study aims to evaluate the impact of this legislation by comparing opioid morphine milligram equivalents (MME) prescribed to pediatric patients undergoing common general, urological, and ENT surgeries before and after June 1, 2018.
METHODS
A single-center, retrospective chart review of pediatric patients who underwent circumcision, inguinal hernia repair, umbilical hernia repair, tonsillectomy, or adenoidectomy between 2015 and 2021. MME was calculated from post-operative opioid prescriptions. Descriptive statistics, univariate and multivariable analysis was performed, looking at changes in MME prescribed pre- and post- P246.
RESULTS
A total of 7,280 patients, with a mean age of 3.1 years were included in the study: 3,512 pre-P246 and 3,786 post-P246. There was no significant difference in age between the two groups. Children undergoing surgery after implementation of the law were found to have been prescribed 4.5 mg MME less than those before implementation (p<0.0001). A multi-variate linear regression model, adjusting for age, surgery date, and type of insurance, showed that all three surgical specialties had a significant reduction in prescribed MME post-P246. We found that there was a significant decrease in MME prescribed for circumcisions and tonsillectomy performed post-P246 (p<0.0001).
CONCLUSIONS
In Michigan, Public Act 246 was significantly associated with reduced take-home prescription opioids for pediatric patients after minor surgeries. These findings suggest that guardianship education about risks of opioid usage can significantly influence prescribing practices for pediatric surgical specialties, demonstrating the potential of legislation in promoting opioid stewardship.
PR Novel Hematoma Prevention Device
Nolan Shoukri, B.S.1, Steven Pearl, M.D.2, Nishaki Mehta, M.D.3
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Ascension Providence Rochester Hospital, Rochester, MI
3Corewell Health William Beaumont University Hospital, Royal Oak, MI
INTRODUCTION
Hematomas are a common complication of CIED placement procedures occurring in up to 25% of placement procedures. A previous survey study revealed that hematomas are the most common complication following CIED placement procedure through the perspective of both physicians and allied health professionals. A generation 2 hematoma prevention (Gen 2 PR) device made from physician and staff feedback was tested on mannequins and then in a healthy volunteer population.
METHODS
The device is tested on three primary locations, and the device efficacy is measured via internal and external pressure measurements. For the volunteer study, including pressure measurements, patient survey data, and photographic analysis of skin changes is also collected.
RESULTS
Device showed no evidence of wear and tear in both benchtop and volunteer testing. Mannequin testing showed the device functioned as expected. Volunteer testing revealed that the device maintained adequate pressure in 100% of testing per the embedded pressure sensor. Post device surveys represented minimal issues and side effects with device use and 100% of volunteers were content with device use, with no significant skin changes on photographic analysis.
CONCLUSIONS
Analysis of patients in our practice showed 56% of patients were on some type of blood thinner excluding aspirin alone. As more patients are on blood thinners, the importance of hematoma prevention increases. The Gen 2 PR device might be a safe and reliable way to reduce hematoma formation and prevent anticoagulation interruption peri-operatively.
Utilizing a Custom Targeted-Sequencing Panel to Identify Gene V ariants in FEVR and Possible FEVR Patients
Rima Stepanian, B.S.1, Gabrielle Abdelmessih, B.S.1, Kim Drenser, M.D.3,4, Antonio Capone Jr, M.D.3,4, Michael T Trese, M.D.3,4, Ken Mitton, Ph.D.3,5
1Oakland University William Beaumont School of Medicine, Rochester, MI
3Oakland University Eye Research Institute, Rochester, MI
4Associated Retinal Consultants, Royal Oak, MI
5Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, MI
INTRODUCTION
To identify gene variants in pediatric vitreoretinal disease patients and their family members using a custom Illumina AmpliSeq orphan pediatric retinal disease gene panel. Our panel and sequencing protocol can sequence eight genes commonly associated with various inherited retinal diseases (IRDs).
METHODS
Participants for DNA-sequencing analysis were referred to Associated Retinal Consultants, LLC, (ARC) of Royal Oak, Michigan, USA. Ten participants were consented under IRB approval for donation of blood to the ARC Eye Biobank, and for genetic analysis at Oakland University. A targeted 8-gene panel included 7 genes required for normal function of retinal vascular endothelial cells: NDP (ChrX), CTNNB1 (Chr3), TSPAN12 (Chr7), KIF11 (Chr10), FZD4 (Chr11), LRP5 (Chr11), ZNF408 (Chr11), and RS1 (ChrX). 180 amplicons, averaging 250 base pairs (bp), with overlapping sequence, targeted all exons and at 25 bp of adjacent intron sequence, totalling 32,000 bp. Genomic DNA was extracted from 200 uL of frozen whole blood, quantified, Illumina Ampliseq libraries prepared, diluted, pooled, and sequenced with the Illumina iSeq-100. VCF files were analyzed with the Ensembl Variant Effect Predictor database.
RESULTS
Seven of the ten samples had coding consequence variants, in genes including LRP5, CTNNB1, TSPAN-12, ZNF408, and FZD4. The variants included missense, in-frame insertion, and in-frame deletion variants. In the NCBI ClinV ar database, two variants were identified, one of these samples was successfully identified as pathogenic with an amino acid change of Methionine to V aline, and another one was identified as likely pathogenic, with an amino acid change of Threonine to Methionine.
CONCLUSIONS
Our panel’s sequencing coverage was of sufficient depth to detect protein-altering variants of interest in pediatric vitreoretinal disease patients. This custom targeted-sequencing approach has promising potential in broadening accessibility, adoptability, and advancement of more widespread gene sequencing in the pediatric retinal disease population.
Advances in Minimally Invasive Hepatobiliary Intervention for Non-Operative Candidates
Tulasi Talluri, MPH1, Philip Cieplinski, M.D.2, Kristian Loveridge, D.O.2
1 Oakland University William Beaumont School of Medicine, Rochester, MI
2 Corewell Health William Beaumont University Hospital, Royal Oak, MI
INTRODUCTION
Cholangioscopy has been available since the 1970’s with limited use in interventional radiology (IR) due to issues with poor percutaneous access options, limited steerability, irrigation capabilities, and requirement for more than one operator. Direct intraluminal visualization of the gallbladder, common bile ducts, and even the ampulla and duodenum has become more practical with the advent of single operator disposable fiberoptic cholangioscopes. Such systems enhance possibilities in diagnosis and treatment, particularly in patients that are non-operative or poor peroral endoscopic candidates due to severity of illness or previous surgery.
METHODS
This study presents seven innovative minimally invasive percutaneous applications for the evaluation and treatment of gallbladder and biliary duct stone disease using the Spyglass cholangioscope, including electrohydraulic lithotripsy for stone removal, forceps biopsy after failed brush biopsy, ampulla sphincteroplasty, bile duct exploration, common bile duct stent inspection. All patients selected for cholangioscopy were non-operative or had limited peroral endoscopic options.
RESULTS
Patient A: Electrohydraulic lithotripsy in a non-operative candidate with choledocholithiasis and ascending cholangitis.
Patient B: Forceps biopsy after a failed brush biopsy in a patient with an altered gastrointestinal tract due to previous surgery.
Patient C: Ampulla sphincteroplasty and common bile duct stricture dilation in a patient who underwent gastric bypass surgery.
Patient D: Hepatic duct stone retrieval with subsequent successful tube removal.
Patient E: Cholangioscopy-confirmed common duct stent patency when fluoroscopy suggested stenosis.
Patient F: Stent inspection through the cholangioscope allowing for necessary adjustments.
Patient G: Large, bulky cholelith removal.
CONCLUSIONS
Single-operator percutaneous cholangioscopy has revolutionized the evaluation and treatment of hepatobiliary diseases in non-operative or poor endoscopic candidates. By utilizing cholecystostomy and/or biliary tubes with tract maturation, we successfully visualized and treated various biliary conditions, including stone removal, biopsy, sphincteroplasty, and stent management. These minimally invasive techniques offer valuable alternatives to traditional approaches, improving patient outcomes and quality of life.
A Review of Risk Factors for Postoperative Failure after Isolated Primary Medial Patellofemoral Ligament Reconstruction
Bilal Tarar, B.S.1, Ameen Suhrawardy, M.D.2, Clark Yin, M.D.3, Leonardo Cavinatto, M.D.3, Elizabeth Dennis, M.D.4, Betina Hinckel, M.D.3
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Orthopaedic Surgery, Wayne State University, Detroit, MI
3Department of Orthopaedic Surgery, Corewell Health William Beaumont University Hospital, Royal Oak, MI
4Department of Orthopaedic Surgery, Mount Sinai Hospital, New York, NY
INTRODUCTION
Medial patellofemoral ligament reconstruction (MPFL-R) is the standard surgical treatment for patellofemoral instability. Postoperative recurrent instability and apprehension can occur due to preexisting anatomical risk factors that were not addressed during the initial procedure, or due to technical errors during surgery.
This review aims to identify and evaluate the significance of anatomic, technical, and demographic risk factors associated with MPFL failure in the current literature.
METHODS
A systematic literature review was performed that followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. PubMed, MEDLINE, and Cochrane databases were searched, and studies were included based on their inclusion of postoperative failures after MPFL-R and evaluation of risk factors. The collective data was analyzed to determine how many knees were associated with each risk factor.
RESULTS
1628 articles were reviewed, and 28 studies were included in the final analysis, encompassing 2195 knees with 245 failures (11.2%). Failures were defined as recurrent dislocations, instability, subluxation, or apprehension. For anatomic risk factors, severe trochlear dysplasia (TD), preoperative J-sign, femoral anteversion, patella alta, and lateral quadriceps vector were most associated with failure. Nonanatomic femoral tunnel placement was the most impactful technical risk factor, associated with 69 failed reconstructions. Single bundle grafts, gracilis autograft, graft overtensioning, and modified adductor sling grafts were also associated with postoperative instability.
CONCLUSIONS
Severe TD (Dejour grades B-D) and nonanatomic femoral placement are strongly associated with recurrent patellar dislocation after MPFL-R. Gaining a more thorough understanding of the key risk factors in treating patients with patellofemoral instability can help optimize outcomes, avoid technical pitfalls associated with MPFL-R, guide decisions regarding additional procedures, and set more realistic postoperative expectations for patients.
Maladaptive Daydreaming and Academic Procrastination: The Effects on Memory
Dominic Tomasi, B.S.1, Changiz Mohiyeddini
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Foundational Medical Studies, Oakland University William Beaumont School of Medicine, Rochester, MI
INTRODUCTION
Maladaptive Daydreaming (MD) is defined as excessive daydreaming that results in personal distress and/or impairment in social and cognitive functioning. It is a relatively new psychological phenomena that is under-investigated in certain populations and in its association with other psychopathological processes. One population where MD has not yet been studied is medical students, a population where high levels of academic and memory performance are vital to daily functioning. Our aim is to investigate if MD is a relevant phenomenon that medical students participate in, and if so, MD’s association with academic procrastination and prospective and retrospective memory.
METHODS
First year through fourth year medical students at Oakland University William Beaumont School of Medicine were invited to participate in a cross-sectional survey consisting of well validated and reliable surveys including the Maladaptive Daydreaming Scale-16, Academic Procrastination Scale – Short Form, and the Prospective and Retrospective Memory Questionnaire.
RESULTS
At a n=25, a mediation analysis showed that maladaptive daydreaming had a significant positive association with academic procrastination (r=0.533, p <0.01). Further, a higher endorsement of academic procrastination on survey measures was associated negatively with both prospective (r=-0.429, p<0.05) and retrospective memory (r=-0.448, p<0.05) performance.
CONCLUSIONS
These findings suggest that medical students who engage in MD behaviors are more likely to experience academic procrastination. Further the negative correlation between academic procrastination and memory suggests that in order to procrastinate, students must remove elements of their memory (prospective and retrospective). Given the stressful nature of procrastination in medical school, the results lend support to tailored intervention to students to help manage MD behaviors, which may positively impact academic performance and memory.
Deep Learning for Segmentation of Intracranial Aneurysms on Time-of-Flight Magnetic Resonance Angiography
Tram Mai Vo, B.S.1, Khalid Mahmood Malik, Ph.D.2, Muzammal Shafique2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2College of Innovation and Technology, The University of Michigan-Flint, Flint, MI
INTRODUCTION
While the vast majority of intracranial aneurysms (IAs) go unruptured, those that do rupture pose a life-threatening risk of subarachnoid hemorrhage (SAH). However, predicting the rupture risk of IAs remains challenging even for experts. Machine learning has emerged as a promising clinical tool in recent years, with the potential to help guide management of IAs. Ideally, a model that is capable of both segmenting and risk stratifying IAs would be of great benefit. In this study, we demonstrate the success of deep learning, a subset of machine learning, in segmenting intracranial aneurysms on time-of-flight magnetic resonance angiography (TOF MRA) imaging.
METHODS
Since MRA is commonly used for IA surveillance, we chose to utilize an open source TOF MRA dataset from the OpenNeuro database. We selected established deep learning algorithms to train this data on. For this study, our outcome measures are the Dice Similarity Coefficient (DICE) and the Intersection over Union (IOU). These are metrics used in the field of machine learning to evaluate model performance, namely by quantifying how similar the model’s prediction is to the ground truth.
RESULTS
The Medical Transformer model performed best on segmentation of IAs on TOF MRA, with a DICE of 0.819 and IOU of 0.705. Close in performance were Residual UNet, UNet++, Attention UNet, UNet, Vision Transformer, and Dual Attention UNet with respective DICEs of 0.778, 0.776, 0.774, 0.770, 0.767, and 0.737; and IOU of 0.705, 0.652, 0.650, 0.643, 0.639, 0.645, and 0.600. Swin UNet had significantly lower performance with DICE of 0.327 and IOU of 0.232.
CONCLUSIONS
We have demonstrated the effectiveness of using deep learning, particularly Medical Transformer, to accurately segment aneurysms on TOF MRA imaging, which is important for rupture risk prediction. Future steps include expanding the capabilities of the model for time-series growth modeling to identify unstable IAs.
Does the Possibility of no Surgery in Watch and Wait for Rectal Cancer Affect Patient Decision-Making?
Joanna F. Wasvary, B.S.1, Ga-Ram Han, M.D.2, Jacob A. Applegarth, M.D.3, Matthew A. Ziegler, M.D.3
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Parkview Health, Fort Wayne, IN
3Corewell Health William Beaumont University Hospital, Royal Oak, MI
INTRODUCTION
Prior studies have established that the watch and wait protocol for rectal cancer is a reasonable approach in patients with excellent response to total neoadjuvant therapy (TNT). This study assessed all patients in our institution who underwent TNT for rectal cancer and recommendations for treatment. Primary aim of this study was to assess patient compliance with surgeon recommendations for residual disease after TNT. Secondary aims included an assessment of oncologic outcomes and the rate of complete pathologic response after surgery.
METHODS
A retrospective chart review of a prospectively maintained rectal cancer database of all patients who completed TNT was performed. Exclusion criteria included patients with metastatic disease, IBD, synchronous colon cancer, and those who did not complete both chemotherapy and chemoradiation. Data evaluation included demographics, comorbidities, initial staging, chemotherapy, radiation therapy, post-treatment re-staging, surveillance results with the watch and wait approach, and final pathology results after resection.
RESULTS
63 patients completed TNT at our institution within the study period.For the TNT regimen, 32% had chemoradiation first, 68% had chemotherapy first. Based on post-treatment re-staging, 41% of patients were eligible for watch and wait. 20% of the patients who did not undergo surgery after TNT had evidence of residual disease. After a median follow-up time of 18 months, none of the patients on the watch and wait protocol had recurrent disease. Of the patients who underwent surgical resection, 33% had sterile specimen on final pathology.
CONCLUSIONS
The results of watch and wait protocols for rectal cancer have been very promising and a significant percentage of patients are able to avoid surgery. However, the success of watch and wait may have an unintended effect on patient compliance in the setting of residual disease after TNT. Further studies are needed to determine the long-term oncologic outcomes of the watch and wait protocol for eligible patients.
The Effect of COVID-19 on Total Joint Arthroplasty
Hailey Worstman, B.S.1, James E. Feng, M.D.2, Alex Miller, M.D.2, Drew Moore, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Corewell Royal Oak, Department of Orthopaedic Surgery, Royal Oak, MI
INTRODUCTION
Total joint arthroplasties are common inpatient procedures to treat arthritis, but were halted in March 2020 due to the COVID-19 pandemic. Studies have found that once total joint arthroplasties resumed, there was a trend toward outpatient arthroplasty due to favorable outcomes and significantly lower costs compared to inpatient arthroplasties. Our study aims to more comprehensively evaluate the surgical outcomes of total joint arthroplasty by identifying differences in surgical outcomes, if any, between patients who had a total knee and hip arthroplasties during the COVID-19 pandemic and those who had the procedure prior to the pandemic. We will evaluate surgical outcomes such as readmissions, emergency department visits, mortality, length of stay, and admission type.
METHODS
This retrospective study compared 90-day surgical outcomes between patients over the age of 18 who underwent total knee or hip arthroplasty before COVID-19 (4/01/2016-3/31/2020) versus those who underwent the procedure during COVID-19 (4/01/2020-12/31/2022). Data was obtained from the MARCQI registry to include patients who had these procedures at Royal Oak and Troy Corewell Hospitals. Data was analyzed and significance was determined with the Chi-Square p-value and equal variance two sample t-test.
RESULTS
Compared to pre-COVID total hip and knee arthroplasties, those performed during COVID showed no significant difference in 90-day mortality (hip p= 0.0439, knee p=0561), ED visits (hip p=0.0070, knee p=0.0854), or readmissions (hip p=0.0066, knee p=0.0119). Length of stay was significantly different for hip and knee arthroplasties during COVID compared to prior (p<0.001), and admission type shifted more towards outpatient during COVID (p<0.0001).
CONCLUSIONS
The results support our hypothesis that surgical outcomes after total joint arthroplasty were minimally impacted by the pandemic due to increased patient screening and various precautions in place. Additionally, the shift towards outpatient procedures and shorter length of stay were associated with favorable surgical outcomes and continuing these practices may be financially advantageous.
Understanding the Effect of Advanced Patient Age on Outcomes and Complications of Spinal Surgery
Alexander R. Woznicki, B.S.1, Daniel K. Fahim, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Department of Neurologic Surgery, Corewell Hospital Royal Oak, Royal Oak, MI
INTRODUCTION
In 2020, 6.7 million individuals in the United States were aged 85 or older. Elderly individuals have increased risk of developing spinal conditions such as spondylosis, fractures, disc degeneration, and spinal stenosis. Cervical spine conditions may result in upper extremity weakness, clumsiness, and gait disturbances, while lumbar spine conditions commonly present as pain. Upper extremity symptoms may impair ability to perform activities of daily living (ADLs), while gait disturbances are linked to falls, fractures, and increased mortality. Pain is correlated with functional impairment, heart disease, and increased mortality.
Elderly individuals are more likely to experience spinal conditions and may greatly benefit from surgery. However, age is a known risk factor for surgical complications, especially with spine surgeries. The question of whether advanced age results in worse outcomes from spinal surgery remains unresolved. This study examines whether super-elderly patients (aged 85+) have worse outcomes or more complications from spinal surgery compared to elderly patients (aged 75-84).
METHODS
A retrospective chart review was conducted comparing the outcomes of super-elderly and elderly patients after spinal surgery at Corewell Royal Oak. Data on pre- and post-operative neurologic exams, pain, ADLs, mobility, pain medication use, adverse events, and mortality were analyzed using Fisher's Exact p-value and two-sample t-tests.
RESULTS
Data from 16 super-elderly and 87 elderly patients were compared. No significant differences were found in neurologic exam results, pain, ADLs, mobility, medication use, adverse events, or mortality. However, there was a significant reduction in pain scores post-operatively for elderly (p=0.004) and super-elderly patients (p=0.041), with super-elderly patients also showing improved mobility (p=0.018) post-operatively, with a significant improvement compared to elderly patients (p=0.0496).
CONCLUSIONS
These findings suggest that super-elderly individuals do not experience worse outcomes or increased complications from spinal surgery. Our study indicates that age alone should not be considered a contraindication for surgical intervention in the elderly.
Metachronous Periprosthetic Joint Infection in Patients with Multiple Arthroplasties at Index Infection: Risk Factor Assessment
Devin Young, B.A.1, Daniel Nikolaidis, M.S.1, Mark Karadsheh, M.D.2, Andrew Steffensmeier, M.D.2, Robert Runner, M.D.2
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Corewell Health William Beaumont University Hospital, Royal Oak, MI
INTRODUCTION
The risk factors for developing an initial periprosthetic joint infection (PJI) are well described, but the risk factors for developing PJI at a second site known as a metachronous periprosthetic joint infection (MPJI) are poorly understood. The purpose of this study was to identify (1) the incidence of MPJI, (2) average time to MPJI, (3) the risk factors for MPJI, and (4) the risk of prosthesis laterality for MPJI.
METHODS
We retrospectively identified 205 patients who had multiple total hip or knee prostheses at the time of an initial PJI between 2013–2022. Patients with a follow-up of at least 2 years were assessed for the development of MPJI. Variables related to the primary prosthesis, the interim before the initial PJI, and surgical management of the initial PJI were compared between those with MPJI and those without MPJI. Analysis was performed using Chi-square or Fisher’s exact test, with significance set at alpha 0.05.
RESULTS
An MPJI developed in 9 of 205 patients (4.3%). Average time to MPJI was 3.3 years (range, 0.26 to 6.4). Risk factors for MPJI included asthma (P-value = 0.02), rheumatoid arthritis (0.014), immunosuppressant use (0.047), history of aseptic revision (0.038), and initial PJI requiring > 1 surgical treatment (0.025). No specific pathogen at initial PJI was a risk factor for MPJI. A positive blood culture at the time of the initial PJI was not a significant risk factor for MPJI.
CONCLUSIONS
Patients who have a history of PJI are at risk for developing a second-site PJI in a pre-existing prosthesis, greater than the risk of developing an initial PJI. Clinicians should be at a higher suspicion for infection in this population, incorporating the identified risk factors of this study into their decision-making process for earlier diagnostic intervention.
The Impact of Bone Frailty vs. BMI on In-Hospital Outcomes Following Primary Total Hip and Knee Arthroplasties
Mazen Zamzam, B.S.1, Fong Nham, M.D.2, Avianna Arapovic, M.D.1, Abdul Zalikha, M.D.2, Inaya Hajj Hussein, Ph.D.1, Mohannad Othmani, M.D.3
1Oakland University William Beaumont School of Medicine, Rochester, MI
2Detroit Medical Center, Detroit, MI
3Brown University, Newport, RI
INTRODUCTION
The United States has the highest incidence rate for Total Joint Arthroplasty (TJA), with projections showing continued growth for the next few decades. TJA remains a frequently performed procedure with consistently positive outcomes, thanks to advancements in implant longevity across generations. However, postoperative complications pose a significant financial and logistical challenge for healthcare system stakeholders. Therefore, optimizing comorbidities, situational variables, and patient demographics before surgery guides interventions aimed at improving care quality. Frailty is linked to a higher risk of postoperative complications and adverse economic impacts on hospitals.
METHODS
Discharge data from National Inpatient Sample registry was used to identify all patients 50 or older who underwent TJA in the period between 2006 and 2015. Patients were stratified into frail-nonobese, nonfrail-obese, frail-obese groupings, based on presence of specific ICD-9 diagnostic coding. An analysis comparing the two groups epidemiology, medical comorbidities, and propensity score weighted postoperative clinical and economic outcomes was performed.
RESULTS
In the time period from 2006 to the third quarter of 2015, a total of 1,738,224 (9,554 frail-nonobese, 1,725,402 nonfrail-obese, 3,268 frail-obese) patients who underwent TJA were included in this study. The average age within this study’s population was 63.0 years old, with a female distribution of 65.85%. Frail-nonobese patients were more likely to exhibit significantly higher rates of the majority Modified Elixhauser Comorbities, than other cohorts. Additionally, frail-nonobese patients displayed increased likelihood of experiencing CNS, Cardiac, GI, GU, and infection postoperatively when compared to patients who are nonfrail-obese.
CONCLUSIONS
Patients exhibiting the defining characteristics of frailty undergoing TJA procedures are at significantly higher risk for inpatient postoperative complications compared to those who are obese. This emphasizes that understanding the implications of frailty is also more vital than obesity for clinical risk assessment, preoperative optimization, and perioperative planning for this expanding population.