Search results
Found 21052 matches for
Magnetic Resonance Imaging Characteristics of LGI1-Antibody and CASPR2-Antibody Encephalitis.
IMPORTANCE: Rapid and accurate diagnosis of autoimmune encephalitis encourages prompt initiation of immunotherapy toward improved patient outcomes. However, clinical features alone may not sufficiently narrow the differential diagnosis, and awaiting autoantibody results can delay immunotherapy. OBJECTIVE: To identify simple magnetic resonance imaging (MRI) characteristics that accurately distinguish 2 common forms of autoimmune encephalitis, LGI1- and CASPR2-antibody encephalitis (LGI1/CASPR2-Ab-E), from 2 major differential diagnoses, viral encephalitis (VE) and Creutzfeldt-Jakob disease (CJD). DESIGN, SETTING, AND PARTICIPANTS: This cross-sectional study involved a retrospective, blinded analysis of the first available brain MRIs (taken 2000-2022) from 192 patients at Oxford University Hospitals in the UK and Mayo Clinic in the US. These patients had LGI1/CASPR2-Ab-E, VE, or CJD as evaluated by 2 neuroradiologists (discovery cohort; n = 87); findings were validated in an independent cohort by 3 neurologists (n = 105). Groups were statistically compared with contingency tables. Data were analyzed in 2023. MAIN OUTCOMES AND MEASURES: MRI findings including T2 or fluid-attenuated inversion recovery (FLAIR) hyperintensities, swelling or volume loss, presence of gadolinium contrast enhancement, and diffusion-weighted imaging changes. Correlations with clinical features. RESULTS: Among 192 participants with MRIs reviewed, 71 were female (37%) and 121 were male (63%); the median age was 66 years (range, 19-92 years). By comparison with VE and CJD, in LGI1/CASPR2-Ab-E, T2 and/or FLAIR hyperintensities were less likely to extend outside the temporal lobe (3/42 patients [7%] vs 17/18 patients [94%] with VE; P
Pre-Sleep Cognitive Arousal Is Negatively Associated with Sleep Misperception in Healthy Sleepers during Habitual Environmental Noise Exposure: An Actigraphy Study.
Specific noises (e.g., traffic or wind turbines) can disrupt sleep and potentially cause a mismatch between subjective sleep and objective sleep (i.e., “sleep misperception”). Some individuals are likely to be more vulnerable than others to noise-related sleep disturbances, potentially as a result of increased pre-sleep cognitive arousal. The aim of the present study was to examine the relationships between pre-sleep cognitive arousal and sleep misperception. Sixteen healthy sleepers participated in this naturalistic, observational study. Three nights of sleep were measured using actigraphy, and each 15-s epoch was classified as sleep or wake. Bedside noise was recorded, and each 15-s segment was classified as containing noise or no noise and matched to actigraphy. Participants completed measures of habitual pre-sleep cognitive and somatic arousal and noise sensitivity. Pre-sleep cognitive and somatic arousal levels were negatively associated with subjective−objective total sleep time discrepancy (p < 0.01). There was an association between sleep/wake and noise presence/absence in the first and last 90 min of sleep (p < 0.001). These results indicate that higher levels of habitual pre-sleep arousal are associated with a greater degree of sleep misperception, and even in healthy sleepers, objective sleep is vulnerable to habitual bedside noise.
Fatigue predicts quality of life after leucine‐rich glioma‐inactivated 1‐antibody encephalitis
AbstractPatient‐reported quality‐of‐life (QoL) and carer impacts are not reported after leucine‐rich glioma‐inactivated 1‐antibody encephalitis (LGI1‐Ab‐E). From 60 patients, 85% (51 out of 60) showed one abnormal score across QoL assessments and 11 multimodal validated questionnaires. Compared to the premorbid state, QoL significantly deteriorated (p < 0.001) and, at a median of 41 months, fatigue was its most important predictor (p = 0.025). In total, 51% (26 out of 51) of carers reported significant burden. An abbreviated five‐item battery explained most variance in QoL. Wide‐ranging impacts post‐LGI1‐Ab‐E include decreased QoL and high caregiver strain. We identify a rapid method to capture QoL in routine clinic or clinical trial settings.
Poststroke Executive Function in Relation to White Matter Damage on Clinically Acquired CT Brain Imaging.
BACKGROUND: Executive function (EF) impairments are prevalent post stroke and are associated with white matter (WM) damage on MRI. However, less is known about the relationship between poststroke EF and WM damage on CT imaging. OBJECTIVE: To investigate the relationship between poststroke EF and WM damage associated with stroke lesions and WM hypointensities (WMHs) on clinically acquired CT imaging. METHOD: This study analyzed data from the Oxford Cognitive Screening Program, which recruited individuals aged ≥18 years with a confirmed stroke from an acute stroke unit. The individuals completed a follow-up assessment 6 months post stroke. We included individuals with a CT scan showing a visible stroke who completed follow-up EF assessment using the Oxford Cognitive Screen-Plus rule-finding task. We manually delineated stroke lesions and quantified then dichotomized WM damage caused by the stroke using the HCP-842 atlas. We visually rated then dichotomized WMHs using the Age-Related White Matter Changes Scale. RESULTS: Among 87 stroke survivors (M age = 73.60 ± 11.75; 41 female; 61 ischemic stroke), multivariable linear regression showed that stroke damage to the medial lemniscus ( B = -8.86, P < 0.001) and the presence of WMHs ( B = -5.42, P = 0.005) were associated with poorer EF 6 months post stroke after adjusting for covariates including age and education. CONCLUSION: Poorer EF was associated with WM damage caused by stroke lesions and WMHs on CT. These results confirm the importance of WM integrity for EF post stroke and demonstrate the prognostic utility of CT-derived imaging markers for poststroke cognitive outcomes.
Venous thromboembolism risk in amyotrophic lateral sclerosis: a hospital record-linkage study.
BACKGROUND: Venous thromboembolism (VTE) can occur in amyotrophic lateral sclerosis (ALS) and pulmonary embolism causes death in a minority of cases. The benefits of preventing VTE must be weighed against the risks. An accurate estimate of the incidence of VTE in ALS is crucial to assessing this balance. METHODS: This retrospective record-linkage cohort study derived data from the Hospital Episode Statistics database, covering admissions to England's hospitals from 1 April 2003 to 31 December 2019 and included 21 163 patients with ALS and 17 425 337 controls. Follow-up began at index admission and ended at VTE admission, death or 2 years (whichever came sooner). Adjusted HRs (aHRs) for VTE were calculated, controlling for confounders. RESULTS: The incidence of VTE in the ALS cohort was 18.8/1000 person-years. The relative risk of VTE in ALS was significantly greater than in controls (aHR 2.7, 95% CI 2.4 to 3.0). The relative risk of VTE in patients with ALS under 65 years was five times higher than controls (aHR 5.34, 95% CI 4.6 to 6.2), and higher than that of patients over 65 years compared with controls (aHR 1.86, 95% CI 1.62 to 2.12). CONCLUSIONS: Patients with ALS are at a higher risk of developing VTE, but this is similar in magnitude to that reported in other chronic neurological conditions associated with immobility, such as multiple sclerosis, which do not routinely receive VTE prophylaxis. Those with ALS below the median age of symptom onset have a notably higher relative risk. A reappraisal of the case for routine antithrombotic therapy in those diagnosed with ALS now requires a randomised controlled trial.
Intracortical recordings reveal vision-to-action cortical gradients driving human exogenous attention.
Exogenous attention, the process that makes external salient stimuli pop-out of a visual scene, is essential for survival. How attention-capturing events modulate human brain processing remains unclear. Here we show how the psychological construct of exogenous attention gradually emerges over large-scale gradients in the human cortex, by analyzing activity from 1,403 intracortical contacts implanted in 28 individuals, while they performed an exogenous attention task. The timing, location and task-relevance of attentional events defined a spatiotemporal gradient of three neural clusters, which mapped onto cortical gradients and presented a hierarchy of timescales. Visual attributes modulated neural activity at one end of the gradient, while at the other end it reflected the upcoming response timing, with attentional effects occurring at the intersection of visual and response signals. These findings challenge multi-step models of attention, and suggest that frontoparietal networks, which process sequential stimuli as separate events sharing the same location, drive exogenous attention phenomena such as inhibition of return.
Safer and more efficient vital signs monitoring protocols to identify the deteriorating patients in the general hospital ward: an observational study.
BACKGROUND: The frequency at which patients should have their vital signs (e.g. blood pressure, pulse, oxygen saturation) measured on hospital wards is currently unknown. Current National Health Service monitoring protocols are based on expert opinion but supported by little empirical evidence. The challenge is finding the balance between insufficient monitoring (risking missing early signs of deterioration and delays in treatment) and over-observation of stable patients (wasting resources needed in other aspects of care). OBJECTIVE: Provide an evidence-based approach to creating monitoring protocols based on a patient's risk of deterioration and link these to nursing workload and economic impact. DESIGN: Our study consisted of two parts: (1) an observational study of nursing staff to ascertain the time to perform vital sign observations; and (2) a retrospective study of historic data on patient admissions exploring the relationships between National Early Warning Score and risk of outcome over time. These were underpinned by opinions and experiences from stakeholders. SETTING AND PARTICIPANTS: Observational study: observed nursing staff on 16 randomly selected adult general wards at four acute National Health Service hospitals. Retrospective study: extracted, linked and analysed routinely collected data from two large National Health Service acute trusts; data from over 400,000 patient admissions and 9,000,000 vital sign observations. RESULTS: Observational study found a variety of practices, with two hospitals having registered nurses take the majority of vital sign observations and two favouring healthcare assistants or student nurses. However, whoever took the observations spent roughly the same length of time. The average was 5:01 minutes per observation over a 'round', including time to locate and prepare the equipment and travel to the patient area. Retrospective study created survival models predicting the risk of outcomes over time since the patient was last observed. For low-risk patients, there was little difference in risk between 4 hours and 24 hours post observation. CONCLUSIONS: We explored several different scenarios with our stakeholders (clinicians and patients), based on how 'risk' could be managed in different ways. Vital sign observations are often done more frequently than necessary from a bald assessment of the patient's risk, and we show that a maximum threshold of risk could theoretically be achieved with less resource. Existing resources could therefore be redeployed within a changed protocol to achieve better outcomes for some patients without compromising the safety of the rest. Our work supports the approach of the current monitoring protocol, whereby patients' National Early Warning Score 2 guides observation frequency. Existing practice is to observe higher-risk patients more frequently and our findings have shown that this is objectively justified. It is worth noting that important nurse-patient interactions take place during vital sign monitoring and should not be eliminated under new monitoring processes. Our study contributes to the existing evidence on how vital sign observations should be scheduled. However, ultimately, it is for the relevant professionals to decide how our work should be used. STUDY REGISTRATION: This study is registered as ISRCTN10863045. FUNDING: This award was funded by the National Institute for Health and Care Research (NIHR) Health and Social Care Delivery Research programme (NIHR award ref: 17/05/03) and is published in full in Health and Social Care Delivery Research; Vol. 12, No. 6. See the NIHR Funding and Awards website for further award information.
The effects of genetic and modifiable risk factors on brain regions vulnerable to ageing and disease.
We have previously identified a network of higher-order brain regions particularly vulnerable to the ageing process, schizophrenia and Alzheimer's disease. However, it remains unknown what the genetic influences on this fragile brain network are, and whether it can be altered by the most common modifiable risk factors for dementia. Here, in ~40,000 UK Biobank participants, we first show significant genome-wide associations between this brain network and seven genetic clusters implicated in cardiovascular deaths, schizophrenia, Alzheimer's and Parkinson's disease, and with the two antigens of the XG blood group located in the pseudoautosomal region of the sex chromosomes. We further reveal that the most deleterious modifiable risk factors for this vulnerable brain network are diabetes, nitrogen dioxide - a proxy for traffic-related air pollution - and alcohol intake frequency. The extent of these associations was uncovered by examining these modifiable risk factors in a single model to assess the unique contribution of each on the vulnerable brain network, above and beyond the dominating effects of age and sex. These results provide a comprehensive picture of the role played by genetic and modifiable risk factors on these fragile parts of the brain.