Search results
Found 21052 matches for
Poststroke Executive Function in Relation to White Matter Damage on Clinically Acquired CT Brain Imaging.
BACKGROUND: Executive function (EF) impairments are prevalent post stroke and are associated with white matter (WM) damage on MRI. However, less is known about the relationship between poststroke EF and WM damage on CT imaging. OBJECTIVE: To investigate the relationship between poststroke EF and WM damage associated with stroke lesions and WM hypointensities (WMHs) on clinically acquired CT imaging. METHOD: This study analyzed data from the Oxford Cognitive Screening Program, which recruited individuals aged ≥18 years with a confirmed stroke from an acute stroke unit. The individuals completed a follow-up assessment 6 months post stroke. We included individuals with a CT scan showing a visible stroke who completed follow-up EF assessment using the Oxford Cognitive Screen-Plus rule-finding task. We manually delineated stroke lesions and quantified then dichotomized WM damage caused by the stroke using the HCP-842 atlas. We visually rated then dichotomized WMHs using the Age-Related White Matter Changes Scale. RESULTS: Among 87 stroke survivors (M age = 73.60 ± 11.75; 41 female; 61 ischemic stroke), multivariable linear regression showed that stroke damage to the medial lemniscus ( B = -8.86, P < 0.001) and the presence of WMHs ( B = -5.42, P = 0.005) were associated with poorer EF 6 months post stroke after adjusting for covariates including age and education. CONCLUSION: Poorer EF was associated with WM damage caused by stroke lesions and WMHs on CT. These results confirm the importance of WM integrity for EF post stroke and demonstrate the prognostic utility of CT-derived imaging markers for poststroke cognitive outcomes.
Venous thromboembolism risk in amyotrophic lateral sclerosis: a hospital record-linkage study.
BACKGROUND: Venous thromboembolism (VTE) can occur in amyotrophic lateral sclerosis (ALS) and pulmonary embolism causes death in a minority of cases. The benefits of preventing VTE must be weighed against the risks. An accurate estimate of the incidence of VTE in ALS is crucial to assessing this balance. METHODS: This retrospective record-linkage cohort study derived data from the Hospital Episode Statistics database, covering admissions to England's hospitals from 1 April 2003 to 31 December 2019 and included 21 163 patients with ALS and 17 425 337 controls. Follow-up began at index admission and ended at VTE admission, death or 2 years (whichever came sooner). Adjusted HRs (aHRs) for VTE were calculated, controlling for confounders. RESULTS: The incidence of VTE in the ALS cohort was 18.8/1000 person-years. The relative risk of VTE in ALS was significantly greater than in controls (aHR 2.7, 95% CI 2.4 to 3.0). The relative risk of VTE in patients with ALS under 65 years was five times higher than controls (aHR 5.34, 95% CI 4.6 to 6.2), and higher than that of patients over 65 years compared with controls (aHR 1.86, 95% CI 1.62 to 2.12). CONCLUSIONS: Patients with ALS are at a higher risk of developing VTE, but this is similar in magnitude to that reported in other chronic neurological conditions associated with immobility, such as multiple sclerosis, which do not routinely receive VTE prophylaxis. Those with ALS below the median age of symptom onset have a notably higher relative risk. A reappraisal of the case for routine antithrombotic therapy in those diagnosed with ALS now requires a randomised controlled trial.
Intracortical recordings reveal vision-to-action cortical gradients driving human exogenous attention.
Exogenous attention, the process that makes external salient stimuli pop-out of a visual scene, is essential for survival. How attention-capturing events modulate human brain processing remains unclear. Here we show how the psychological construct of exogenous attention gradually emerges over large-scale gradients in the human cortex, by analyzing activity from 1,403 intracortical contacts implanted in 28 individuals, while they performed an exogenous attention task. The timing, location and task-relevance of attentional events defined a spatiotemporal gradient of three neural clusters, which mapped onto cortical gradients and presented a hierarchy of timescales. Visual attributes modulated neural activity at one end of the gradient, while at the other end it reflected the upcoming response timing, with attentional effects occurring at the intersection of visual and response signals. These findings challenge multi-step models of attention, and suggest that frontoparietal networks, which process sequential stimuli as separate events sharing the same location, drive exogenous attention phenomena such as inhibition of return.
Safer and more efficient vital signs monitoring protocols to identify the deteriorating patients in the general hospital ward: an observational study.
BACKGROUND: The frequency at which patients should have their vital signs (e.g. blood pressure, pulse, oxygen saturation) measured on hospital wards is currently unknown. Current National Health Service monitoring protocols are based on expert opinion but supported by little empirical evidence. The challenge is finding the balance between insufficient monitoring (risking missing early signs of deterioration and delays in treatment) and over-observation of stable patients (wasting resources needed in other aspects of care). OBJECTIVE: Provide an evidence-based approach to creating monitoring protocols based on a patient's risk of deterioration and link these to nursing workload and economic impact. DESIGN: Our study consisted of two parts: (1) an observational study of nursing staff to ascertain the time to perform vital sign observations; and (2) a retrospective study of historic data on patient admissions exploring the relationships between National Early Warning Score and risk of outcome over time. These were underpinned by opinions and experiences from stakeholders. SETTING AND PARTICIPANTS: Observational study: observed nursing staff on 16 randomly selected adult general wards at four acute National Health Service hospitals. Retrospective study: extracted, linked and analysed routinely collected data from two large National Health Service acute trusts; data from over 400,000 patient admissions and 9,000,000 vital sign observations. RESULTS: Observational study found a variety of practices, with two hospitals having registered nurses take the majority of vital sign observations and two favouring healthcare assistants or student nurses. However, whoever took the observations spent roughly the same length of time. The average was 5:01 minutes per observation over a 'round', including time to locate and prepare the equipment and travel to the patient area. Retrospective study created survival models predicting the risk of outcomes over time since the patient was last observed. For low-risk patients, there was little difference in risk between 4 hours and 24 hours post observation. CONCLUSIONS: We explored several different scenarios with our stakeholders (clinicians and patients), based on how 'risk' could be managed in different ways. Vital sign observations are often done more frequently than necessary from a bald assessment of the patient's risk, and we show that a maximum threshold of risk could theoretically be achieved with less resource. Existing resources could therefore be redeployed within a changed protocol to achieve better outcomes for some patients without compromising the safety of the rest. Our work supports the approach of the current monitoring protocol, whereby patients' National Early Warning Score 2 guides observation frequency. Existing practice is to observe higher-risk patients more frequently and our findings have shown that this is objectively justified. It is worth noting that important nurse-patient interactions take place during vital sign monitoring and should not be eliminated under new monitoring processes. Our study contributes to the existing evidence on how vital sign observations should be scheduled. However, ultimately, it is for the relevant professionals to decide how our work should be used. STUDY REGISTRATION: This study is registered as ISRCTN10863045. FUNDING: This award was funded by the National Institute for Health and Care Research (NIHR) Health and Social Care Delivery Research programme (NIHR award ref: 17/05/03) and is published in full in Health and Social Care Delivery Research; Vol. 12, No. 6. See the NIHR Funding and Awards website for further award information.
The effects of genetic and modifiable risk factors on brain regions vulnerable to ageing and disease.
We have previously identified a network of higher-order brain regions particularly vulnerable to the ageing process, schizophrenia and Alzheimer's disease. However, it remains unknown what the genetic influences on this fragile brain network are, and whether it can be altered by the most common modifiable risk factors for dementia. Here, in ~40,000 UK Biobank participants, we first show significant genome-wide associations between this brain network and seven genetic clusters implicated in cardiovascular deaths, schizophrenia, Alzheimer's and Parkinson's disease, and with the two antigens of the XG blood group located in the pseudoautosomal region of the sex chromosomes. We further reveal that the most deleterious modifiable risk factors for this vulnerable brain network are diabetes, nitrogen dioxide - a proxy for traffic-related air pollution - and alcohol intake frequency. The extent of these associations was uncovered by examining these modifiable risk factors in a single model to assess the unique contribution of each on the vulnerable brain network, above and beyond the dominating effects of age and sex. These results provide a comprehensive picture of the role played by genetic and modifiable risk factors on these fragile parts of the brain.
Towards Uncovering the Role of Incomplete Penetrance in Maculopathies through Sequencing of 105 Disease-Associated Genes
Inherited macular dystrophies (iMDs) are a group of genetic disorders, which affect the central region of the retina. To investigate the genetic basis of iMDs, we used single-molecule Molecular Inversion Probes to sequence 105 maculopathy-associated genes in 1352 patients diagnosed with iMDs. Within this cohort, 39.8% of patients were considered genetically explained by 460 different variants in 49 distinct genes of which 73 were novel variants, with some affecting splicing. The top five most frequent causative genes were ABCA4 (37.2%), PRPH2 (6.7%), CDHR1 (6.1%), PROM1 (4.3%) and RP1L1 (3.1%). Interestingly, variants with incomplete penetrance were revealed in almost one-third of patients considered solved (28.1%), and therefore, a proportion of patients may not be explained solely by the variants reported. This includes eight previously reported variants with incomplete penetrance in addition to CDHR1:c.783G>A and CNGB3:c.1208G>A. Notably, segregation analysis was not routinely performed for variant phasing—a limitation, which may also impact the overall diagnostic yield. The relatively high proportion of probands without any putative causal variant (60.2%) highlights the need to explore variants with incomplete penetrance, the potential modifiers of disease and the genetic overlap between iMDs and age-related macular degeneration. Our results provide valuable insights into the genetic landscape of iMDs and warrant future exploration to determine the involvement of other maculopathy genes.
Comparative neuroimaging of sex differences in human and mouse brain anatomy.
In vivo neuroimaging studies have established several reproducible volumetric sex differences in the human brain, but the causes of such differences are hard to parse. While mouse models are useful for understanding the cellular and mechanistic bases of sex-specific brain development, there have been no attempts to formally compare human and mouse neuroanatomical sex differences to ascertain how well they translate. Addressing this question would shed critical light on the use of the mouse as a translational model for sex differences in the human brain and provide insights into the degree to which sex differences in brain volume are conserved across mammals. Here, we use structural magnetic resonance imaging to conduct the first comparative neuroimaging study of sex-specific neuroanatomy of the human and mouse brain. In line with previous findings, we observe that in humans, males have significantly larger and more variable total brain volume; these sex differences are not mirrored in mice. After controlling for total brain volume, we observe modest cross-species congruence in the volumetric effect size of sex across 60 homologous regions (r=0.30). This cross-species congruence is greater in the cortex (r=0.33) than non-cortex (r=0.16). By incorporating regional measures of gene expression in both species, we reveal that cortical regions with greater cross-species congruence in volumetric sex differences also show greater cross-species congruence in the expression profile of 2835 homologous genes. This phenomenon differentiates primary sensory regions with high congruence of sex effects and gene expression from limbic cortices where congruence in both these features was weaker between species. These findings help identify aspects of sex-biased brain anatomy present in mice that are retained, lost, or inverted in humans. More broadly, our work provides an empirical basis for targeting mechanistic studies of sex-specific brain development in mice to brain regions that best echo sex-specific brain development in humans.
Multi-night cortico-basal recordings reveal mechanisms of NREM slow-wave suppression and spontaneous awakenings in Parkinson’s disease
AbstractSleep disturbance is a prevalent and disabling comorbidity in Parkinson’s disease (PD). We performed multi-night (n = 57) at-home intracranial recordings from electrocorticography and subcortical electrodes using sensing-enabled Deep Brain Stimulation (DBS), paired with portable polysomnography in four PD participants and one with cervical dystonia (clinical trial: NCT03582891). Cortico-basal activity in delta increased and in beta decreased during NREM (N2 + N3) versus wakefulness in PD. DBS caused further elevation in cortical delta and decrease in alpha and low-beta compared to DBS OFF state. Our primary outcome demonstrated an inverse interaction between subcortical beta and cortical slow-wave during NREM. Our secondary outcome revealed subcortical beta increases prior to spontaneous awakenings in PD. We classified NREM vs. wakefulness with high accuracy in both traditional (30 s: 92.6 ± 1.7%) and rapid (5 s: 88.3 ± 2.1%) data epochs of intracranial signals. Our findings elucidate sleep neurophysiology and impacts of DBS on sleep in PD informing adaptive DBS for sleep dysfunction.
Cortical signatures of sleep are altered following effective deep brain stimulation for depression
AbstractDeep brain stimulation (DBS) of the subcallosal cingulate cortex (SCC) is an experimental therapy for treatment-resistant depression (TRD). Chronic SCC DBS leads to long-term changes in the electrophysiological dynamics measured from local field potential (LFP) during wakefulness, but it is unclear how it impacts sleep-related brain activity. This is a crucial gap in knowledge, given the link between depression and sleep disturbances, and an emerging interest in the interaction between DBS, sleep, and circadian rhythms. We therefore sought to characterize changes in electrophysiological markers of sleep associated with DBS treatment for depression. We analyzed key electrophysiological signatures of sleep—slow-wave activity (SWA, 0.5–4.5 Hz) and sleep spindles—in LFPs recorded from the SCC of 9 patients who responded to DBS for TRD. This allowed us to compare the electrophysiological changes before and after 24 weeks of therapeutically effective SCC DBS. SWA power was highly correlated between hemispheres, consistent with a global sleep state. Furthermore, SWA occurred earlier in the night after chronic DBS and had a more prominent peak. While we found no evidence for changes to slow-wave power or stability, we found an increase in the density of sleep spindles. Our results represent a first-of-its-kind report on long-term electrophysiological markers of sleep recorded from the SCC in patients with TRD, and provides evidence of earlier NREM sleep and increased sleep spindle activity following clinically effective DBS treatment. Future work is needed to establish the causal relationship between long-term DBS and the neural mechanisms underlying sleep.
Bioelectronic Zeitgebers: targeted neuromodulation to re-establish circadian rhythms.
Existing neurostimulation systems implanted for the treatment of neurodegenerative disorders generally deliver invariable therapy parameters, regardless of phase of the sleep/wake cycle. However, there is considerable evidence that brain activity in these conditions varies according to this cycle, with discrete patterns of dysfunction linked to loss of circadian rhythmicity, worse clinical outcomes and impaired patient quality of life. We present a targeted concept of circadian neuromodulation using a novel device platform. This system utilises stimulation of circuits important in sleep and wake regulation, delivering bioelectronic cues (Zeitgebers) aimed at entraining rhythms to more physiological patterns in a personalised and fully configurable manner. Preliminary evidence from its first use in a clinical trial setting, with brainstem arousal circuits as a surgical target, further supports its promising impact on sleep/wake pathology. Data included in this paper highlight its versatility and effectiveness on two different patient phenotypes. In addition to exploring acute and long-term electrophysiological and behavioural effects, we also discuss current caveats and future feature improvements of our proposed system, as well as its potential applicability in modifying disease progression in future therapies.
Multimodal neuroimaging correlates of physical-cognitive covariation in Chilean adolescents. The Cogni-Action Project.
Health-related behaviours have been related to brain structural features. In developing settings, such as Latin America, high social inequality has been inversely associated with several health-related behaviours affecting brain development. Understanding the relationship between health behaviours and brain structure in such settings is particularly important during adolescence when critical habits are acquired and ingrained. In this cross-sectional study, we carry out a multimodal analysis identifying a brain region associated with health-related behaviours (i.e., adiposity, fitness, sleep problems and others) and cognitive/academic performance, independent of socioeconomic status in a large sample of Chilean adolescents. Our findings suggest that the relationship between health behaviours and cognitive/academic performance involves a particular brain phenotype that could play a mediator role. These findings fill a significant gap in the literature, which has largely focused on developed countries and raise the possibility of promoting healthy behaviours in adolescence as a means to influence brain structure and thereby cognitive/academic achievement, independently of socioeconomic factors. By highlighting the potential impact on brain structure and cognitive/academic achievement, policymakers could design interventions that are more effective in reducing health disparities in developing countries.
A Prospective, Observational, Noninterventional Clinical Study of Participants With Choroideremia: The NIGHT Study.
PURPOSE: The NIGHT study aimed to assess the natural history of choroideremia (CHM), an X-linked inherited chorioretinal degenerative disease leading to blindness, and determine which outcomes would be the most sensitive for monitoring disease progression. DESIGN: A prospective, observational, multicenter cohort study. METHODS: Males aged ≥18 years with genetically confirmed CHM, visible active disease within the macular region, and best-corrected visual acuity (BCVA) ≥34 Early Treatment Diabetic Retinopathy Study (ETDRS) letters at baseline were assessed for 20 months. The primary outcome was the change in BCVA over time at Months 4, 8, 12, 16, and 20. A range of functional and anatomical secondary outcome measures were assessed up to Month 12, including retinal sensitivity, central ellipsoid zone (EZ) area, and total area of fundus autofluorescence (FAF). Additional ocular assessments for safety were performed. RESULTS: A total of 220 participants completed the study. The mean BCVA was stable over 20 months. Most participants (81.4% in the worse eye and 77.8% in the better eye) had change from baseline > -5 ETDRS letters at Month 20. Interocular symmetry was low overall. Reductions from baseline to Month 12 were observed (worse eye, better eye) for retinal sensitivity (functional outcome; -0.68 dB, -0.48 dB), central EZ area (anatomical outcome; -0.276 mm2, -0.290 mm2), and total area of FAF (anatomical outcome; -0.605 mm2, -0.533 mm2). No assessment-related serious adverse events occurred. CONCLUSIONS: Retinal sensitivity, central EZ area, and total area of FAF are more sensitive than BCVA in measuring the natural progression of CHM.