On a Saturday night in February 2018, 31-year- old British boxer Scott Westgarth spoke of his love of the sport as he emerged triumphant from his match against Dec Spelman in Doncaster, England.
Hours later he collapsed in his locker room and was rushed to hospital, where he tragically died of head injuries sustained during the fight.
Then in July this year came two more boxing fatalities within a few days of each other. Both the Russian boxer Maxim Dadashev, 28, and 23-year-old Argentinian fighter Hugo Alfredo Santillan died from brain injuries, prompting fresh calls for drastic changes in the sport.
Boxers know they take a risk every time they enter the ring. Success is measured by the number of blows to the head delivered to your opponent, and has therefore long been controversial among campaigners.
Brain injury charity Headway has even called for it to be banned on several occasions. But while we often hear about what can go wrong inside the ring, boxing has a hidden danger less talked about.
American pathologist Dr Harrison Martland first described a group of boxers as being “punch drunk” in 1928.
His research paper gave a name to the phenomenon of boxers with a history of repetitive head trauma developing neurological symptoms.
Today punch drunk syndrome is better known as Chronic Traumatic Encephalopathy (CTE), the degenerative brain disease increasingly linked with rugby, football and injuries sustained in military service.
CTE, currently only diagnosed after death through brain tissue analysis, has been confirmed in more than 50 former boxers, according to the Concussion Legacy Foundation in America, using data from the VA-BU-CLF Brain Bank in Boston.
“To put it simply, the more head impacts somebody gets, the greater the risk,” says neurologist Dr Charles Bernick, from his office at the Lou Ruvo Centre for Brain Health, in the self-proclaimed fight capital of the world, Las Vegas.
“It’s just that boxing is probably the sport that gets the largest number of head impacts, because that’s the whole goal.”
In 2011 Bernick and fellow researchers at the Ruvo Centre, which specialises in research and care of neurodegenerative diseases, launched the Professional Athletes Brain Health Study (PABHS).
It is now the largest longitudinal research project into the effects of repetitive impacts to the brain in a group of professional combatants.
The study currently has over 800 participants, at various stages in their careers; from active fighters at different levels to retired athletes of varying lengths of time – and others who are transitioning between the two.
“CTE is a neurodegenerative disease, it’s in the same category as other disorders such as Alzheimer’s and Parkinson’s, but although we’ve known about it for almost 100 years, nobody really understood it.”
Symptoms of CTE don’t generally begin to appear until years after the head impacts.
So while a boxer may seem to have walked away from a fight unscathed, the signs of long-term brain damage could show up in later life.
Former British Gold Olympian boxer Audley Harrison has spoken recently about the long-term impact his career has had, as he battles permanent brain damage, sight and balance problems and behavioural issues, such as mood swings.
Early symptoms affect the individual’s mood and behaviour, but as the disease progresses, some may experience memory loss, confusion, impaired judgment, and eventually progressive dementia.
From two experts at Boston University, Dr Ann McKee and Dr Robert Stern, we know that CTE can take two forms.
Some with CTE initially present with behavioural symptoms, usually in their late 20s or 30s, while others first show signs of cognitive problems, typically beginning in their 40s and 50s. What we don’t know is why it affects some people and not others.
“We thought this was a condition that we needed to learn more about, and being in Las Vegas, we had the means to do it,” says Bernick.
With his team, he set out to find out how CTE evolves, what the risk factors are and why only some boxers go on to develop the disease.
“We started working with key players, with the Nevada Athletic Commission (NAC) and some of the main organisations and top boxing promoters, to begin recruiting fighters, alongside a controlled group of non-fighters.
“We’ve been following these people on a yearly basis over that time, to try to really understand what happens.”
The open-ended study was set to last for at least 10 years, but as the decade draws to a close it looks likely to continue for as long as the funding is in place. Eight years in, it has made some stand-out findings.
While symptoms may not show up for many years after an athlete has left the ring, according to the research, in some cases, changes in certain areas of the brain can be detected by MRI methods within just a year of being exposed to repetitive head injuries.
These changes correlate with a decline in performance on tests of cognitive function, such as memory and thinking tasks.
The effects also differed between active and former fighters, with some evidence to suggest the disease actually progresses more quickly once a boxer retires.
And changes were also influenced by an individual’s genetic make-up, in older, former athletes.
Bernick says: “In some individuals you can track change over time, in certain areas of the brain, and it seemed to differ between active and retired fighters. Once they retire, there’s a subset of people that have this progressive process which may affect different areas of the brain. This is really interesting stuff as we try to develop ways to identify who might be at risk of CTE.”
The study has also discovered blood markers – certain proteins released from injured brain fibres which leak out of the brain and can be measured in the blood.
They could be used to identify brain injury and follow recovery, and changes in MRI imaging that may be able to track an injury.
It is hoped this will lead them to identifying the “breaking point” – the point at which repeated head trauma begins to cause cognitive problems for a boxer, and could lead to serious brain conditions such as CTE.
Researchers are also exploring how changes in behaviour correlate with brain imaging changes. Behavioural issues are common in the early stages of CTE, with symptoms including impulse control problems, aggression, paranoia and depression.
In April 2019, Bernick and co published a paper linking symptoms of depression in some athletes to structural brain changes associated with CTE; but the pathologist is eager to state it as “cause and effect”.
“Depression is complicated, because there’s the issue of what happens in the brain and the issue of other surrounding factors, such as family history, life circumstances, drug and alcohol use, which makes it difficult to tease out how much is really from the head impact.
“The prevalence of depression in our group was the same as the general population of men at that age, but in those who have depression there was a correlation with smaller regional volumes in certain areas of the brain.
“It suggests that there is a relationship between what’s changing in the brain and the manifestation of depression and other behavioural changes.”
There is also preliminary evidence to suggest that symptoms of depression could even appear before noticeable cognitive changes in someone with CTE.
Far from banning boxing, however, researchers hope that the findings will help to guide new practices to improve brain health in the sport and ultimately, make it safer.
“There is no question that there is great value in boxing. It’s an outlet, and there’s a clear societal benefit to these sports.
“It’s just a matter of how we make them safe.
“In the US, even though the main risk of fighting is to the brain, there is no requirement for any brain tests, except for an MRI scan. “Looking at the risk factors, whether it’s genetic, environmental or lifestyle, might help to protect an athlete as they play these sports.”
Few boxers have spoken publicly about CTE, perhaps for fear of giving the sport a bad name.
Yet, those in the industry have been right behind the study from the beginning, says Bernick, particularly representatives of the NAC, which provides the athletes for the study.
“The first goal of the NAC is to advocate and protect the safety of unarmed combatants,” says Dr Timothy Trainor, consulting physician to the NAC.
“When we partnered with the Ruvo Centre years ago, it was the vision of both the NAC and the Ruvo Centre to see if we could make meaningful progress in the diagnosis and treatment of CTE and brain injury. Anything we can do to promote the safety of an inherently risky sport is our first objective.
“The NAC has stressed the importance of this study to all of our licensed unarmed combatants, including boxers, mixed martial artists and kickboxers, and we are hoping the data gleaned from the study can better help us protect the athletes from harm, both short and long term.”
As a result of the study, the NAC is now looking into ways to actively protect its fighters, and identify potential problems sooner.
“We have learned that we need to focus more on functional studies as opposed to static anatomy tests like MRI/MRA,” says Trainor.
“Certainly, the study is pointing us in other directions that need further study. “One specific test that we have looked at extensively is the ‘C-3 Test’, a cognitive function test performed on an iPad.
“We have tried to implement this in our jurisdiction, however, the logistics of conducting such a test have proved insurmountable.
“We are continuing to try to find tests that will not be logistically prohibitive to the athletes.”
But, he adds: “Just the fact that the studies are being performed has raised the awareness of brain health to the fighters, trainers, and all involved in these sports.”
Things in the boxing ring are certainly changing. Fighters are reportedly sparring less and choosing their opponents more carefully.
In a 1973 study by British pathologist John Corsellis, the boxers participating were exposed to between 300 and 700 bouts over the course of their careers, in addition to sparring and other training.
Today a professional boxer would rarely see upwards of 50 fights before retirement.
But as far as Bernick is concerned, this in no way means athletes today don’t run a substantial risk of suffering a neurological hangover from their careers.
“What we’ve learned from 40 to 50 years ago may not be exactly what the risk is for modern day fighters,” he says.
“But we know that the more exposure you have to head trauma, the higher the risk of CTE.”
Thankfully, many in the boxing world appear to be waking up to dangers of the sport and taking research evidence on board.
Bernick would like to see such recognition in other sports too.
“If all sports took some responsibility for the long-term health of these athletes, not just when they are playing, I think it would be a real step forward for safety in sports.”
Study highlights cycling / concussion blind-spot
Most cyclists don’t know that helmets don’t protect against concussion, researchers have
Eighty-seven per cent of cyclists believe helmets can prevent concussion, according to a survey of cyclists in New Zealand.
While most participants said they wore a helmet when cycling, many misunderstood how to best use helmets to help prevent head injury.
Nine in ten agreed that a helmet should be replaced after a fall, but just over a third had not replaced their helmet after an accident and continued to use it.
Many respondents reported cracking or otherwise damaging their helmet but didn’t feel at any increased risk of concussion.
Reasons for not doing so included the cost of buying a replacement, perceiving helmets as overrated, and seeing others riding with a damaged helmet.
The researchers, from Auckland University of Technology, found that younger cyclists and those who had previously had a concussion (59 per cent of respondents) demonstrated a better understanding of concussions than other participants.
However, of those who had experienced concussion, almost 18 per cent had never sought medical treatment for their concussion.
Participants were asked at the end of the survey if they had any further comments, and from these, the researchers concluded that concussion is underestimated and misunderstood, and that there is a need for a cycling-specific policy regarding concussion.
Most participants understood that concussions can occur without hitting their head of losing consciousness, and could recognise the symptoms associated with concussion, including dizziness, amnesia, confusion, headache, poor balance, fatigue and nausea.
However, the researchers also found confusion around the different symptoms of concussion and more severe brain injury, which could impact accessing appropriate medical help.
‘The study found high to very high levels of concussion knowledge, although knowledge of the function of helmets was very low,’ the paper states.
“There appeared to be discrepancies between knowledge and attitudes, and behaviour toward health care-seeking and replacing a helmet after a hit to the head, highlighting the complexities around decision-making.’
The researchers call for clearer, consistent public health messaging across sports to clarify messaging around concussion, as well as cycling-specific messaging for issues such as helmet use from cycling organisations.
While there’s very good evidence of reduced injury severity and reduced poor outcomes, there’s no data to suggest bicycle helmets prevent concussion, confirms Willie Stuart, clinical associate professor at the University of Glasgow’s Institute of Neuroscience & Psychology Administration.
“They are similar design to all other protective headgear in intent, and various studies of headgear in various sports show no concussion benefit,” he says.
“Bicycle helmets are not intended for concussion protection, but are intended to mitigate against fractures scalp injuries penetrating injuries etc which can complicate traumatic brain injury.”
Helmet testing in labs is typically based on drop tests, testing impact, Stuart says, whereas rotation is what a factor of concussion, which helmets don’t help to prevent.
Scientists identify neurons that control hibernation-like behaviour
The dream of suspended animation has long captivated the human imagination, reflected in countless works of mythology and fiction, from King Arthur and Sleeping Beauty to Han Solo.
By effectively pausing time itself for an individual, a state of stasis promises to enable the repair of lethal injuries, prolong life and allow for travel to distant stars.
While suspended animation may seem a fantasy, a strikingly diverse array of life has already achieved a version of it.
Through behaviours like hibernation, animals such as bears, frogs and hummingbirds can survive harsh winters, droughts, food shortages and other extreme conditions by essentially entering into biological stasis, where metabolism, heart rate and breathing slow to a crawl and body temperature drops.
Now, Harvard Medical School neuroscientists have discovered a population of neurons in the hypothalamus that controls hibernation-like behaviour, or torpor, in mice, revealing for the first time the neural circuits that regulate this state.
The team demonstrated that when these neurons are stimulated, mice enter torpor and can be kept in that state for days. When the activity of these neurons is blocked, natural torpor is disrupted.
Another study published simultaneously by the University of Tsukuba in Japan also identified a similar population of neurons in the hypothalamus.
By better understanding these processes in mice and other animal models, the authors envision the possibility of one day working toward inducing torpor in humans—an achievement that could have a vast array of applications, such as preventing brain injury during stroke, enabling new treatments for metabolic diseases or even helping NASA send humans to Mars.
“The imagination runs wild when we think about the potential of hibernation-like states in humans. Could we really extend lifespan? Is this the way to send people to Mars?” said study co-lead author Sinisa Hrvatin, instructor in neurobiology in the Blavatnik Institute at HMS.
“To answer these questions, we must first study the fundamental biology of torpor and hibernation in animals,” Hrvatin said. “We and others are doing this—it is not science fiction.”
To reduce energy expenditure in times of scarcity, many animals enter a state of torpor. Hibernation is an extended seasonal form of this. Unlike sleep, torpor is associated with systemic physiological changes, particularly significant drops in body temperature and suppression of metabolic activity.
While common in nature, the biological mechanisms that underlie torpor and hibernation are still poorly understood.
The role of the brain, in particular, has remained largely unknown, a question that drove the research efforts of Hrvatin and colleagues, including co-lead author Senmiao Sun, a graduate student in the Harvard Program in Neuroscience, and study senior author Michael Greenberg, the Nathan Marsh Pusey Professor and chair of the Department of Neurobiology in the Blavatnik Institute at HMS.
The researchers studied mice, which do not hibernate but experience bouts of torpor when food is scarce and temperatures are low.
When housed at 22 C (72 F), fasting mice exhibited a sharp drop in core body temperature and significant reduction in metabolic rate and movement. In comparison, well-fed mice retained normal body temperatures.
As mice began to enter torpor, the team focused on a gene called Fos—previously shown by the Greenberg lab to be expressed in active neurons. Labeling the protein product of the Fos gene allowed them to identify which neurons are activated during the transition to torpor throughout the entire brain.
This approach revealed widespread neuronal activity, including in brain regions that regulate hunger, feeding, body temperature and many other functions.
To see if brain activity was sufficient to trigger torpor, the team combined two techniques—FosTRAP and chemogenetics—to genetically tag neurons that are active during torpor. These neurons could then be re-stimulated later by adding a chemical compound.
The experiments confirmed that torpor could indeed be induced—even in well-fed mice—by re-stimulating neurons in this manner after the mice recovered from their initial bout of inactivity.
However, because the approach labelled neurons throughout the entire brain, the researchers worked to narrow in on the specific area that controls torpor. To do so, they designed a virus-based tool that they used to selectively activate neurons only at the site of injection.
Focusing on the hypothalamus, the region of the brain responsible for regulating body temperature, hunger, thirst, hormone secretion and other functions, the researchers carried out a series of painstaking experiments.
They systematically injected 54 animals with minute amounts of the virus covering 226 different regions of the hypothalamus, then activated neurons only in the injected regions and looked for signs of torpor.
Neurons in one specific region of the hypothalamus, known as the avMLPA, triggered torpor when activated. Stimulating neurons in other areas of the hypothalamus had no effect.
“When the initial experiment worked, we knew we had something,” Greenberg said. “We gained control over torpor in these mice using FosTRAP, which allowed us to then identify the subset of cells that are involved in the process. It’s an elegant demonstration of how Fos can be used to study neuronal activity and behavioural states in the brain.”
The team further analysed the neurons that occupy the region, using single-cell RNA sequencing to look at almost 50,000 individual cells representing 36 different cell types, ultimately pinpointing a subset of torpor-driving neurons, marked by the neurotransmitter transporter gene Vglut2 and the peptide Adcyap1.
Stimulating only these neurons was sufficient to induce rapid drops in body temperature and motor activity, key features of torpor. To confirm that these neurons are critical for torpor, the researchers used a separate virus-based tool to silence the activity of avMLPA-Vglut2 neurons. This prevented fasting mice from entering natural torpor, and in particular disrupted the associated decrease in core body temperature. In contrast, silencing these neurons in well-fed mice had no effect.
“In warm-blooded animals, body temperature is tightly regulated,” Sun said. “A drop of a couple of degrees in humans, for example, leads to hypothermia and can be fatal. However, torpor circumvents this regulation and allows body temperatures to fall dramatically. Studying torpor in mice helps us understand how this fascinating feature of warm-blooded animals might be manipulated through neural processes.”
The researchers caution that their experiments do not conclusively prove that one specific neuron type controls torpor, a complex behaviour that likely involves many different cell types. By identifying the specific brain region and subset of neurons involved in the process, however, scientists now have a point of entry for efforts to better understand and control the state in mice and other animal models, the authors said.
They are now studying the long-term effects of torpor on mice, the roles of other populations of neurons and the underlying mechanisms and pathways that allow avMLPA neurons to regulate torpor.
“Our findings open the door to a new understanding of what torpor and hibernation are, and how they affect cells, the brain and the body,” Hrvatin said. “We can now rigorously study how animals enter and exit these states, identify the underlying biology, and think about applications in humans. This study represents one of the key steps of this journey.”
The implications of one day being able to induce torpor or hibernation in humans, if ever realized, are profound.
“It’s far too soon to say whether we could induce this type of state in a human, but it is a goal that could be worthwhile,” Greenberg said. “It could potentially lead to an understanding of suspended animation, metabolic control and possibly extended lifespan. Suspended animation in particular is a common theme in science fiction, and perhaps our ability to traverse the stars will someday depend on it.”
Additional authors include Oren Wilcox, Hanqi Yao, Aurora Lavin-Peter, Marcelo Cicconet, Elena Assad, Michaela Palmer, Sage Aronson, Alexander Banks and Eric Griffith.
An update on Parkinson’s research
Despite the impact of COVID-19 across many sectors, Parkinson’s research continues at pace with studies across the world shining new light onto the disease, as NR Times reports.
Despite 30 years of research, not a single therapy has been found to successfully delay or stop the progression of Parkinson’s Disease (PD), a slowly progressive disorder that affects movement, muscle control and balance.
It is the second most common age-related neurodegenerative disorder affecting about three per cent of the population by age 65, and up to 5 per cent of individuals over the age of 85.
Each potential cure for PD has to go through three clinical trial phases to test its safety, whether it shows signs of improving PD, and whether there is any meaningful benefit to people with PD.
Running a clinical trial is a huge logistical, costly, and time-consuming undertaking. For a single new therapy this process can take the best part of a decade.
“The current way we do trials in Parkinson’s is too slow and inefficient,” explained Camille Buchholz Carroll from the Applied Parkinson’s Research Group at the University of Plymouth.
“We need to develop new ways of doing trials such as the Multi Arm Multi Stage (MAMS) trial platform, which will speed up the process and bring us closer to finding a cure, faster. We have the opportunity to learn from the experience in these other conditions and design a new trial that will work for people with Parkinson’s.”
MAMS trial platforms already exist for prostate, renal, and oropharyngeal cancer and are currently being developed within the UK for other neurogenerative disorders such as progressive multiple sclerosis (PMS) and motor neuron disease (MND).
MAMS trials test many potential therapies in parallel (multi-arm), transitioning seamlessly through various phases (multi-stage), i.e., from a phase II safety and efficacy study to a phase III trial.
Early analyses allow unsuccessful therapies to be replaced. At the interim checkpoint, ineffective arms can be dropped and replaced by new treatment arms, thereby allowing for the continuous evaluation of interventions.
Dr. Carroll and colleagues explore how the challenges of drug selection, trial design, stratification and outcome measures, type and stage of PD to be tested have been met in promising MAMS trials instituted to address other diseases including the STAMPEDE trial; Motor Neuron Disease Systematic Multi-Arm Adaptive Randomized Trial (MND SMART]; and UK MS Society’s 2018-2022 Research Strategy.
“There are many promising drugs in the pipeline that have potential to slow down the progression of PD but taking that hypothesis to the test is still a long and cumbersome process,” notes Prof. Bas Bloem, co-editor-in-chief of the Journal of Parkinson’s Disease.
“The new approach described holds great promise for facilitating this complex procedure, so that we can gather the necessary evidence for new treatments much quicker than before. Patients will certainly applaud this development as well!”
The authors stress that to maximise the potential of a MAMS platform trial running over many years and interrogating many research questions, it is crucial that there is a pipeline in place that will continuously identify and evaluate suitable drug candidates.
Furthermore, outcome measures have to be chosen that are sensitive enough to changes in disease progression over interim stages as well as the full duration of the trial.
Other studies are taking different approaches to relieve the symptoms of Parkinson’s disease. For example, biomedical engineers at Duke University have used deep brain stimulation based on light to treat motor dysfunction in an animal model of the disease.
Succeeding where earlier attempts have failed, the method promises to provide new insights into why deep brain stimulation works and ways in which it can be improved on a patient-by-patient basis.
“If you think of the area of the brain being treated in deep brain stimulation as a plate of spaghetti, with the meatballs representing nerve cell bodies and the spaghetti representing nerve cell axons, there’s a longstanding debate about whether the treatment is affecting the spaghetti, the meatballs or some combination of the two,” said Warren Grill, the Edmund T. Pratt, Jr, school distinguished professor of biomedical engineering at Duke.
“But it’s an impossible question to answer using traditional methods because electrical deep brain stimulation affects them both as well as the peppers, onions and everything else in the dish. Our new light-based method, however, is capable of targeting just a single ingredient, so we can now begin teasing out the individual effects of activating different neural elements.”
“Neurons being stimulated with optogenetics don’t generally respond very quickly, and it seemed to me that the researchers [in a previous study] were flashing their lights faster than the neurons could keep up with,” said Grill. “The data bore this out, as the neurons appeared to be responding randomly rather than in sync with the flashes. And previous research that we conducted showed that random patterns of deep brain stimulation are not effective at relieving symptoms.”
It took more than a decade for Grill to be able to test his theory, but two recent developments allowed him to follow his hunch. Researchers developed a faster form of optogenetics called Chronos that could keep up with the speeds traditionally used in deep brain stimulation.
And Chunxiu Yu, a research scientist with expertise in optogenetics, joined Grill’s laboratory. Also contributing to the work in Grill’s laboratory were Isaac Cassar, a biomedical engineering doctoral student, and Jaydeep Sambangi, a biomedical engineering undergraduate.
In the new paper, Yu embedded the Chronos optogenetics machinery into the subthalamic nucleus neurons of rats that have been given Parkinson’s disease-like conditions in one-half of their brains. This model helps researchers determine when a treatment is successful because the resulting physical movement symptoms only occur on one side of the rat’s body.
They then delivered deep brain stimulation using light flashes at the standard 130 flashes per second.
As Grill first suspected nearly 15 years ago, the technique worked, and the rats’ physical symptoms were substantially alleviated.
Perhaps the most important result is simply that the technique worked at all. Besides offering a much clearer look at neural activity by removing electrical artifacts, the ability to deliver deep brain stimulation to precise subsets of neurons should allow researchers to begin probing exactly which parts of the brain need to be stimulated and how therapies might be tailored to treat different motor control symptoms on a case-by-case basis.
As their next experiment in this line of research, Grill and his colleagues plan to recreate this same study but in the hyperdirect pathway – the spaghetti instead of the meatballs – to see what its individual contribution to relieving symptoms might be.
Elsewhere, Parkinson’s disease researchers have used gene-editing tools to introduce the disorder’s most common genetic mutation into marmoset monkey stem cells and to successfully tamp down cellular chemistry that often goes awry in Parkinson’s patients.
The researchers used a version of the gene-editing technology CRISPR to change a single nucleotide – one molecule among more than 2.8 billion pairs of them found in a common marmoset’s DNA – in the cells’ genetic code and give them a mutation called G2019S.
In human Parkinson’s patients, the mutation causes abnormal over-activity of an enzyme, a kinase called LRRK2, involved in a cell’s metabolism. Other gene-editing studies have employed methods in which the cells produced both normal and mutated enzymes at the same time. The new study is the first to result in cells that make only enzymes with the G2019S mutation, which makes it easier to study what role this mutation plays in the disease.
“The metabolism inside our stem cells with the mutation was not as efficient as a normal cell, just as we see in Parkinson’s,” says Marina Emborg, professor of medical physics and leader of University of Wisconsin-Madison scientists , whose work is supported by the National Institutes of Health.
“Our cells had a shorter life in a dish. And when they were exposed to oxidative stress, they were less resilient to that.”
The mutated cells shared another shortcoming of Parkinson’s: lacklustre connections to other cells. Stem cells are an especially powerful research tool because they can develop into many different types of cells found throughout the body.
When the researchers spurred their mutated stem cells to differentiate into neurons, they developed fewer branches to connect and communicate with neighboring neurons.
Scientists have long known that clumps of a damaged protein called alpha-synuclein build up in the dopamine-producing brain cells of patients with Parkinson’s disease. These clumps eventually lead to cell death, causing motor symptoms and cognitive decline.
“Once these cells are gone, they’re gone. So if you are able to diagnose the disease as early as possible, it could make a huge difference,” says LJI research assistant professor Cecilia Lindestam Arlehamn, Ph.D., who served as first author of a new study co-led by scientists at the La Jolla Institute for Immunology (LJI) which adds increasing evidence that Parkinson’s disease is partly an autoimmune disease.
The research could make it possible to someday detect Parkinson’s disease before the onset of debilitating motor symptoms–and potentially intervene with therapies to slow the disease progression.
The new findings shed light on the timeline of T cell reactivity and disease progression. The researchers looked at blood samples from a large group of Parkinson’s disease patients and compared their T cells to a healthy, age-matched control group.
They found that the T cells that react to alpha-synuclein are most abundant when patients are first diagnosed with the disease.
These T cells tend to disappear as the disease progresses, and few patients still have them ten years after diagnosis.
The researchers also did an in-depth analysis of one Parkinson’s disease patient who happened to have blood samples preserved going back long before his diagnosis.
This case study showed that the patient had a strong T cell response to alpha-synuclein ten years before he was diagnosed with Parkinson’s disease. Again, these T cells faded away in the years following diagnosis.
“This tells us that detection of T cell responses could help in the diagnosis of people at risk or in early stages of disease development, when many of the symptoms have not been detected yet,” says professor Alessandro Sette who co-led the study.
“Importantly, we could dream of a scenario where early interference with T cell responses could prevent the disease from manifesting itself or progressing.”
Sulzer added: “One of the most important findings is that the flavour of the T cells changes during the course of the disease, starting with more aggressive cells, moving to less aggressive cells that may inhibit the immune response, and after about 10 years, disappearing altogether.
“It is almost as if immune responses in Parkinson’s disease are like those that occur during seasonal flu, except that the changes take place over ten years instead of a week.”
Here in the UK, neuroscientists at York University have found five different models that use types of non-motor clinical – such as sense of smell, frequently dozing off or thrashing about during dreams – as well as biological variables to more accurately predict early-stage Parkinson’s disease.
Their five-model analysis is one of the first utilising only non-motor clinical and biologic variables. Some models performed better than others but all distinguished early stage (preclinical) Parkinson’s disease from healthy, age-matched controls, with better than 80 per cent accuracy.
The models may assist in more timely administration of future treatments as they become available, according to the study published in Frontiers in Neurology today.
In the study, two separate analyses were conducted: one for the classification of early Parkinson’s disease versus controls, and the other for classification of early Parkinson’s versus SWEDD (scans without evidence of dopamine deficit).
The term SWEDD refers to the absence, rather than the presence, of an imaging abnormality in patients clinically presumed to have Parkinson’s disease.
Facilitated and more accurate prediction of early-stage, de novo Parkinson’s can allow those positively diagnosed to adopt lifestyle changes such as regular physical exercise early on that can improve mobility and balance, says Joseph DeSouza, associate professor of the Department of Psychology at York University.
Researchers used cross-sectional, baseline data from the Parkinson’s Progressive Markers Initiative (PPMI).
The PPMI data used was confined to non-motor clinical variables (e.g. sense of smell, daytime sleepiness, presence of rapid eye movement behaviour disorder, age, etc.) and biologic variables (e.g. cerebral spinal fluid alpha-synuclein, tau protein, beta-amyloid-142, etc.)
Five different model types were “trained” models that could prove useful in helping to differentiate early stage Parkinson’s pathology.
- Legal2 weeks ago
Capacity and sexual relations
- Legal2 months ago
Assessments in the virtual world
- News1 week ago
How two community services are making tentative steps to normality
- News1 month ago
Repeated head injuries linked to depression – study
- Opinion9 months ago
Biting back against a common threat
- More headlines2 weeks ago
From brain injury to Bafta
- More headlines8 months ago
2020: The year of the exoskeleton?
- News1 month ago
Brain injury in the new normal: How to get a good night’s sleep during the pandemic