The Need to Know Sooner - Racing Against The Clock

by Women's Brain Health Initiative:Current diagnosis of Alzheimer’s disease (AD) happens after symptoms of cognitive decline are evident, yet by that point in time brain damage is often already severe. The disease is known to develop over many years before symptoms appear, perhaps decades. Researchers have been seeking an easy, accurate method for detecting Alzheimer’s disease before the symptoms surface, and before so much damage is done in the brain.They are looking for biomarkers that indicate early stages of the disease. (A biomarker is a measurable substance that indicates the severity or presence of a disease, e.g., blood sugar levels are evaluated to determine if someone has diabetes.)BIOLOGICAL MARKERSLooking for changes in the brain, detectable with neuroimaging is one area of biological marker research. Three molecular imaging tracers have been developed and approved that bind to amyloid beta (Aß) plaques in the brain so that they can be seen during a positron emission tomography (PET) brain scan.

While the presence of Aß plaques in the brain is a characteristic of Alzheimer’s disease, not all people with such plaques have or ever develop the disease.

So, these tests are not definitive on their own for reliably diagnosing or predicting Alzheimer’s.Other biomarkers being researched are levels of Aß and tau--associated with Alzheimer’s brain plaques and tangles--in cerebrospinal fluid (CSF), the clear liquid that cushions the brain and spinal cord. CSF can be sampled via a spinal tap, or lumbar puncture.These tests are not considered definitive on their own for diagnosing AD, and are currently used primarily in research settings. Blood tests are also being studied for their potential to diagnose or predict risk of Alzheimer’s disease.No biomarkers for Alzheimer’s disease have yet been validated, meaning that not enough research has been done at this point to say that any biomarker accurately and reliably indicates the presence of Alzheimer’s disease. The benefits of validated biomarkers will be significant for researchers who are working on developing treatments that target the disease earlier in its progression.But, the discovery of validated biomarkers would open up the possibility of tests that would let individuals know early that they are going to (or are likely to) get Alzheimer’s disease, and that would result in challenging personal and ethical decisions.Current Process for Alzheimer’s DiagnosisExamination of brain tissue during an autopsy is currently the only way to definitively diagnose Alzheimer’s disease. There is no single test that can be used to assess whether a living person has Alzheimer’s disease. Doctors must use a variety of tools during an assessment, and they will be looking to rule out any other illness that might be causing the symptoms. According to the Alzheimer’s Association, “experts estimate a skilled physician can diagnose Alzheimer’s with more than 90 percent accuracy.”TESTINGIf a test existed that would tell you definitively you are going to get Alzheimer’s disease in 20 years, would you want to know?What if it told you definitively 10 years in advance, or five years? While the answers to those questions are likely to vary from person to person, it is easier to weigh the pros and cons of the decision if the test results are 100% accurate. But in reality, that is not usually the case. More likely is that tests will be developed that will predict with some degree of accuracy less than 100% that you have a certain level of risk of getting Alzheimer’s disease.Now consider this hypothetical situation: if a test were available that could tell you if you are at high risk of developing Alzheimer’s and that test was correct 97% of the time, would you want to take the test? In that scenario, the benefits of “knowing” are questionable because you actually don’t really know 100%, and the harms of “knowing” could be quite high, especially if you receive a false positive or a false negative result. Is it helpful to know you will probably get Alzheimer’s, but might not? Most people would probably not find that information helpful and, so, would opt not to take such a test.Given the uncertainty of predictive testing and the lack of a cure or limited effectiveness of current treatments, widespread screening for Alzheimer’s disease is not recommended.Screening involves looking for signs of early disease or the presence of risk factors in healthy individuals, as is done in routine colonoscopies, for example. It is pre-symptomatic prediction, in particular, that has evoked strong criticism.David Le Couteur, FRACP, PhD at the University of Sydney, Australia and his colleagues argue that screening for ‘pre-dementia’ or mild cognitive impairment (before symptoms are evident) will lead to over diagnosis and “have potential adverse consequences for individual patients, resource allocation, and research” (in their 2013 article “Political drive to screen for pre-dementia: not evidence based and ignores the harms of diagnosis” published in BMJ).They go on to emphasize that over diagnosis of mild cognitive impairment needs to be avoided since “only 5-10% of people with mild cognitive impairment will progress to dementia each year, and as many as 40-70% of people do not progress or their cognitive function may even improve.”In the long-run, as testing methods are further developed and hopefully effective treatments or a cure are discovered, the ethical issues may shift around whether predictive testing and screening are a good idea or not. Of course, using predictive tests for research purposes, which is already happening now, is critical as it allows researchers to seek treatments that would address the disease early. Screening for research purposes, to find study participants, is different than widespread screening in a clinical setting.EARLY DIAGNOSISWhile widespread screening for Alzheimer’s disease is not recommended, there is consensus that anyone who is experiencing worrisome cognitive changes should get assessed. In other words, there is consistent support in favour of being diagnosed as soon as possible. The Alzheimer’s Society of Canada, for example, has a campaign to raise awareness of the importance of getting diagnosed early, and the Alzheimer’s Society in the UK runs the Right to Know campaign (https://www.alzheimers.org.uk/righttoknow) which advocates for early detection and support after diagnosis.Once symptoms appear, there are obvious benefits to getting medically evaluated. Many treatable conditions cause dementia -like symptoms, and addressing those or ruling them out as early as you can is important. If you are diagnosed with Alzheimer’s disease early, it provides an opportunity for you to play a role in planning for the changes ahead, while you have the mental capacity to make important decisions about your finances and long-term care. Other benefits of being diagnosed early include having an opportunity to make lifestyle changes that might delay the disease’s progression, and perhaps participating in a clinical trial to help find a cure or treatment.Even at this later stage, when the consensus is to seek a diagnosis if you are experiencing cognitive difficulties, there can be understandable resistance. Alzheimer’s is a dreaded disease so, of course, many people are reluctant to see their doctors when they are experiencing problems with memory. But, not getting a diagnosis if you have the disease doesn’t change the fact that you have it, and it robs you of a chance to make the most of the time you have left.Source: MIND OVER MATTER

Previous
Previous

Early diagnosis - Would you want to know?

Next
Next

THE ROAD LESS TRAVELED - The Impact of Dementia on Independent Mobility