This week there were two huge stories about Alzheimer's disease (AD). First was the failure of Solanezumab, an Eli Lilly-developed drug. Unlike early drugs, which are typically small molecules, Solanezumab is an antibody based therapeutic designed to bind to excess amyloid beta and facilitate its removal from the brain.
The three main reasons why a drug doesn't go through clinical trials are (1) the drug does not work (that is, effect the process it is designed to achieve), or (2) the drug generates untenable side effects, or (3) the drug has acceptable side effects and works as intended, but does not significantly improve the condition of the patients.
We can immediately remove the side effect issue, as Solanezumab was tolerated reasonably well during trials. The new drug was abandoned because of a failure to achieve any meaningful effect. This could stem from a technical failure in the design of a drug that renders it ineffective or, more worryingly, there's a good chance the drug IS effectively removing amyloid yet still generates no measurable improvement (in this case, performance on standardized memory and dementia exams).
Because this is a CNS disease, determining whether a drug is doing its job can be challenging. I'm certain that the Lilly scientists tested the ability of their new therapeutic to remove amyloid from nonhuman models, but this was a process optimized in other animals and the Lilly guys can't just give the drug to a human patient and then conclude that they've successfully removed a bunch of amyloid.
That being said, I'm not so sure about the defective medicine argument as an excuse for the trial's failure. You see, this is the THIRD different drug specifically targeting beta amyloid removal that has failed. The odds of three different programs all failing to generate a working form of the same therapeutic target is really low. This leads us to the final possibility - amyloid-beta deposition in the brain has jack shit to do with AD. (Side note: I've previously written about why this may well be the case and, while it gives me no pleasure to say, the skeptics may have it right this time.)
I think even non-scientists know by now that the hypothesis that amyloid beta depositions are a causative factor in AD has been a mainstream hypothesis for a long time. Maybe not so much longer; Eli Lilly brass put so much stock in the idea (no pun intended) that they cherry picked results from a previous trial of this drug and performed another clinical trial to look at people with only limited evidence of dementia*. Gotta give them credit for chasing it all the way to the end of the rabbit hole but, as they say in North Carolina, that dog won't hunt.
So what's going to happen next?
For the industry part of it, I have a pretty good idea - they're getting the fuck out. Pragmatism is a little more abundant here in industry than in academia; no one is about to invest in another expensive AD boondoggle in the face of the recent results. My guess is pretty much everyone is going to stop investing in new AD programs until there are some new, viable targets to pursue.
As a corollary, those future targets I just mentioned usually come from academic research. Industry usually doesn't like to devote much energy to plinking around on the really early stage stuff. That's academia's job. Unfortunately, it was academic scientists who rammed home the beta amyloid thing in the first place. This creates a really interesting dynamic - on one hand, you've got a shitload of academics (and at least a few industry scientists) who have built their career on the idea that amyloid beta is a causative factor in AD. Now we have three clinical trials saying otherwise. By accepting the results of these trials, academic honchos have to admit that they're wrong and move on to the next thing. This is not something they're likely to do and even if they did, why would industry listen to people who (potentially) pointed everyone down the wrong road for the last 30 years? Why would these same thought leaders still be credible? It creates a very interesting impasse, and it's natural to be curious as to how this will be reconciled.
Fuck if I know.
But it's not all bad news, kids. The second Alzheimer's-related news story, an incidence study on dementia, is baffling, but far more positive. One of the JAMA journals ran an article reporting changes in the incidence of dementia in seniors. To do this, researchers performed a (not as detailed as I'd like) survey on a population of seniors that was curated to accurately reflect the makeup of American society. This study compared a senior group from 2012 to a group that responded to a similar 2000 survey (I guess it took an additional four years before someone could add all the responses up). This type of study is commonly used to see the dynamic changes in particular disease frequencies.
Much of the data was predictable. We're marginally fatter and more prone to diabetes than we used to be. However, we're also much less demented. The study documented a decrease of roughly 25% in patients suffering from dementia in the twelve years between studies.
Are. you. shitting. me? A 25% drop in dementia in only 12 years is MASSIVE. Imagine every nursing home being 25% less crowded with dangerous, hard-to-manage Alzheimer's patients and you get some idea of how big a deal this is.
On paper, the study looks legit. The researchers attributed this decrease to increased "cognitive reserve", a hypothetical buffer between normal and crazy that is established by additional education (full disclosure: I'm not convinced this actually exists). The new cohort of patients had more education than the 2000 cohort. A whopping 0.9 years more. The researchers argue that the extra year of community college may be dropping dementia by 25%. Me skeptical. But there's a good reason that the authors are stuck with such dumb theories - the limited scope of the survey didn't give them many other factors to examine.
So I'll go ahead and say what a responsible researcher can't: there's a good chance that a changed environmental factor is responsible for this change. Genetic changes aren't passed on rapidly enough to explain the drop. Other factors don't seem to be at play: there are no patient-driven changes in measured factors, no new medicines for dementia (see above). So why the sudden difference?
Hazarding a guess, I'd wager we removed an environmental toxin or risk factor that was present for the 2000 seniors but not for the 2012 group. It could be an additive effect of environmental protections that emerged in recent years (with older people sucking up more of the hypothetical dementia-causing toxins during their life). This could include the elimination of unleaded gasoline, the reduction in smoking, the addition of fluoride to our water supply, etc., factors that could be important, either alone or additively.
However, there's a significant problem with my guess: the relatively rapid decrease in dementia observed in the study doesn't strongly support that hypothesis, which would likely result in a more gradual, concomitant decrease in dementia rates.
Here's a more aggressive hypothesis: extrapolating from the study dates, we could guess that this environmental change occurred approximately 1931-1943 (the inflection point between studies; an event that occurred in this period would affect current seniors, while older individuals measured in 2000 would have mostly died off). Looking at those dates, it could be related to conditions created by the World Wars, or it could be something else, possibly something indirectly related the conflicts. One possibility is widespread vaccination, as the widespread enlistment of Americans into the military dramatically increased rates of vaccination (as it was mandatory for new recruits). Neurocognitive benefits from vaccination wouldn't be apparent for years, even if anyone was looking for them. No idea if it's likely or not, so just a fun theory for now.
Cleaning up, it's worth stating that not all dementia is AD, so there's not a direct correlation between these two stories, but the results are certainly suggestive and most definitely interesting. Moving forward, though, the two stories seem to push us towards an environment where little private sector investment will go into AD. Maybe, if we're exceedingly lucky, rates of dementia will continue to fall, giving epidemiologists some real runway on which to operate. Then again, with president elect Trump railing against both the Affordable Care Act and the EPA, we might just learn whether my environmental hypothesis is right when the 2072 dementia survey comes back.
*Alzheimer's trials are the worst for so many reasons: they take a long time to see clear effects, old people are your subjects. They tend to die off during the trial, meaning you have to enroll so many more patients, and they also can develop dementia-like symptoms from being overmedicated or other issues. Big $.
Noah's Inner Monologue
Scribblings of a man who can barely operate an idiotproof website.