Why would this be on the national news today? Is it true? Does the manufacturer fail to support their claim? Let’s find out.
Last year, I was asked to look into the claims that are made on a box of nutritional supplements. The product was Prevagen®. Here is what I found.
The manufacturer claims the active ingredient was originally found in jellyfish. However, the active ingredient that you can buy in the store is grown in a lab. This active ingredient has a patent attached to it.
On the box there is a claim that the product improved memory in a clinical trial. The chart on the Prevagen® box is problematic. It is labeled, “Prevagen Improves Memory”. This certainly grabs one’s interest. Is it simple eye candy? The chart appears to show a 20% improvement in memory over 90 days. Above the chart are the words, “Clinically Tested”. This is followed by “In a computer assessed, doubleblinded, placebo controlled study …”. My years of scientific training kicked in. Doubleblinded? Who writes this stuff? I needed to dig deeper.
The prevagen.com website provided a good amount of information about this study. Before we dig into the research, it is a good idea to recall some of the ground rules to doing scientific research.
1. Statistical analysis of a study has something in common with playing pool. In certain games of pool. the person preparing to take a shot must call out which pocket a preselected ball will go into.* Identification of which way the observed variable will move doubles your chances of showing statistical significance, if you claim it.
2. If you cannot tell, which way the study will go; the standard for identification of a statistically significant effect is p < .05. What does this mean? In the inequality, p < .05, the letter p stands for the probability that the effect could have happened by chance.The value .05 is one way to write 5%. So, if there is a 5% or less chance that the effect occurred by random events, then it is considered to be interesting. Statisticians call this statistically significant.
3. If you do know which way your variable will move, then you can claim statistical significance for any result where the probability that the effect occurred by chance is less than or equal to 1 in 10.That is p < .10.
Knowing the direction of the effect also helps in planning an experiment. This is similar to the advantage in pool of knowing where the ball will fall.
Now, here is where the red flags begin to go up. Remember the statement on the box, “In a computer assessed, doubleblinded, placebo controlled study …”. The fact that the people, who planned the study could not identify what sort of effect this product would have on memory means that they could not claim significance at a 0.10 level. The study had to be assessed at p < .05.
With this in mind, let’s look at the difference between the placebo group and the active ingredient group. The link to this data is labeled “Madison Memory Study: A Randomized, Double-Blinded, Placebo-Controlled Trial of Apoaequorin in Community-Dwelling, Older Adults”. Fortunately, this analysis of study data is written well enough to understand what the product manufacturer based their claim on.
The Prevagen® study ran from December 2009 to April 2011. The prepared report for Quincy Bioscience is dated August 2016. The study ran for about a year and five months. Yet, the data shown on the product box represents only 90 days.
The study post hoc design section shows that the two groups were further subdivided based on a ranking along the spectrum of Alzheimer’s Dementia (AD). Note that a low AD score suggests normal aging and not AD. Now among those who scored low on the AD range, there appeared to be some positive change for some people. But this is really digging into the data. Before digging into the data, subdividing people into various different groups and trying desperately to find some correlation between some participants in the study and the active ingredient, there is an important step to take.
The first step in any analysis that claims a placebo group and an active ingredient group is to show a difference between these two groups. On page 3 of the Madison Memory Study, significance is set to a value of p < .05. This is what is called a 2-tailed criterion for statistical significance. The software used was the industry gold-standard software produced by the SAS Institute of Cary, North Carolina.
The first line of the RESULTS section on page 4, reads as follows, “While no statistically significant results were observed over the entire study population …”. This means that there was no discernable difference between the placebo and the supplement groups. All you can really do at this point is data mine the failed study and look for possible candidates for further study.
In the post study analysis, there was an observation of some statistically significant numbers, when two subgroups were examined. The effects or correlations were found with people experiencing minimal or no cognitive impairment. This sort of life experience is usually referred to as Age Associated Cognitive Decline (AACD). The statistical review rightly stated that this sort of group is precisely the people, who might benefit from supplementation.
This author has no insight into exactly what the FDA is looking at, when they suggest that Prevagen® does not work as advertised. I would have to say that there is a strong probability that not having shown a statistically significant difference between the two groups is at the top of the FDA’s list of problems with this supplement.
If you read this far, you deserve some good news. There are good supplements for mild memory problems. Among the best supplements is one mentioned in my email in December 6th 2016. You can read about it here:
* I learned this while serving in the U.S. Air Force. In one of the headquarters buildings, there was a pool table to provide some diversion for the troops.
Gerald Epling, Ph.D.
Nationally Certified Psychologist