Caloric Restriction Delay Aging in Humans?
Bob Seitz, Ph. D.
September 1, 2006
(Updated: June 7, 2008)
I think that the short answer is
One of the reasons you might ask is that I'm aware of four different models that have recently been advanced by leading gerontologists that explain why, on evolutionary grounds, CR (calorie restriction) won't extend the maximum lifespans of humans. At least three of these cite different causes, so only one, or at most two of these theories could be correct. But before examining them and deciding which two or three are wrong, the first question is: does CR extend the maximum lifespans of humans?
Again, I think the short answer is, "yes", to the tune of, perhaps, 11-to-21 additional years of average and maximum lifespan extensions for 28% calorie restriction, plus another 7 years for a healthy lifestyle. We've known since April, 2004 (and only since April, 2004), that calorie restriction induces the same dramatic changes in humans that it does in other animals. This startling revelation seems to me to have rendered moot any divergence of opinion about whether CR works in humans; obviously, it does. But before we get into further discussion and the basis for my projections, I want to talk about:
Reasonable Expectations for the Life Extension Benefits of Human CR If CR Works in Humans As Well As It Does in Some Animals
(1) Although CR might be able to produce a 150-year-old human, if it works as well in humans as it does in some rats, it seems to me that the chances of it happening to me personally might be vanishingly small. (One in billions? One in trillions?).
The expectation that the calorie-restricted will live to the age of 116 or beyond would probably be no greater, in my estimation (after giving extra points for a very healthy lifestyle), than 1 in 20 for someone who becomes 28% calorie restricted from the age of 20 onward. (I'm using 28% CR in my examples because that's the average level of CR given for the 18 volunteers in the original April, 2004, Washington University PNAS paper. This assumes that the control group was wolfing down 2,500 (kilo)calories a day, versus a dainty 1,800 calories a day for the CR group.)
This is important because I think that inflated expectations of what calorie restriction may be expected to do might have led to unwarranted disappointments. A little arithmetic suggests that realistic levels of CR would normally engender lifespans well within the normal human envelope.
I think that, realistically, perhaps the best that someone embracing 28% CR this year at the age of 20 can hope from CR alone is to survive to 100 to to 110, living independently and driving his or her car until the last year or two--like Ralph Cornell (see below). He or she will be physiologically younger than their ages. This might be similar to what happens to centenarians who weren't/aren't calorie restricted. For example, an acquaintance, Miss Ola Wicks, who died in a nursing home at 101, was living by herself and driving to and from church until she entered a nursing home at 99. At 101, her mind still seemed clear, and her hair remained dark. Our next-door neighbor's mother, Mrs. Thomas Locke, was working as her son's office manager until she was 98 (when he sold his business and retired). Like Ola Wicks, she also died at 101. Our friend, Ben Harris, is still riding his Harley at 93. (He plays a wicked game of dominoes.) Allen Drake is 94, but looks and acts a healthy, vigorous 74. My Aunt Addie called me from Florida when she was 90 and said, "Bob, can you believe it? Here I am 90 years old and my hair is still jet black. There isn't a gray hair in it!" She died 10 years later at the age of 100. (Aunt Addie was overweight all the years that I was around her.) Uncle Glen was 95 when he was cut down in the prime of life by a sudden stroke. He was driving his truck around town, was cracking jokes, and was living alone in his house (after Aunt Eva died).
Of course, we all hope and expect that other mechanisms such as SENS for extending 95th-percentile human lifespans will come into being over the next few decades. And we all hope to reach "escape velocity"--the point at which the pace of aging remediation advances more rapidly than aging.
Could one of the promising gambits in this department be CR mimetics which might allow the calorie-restricted to safely deepen their levels of CR?
I'm thinking that another important point to remember is that the average and maximum life expectancies that apply to us are those of our birth cohorts... not the 77.5 years for neonates born in 2003.
For example, if you were born in 1980, your average life expectancy is about 74... 70 if you're a guy and 77.6 if you're a gal. Your 95th-percentile... one-in-20... life expectancy is about 95... 92 if you're a he and 98 if you're a she.
If you were born in 1950, your average life expectancy is about 68... 65.5 if you're a man and 71 if you're a woman. Your 95th-percentile life expectancy is about 91... 87.5 if you're a male and 94 if you're a female.
What Might We Reasonably Expect?
Based upon "classical" CR animal models, we might hope that 40% calorie restriction from infancy to death would lead to a 40% extension in average and maximum (defined as the average lifespan among the longest-lived decile) human lifespans.
These individuals would grow up stunted, and would require fewer calories to maintain their adult weights.
(1) Given our present lack of knowledge regarding the long-term effects of CR in humans, it's, perhaps, not practical to rear children on 40% calorie-restricted diets, and certainly, for those of us who are already adults, CR from infancy onward is no longer an option.
(2) Most of us may be unwilling, and may even be ill-advised at this stage of our knowledge of CR effects in humans, to restrict calories by more than about 30%.
(3) there are animal studies indicating that most of the benefits of CR can still be realized if CR is begun in early adulthood.
In that case, human CR might (hopefully) afford someone who begins 28% CR at 20 something like a 23% extension in average and maximum lifespans.
Replacing Maximum Life Expectancy With the 95th-Percentile Life Expectancy
It's customary in animal studies to define the maximum lifespan as the average lifespan of the longest-lived 10% of the animals. However, the longevity data for human populations is generally given in percentiles, so I'm going to substitute the 95th percentile lifespan as my maximum lifespan. (Because the survival curve falls off rapidly with age and is highly asymmetric, the average lifespan of the longest-lived decile would be a year or two greater than the 95th percentile lifespan.)
For those of us who have initiated CR in mid- or later life, our reasonable expectations for CR-induced life extension might be, I should think, somewhat less than 23%.
For a Twenty-Year-Old Beginning CR in 2006:
The average life expectancy for all 20-year-old U. S. citizens (born in 1986) is about 76. However, this includes some Americans with pretty unhealthy life styles. What's the average life expectancy for someone with very healthy habits? I'm going to suppose an average life expectancy of 85 for a 20-year-old who embraces a healthy lifestyle in 2006 and practices it for the rest of his or her life. This is about the average life expectancy for a present-day, Californian, vegan, 7th-Day Adventist.* In that case, a CR-induced 23% extension in average lifespan would add 20 years (23% of 85) to the average age of 85, leading to a projected average lifespan for someone who starts CR in 2006 at the age of 20 of 105 years, and a date-of-death of 2091. Thus, this individual might be the longest-lived in his or her family, but by 2090, even without any anti-aging advances, centenarians should be as common as 92-year-olds are today, and he or she might not even warrant special mention in the local newspaper. (This assumes that we are able to reverse our epidemic of obesity and that 95th-percentile life spans continue to increase over time at the present rate of a decade per century.)
* - Note that the adoption of a Seventh Day Adventist healthy lifestyle is enough to increase average life expectancy by 7-or-so years... a sizable fraction of what moderate CR would do.
The 95th percentile life expectancy for all 20-year-old Americans (born in 1986) is 95. I'm going to suppose a 95th-percentile, projected lifespan for today's 20-year-old, non-calorie-restricted, Seventh-Day Adventists of 95. In other words, I'm assuming that a very healthy lifestyle will increase the average, but not the maximum life expectancy. Using the same projected, CR-induced lifespan extension of 20 years, we would arrive at a 95th percentile CRANIE lifespan of 115, and a projected date of death of 2101. Today, this would qualify our 20-year-old for an entry in the global list of super-centenarians, but by 2101, even without any anti-aging advances, super-centenarians ought to be as common as centenarians are today. She might earn an article in her local newspaper, but she wouldn't make it into international studies.
For a Fifty-Year-Old Beginning CR in 2006:
The average lifespan for all U. S. citizens born in 1956 is about 75. I'm going to suppose an average, healthy-lifestyle lifespan of 84, with an average 28%-CR-induced lifespan extension of 15 years, leading to an average CR lifespan of 99 years, and a projected date of death of 2055. I'm going to project a 95th-percentile lifespan for today's Seventh-Day-Adventist, 50-year-olds of 95 years, and a projected, 28%-CR-enhanced lifespan extension of 15 years, leading to a projected 95th percentile life expectancy of 110, and a projected date-of-death of 2066. (By 2066, 110-year-olds should be as common as 104-year-olds are today.)
I've used a 15-year, 28%-CR-conferred life extension for people starting CR at 50 because this seems to be what seems to be turning up in the Washington University studies. However, Dr. Steve Spindler's study of 40% CR in mice started at 19 months (equivalent to about 47½ in humans?) led to an increase in maximum lifespan of about 6 months (equivalent to about 15 years in humans?). If I interpolate for 28% CR, I would come up with about an 11-year, rather than a 15-year increase in maximum lifespan. In that case, the average lifespan for people starting CR at 50 would become 95, and the 99th-percentile lifespan would be about 106 (comparable to 100 today).
Because the Washington University studies are showing 15-year rollbacks for 28% CR for all ages, I'm still entertaining a 15-year average and maximum CR-induced lifespan extension for those who start 28% CR as 50-somethings. In that case, the median age of demise for our longevinauts would be 99, and the 95th-percentile age of curtailment would be 110.
My hope is that other methods of extending average and maximum lifespan will appear before we have to put lifelong CR to the test.
These Might Be the Most Optimistic Ballpark Life Extensions We Can Practically Expect from 28% CR Begun Later in Life
Table 1, below, summarizes these guesstimates of upper bounds upon what CR might offer for practical (28% CR) begun in adult life.
|Begin 28% CR at:||
Average CR-Augmented Lifespan
"Maximum" CR-Augmented Lifespan*
|Age 20||105 years||115 years|
|Age 50||99 years||110 years|
|Age 50*||95* years||106* years|
* - "Maximum" defined as the 95th-percentile life
** - Uses more-conservative numbers based upon Dr. Steve Spindler's murine studies showing that starting 40% CR in mice at 19 months led to an increase in maximum lifespan of 6 months. (This is also consistent with the observation that, within two months after beginning CR, the CR mice' mortality dropped to 1/3rd that of the control group.)
To summarize, the most optimistic lifespan extensions that we would realistically expect in humans from 28% CR alone might be of the order of 11 to 20 years, rather than the 25 to 35 years that might be theoretically possible**, and might fall unrecognized within the envelope of existing human lifespans.
** - Might such extensions be possible if we augmented CR with CR mimetics?
Is there any way to improve on these numbers?
In theory, embracing 40% CR at 20, assuming that 40% CR didn't actually shorten one's lifespan, might theoretically add, possibly, 25-28 years to one's life, leading to an average lifespan of 109-112, and a 1-in-20 lifespan of 120-123*, but I would hope that other therapeutic maneuvers will afford comparable gain with lower risk (with the suggestion that CR mimetics might be a dark horse in this race).
I'm personally optimistic that the 15-year figure might be more accurate. Rodents usually die of cancer; humans usually die of circulatory failures. A 15-or-more-year rollback in circulatory-failure risks might suggest a 15-year lifespan extension.
Obviously, these are crude estimates.
Humans Are a Wild Population, and Human Longevity Is More Complicated Than That of Inbred Animals in a Controlled Environment
(1) The human population is huge, with, perhaps, a billion or more people whose records are tracked by gerontologists. Super-centenarians like Jeanne Calment, the current human longevity record holder at 122, occur only once in several hundred million people. It's tempting to compare human longevity with those of extreme outliers like Jeanne Calment.
(2) Presumably, lab animals aren't exposed to other lab animals with contagious diseases, whereas humans are (e. g., the flu).
(3) Lab animals aren't allowed to get drunk, smoke, and ride their Harleys the wrong way down one-way streets, nor do they fall and break their hips. Lab animals die of degenerative diseases, as do many but not all elderly human beings. (Aunt Florence died of the flu at 94. Grandpa Sheffield died at 87 of pneumonia while hospitalized for his ulcers. Mother died at 81 because of low potassium associated with the diuretic she was taking. Dad had his crippling stroke at 89 as a consequence (we speculate) of a blood clot he incurred, ultimately dying in a nursing home at 94. Medical mistakes are being cited as the number one killer of humans.) And you can't usually control what, and how much humans eat.
(4) Lab animals have no medical insurance, and aren't given individual life-saving medical attention..
(5) The food fed to lab animals can be carefully controlled, but humans operate on their own recognizance. If you binge one day and compensate the next, what effects does that have upon your "CR state"? How about overeating for a week and then undereating the following week?
(6) Because of the lack of control over human behavior, the average and 95th-percentile human lifespans would seem to me to be more difficult to ascertain than they are in animals.
The Maximum (95th-Percentile) U. S. Life Expectancy Has Increased 10 Years in the past 100 Years
The Average and 95th Percentile Human Life Expectancies
Average human lifespans have been steadily extended at a rate of about ¼ year per year for the past 160 years, presumably because of medical and public health interventions. The importance of this is dramatically demonstrated by
(1) the rise in average U. S. lifespan by about 28 years between 1900 and 2000, and
(2) the extension of 95th-percentile human lifespan from about 86 for people born in 1900 to a projected 96 for people born in 2000--an increase of about 10 years between 1900 and 2000!
The 99th-Percentile Human Life Expectancy
The 99th-percentile human life expectancy increased from about 92 for those born in 1900 to 101.3 .. an increase of 9.3 years!
The number of centenarians was pegged at 31 per 100,000 for Americans born in 1900-1902, and is projected to rise to 2,363 by 2103... a 76-to-1 increase from 0.03% to 2.3% of the population. (Like the 95th-percentile life expectancy, this increase in centenarians also corresponds to about a 10-year increase in life expectancy.)
This is quite a shift, comparable in magnitude to what moderate calorie restriction begun in middle age might afford.
Note that, beyond about the 90th life-expectancy percentile, there doesn't seem to be any fall-off in the 10-year-per-100-year in these life expectancies as you go from the 90th percentile frequencies to the centenarian frequencies.
Centenarian Life Expectancy
On the other hand, the life expectancy for centenarians has only risen from 1.58 years in 1900 to 2.6 years in 2003. (This seems strangely minimal, given a 76-fold increase in the frequency of centenarians.)
Is It Only the Average Life Expectancy That Has Increased?
It's often stated that maximum human lifespan hasn't changed--only the average. After all, some people lived to ripe old ages in the past. It's just that the survival curve is squaring up, with more people living longer, but with some fixed maximum age remaining unchanged.
I don't think that's the Occam's-Razor interpretation of what's happening. By any pragmatic metric, the observable maximum human lifespan is increasing, albeit at a slower rate than the average lifespan. This could be occurring because of better medical interventions or it might be a result of a better knowledge of life-prolonging lifestyles. Humans have benefited from antibiotics and vaccinations, improvements in water quality over the decades, and restorative surgical procedures. Or it could be happening because of some as-yet-unrecognized phenomenon. (Dr. Aubrey de Grey has raised the possibility that these steadily rising numbers might betoken an evolving increase in human lifespans fostered by the sheltered lives we lead that permit us, unlike wild populations, to grow old enough to die of degenerative diseases.)
This upsurge won't do us any good individually because we can't go back and be reborn, but it seems interesting to me from a theoretical standpoint.
Are These Increases in Average Life Expectancy s Result of Reduced Infant Mortality?
It's commonly claimed that the 28-year difference in the average U. S. life expectancy between 1900 and 2000 was engendered by a reduction in childhood mortality. However, a look at the remaining-life expectancies for 20-years-old adults among those born in 1900 and those born in 2003 shows that only about 12 of the 28-year difference in life expectancies between 1900 and 2003 can be attributed to deaths before the age of 20. The other 16 years occurred because of an increase in average adult lifespan.
Dr. Jay Olshansky has predicted that human life expectancies may soon begin to fall because of the rising tide of obesity.
The bottom line: Maximum lifespan is easier to define in animal experiments than it is in human beings. However, by any reasonable measure, maximum human life expectancies have risen by about 10 years over the past century... a significant fraction of what CR might be expected to accomplish.
One take-home message from these paragraphs is that in comparing human life expectancies, we might want to remember that human life-expectancy data is changing fairly rapidly over time, and that we must be sure to consider the date associated with each life expectancy in comparing it with other life expectancies.
Medical and Lifestyle Interventions
Medical and lifestyle interventions are powerful players in the life expectancy game. The amelioration of heart disease, cerebrovascular disease, cancer, Alzheimers disease, Parkinson's disease, respiratory diseases, type II diabetes, osteoporosis, etc., might not affect some abstract "maximum lifespan", but they would certainly increase our chances of living better longer. Heart and cardiovascular diseases are rapidly becoming preventable and treatable. (The British believe that heart disease could be nearly eliminated by 2010.) Preventive measures may exist for cancer, Alzheimer's disease, Parkinson's disease, type II diabetes, and osteoporosis, independently of CR. (Of course, avoiding smoking, radon, and certain types of pollution reduces your risk of lung cancer.) As time goes on, there will be more and more of these techniques, including, perhaps, effective stem cell treatments and organ replacements. And, of course, we hope that ultimately, there will be interventions that will go beyond what CR can accomplish in the amelioration of aging.
Focused Medical Interventions
Another key to improved individual longevity may lie in a medical program tailored to one's personal Achilles heels. Different individuals have different congenital vulnerabilities... high cholesterol levels in one, diabetic tendencies in another, and familial Alzheimers disease in a third. These may profit from focused interventions. I think advancing medical knowledge, coupled with special attention to individual vulnerabilities, is going to lead to increases in both average and maximum lifespans. An example would be ex-President Ford's recent pacemaker and stent operations. This may keep him going for at least a few years beyond events that would have felled him fifty years ago.
So far, none of these interventions, such as antioxidants, have increased the maximum lifespans in animals, but new approaches such as Protandim offer the hope of a new dispensation in such techniques.
The bottom line: Human life expectancies are more complicated then clean, controlled lab experiments with mice.
The Case Against CR Extending Lifespans in Humans
Is it conceivable that theories that explain why CR doesn't lengthen lifespans in humans might have begun to sprout when it was thought that CR has done next to nothing for the supposedly-20%-calorie-restricted Okinawans?
"Why Hasn't Calorie Restriction Done More for the Okinawans?"
One of the pieces of evidence that has (understandably) been cited to support the contention that calorie restriction doesn't delay aging in humans is the understanding that calorie restriction has had (or at least, so I misunderstood) a negligible effect upon the lifespans of the indigenous population of Okinawa.
The people of Okinawa were 38% calorie restricted as children and (allegedly) 20% calorie restricted as adults compared with the rest of the Japanese (who themselves are the longest-lived national population in the world), and yet, in 1995, the life expectancy of the Okinawans, at 81.2 years, was a paltry 1.3 years greater than the Japanese average of 79.9 years, and merely 4.4 years longer than Americans, at 76.8 years.
In contrast, the life expectancies of vegetarian California 7th-Day Adventists who
was, in... 1976 to
1988 (1982?), 84.5: 83.3 for the men, and
for the women. Since these life expectancies are for those who have already
survived to the age of 30, I've endeavored
to correct these
numbers to life expectancies at birth, subtracting the life expectancies at
birth from the age 30 life expectancies in
the latest life
expectancy table for the U. S. population as a whole and for U. S. males
and U. S. females in order to determine how much time surviving to the age of 30
added to their life expectancies. Then I subtracted these offsets from the age-30
life expectancies of the vegan Seventh-Day Adventists to estimate the life
expectancies at birth for the Seventh-Day Adventists who met the above six
criteria. (Table 1, below). This gave me an estimated 1982(?)
life expectancy from
birth for Seventh Day Adventists of about 83.1 years,
or nearly two years more than the 1995 life expectancies of the
Since 1995, the life expectancies of the Okinawans have begun to fall, with men's life expectancies dropping 0,42 years over the 5 years between 1995 and 2000. Also, the longest-lived Japanese aren't the Okinawans but the residents of the island of Amami, which had 56.57 centenarians per hundred thousand inhabitants when Okinawa had 31.19 per hundred thousand. The oldest man on record, Izumi Shigechiyo, who died at 120 in 1986, was from Amami, as was Hongo Kamato, who died at 116. And these geriatric wonders attribute their storied longevities not to CR but to minerals, fish, and a sense of purpose (not to mention brown sugar)!
Table 1. 2003 Life Expectancies of the Entire U. S. Population at Birth, and 2001 Life Expectancies of Vegan Seventh Day Adventists
|2003 Life Expectancies, U. S. Population||1982(?) Life Expectancies, 7th-Day Adventists|
Comparisons between the Okinawans and the vegan Seventh Day Adventists are shown in Table 2.
What Table 2
shows us is that the average 1995 life expectancy of the Okinawans is
1.9 years less than that of the Californian, vegan Seventh Day
Adventists who aren't on CR. The Okinawans eat an exemplary diet,
exercise daily, and lead spiritual, low-stress lives, and (supposedly) had been
calorie-restricted from birth onward, and yet, the calorie-unrestricted Seventh
Day Adventists are outliving them! No wonder the five theorists are
challenging the common understanding that CR delivers major lifespan
extensions in humans!
as Mark Twain famously said, "Reports of my death have been greatly exaggerated." A not-yet-published paper in Biogerontology (thanks to Dr. Al Pater for calling attention to this paper) shows
(1) that the Okinawans were 17% calorie-restricted only until the late 1960's!, "for about half their adult lives", but apparently, are no longer restricted, or at least, not at a 17% level. We know from Dr. Steve Spindler's murine CR studies that it doesn't take long (about 2 months in rodents) after the cessation of calorie restriction for the affected genes to largely revert to their pre-CR states. Even so, Willcox, Willcox, Todoriki, Curb, and Suzuki show in their June 30, 2006, not-yet-published paper that
(2) the maximum (99th-percentile) 2003 life expectancies of the Okinawans have been significantly extended (Table 3)!
You'll notice that
the median life expectancy of the Japanese was
3.4 years longer than the median
life expectancy for the U. S.
population, but that the two countries' 99th-percentile lifespans were essentially the same
(101.3 and 101.1 years, respectively). By contrast, the 99th-percentile lifespan for the Okinawans, at
is shifted about 3.6 and
3.8 years to the right. If the Okinawans were still on
calorie restriction, that would be a disappointment, but under the
circumstances, it seems a happy lagniappe that even though the Okinawans
are no longer 17% calorie restricted, there has, nevertheless; been a significant increase in
their maximum lifespan.
Note that the 50th-percentile life expectancies in Table 3 are median life expectancies rather than average life expectancies.
You'll also notice that the Japanese live, on average, closer to their maximum lifespans than do either the Americans or the Okinawans.
Much of what we know about Okinawa has been provided by Drs. Bradley and Craig Willcox, and Dr. Makoto Suzuki.
3.6 or 3.8 years is a small fraction of the 11-to-21-year increases in the average and 95th-percentile life expectancies that I'm projecting for dedicated CRONies who start CR as adults, but these Okinawan numbers still stick out like sore thumbs above the maximum (99th percentile) life expectancies of both the Americans and Japanese, which differ insignificantly by 0.2 years. This is especially significant in that they signal either residual effects of a past CR regimen prior to 1970 or a current lowered level of CR (e. g., 5% CR?) or some combination of the two.
The five authors of the four theories that I mentioned at the beginning of this discussion may well and rightfully be expected to question these numbers.
(I think that 3.6 to 3.8 years of 99th-percentile life extension falls within the ambit of compatible calorie restriction effects in humans that one or more of the four models cited at the beginning of this paper would allow.)
It's also important to note that the 10 year rise in the 99th percentile life expectancies in the United States is almost three times the putative increase in the life expectancies of the Okinawans. In other words, CR wouldn't have to be the only reason that the 99th-percentile life expectancy of the Okinawans has increased 3+ years.
However, in my estimation, the most significant information in this paper is the fact that the Okinawans haven't been on 20% calorie restriction, and haven't been on 17% calorie restriction for the past several decades, and that changes the whole ball game. This is important because the whole reason that the Okinawans have been referenced as an example of how CR has failed to extend human lifespans is because it was assumed that the Okinawans have been on 38% CR as children, and then on 17% CR as adults. Now we're learning that hasn't been the case since the late 60's.
There was already a hint of this in Dr's. Bradley and Craig's and Makoto Suzuki's 2001 Random House Book, "The Okinawa Program", pg. 23, where the authors mention that the ratio of total-cholesterol-to-HDL is 3.3 for the Okinawans, compared with my own 2.2-to-2.5 ratios, and the 2.5 ratio among the 18 calorie-restricted volunteers who participated in the bellweather April, 2004 Washington University Medical School study of calorie restriction in humans. This 3.3 TC/HDL ratio already suggests that they might not have been on full CR. Also, the Okinawans have an average BMI of 21 compared with 19.5 for the 18 study CR-study participants... slim, but not as slim.
Even if CR hadn't extended the 99th-percentile life expectancies of the Okinawans, comparing the life expectancies of a third-world population to those of the United States body politic seems a bit unfair to the third-world population. As mentioned above, many factors, such as contagious diseases, sanitation, clean water supplies, and accidents play a role in human life expectancies.
The Purina Study of Labrador Retrievers
In 1987, the Purina company initiated a small study of 48 Labrador Retrievers, 24 of which were put on 25%- calorie-restricted diets starting at 8 weeks. The study lasted 15 years until the last dog was hanged—well, OK: buried. The control group was somewhat calorie-restricted from the age of 3¼ onward to prevent obesity. The median age at death, summarized in Table 5 below, was 11.2 years for the control group, and 13 years for the CR group. The last dog in the control group died at 13.3 years, while the last dog in the CR group lived until it was 14.7 years… about 10.5% longer than the slightly restricted control dog. The median age at death for the CR group was 22 months (15%) greater than the median age for the control group. When only one member of the control group was left, 18 of the 24 members of the CR group were still running around. When the last member of the control group had died, 6 of the CR Labradors were still living. And the health of the CR group was much better than that of the control group at every age.
"AL" stands for
"ad libitum", and represents the fully-fed control dogs.
"CR", of course, stands for "calorie restricted".
The maximum life span, defined as the 90th percentile survival age, was a little longer in the CR group than in the control group, but the difference wasn’t statistically significant at the p<0.05 confidence level. (The oldest retriever in the CR arm outlived the oldest dog in the control group by about 17 months). Of course, the longest-lived 10% in each group consisted of 2.4 dogs, and the longest-lived dog in the control group and the longest-lived dog in the CR group consisted of one dog each.
The key issue was that although the CR group outlived the control group, they did so by 10%-to-15%, and not by the 25% that you’d have expected in mice--unless, of course, the dogs remained 25% calorie restricted after the control group became sufficiently "limited to prevent obesity". In that latter case, the CR effects might have been approximately on target, with a 15% average life extension, and a 10.5% maximum life extension beyond the corresponding slightly extended lifespans of the 10%-calorie=restricted control group.
I have the impression, though, that the control dogs at the times of their deaths were overweight and not 10% calorie restricted.
We would have expected about a 36-month lifespan extension in both the 50th- and the 90th percentile lifespans in the CR contingent of these Labs, and instead, we got 22 months and 17 months, respectively.
Dr. de Grey's interpretation is that because weather-generated famines engender the calorie restriction phenomenon, a year or so is about the most you can expect from calorie restriction in any organism.
I don't know what happened with the Labrador retrievers. This is said to be the first formal study conducted with a longer-lived species than the mouse. As Dr. Aubrey de Grey has observed, there may be an animal-husbandry learning curve associated with different types of animals (although he notes that an improved knowledge of CR in mice has led to a perceived decrease in CR-modulated lifespans rather than an increase). One possibility might be that reducing the dogs' calorie intakes by reducing their dog chow by 25% led to malnutrition. Consuming three-quarters as much food without altering its composition means acquiring three-quarters of the nutrients--e. g.., protein--that a full plate affords. Both the AL and CR groups were fed a diet that contained 21% protein as adults. However, protein turnover and absolute protein requirements rise in absolute terms in the calorie-restricted. This means that the CR group got 75% of the protein that the AL dogs got when they may have needed 100% to 120%. Understandably, the Purina researchers wouldn't have wanted to play fast and loose by feeding the CR arm of their kennel a different dietary composition then that of their control group. (This kind of information about absolute protein requirements may not have been available in 1987 when the Purina study began.) Is it possible that the calorie-restricted animals didn't get as much protein as they needed?
Comparing Dogs with People
Table 4, below, compares the Purina Labradors' average numbers with corresponding numbers for humans.
TABLE 4 - Dogs versus People
|Time to baseline||91.2||41.4||-55%|
* - percentage reduction comparing the CR dogs
(column 2) with the fully fed dogs (Column 1).
They throw their food at the walls and at their caretakers and act angry and
depressed. Even the calorie-restricted monkeys aren't eating all their food.
Things got so bad in 2005 that Dr. Le Bourg's paper observes that the CR group
mortality actually became greater than that of the control.
** - copied from Table 5.
*** - copied from the Biosphere II table
The black numbers in the AL (Ad Libitum) and the numbers in the CR (Caloric-Restricted) columns are for the Labrador retrievers.
The red numbers in the fourth ("Dogs") column represent the percentages by which the listed biomarkers for the 25%-calorie-restricted Labradors fell below those of the fully-fed canines.
The red numbers in the "WU" (Washington University) column show the percentages by which the eighteen 25%-calorie-restricted volunteers in the 2004 study lagged those of their control group.
The red numbers in the "Bio II" column constitute the percentages decreases in the listed parameters recorded by the 8 Biosphere II team members. I don't know their degree of CR (it was "severe"), but the team members were said to be in excellent physical condition and might have already been somewhat calorie-restricted before they began their CR regimen. The percentages are relative to their pre-CR values--i. e., they acted as their own controls.
Looking at the data, two inferences suggest themselves.
(1) Every human biomarker is less, and in some cases, drastically less than it is for the Labradors. In a manner similar to the Okinawans, for whom a TC/HDL ratio of 3.3 suggested less than full calorie restriction, the Labrador numbers may not quite jibe with 25% calorie restriction. They would be consistent with, maybe, 15% calorie restriction. Of course, that assumes that these biomarkers will change in the same ways with Labradors as they will with humans, and that may not be the case.
(2) There doesn't appear to be any attenuation of CR effects as we go from a medium-sized mammal (the dog)--to a large mammal (the human).
Perhaps, as was the case with the Okinawa studies, additional information will emerge over the coming years.
The results could also have to do with species-specific idiosyncrasies... viz., the fact that dogs are pure carnivores.
These Labrador retrievers lived something like 5 times longer than mice, and about 1/7th as long as humans.
The bottom line:
(1) The fact remains that the Labradors in the Purina study showed only about half the expected CR improvements in lifespans, and I don't know why.
(2) The Washington University study subjects, on 28% CR begun at mid-life, show considerably stronger calorie restriction responses than do the Labrador retrievers on 25% CR from infancy.
In 1987, Drs. George Roth, Donald Ingram and Mark Lane at the National Institute on Aging began a study of long-term calorie restriction with 30 monkeys. This has since expanded to nearly 200 monkeys with three basic age cohorts: very young, young-adult, and old (equivalent to the age of 60 in humans). From the program’s inception, the CR group has exhibited the characteristics common to other species on CR: lowered fasting insulin levels, lowered fasting blood sugar levels, lowered body temperatures, and elevated HDL and insulin sensitivity, to name a few. Dehydroepiandrosterone (DHEA) and alkaline phosphatase levels in the CR monkeys, thought to be meaningful barometers of aging, aren’t declining with age the way they are in the full-calorie group. The CR monkeys are smaller, younger-looking, and healthier than the ad libitum-fed monkeys. The CR monkeys have about 7% body fat, in comparison with 15%-20% body fat among the controls.
Some accidental deaths have occurred over the years, such as overcooked food causing deadly gastric bloat. This might muddy the waters when the time comes to crunch the final numbers.
These monkeys are now (in 2006) 19 years into the study, with average life spans of about 27 years, and “maximum” life spans (defined as the average age at death of the longest-lived decile) of... 35(?) years (corresponding to human ages of 75 and 97?). It will be 2014 before they reach the 27-year mark.
At the Fourth Calorie Restriction Conference, it was revealed that the control group of monkeys in the NIA study are becoming so depressed in their middle age, cooped up in their cages, that they’re eating no more than the CR monkeys. They have to be cajoled, and fed candy to get them to eat.
The oldest member of the CR group, C58, set (I believe) a new longevity record by dying in 2005 at the age of 41. The second-oldest confirmed monkey life span was 36, with an unconfirmed report of another monkey who reached 39. These latter two monkeys weren't calorie-restricted, but they’re also the oldest monkeys on record (1 in 1,000? 1 in 10,000?) Note that C58 presumably wouldn’t have been calorie restricted until its last 18 years, when the NIA Aging in Non-Human Primates program began. That means it would have been about 23, equivalent to 69(?) in human terms, when its calorie restriction began.
National Institute on Aging researchers Don Ingram, George Roth, Matt Lane, M. A. Ottinger, S. Zou, R. de Cabo, and J. A. Mattison have just pre-published a Biogerontology paper, "Evidence emerging from studies in rhesus monkeys suggests that their response to CR parallels that observed in rodents."
"Based on results emerging from long-term studies of dietary restriction in rhesus monkeys, we offer our views regarding whether dietary restriction can increase longevity in humans. Because lifespan data in monkeys remain inconclusive currently, we respond that 'we do not for sure.' Based on the vast literature regarding the effects of healthy, low calorie diets on health and longevity in a wide range of species, including humans, and based on data emerging from monkey studies suggesting that dietary restriction improves markers of disease risk and health, we respond that 'we think so.'"
As was the case with the Purina chow experiment with Labradors, these researchers are plowing new ground. We are in the earliest stages of investigating CR in large mammals. Understandably, we are all eager to mine what meager data there is for whatever might be buried in it, but the earliest experiments are not always dependable. For example, the first experiments in which CR was initiated in mid-life reduced the lifespans of mice rather than extending them, but later studies showed that if CR began gradually, the lives of the mice were amply extended. If we had been hoping to utilize CR in humans at that time, we would have been disheartened over what turned out to be a false alarm.
There is one previous study of calorie restriction in monkeys, but there were only 8 monkeys in the calorie restricted group.
The Vallejo Study
Sixty healthy individuals aged 65 and up (with an average age of 72) living in a religious home for the elderly, were put on alternate-day calorie restriction, ingesting 2,300 calories on one day, and 900 calories (of fruit and milk) on the next. This CR contingent was compared with a matched control group of 60 normally fed residents who received the full 2,300 calories every day. At the end of three years, 6 (10%) of the calorie-restricted subjects had died, compared with 13 (22%) of the control group, but this wasn’t statistically significant not because the effect was slight but because the numbers were so small (6 and 13). In other words, the mortality rate in the CR group was about half that of the control group. The calorie-restricted group had registered a little over half the trips to the hospital experienced by the control group. This was statistically significant, because the numbers of occurrences were much larger.
I inferred that the take-home message from this experiment was supposed to be that the subjects on the calorie-restricted diet didn’t live 10 or 15 years longer than the controls. But the study only lasted for three years. The transition from an ad libitum diet to a calorie restricted diet was presumably abrupt. The Vallejo subjects probably didn't get adequate nutrition in the light of what we know today about human nutrition. The subjects might have had occult pre-existing conditions such as atherosclerosis and incubating cancers. It would have taken a certain amount of time for the full effects of CR to kick in. Yet in spite of these limitations, during the three years when the experiment was running, CR roughly halved the mortality rate, and cut the hospital visits in half.
The Washington University Medical School Studies
In April, 2004, a Washington University Medical School research team team, funded by the National Institute on Aging, published a "bombshell" paper in the Proceedings of the National Academy of Science. The Washington University research team had conducted testing upon 18 volunteers who had been practicing calorie restriction for three to fifteen years, with an average calorie restriction duration of six years. The calorie restriction practitioners ranged in age (in 2004) from 35 to 82 with an average age of 50, and an average degree of calorie restriction of 28%..T
Table 5, below, compares a subset of the 12 calorie-restricted subjects who had prior records of relevant parameters before they began calorie restriction with a set of age-matched controls. As Table 5 reveals, prior to the initiation of calorie restriction, the experimental group had atherosclerotic risk factors that are, perhaps, typical of those of the general U. S. population.
The original Washington University study published in April, 2004, as well as other short-term experiments, have shown that calorie restriction begun in adult life and even beyond middle age produces dramatic improvements in human health parameters (see Table 5) that are congruent in character and degree to those that caloric restriction induces in animals in which CR shifts the survival curve to the right.
They throw their food at the walls and at their caretakers and act angry and
depressed. Even the calorie-restricted monkeys aren't eating all their food.
Things got so bad in 2005 that Dr. Le Bourg's paper observes that the CR group
mortality actually became greater than that of the control.
Table 5: Risk Factors for Atherosclerosis
Mass Index (BMI),kg/m2
Blood Pressure, mm. Hg
- 27 %
Blood Pressure mm. Hg
Fasting Glucose, mg/dl
Fasting Insulin, mIU/ml
These include drastic reductions in fasting insulin
levels (1.4 down from 5.1), inflammatory biomarkers
(0.3 mg./dl., down from 1.4), a 15-year
rollback in diastolic function of the heart, and lowering of white counts,
insulin resistance, blood pressure, and core body temperatures among other
metabolic indicators. Calorie restriction in
humans lowers blood pressure by reducing cartilage that builds up around arteries with age. The average thickness of
the carotid artery intima (a measure of atherosclerosis) in the calorie
restricted group was, on average, 40% less than
the average intima thickness of the carotid arteries of the control group.
It may be that the ratio of total cholesterol to HDL in the calorie restricted subset was low enough that their arteries had been partially cleared of atherosclerotic plaque in the light of the measured 40% average reduction in the average intima thickness of the CR group's their carotid arteries. In addition to their improved blood lipid profiles, the CR practitioners also exhibited low levels of C-reactive protein and other inflammatory cytokines associated with inflammatory disease processes, including coronary artery disease and cancer.
A recently completed Washington University study, Calorie Restriction Appears Better Than Exercise At Slowing Primary Aging, reveals results that are, perhaps, more fundamental and important then the above-described cardiovascular improvements in calorie restricted humans that Washington University researchers published in 2004. This latest study showed that the calorie-restricted have lower levels of tumor necrosis factor (a biomarker of inflammation), and T3 (triiodothionine) (though not of thyroxin-T4- and thyroid stimulating hormone-TSH), and much lower levels of Insulin Growth Factor 1. Lower levels of T3 mean lower body temperatures, lower cellular body weights, and to some extent, lower levels of free radicals. These effects are found in calorie restricted animals with delayed aging, and suggest reductions in the rates of primary aging. This latest Washington University study also examined endurance athletes with the same body fat percentages as those on calorie restriction. The endurance athletes ingested not-quite 2,800 calories a day versus not-quite 1,800 calories a day for the calorie-restricted group. In spite of the fact that the endurance athletes had the same body fat levels as the CRANies, their average level of T3, while lower than the average for the control group, was closer to the average T3 level of the control group than to that of the calorie restricted group. The same situation occurred with respect to triglycerides and HDL. C-reactive protein, fasting glucose, and blood pressure levels are also much lower only in the calorie restricted.
A very important result of this particular study is that it has demonstrated that the limited and partial aging-reversal effects of calorie restriction are a consequence of reduced caloric intake rather than a concomitant of low body fat.
Perhaps more important, the changes in measured biomarkers in the CR group mirror those found in calorie-restricted animals, whereas the changes in those same measured biomarkers in athletes by and large do not.
To say it another way, the Washington University studies point toward an average 15-year rollback in various biomarkers that, taken in their aggregate, suggest to me a 15-year increase in both average and maximum lifespan. This has the characteristics one would expect from CR: a delay in aging rather than a slowing of the rate of aging.
I'm toying with the speculation that, possibly, CR isn't primarily about delaying aging but about revving up an organism's health to give it a better chance to survive hard times. The delay in aging might be a by-product of this quantum-jump in health rather than a delay in aging per se. Could it be that half a billion years ago, nature might have employed a partial reversal in one of the genes that mediate aging (SIRT1?) as an existing single-gene mechanism to abet an organism's survival during hard times?
Note also that, with continued CR, this 15-year-or-more rollback can be maintained in humans and animals over a period of many years. It doesn't fade away with continued caloric restriction.
Although the CR-induced changes that the Washington University researchers first studied were related to cardiovascular disease and diabetes, I see no reason to think that CR isn't also protective against cancer, Alzheimer's Disease, Parkinson's Disease, and other degenerative diseases, as it is in animals, and, presumably,, among the Okinawans. We know from the Purina studies described above that CR also delayed and mitigated osteoarthritis in the 24 Labrador retrievers that comprised the CR arm of their study, suggesting that autoimmune diseases may ameliorated by CR. (But in the interest of full disclosure, we need to note that CR didn't prevent Dr. Roy Walford's death from amylotropic lateral sclerosis--Lou Gehrig's Disease--at 79.)
The Smoking Gun
To me, the Washington University findings are "the smoking gun". If CR failed to induce in humans any or few of the CR effects found in animals, then, I would expect CR to produce few, if any, changes in humans other than reducing weight, lowering blood pressure somewhat, etc.
There would be less improvement in health-related biomarkers than would be found in athletes with the same fat levels.
I'm inclined to apply the famous duck test (a corollary of Occam's Razor): "If it looks like a duck, it quacks like a duck, and it waddles like a duck, then we can probably safely assume it's a duck."
Had the landmark 2004 Washington University Medical Study shown no, or at most, a few years improvement in their CR subjects' age-related biomarkers, then I think CR in humans would be open to question, and we might need to debate which one or two of the four models mentioned at the beginning of this discussion are correct. But instead, the changes are startling and dramatic, and hew very closely to the responses to CR found in animals. and to the three biomarkers associated with longevity in the Baltimore Longitudinal Study: durable levels of DHEA, lower body temperature, and lower insulin levels.
To me, it's simple. If CR isn't going to work, then it won't work, and the CR footprint will be weak or non-existent. If CR is going to work, then the characteristic CR syndrome will be present, and the strength of the CR effects will be emblematic of how strongly CR is working.
And think about the implications of the converse!
If CR engenders the same robust changes in humans that it produces in other animals, and yet fails to increase the human lifespan, then we must conclude that these CR-induced changes in rodents have no relationship to the CR-induced extensions in rodents' maximum lifespans, and that some other mechanisms that researchers have failed to recognize for the past 70+ years must be responsible for the extensions in maximum lifespan in rodents! (Obviously, if we register the same experimental changes in both rodents and humans, but CR extends maximum lifespans in rodents but not significantly in humans, then these changes can't be used to predict extensions of maximum lifespans. Some other occult changes must be postulated to explain the calorie-restriction-induced extensions in maximum lifespans in animals.)
But it doesn't stop there.
If CR doesn't work in humans, the sponsors of the Baltimore Longitudinal Study must also be declared to be in error. These researchers have observed that their longer-lived subjects are characterized by lower body temperatures, more durable levels of DHEA, and lowered insulin levels. Since these characteristics are also distinctively present in the calorie-restricted, it follows that if CR doesn't work in humans, these Baltimore-Longitudinal-Study longevity markers can't be valid predictors of longevity.
These aren't just interesting changes, these are dramatic improvements in the health parameters associated with aging. And although I don't know whether such studies have been performed yet on humans, humans will presumably evince the same CR-induced genetic changes that have been found in mice introduced to CR in mid-life.
The bottom line: Whether CR works in dogs or monkeys or whether it doesn't, it obviously works in humans, and may reasonably be expected to produce the same standard life-extending effects that it has conferred upon other animals. (Or at least, that's how it seems to me.)
I think this concern haunts us because, of the many interventions that will extend the average lifespans of animals, only CR increases the maximum lifespan. When someone argues that CR won't increase our maximum (95th- or 99th-percentile life expectancies), it triggers self-doubts about whether CR is really going to extend our maximum lifespans (and even whether or not it will extend our average lifespans). Then. too, there's no sign that lights up and says, "Calorie restriction is working, and it's going to add 15/20 years to your "healthspan." Also there has been the perception that CR would probably have occurred naturally over the years, and yet, we haven't seen >120-year lifespans associated with low caloric intake. But as mentioned at the beginning of this discussion, CR, as practiced under the most optimistic of practical regimens, wouldn't have been expected to add more than, maybe, 15 or 20 years to the healthy-lifestyle longevity of the average person born in 1900, enabling them to live. perhaps, into their 90's or 100. This probably wouldn't have been noticed.
Incidentally, I've been impressed with the youthful appearances of those I see at the Calorie Restriction Conferences who have been practicing CR for 10 years or longer. Of course, we all know many who aren't on CR who are very young-looking for their ages, and it's hard to know what fraction of this youthful appearance is a legacy of CR and what fraction is a consequence of genetic endowments. However, without mentioning names, I know of one long-term CR practitioner whose sister now looks like her aunt and whose nephew now looks like her brother.
I think that the skepticism that CR's extension of maximum lifespans in humans is receiving is a kind of rite of passage in a new field. (For an hilarious romp through previous expert predictions concerning past cutting-edge discoveries and inventions, click here.) It certainly happened with space flight in the early fifties and to a lesser extent, throughout Robert Goddard's research career). At first, a new field is too obscure to warrant serious critique, but as it begins to approach the mainstream, experts will shake it and rattle it. And this is what science is all about. It's the critic's duty to independently challenge and assess the claims made by the protagonists for any new field or discovery.
Still, to me, it's, perhaps, a little hard to understand challenging CR in humans after the Washington University discoveries and the revelation that Okinawans haven't been on CR for decades and yet, exhibit a 95-percentile lifespan that is 3.6 to 3.8 years longer than the Americans or the Japanese (and presumably, all other national groups).
Calorie Restriction Responses Appear to Be As Dramatic in Septuagenarians as They Are In Twenty-Somethings
At the CR III Conference, Dr. Luigi Fontana showed a viewgraph comparing ultrasound images of one of the carotid arteries of a 77-year old runner with one of the carotid arteries of the 82-year-old CRONIE who was the oldest subject among the 18 volunteers in the original, 2004 Washington University Medical School investigation of the long-term calorie-restricted. There was quite a bit of plaque in the carotid artery of the 77-year-old runner, and zero plaque in the carotid artery of the 82-year-old CRONIE. Since the longest period of CR among the members of the CRONIE arm was 15 years, this individual must have been at least 67 when he began calorie restriction.
The significance of this is that it suggests (at least, to me),
(1) that a dramatic calorie restriction response can be elicited in the aged, and
(2) that it was still in effect years later when the subject had reached the age of 82.
My Own Experience With CR
I've never had bad health habits, and I've been on a good, largely vegetarian diet since 1979. I've always gotten daily exercise, swimming, jogging, or cycling half an hour to an hour a day. So I began my CR career in excellent health and probably, from a starting point similar to a Seventh Day Adventist.
I began calorie restriction in August, 2003, when, at the age of 74, I entered into a weight-loss program in an effort to lower my blood pressure. Six months later, an SMA administered during my annual physical revealed that my HDL had jumped 40%, from 51 to 71 mg./dl., and my total cholesterol had dropped 6%, from 192 to 180 mg./dl. My triglycerides had fallen to 56, and my fasting blood sugar was 81. My ratio of total cholesterol to HDL had dropped from 3.85-to-1 to 2.5-to-1.
Six months after that, in August 2004, my fasting glucose level had fallen a little further to 78.
After noticing no perceptible changes from various, hyped anti-aging supplements (such as vitamin E and selenium), I was most impressed when, after six months of slow weight loss, my own lipid profile came back with such remarkable improvements (in line with the changes experienced by the eight calorie-restricted participants in the Biosphere II experiment, and later, with the 18 CR volunteers in the Washington University study). I am even more impressed now that 2¼ more years have passed that these numbers have persisted and even improved somewhat. Assuming that other CR-enhanced age-related biomarkers are congruent with those of which I'm aware, my personal lifespan is probably being extended beyond what it would otherwise have been.
In March, 2006, I paid for a blood lipid panel and basic SMA to see if these improvements in serum parameters had been maintained over the next two years. This time, my HDL had risen to 84, my total cholesterol was 204, my triglycerides were 67, and the TC/HDL ratio was 2.4-to-1. (I had begun eating eggs every morning for breakfast just before this testing was carried out.) I quit eating egg yolks, and two months later, I paid for a third round of tests to see whether laying off the egg yolks would lower my cholesterol levels. This time, my HDL level was 78, my TC level was 174, my triglycerides were 48, and my TC/HDL ratio was 2.2-to-1. (I have never taken any medications or food supplements for cholesterol or other cardiac-related conditions.)
Thanks to statins, it's not that difficult these days to lower one's levels of LDL and total cholesterol, but 50%-to-60% increases in HDL with simultaneous reductions in total cholesterol are, I think, not easily effected. The Life Extension Foundation says of this, "It is not easy, however, to significantly elevate HDL levels. Even with the proper drugs and supplements, it is extremely difficult to raise HDL more than 27%. In some people, it is hard to get HDL levels to nudge upward at all."
I have read that in the Framingham long-term study of heart disease, no one with a TC/HDL ratio of 2.5-to-1 or less has ever died of a heart attack.
My temperature, averaged over a period of two months using three independent thermometers for cross-checking, was 96.1.
Assuming that other CR-affected parameters such as homocysteine and C-Reactive protein levels have also declined in me, as they are wont to do for others on CR, I am probably at very low risk for a heart attack or related vascular trauma. This means that my blood lipid rollback has been considerably greater than 15 years. Most twenty-somethings could envy us CRONIES our blood lipid profiles.
I don't feel any different than I did in my thirties. In February, 2004, I began running 2½ miles a day. (I've since backed off to 2 miles three times a week because of concerns that the extra calories burned in running may diminish the benefits of CR.) Running fast for sizable distances must make demands on various physical subsystems--heart, lungs, joints. Of course, there's no ready way to know whether I could function as I do if I weren't on CR, but many of my age peers are hobbling around or falling over. I may have been developing osteoarthritis in my right knee before I embarked on my CR regimen in 2003, but if so, all traces of it have vanished.
I haven't contracted a head cold, a stomach bug, or other infectious disease since I embarked upon my weight loss program in August, 2003. Prior to August, 2003, I typically experienced two or three such illnesses a year.
What's important about both of our cases is that they suggest that, at least in humans, CR can
(1) evoke the same kinds of dramatic results in the elderly that they engender in the youthful, and
(2) that these biomarkers translate into structural improvements in vital organs.
What I consider important about this is that the standard panoply of CR effects seems to show no signs of attenuating with age. One would expect that eventually CR effects must fall off, since CR slows or delays aging but doesn't eliminate it, but at least in our two instances, the CR effects don't seem to have become seriously attenuated.
This raises interesting questions
The fact that CR seems to produce the same changes in the aged that it does in the young raises interesting questions. If blood lipid, fasting insulin, fasting blood sugar, C-reactive protein, and other age-related biomarker levels return to youthful values, what impact does this have upon mortality? We know that CR won't prevent aging or death, but will, at best, delay it. What happens if someone 85 years old or 95 years old adopts a CR program? Will their deaths be postponed by CR? If so, how much? Does this mean that adopting CR at 75 is as effective in extending lifespan as adopting CR at 25?
Intuitively, the answer to this latter question is "no", but this is a subject that invites closer examination. (If you're 25 and questioning whether you should begin CR now or wait until you're 75, I'd observe that even if CR were as effective at delaying death when begun at 75 as it is at 25--and we don't know that it is--you still might prefer to delay aging during your younger years rather than during your later years. Then there's the matter of surviving to 75.)
For me personally, it's imperative that I correctly assess the capabilities and limitations of CR. At my age and stage, I can't afford pratfalls. If CR won't work in humans, I need to explore other options. But given the full-throated response to CR that humans (including I) evince, it's my considered opinion that CR will deliver as promised.
Two swallows do not a summer make: this phenomenon suggests a closer look at other superannuated CRONIES.
The Relationship Between Health and Longevity
Suppose that in a group of present-day centenarians, with an average life expectancy of 2 years, we could restore all their age-related biomarkers (including immunocompetency) to the levels of a neonate. Would they still have an average life expectancy of 2 years? I think the answer must surely be, "no". Of what will they die?
The point is: a dramatic improvement in the health of a nonagenarian should lead to a dramatic reduction in mortality, and an increase in lifespan.
CRANie of Them All?
In 1956, at the age of 53, Ralph Cornell (left), of Massilon, Ohio, began skipping lunch. From then on, he became voluntarily calorie restricted. On March 13, 2006, Mr. Cornell celebrated his 103rd birthday. He continued to drive every day to and from his real estate business until he was 101. At that time, he fell on his front porch and broke his hip (osteoporosis?) He then retired from his real estate operations. Ralph has now outlived all his siblings, including his sister, Edna, who died recently at 93. (Many members of Ralph's family have lived into their 90's.) In addition to his broken hip, Ralph has also suffered from a serious heart attack. Still, he couldn't have known until fairly recently what's now known about the nutritional requirements unique to CR. Also, his heart disease might have been preventable given early attention.
Note that Ralph is reading. If you're 25, you may think, "So what?", but if you're 80 or 90, you'll realize that at 103, most people can't read.
Ralph is the Neil Armstrong of calorie restriction. While we can't draw too much from what happens to one man, or two or three, it's encouraging that what is happening is consistent with what we might expect CR to accomplish in humans.
Ralph has been a close friend of Paul McGlothin and Meredith Averill for many years.
June 7, 2008 Update: Ralph Cornell celebrated his 104th birthday on March 13, 2007, but I don't see any indications that he was still alive for his 105th birthday in 2008. Still he had quite a life, and was still enjoying it at 104 (when he attended a football game). (He certainly doesn't look like the typical centenarian in this picture.) He had a very serious heart attack a few years ago. It's my impression that he avoided doctors, and of course, some of the recent ideas regarding nutrition for the calorie-restricted wouldn't have been available to him until recently. CR begun at the age of 53 probably added 10 years to his life span and allowed him to drive to and from his office until he was 101.
He may not have had the advantage of some of the newer nutriceuticals such as green tea and high-dose resveratrol.
The Earliest CRANie of Them All?
Luigi Carnero was born in 1464 at the peak of the Italian Renaissance. It was a lusty age, and by 1500, Signor Carnero was suffering from such disorders of excess as gout, chronic fevers, stomach distress, and perpetual thirst (for wine?). Luigi was advised by his physicians that his only recourse was to renounce his licentious ways and to convert to a "sober and regular life". Otherwise, within a few months, he could expect to transition to a more spiritual existence.
Luigi decided that this was an offer he couldn't afford to refuse, and he began rising from the table "with a disposition to eat and drink more". In other words, he ate until he was 75% or 80% full, a lifestyle upon which he expatiates in his "Discourse on the Sober Life". His diet consisted of 12 ounces of bread, meat, the yolk of an egg, and soup, washed down with 14 ounces of wine. His health soon returned, and he cleaved to his temperate lifestyles until he was in his seventies(?). At that time, his friends enjoined him to increase his food intake by two more ounces a day and his wine intake by two more ounces a day. He eventually acquiesced. Straightaway, he became "peevish and melancholy", and on the twelfth day waxed deathly ill. But he backed away from the extra food and wine, recovered, and went on to live to be 102, dying in 1566.
Centenarians must have been unusual in Luigi's day.
He would have initiated CR in his latter thirties.
Q: Could CR markedly improve certain aspects of human health, but fail to significantly extend the average lifespan beyond the 84 years or so attributable to a Seventh-Day Adventist lifestyle?
A: This could happen if:
(1) The CR effects faded away as we approached the average lifespan.
But the (admittedly limited) available evidence points toward a continuation of full CR effects beyond the average lifespan.
(2) There are countervailing morbidities that fully offset the life-extending health improvements that have now been documented in humans on CR.
In all other animals, CR extends the lifespan in proportion to the degree of CR. Why would we look for offsetting morbidities? And what would they be? (Discussed further below.)
Q: Could it be that CR markedly improves human (and, maybe, non-human-primate) health, increasing their average, but not their maximum life expectancies? Couldn't the benefits of CR fade away as we approach the maximum lifespan?
A: With respect to maximum life expectancies, I don't think that I for my part can rule out that possibility, but given the fact that, at least up to my age (77) and up to the age of the 82-year-old (now 84) who was chronicled in the April, 2004, Washington University National Academy Sciences paper, the same 15-year-or-more-year rollback in the CR-induced biomarker profile seems to have occurred, I think it highly unlikely. However, this might be a suitable topic for close scrutiny. We need to see whether or not this kind of profound "quasi-rejuvenation" happens with most older adults or whether it has occurred in the two of us through chance alone. (Don Dowden would be another long-term septuagenarian CRONIE--one who looks shockingly young for his age.) We also need to see what CR does for the lifespans of octogenarians and nonagenarians. Here, we may have evidence within a few years revealing whether, and by how much, CR late in life extends lifespans.
In any case, as we've seen, some of the evidence marshaled to show that CR has a minor effect in humans has now been shown to be incorrect--i. e.,. among the older Okinawans.
Ralph Cornell's and Luigi Carnero's centenarian status might also signal some delay in senescence.
Of course, CR delays but doesn't eliminate aging. Given CR alone, one or another of our subsystems is eventually going to age enough to fail.
Q: Granted that CR improves cardiovascular disease and type 2 diabetes, what about cancer and Alzheimers Disease?
|A: Since Alzheimers Disease has
just been linked to type 2 diabetes and to blood sugar elevations over an
extended period of time, eliminating type 2 diabetes should reduce the risk of
Alzheimers disease. In fact, age-related cognitive impairments of all types are
substantially lower among the quondam-calorie-restricted Okinawans. As for
cancer, CR extends the
lifespans of mice primarily through delaying the onset of cancer. The human volunteers
have remarkably low levels of C-reactive protein and other inflammatory cytokines
that are thought to help initiate neoplastic transformations. Also, CR tends to
Okinawans have 18% of the (presumably age-adjusted) heart attack rates as Americans, 14% of the prostate cancer incidence, 18% of the breast cancer rate, 43% of the colon cancer occurrences, and 43% of the ovarian cancer cases. Of course, this isn't necessarily a dividend of CR. Drs. Willcox, Willcox, and Suzuki have have attributed it to the bioflavenoids and anthocyanins that the Okinawans ingest through their plant-based diets. Also, the green tea and turmeric tea might help. But in animal models, CR lowers the age-adjusted rates of cancer.
And why would we expect that CR would benefit
cardiovascular disease and Type 2 diabetes and nothing else? By now, the WU researchers have shown that CR influences
other non-cardiovascular aging indices. I expect it to be only a matter of time
until various wiggle-holes are closed off.
Q: Could it be that CR will improve health in many ways but that it increases the risk of death from other causes such as infections, or autoimmune disorders? I've read that white counts are lower in people on CR. Also, there are studies showing that as you grow older, life expectancies rise with BMIs above 25. What about osteoporosis? And would elderly CR habituès have enough body fat to carry them through a several-week illness?
A: Since Nature's "purpose" for the CR response is, presumably, to improve an organism's chances of surviving tough times, it would be surprising if CR rendered the organism more susceptible to infections. That would tend to undermine the interpretation that CR is a survival mechanism. In my own case, I haven't had a cold or stomach bug (or any other illness) since I embarked on CR three years ago, and this seems to be a common experience among CRONies. But even if it were true, between vaccines and antibiotics, infection is one vulnerability for which society can provide some external amelioration. Also, medical science is learning ways to reduce the risk of osteoporosis. With respect to healthy BMIs among the elderly, the 1964-2003 Honolulu Heart study of 8004 Japanese-Americans has found a minimum mortality at an average caloric intake of 1.900 calories a day. Similar results were found in the Harvard-conducted Nurses Study. (One factor that might enter into these studies is the loss of appetite by the very old.) We know what has happened with the Okinawans.
As for a body fat reserve, one has to ask," What kind of several-week illness would we be talking about?" If it's a frankly terminal illness, who wants to linger longer anyway? Then, too, with most illnesses, you can still eat. And this isn't the 19th century. There are always glucose drips and stomach tubes.
I'm carrying something like 10 pounds--36,000 (kilo)calories--of fat.
Is the optimum BMI 23?
With respect to the "controversial" CDC-sponsored study that showed that as you grow older, life expectancies rise with BMIs above 25, two larger and more recent studies have shown the opposite. A Forbes review of the "controversial" CDC-sponsored study notes that, "Since then, CDC chief Dr. Julie Gerberding distanced herself from the report and acknowledged potential flaws in the study that included people with health problems who tend to weigh less."
The two new studies arrived at J-curves for mortality versus BMI, with the minimum mortality lying at about 22 for women and about 23 for men. Mortality rose steeply as the BMI dropped below the optimum level. The risk of death among Koreans with BMIs below 18.5 was 29% higher than for those at the ideal weight level., while for Americans, it was 50% higher. The Korean study found that underweight Koreans tended to die of respiratory diseases while the overweight had increased rates of atherosclerotic cardiovascular and cancer deaths. (The Koreans have a relatively high rate of tuberculosis among those with low BMI's.)
You wonder why mortality was 29.5% among Koreans and 50% among Americans for BMIs below 18.5. One possibility might be that Asiatics have a higher level of body fat at a given BMI than Occidentals. Perhaps the Americans with BMIs below 18.5 had less body fat.
So why do mortality rates rise as the BMI drops below 23? Well, how many of these low-BMI study subjects do you think were on CR? None, maybe? One? Two? How many get all the nutrients they need? As caloric intake drops, it becomes more difficult to insure good nutrition. How many are slim because they don't have hearty appetites? How many subsist on junk foods? How many are alcoholics or drug addicts? One factor among the congenital ectomorphs could be smaller lungs. How many eat lots of calories but don't still don't put on weight because of poor absorption or for some other reason? (One slimbo I asked told me that he couldn't put on weight if he ate buckshot. And for that matter, I couldn't put on weight until I hit my latter twenties even though I ate like the Paddy Pig that brought the famine to Ireland.) (My natural slimness in my youthful days of indiscretion didn't protect me from contagious diseases the way CR has since I've been practicing it.) We know that the incidence of heart attacks among the very slim is lower than it is among the buxom, but it won't hold a candle to us CRONies.
I think that CR is a different breed of cat than natural slimness. I think these BMI results reinforce the idea that CR is about more than absolute BMI values: it's about lowering your calorie intake below what it would be if you were fully fed.
Having said all this, I have to confess that these solid epidemiological studies, backed up with other data such as life insurance tables, are something I take seriously. I don't want my own BMI to drop below about 20, with 7%-to-8% body fat. As long as this is sufficient to produce the kinds of improvements in aging biomarkers that I'm experiencing, I see no need to push it further, at least at this time.
Summing It Up
Studies of CR in humans are in their infancy. If you're currently practicing CR, I think you would be well-advised to stay the course. The five theorists have fired their salvoes. Now researchers are beginning to respond to their challenges. (I don't mean to portray this in a combative framework. I think that the theorists who have formulated the four theories are "the loyal opposition", critiquing our tacit assumptions about CR in humans, and insuring that future models of human CR are built upon bedrock rather than sand. This needs to be done in science, and I think we owe a debt of gratitude to the framers of these models for taking the time and investing the effort to construct their theories.)
In the meantime, hang in there.
Appendix: Research Studies That Should Cost Little or No Money.
"Purity" of Results. The dosage dilemma for CR mimetics.
Metformin, Resveratrol Do they mimic CR in non-CR subjects? What about HDL levels? Do they deepen CR in CR practitioners? What about people already on metformin? How does metformin affect HDL levels? What's the duration of the response to metformin? All this information must already be in the literature.
Protandim: Awaits reinstatement of TBARS test.
Late-Life Induction of CR: Lifespan Studies
One gambit that's been approached with understandable trepidation is that of the induction of CR late in life. There are a priori questions about whether the very elderly can handle the stresses of CR induction.
The flip side is that us antediluvian fossils have more to gain and less to lose than our less-rickety compeers. For an 85-year-old guy, the average life expectancy in 2003 was 6 years; for a 95-year-old, it was 3.2 years. "Shucks!", as guys in our generation are often inclined to intone, "when the odds get down to 'slim' or 'none', I think I'll take 'slim'." ("If I can't take it with me, I don't want to go.") And so it is that many of us have already voted with our feet, entering into calorie restriction with highly positive results.
Have inferred that mid-life induction of CR leads to a lifespan extension that is proportional to the age at which CR began. However, this may not be the way it works. How does lifespan extension vary with induction age? Will take years to measure, but not half a lifetime. Perhaps four or five years will be sufficient to indicate the efficacy of the treatment.
Different People Have Different Inherited Vulnerabilities
One move that I should think would represent an advance in public health would be a program aimed at mapping out for each individual the familial vulnerabilities that he or she might have inherited. For example, some
Variations in Individual Responses to CR
So far, pilot studies of responses to CR have, justifiably, been directed toward the average responses of small groups of 18 to 25 people. Neither the funding or the justification has existed for the investigation of individual, heritable health vulnerabilities. For example, some individuals are genetically predisposed to Type 2 diabetes. Others have familial hyperlipidemia problems... e. g., the E4 allele of the apo-E gene. This may be an area in which we can examine the biomarker profiles and histories of existing members of the CR Society.
We might conceivably publish peer-reviewed journal articles concerning these findings, but even if we didn't take that final step, we could marshal the data that would support such publications.