I think part of the issue / problem with the Intermittent Fasting (IF) data is that there are many different interpretations and definitions of IF.
You mention that " 1. Animals that do IF donât live longer than those that do normal CR."
which I suspect is true, but if you can achieve lifespan and healthspan extension anything close to 40% CR with a mimetic like rapamycin (without the pain and suffering of CR) then most people would agree that its a great accomplishment.
You mention: âHumans who practice IF donât live longer.â
- probably true, but the data is very spotty, and the definition of âIFâ is again an issue. What level of autophagy are they stimulating in their version of IF?
You mention âThere are no more people doing IF among centenarians than among the general population.â
And how do they define IF in these studies? I doubt its 2+ days of fasting every week, for decades. So the data is not comparable to rapamycin dosing. And, the centenarian data is famously pretty sparse and of low quality (especially when it comes to diet and fasting over the past 50 years of their lives)
You say: âRCTs on IF vs CR donât conclude that IF is superior.â
Perhaps that is true, but again, if I can get even close to CR with a weekly rapamycin tablet without the pain and suffering of CR or IF, Iâm happy.
Short term IF (i.e. 16 hours not eating, 8 hour window of eating) will not induce autophagy to any significant effect I suspect, and thus not provide the benefits.
And the CR that is typically studies in animals is much higher (ie. greater reduction in calories, typically 25% to 40% reductions from Ad lib diet) than the caloric restriction done in IF that is seen in human applications. So - apples and oranges in their comparison.



