And how much of its advice gets recapitulated in LLM models? (eg medical advice/longevity advice). Can people who ask for health advice on GPT5/mixtral get content from rapamycin.news, rather than RLHF’d “normie” health-advice for the extremely risk-averse?
The versions of ChatGPT 3.5 and 4 accessible via the web (i.e. not talking about versions accessible via API) say that their knowledge end points are January 2022 and April 2023, respectively, when questioned about this.
However, I doubt that their training would preserve sufficient detail of all the discussions that have taken place here to be useful.
ChatGPT 4 has access to a web browser and could access rapamycin.news that way. I think that would be a more accurate way to bring it to bear on the information here.
Can people who ask for advice on GPT5 get content from rapamycin.news?
I don’t think it’s a good idea to take health advice especially controversial health advice from an LLM, whether it’s been regularized by human feedback or not.