Summary

In this session, Terrie Moffitt, professor at Duke University, and Daniel Belsky, assistant professor at Columbia University, dove into the details of a new approach to measuring aging using DNA methylation. Their Pace of Aging approach, based on which a second generation DunedinPACE algorithm was developed, looks at multiple timepoints and assesses how fast or slow a person ages (like a spinometer), compared to previous methylation based diagnostics (working more like a clock). They discuss the advantages of this new method and announce the availability to researchers and the public as well. The Dunedin longitudinal study plays an important part, which is why that is a significant part of the discussion.

Presenters

Presentation: Terrie Moffitt

Transcript
  • We are probably all familiar with this chart showing the accumulation of multimobid diseases across the lifespan. If prevention of that is our aim, then this chart is a good argument for picking 30’s, 40’s, and 50’s as the target ages for intervention
  • Because of that, measuring aging in young people is essential and possible. Efforts to slow down aging in younger people have got to have a way to measure aging in order to quantify how well the treatments work. Waiting for the onset of disease or death as the treatment outcome is ineffective and costly. Investigators like Steve Horvath or Morgan Levine have been making groundbreaking progress with DNA methylation clocks as measures for this purpose. 
  • Our team at Duke University took a different approach. We tried to directly measure the pace of aging itself, I’ll tell you our story and after that, Daniel Belsky will tell you about a DNA methylation version of our measure that you can use.
  • We’ve been studying aging in a cohort of over 1000 people all born in 1972. When we started, NIH reviewers scoffed at us, believing that there would be no variation in measurements relevant to aging in people that were that young. However what we first observed was that there is definitely a variation in aging many years before they are older adults. 
  • You can see composite photos of a part of our cohort, all aged 45.
  • This is a different part of our cohort, also a composite image from people all aged 45, they are the fastest aging part of our cohort.
  • All the people in this cohort were born in the same year – 1972. You can see there is obviously an enormous variation in our participants’ pace of aging, as visible in their faces. We wanted to quantify this biological aging.
  • This is our primary research tool.
  • Here’s the design of the study. All babies were enrolled in 1972-1973, born in one city, of ethnically white European descent. There were 13 data collection events and assessments since babyhood, each time a full day in a clinical setting, and staying overnight and half the next day doing scanning. At the bottom you can see the last completed assessment at the age of 45 – 94% of the initial cohort took part. When the cohort was in their 30’s, we wanted to convert the study into a study of aging, so we began collecting measures used everyday in geriatric medicine.
  • Here are a few photos from the assessment, so you have an idea. We repeat these any time we assess the study members.
  • Because of all that data, we were interested in what geroscience is saying about the operational definition of aging.
  • We have been tracking 19 biomarkers since the 1000 person study participants were in their 20’s. The question is whether these biomarkers show evidence of mean level decline across the cohort’s middle decades of life.
  • Here are a few examples – Vo2Max measuring cardio fitness on an exercise bike declines steadily.
  • Mean lung function also declines.
  • Mean blood pressure rose steadily.
  • Pre-diabetes testing shows the same pattern.
  • I’m not gonna show all 19 of the biomarkers one by one, but this slide gives you an eyeball look at the 19 biomarkers, virtually all of them showing gradual progressive coordinated worsening of physiological, functional integrity across multiple bodily systems. But some of the cohort members are declining really fast and others are hardly declining at all.
  • So to capture these differences between individuals, we calculated 19 growth curves. One for each of these repeated biomarker measures. And we then combined the curves to model what we call the “Pace of Aging”. Each individual in the cohort ends up with a score that represents the gradual change in each of the 19 biomarkers that is synchronized with the change in the other 18. The green histogram shows you the distribution of the Pace of Aging scores in the cohort. It’s plotted in years of biological change in a chronological year. On the left you see some individuals who show almost no biological aging between age 26 and 45. But on the right you see some of the cohort members have biologically aged almost 2 years per chronological year over the 20 years that we followed their biomarkers.
  • Next we wanted to validate the Pace of Aging measure score by testing predicted cohort members’ outcomes on cognitive, perceptual and sensorimotor kinds of measures that are used in geriatric medicine. The full report was in Nature Aging this summer.
  • I’ll give you just a few examples – here you can see that the fast agers already have poorer balance at age 45 than the slow agers. Each circle represents 20 of the ~1000 cohort members who were assessed at age 45.
  • Fast agers also had declined faster in tested cognitive function.
  • And even their faces were rated as older looking at age 45.
  • We looked inside the brain for these validations as well. A faster Pace of Aging score was associated with thicker cortex and smaller surface area of the brain as assessed by MRI at age 45.
  • So now that we have a measure of Pace of Aging, we need to move RCTs from exclusive focus on disease endpoints to tracking the Pace of Aging. Good measure has to be sensitive to change in trials of anti-aging therapies.
  • We’ve translated our Pace of Aging measure into an epigenetic methylation signature, and we recently named it DunedinPACE, so now you can measure Pace of Aging in your own research participants even if you don’t have a longitudinal study or a lot of biomarkers (through TruDignostic). At this point I want to hand off to Daniel Belsky, and he’s going to tell you how this methylation measure has been faring in research.
  • Thank you very much!

Presentation: Daniel Belsky

  • Disclosure and early thank you to all of the funders who supported the work.
  • Just to recap, here’s a little geroscience cartoon recapitulating the paradigm.
  • We think, based on animal experiments, that interventions on these molecular changes can delay or prevent the decline in system integrity and extend healthy lifespan.
  • These are some of the limitations of the “biological age” construct that motivated us to seek the Pace of Aging and DunedinPACE method.
  • If we are trying to measure biological age from comparisons of old people and young people, the older people already overrepresent the survivors.
  • Differences in the exposure histories in people born at different times in history. The 70 year old survivor has had a history characterized by a very different exposure profile than the young people we might be comparing them to.
  • And this third one is probably the biggest limitation – we don’t know when the degree of advancement or delay in a person’s biological age has occurred. So when we observe a difference, we don’t know when it happened and therefore neither the extent to which it might be amenable to intervention that aims to slow the process of biological aging.
  • So this is a biological clock.
  • And this is where we’re at with Pace of Aging as a metaphor.
  • This was the first generation of the algorithm where we translated Pace of Aging into a DNA methylation measure. It accurately predicted the functional decline, morbidity and mortality, but it had some limitations.
  • The initial Pace of Aging had a limited followup – a lot of data, but just 3 timepoints, which for growth modelling is the bare minimum, but having more gives you more precision in your estimates. The second challenge that was recently discussed also by Morgan Levine is the reliability of DNA methylation data – it’s not great. It is ok, similar to other clocks, but if you want to test your interventions and compare baseline pre-treatment to post-treatment values, that’s not gonna cut it. You will deal with a lot more error and will need a much larger effect as a result.
  • So we undertook efforts to refine it – and the result is DunedinPACE. The first was substantial extension of followup – 4 timepoints of followup and additional decade of aging that got incorporated into this DunedinPACE measure.
  • The second is integration of results from Karen’s work to identify the relatively more reliable subset of probes on the Illumina array. And for modelling those subsets of probes, we identified this novel DunedinPACE algorithm that consists of 173 CPG sites, which when weighted produce this normally distributed cohort value around 1 year of physiological change per chronological year. And that scaling unit is going to stay with us – it corresponds to the expected rate of aging over that 20 year interval from the mid-20’s to mid-40’s. So this is the normal Pace of Aging and we’re asking whether the person ages faster or slower.
  • The algorithm has excellent reliability. It shows superior test-retest reliability to most of the other tests in the field and what we think is adequate for testing change from pre-treatment baseline to post-treatment followup. The ICC is 0.96.
  • The next thing to ask is whether this measure gives us information about the Pace of Aging that’s consistent with what we know from other sources. We know from demography that the Pace of Aging accelerates as you grow older, the mortality rate increases, and that’s what we see with the Pace of Aging from the Understanding Society study. This is not a clock, we are not measuring how old they are, but how fast they are aging – and that rate is accelerating across the lifespan.
  • The second thing we did is we looked at the people who are measured to be aging faster and looked older by existing technologies. Here you see the correlations between DunedinPACE measure and age acceleration residuals from these 4 benchmark DNA methylation clocks. The correlation is the strongest with GrimAge – the one most predictive of morbidity and mortality in prior work.
  • Here’s the data. You can see effect sizes for mortality on the right hand side and survival curves on the left hand side for the DunedinPACE. Again you can see effect sizes similar to GrimAge (*GrimAge was developed within the Framingham Offspring study data, so it’s probably a bit overestimated there).
  • Here’s some data on chronic disease incidence and prevalence – so here we are predicting forwards to the onset of chronic diseases. DunedinPACE is performing as well as or better than the other benchmark clocks.
  • Finally some not yet published data from Karen’s analysis of Alzheimer’s Disease Neuroimaging Initiative data showing that DunedinPACE algorithm distinguishes dementia and MCI from cognitively normal adults. So although it characterizes something different from current clocks – Pace of Aging, not biological age – it is similarly or better predictive of clinically relevant endpoints.
  • However what most of us are ultimately interested in and concerned with is whether it is actionable. Whether we will be able, from the DunedinPACE test, figure out if the intervention is going to slow down biological aging.
  • Now I’m going to show data from the CALERIE trial – long term caloric restriction trial. Full disclosure – CALERIE tried to get to 25% intervention group size, but it is hard and the average treatment effect in the treatment group was about half the prescribed dose. Nevertheless, it is established to have benefits to cardiometabolic health. We previously published an analysis of blood chemistry biological age algorithms indicating that CALERIE slowed aging. This work was designed to see whether we can see the same phenomena at the molecular level with DunedinPACE.
  • Here is the data for those benchmark DNA methylation clocks.
  • And here’s the data for Pace of Aging, what we’re expecting is relatively straight or flat curves for the control group as you don’t expect a change there, but would expect change in the CR group, the Pace of Aging should be slower. And we see evidence of that with both measurements and a much stronger signal for the new DunedinPACE algorithm. It’s hard to compare the data between clocks and Pace of Aging here, because they have different standard errors of measurement, so I’m going to show a standardized scale next.
  • These are treatment effects scaled based on standard deviations of the measure computed from the baseline pre-treatment data, we can interpret that kind of like Cohen’s d treatment effects. What you see is effects in the expected direction with GrimAge, Pace of Aging, and only the DunedinPACE algorithm that gives us a reliable signal different from 0 that Pace of Aging was slowed.
  • To explain a bit more what does this chart mean. The challenge we have is that epigenetic clocks appear to have this very straightforward effect size metric – it’s years of biological age. But they are not actually equivalent to each other. The standard deviation of the age acceleration residual (part of the clock that is left after you take out chronological age) is in the neighborhood of 5 for Horvath’s and Hannum’s clock, Grimage has a standard deviation of 3, so maybe it is a more accurate predictor, but the point is that a year isn’t a year isn’t a year. So we need some kind of standardized scaling metric if we’re going to compare these measures head to head in a single trial.
  • And a measure that is commonly used in the behavioral sciences where there are no natural scales is the standard deviation unit. We know that these age acceleration residuals from the clocks are normally distributed because regression residuals are normally distributed by construction, and we know that the Pace of Aging is normally distributed because there has been work done to figure that out. So we can use the standard deviation – a typical description of a normal distribution – as a scaling metric to quantify these treatment effects. The question is how much was the aging slowed if we have a treatment effect of let’s say 0.5 – what does that mean? It means – if we take the metric of Pace of Aging literally – that the Pace of Aging was slowed by a couple of percent in these people. Is that a lot or a little? I don’t know, but that is what I can say about the magnitude of these effects.
  • Some closing remarks.
  • Thanks!

Q&A

How correlated were all the different biomarkers regarding slower / faster aging re: the patients. If they are all strongly correlated there is an argument to be made that all you’d need is one of them to become a standard for all trials – which could be especially useful if that one is something extremely easy and not costly to collect, such as longitudinal face data

  • They were not that strongly correlated, numbers are in the paper. We did “leave one out” analyses, which showed they all contribute something, but none of the biomarkers are more strongly determinant than others. In fact, the strongest is the dental caries! In the end, with the DNA methylation version, it shouldn’t be difficult or costly.
  • The main predictors of the dental data are genetics (hardness of dental enamel and other genetically determined factors) and oral health behaviours (brushing, flossing, all that yuk). And yes, sugar.

 

How did you choose your biomarkers?

  • How did we choose? We needed at minimum 3 time points of each biomarker, to make the model possible. Since we began recording biomarkers when the cohort was 26, most of them are standard clinical biomarkers, nothing special. At that time, 20 years ago, we never thought we would be studying aging! But we did strive to represent as many organ systems as possible.
  • We started a long time ago with kids, we were studying asthma, and then drug abuse and things like that when they were teenagers, so we were taking some biomarkers at that time. But then when they were in their 20’s, we got serious about wanting to study their health outcomes. But we were not at all thinking we were ever going to study aging. So it was only really after Horvath’s clock came out that we thought this was important and started to model it. So we had to look backwards in the archives of the study to see what biomarkers we happen to have. We had a lot, but we had to take those that had at least 3 waves, because that’s the minimum requirement for statistical modelling of a growth curve. So it was really selected based on a statistical basis. When we started collecting biomarkers, the main objective was just to represent the health of all the different organ systems in their 20’s, and as they’ve gone forward we’ve added more biomarkers as they’ve gone along.

 

Do you have any sense of how the predictive accuracy of each biomarker scales with more data points? For example w/ face; it is easy to scale that to 1 data per day (or even more frequently) – so the question is: how much more accurate would it make that facial biomarker? And if that scaling factor is different for each marker. My hunch is that with much more frequent data certain biomarkers may rise in utility sharply, whereas others might not be affected.

  • Adding data waves does improve the precision of DunedinPACE. We modelled with just 3, the minimum, but then added a 4th wave after we got the age 45 biomarkers, and the new 4-wave version is more reliable and slightly more predictive of outcomes like dementia.
  • Regarding the face picture being taken every day, that’s an interesting idea, maybe I’ll start doing that for myself every day. What we did with this modelling the idea of having multiple waves of data across time was that it’s clear that any one of the biomarkers could spike up into an abnormal range on a short term way, and that would be just a temporary health problem (like CRP when you have an infection when we assess you). But if we assess you on multiple occasions, that temporary noise from temporary health problems or illnesses is being erased and we have a better long term trend.
  • The question of what is the optimal density of measurement for something like this is a good one. It’s going to depend on the kind of measures you’re taking. For some measures ultra dense measurement design will contribute mostly noise and not a lot of additional signal and it could be quite expensive. For something like these facial images – low cost passively accumulated measurement – there is a lot of potential there, we’ve got to show it means what you think it means though. Correlation of these surrogates is good, prediction of hard outcomes is better. We already know from facial recognition technology that there’s a lot of subtlety in what makes these algorithms work the way they do – and so what gets detected as aging may or may not be this underlying process of aging even if it is something that ultimately makes you ill and kills you. And I think that matters if we want to slide it in as a surrogate endpoint for an intervention designed to modify biological processes of aging specifically rather than for example skin rejuvenation or more sleep or something like that.

If the compliance for caloric restriction is very hard, what about doing an intermittent fasting version, or a carbohydrate restriction version, both of which are much easier than full on caloric restriction?

  • It’s full speed ahead, NIA is on their way and our measure will be there hopefully.

 

Do you expect an intervention to slow the rate of aging to be more effective among “fast agers” versus “slow agers”?

  • I can make predictions both ways.
  • People who are already aging slowly could be at a ceiling effect where they might not be able to age any slower. So if you include slow agers in your clinical trial, you might be sort of handicapping yourself and you wouldn’t be able to get much of an effect size for the outcome.
  • On the other hand you might say that the fast agers have already gone too far in the direction of organ damage and therefore it might be difficult to reverse that damage that has already happened. Therefore you might want to start your trial with medium agers and see if you can slow them down.
  • It might be that the very slow and very fast don’t show much of an effect size, and you could preload your trial with people who are sort of average agers and exclude people who are extreme agers. If I were running a trial, that might be my strategy. But as long as we don’t know before the trial starts whether somebody is already aging faster or slower, we can’t pre-register that interaction between the treatment and the initial Pace of Aging. The FDA takes a dim view of post-hoc investigations of subgroups sensitive to the trial, it’s much better if you can pre-register those kinds of analyses before the trial starts. Whether it is Pace of Aging, or clocks, or other biomarkers, it would be a good idea for trials to have pre-data that can then pre-register the moderation hypotheses.

 

What would be a good time window to get good pre-trial data? One year, three years?

  • The hypothesis that is motivating the development of these methylation algorithms is that you could observe them at pre-treatment baseline or at enrollment screening and stratify that way. If we are thinking about actual repeated measures, longitudinal followup, physiology, that’s a different question – we have some evidence that a decade is enough time. If the question is whether we could do it faster, I don’t know.
  • People and reviewers always ask why we didn’t measure these people every year, annually, so that we could have a series of 20 years of annual biomarkers. And we say that it’s a great idea but neither the study members nor the funding agency is fond of the idea. We see them as often as we can.

In the chart comparing DunedinPACE to chronological age – is the increase you are showing in the Pace of Aging over lifespan monotonic or is there an inflection point especially around the age of 60? Is something special happening there?

  • People are certainly interested in this question, there seems to be no obvious inflection point in these data around age 60 that I see. Since it is only 1000 people, our statistical power to identify subtle shifts in the variance is limited. But this is a kind of question that once we accumulate more dataset, we should be able to go after. I should note that because of mortality selection, our expectation is that expanding variance with extending chronological age is going to be limited by the fact that you just can’t be aging that fast and be that old at the same time, apart from a few outliers. What we are seeing also in other data from different studies is that the variance winnows at the oldest ages. Peoples’ Pace of Aging is still faster, but not quite as fast as those a bit younger, maybe because the sort of carrying capacity for rapid aging is eroded by the aging process itself. So the people who would be aging that fast at that old age are just not in the population anymore.
  • It is good to note here again that DunedinPACE was developed and modeled on data from this cohort of 1000 people in ages from 26-45, so there is no healthy survivor bias, because they haven’t started dying yet, and they don’t have many diseases like heart attack or cancer yet. That is one advantage of doing this research in younger cohorts, you get much more, better, and real population variation.

 

You mentioned Morgan Levine’s preprint on making test-retest consistency higher. My memory is the main way to modify a methylome clock to do that was to use more sites in the array, not just picking the individual sites that individually have more self-consistency. Maybe consider doing that, analogous to the way her group modified all the existing clocks.

  • This is a good suggestion. Although, as we are already at an ICC near the ceiling and the correlation between the DNAm algorithm and the observed pace of aging is high, there is a question about the marginal benefit. But this is definitely worth doing.

 

Have you pushed the data on decline in functional measures in 20s & 30s? Seems super important for people to understand that aging really starts in early adulthood—something not recognized widely by the public I think. Professional athletes know about this very well, but the general public seem to be much more oblivious to it.

 

Can you talk a bit more about the idea that pace of aging (spinometer) might be a bit more useful than instant timepoint (clock)? I think we can dismiss the first generation of methylation clocks that were trained just to predict your chronological age, your points about survivorship bias are excellent. But then the second generation clocks, the idea is that they are predicting time till death. So why is a pace of aging a better concept on which to test interventions than time till death – in terms of if you really want to see whether you are reversing the current state of biological age (which is my intuition that you want to do)?

  • Well, let’s say you have a data set that has people aged 18-80 and you ascertain their biomarkers so you have a record of their current health and you train your methylation dataset based on that. It’s mixing up both temporary sickness (not aging), with actual aging effects. So there is a source of noise that you can’t exclude with one wave data. That’s the beauty of multiple ways.
  • Another beauty of our study is that all the people are born in the same year – there isn’t any way that their physiology and the function of their organ systems could have been differentially affected by things that were in their environment while they were children. So take for example these datasets that were used to train these biological age variable in previous clocks. People who were in the dataset and were in their 80’s had quite different childhoods in terms of their exposure to lead, antibiotics, vaccinations, air conditioning, etc. So there has been a lot of environmental exposure for the older people that younger people in the same dataset didn’t have. I don’t consider environmental exposure aging, so it adds noise. So we have some sources of noise and we don’t know how noise or bad those sources are, because the research hasn’t been done to evaluate them, so they could be a problem for interpretation of the clock age as aging per se.
  • But the real challenge is the uncertain timing issue when it comes to thinking about whether this is appropriate for an intervention trial. If you are designing a trial to turn back the clock, to rejuvenate, with big effect, then probably you are fine – in fact you might even want something like a clock that tells you biological age, because that’s what is going to be sensitive to reversal of aging by like 10 years. But if you are thinking your intervention is going to have more of a modest effect, that it’s going to slow the accumulation of these molecular changes that drive aging, then you want a measurement that is focused on the rate of change – of deterioration in the organism, ongoing during the intervention, because that’s what the intervention is going to modulate.
  • So ultimately, it depends on what you want to test. My intuition about the interventions that are going to be feasible to give to humans in large numbers, is that they are going to slow the pace of aging, and to the extent they rejuvenate, the rejuvenation will be moderate. But I understand there are many others out there who have a different view and there is some animal data that the alternate view is at least possible.

 

Your data could be used to create a point in time that predicts morbidity. You don’t have a big enough cohort yet for mortality, but you could train a clock using the GrimAge methodology to specifically the decline in function over the decade-long timeframe. Would that clock be much different from what you ended up with with DunedinPACE? And is that a worthwhile thing to do or not?

  • It’s good food for thought. We have all the variables that would be needed, for example pack years of smoking (one of the variables in GrimAge). I think our way of thinking (untested, so might be wrong), is that once those pack years are behind you, you’re not changing them. So we wanted things that were more fluid.

 

Have you had a chance to test children with progeria or other fast aging diseases? That is probably a good model for cardiovascular or skin aging for example. Are you able to separate out facets – there is probably a cardiovascular age, immunological age, cancer age? Are you able to sort things out at all?

  • We haven’t thought about the progerias, sounds like a good idea worth investigating. One of the things we are proposing to do with the Dunedin study is to focus on some of the things we’ve been measuring for some time and have enough waves to look carefully at – say sexual aging. We have enough variables that have been measured repeatedly (from 20’s to 50’s). Or inflammaging, we have enough inflammatory biomarkers, and we’re going to do some in-vitro work with cellular level response to inflammatory challenge. So yes, we’re focusing on breaking this down into different parts. Good way would be to import DunedinPACE into some medical research with things like progeria, as you said.
  • We just haven’t got that export going yet. But now that we’ve made the Pace of Aging from 4 waves and it is out there and TruDiagnostic has licensed it for commercial purposes and we’ll make the algorithm available to other research teams, I’m hoping that it will spread. It’s available on the healthy retirement study for example so you can download their dataset and look at it in that context.

 

Where are you with respect to understanding the basis of slow and fast pace-of-ageing? (e.g. genetics, lifestyle…) I remember reading about the Dunedin study for the first time and there were 3 individuals that did not age over the period of the study and I just wanted to know who these individuals are, what they are doing, where they live, and all that. Obviously there are privacy concerns, but at least I could follow what they are doing. Has there been any investigation around that?

  • We haven’t published any papers around slow agers but it is something that we’re always kind of looking at. What we’re looking for is something about them that is not just on a continuum from fast to slow, but something that’s special, discontinuous. We need to set our minds to really doing that. We’re hoping that with the four waves, extending to five with the next followup at age 52, we’ll have enough precision to really identify quite a substantial group of slow agers, so it’s definitely on our radar. But I can tell you since it is 1000 people from 1 city, it represents people from all walks of life. We have a few olympic athletes in there with truly amazing lifestyles for example.

 

 

Since you have data for young ages, during childhood development (let’s say 5-20), and when you apply DunedinPACE biomarker to this data, somehow in your slides it was missing, so I wonder why, how it behaves and what do you think it means in terms of the nature of aging?

  • I can give you a sneak peek of a finding, we haven’t published it yet and haven’t written it up, but we have looked back because so many people ask us this question. We identified the things that are available in the dataset (we don’t have all of those same biomarkers going back to birth), but we have things like intrauterine growth retardation. We don’t have the methylation for those younger ages. However we have a sample from 5 days old newborns, so there are some things that we can do. But what we can leak to you is that if you look back and ask about timepoints of maturation such as when they first walk, first talk, adipose rebound, age at puberty, onset of menarche, menopause, things like that – the pace of aging is related to some of those developmental milestones very early in life. So that is interesting. We would of course like to measure the pace of aging during the early phases better, but there isn’t a lot of blood data from those pediatric cohorts. There are other tissues that are more often collected, this is why a different lab built a clock from buccal cells. And it’s also why I think there are many groups trying to translate these blood based assessments to saliva, which includes leukocytes, the same cells that we’re getting DNA from in blood. That would expand the opportunities to ask those questions in young children.

 

 

What is your challenge for the longevity field?

  • There are numerous DNAmethylation-based clocks for measuring age in clinical trial participants. And DunedinPACE is a DNAmethylation-based tool for measuring current pace of aging in trial participants. But whole-genome methylation is still expensive. What’s the solution for translating methylation measures to something cheaper that is feasible for repeat administration throughout the course of a gero-protective clinical trial?
  • Aging measures need to be validated as sensitive to therapeutic change by giving them to participants in already proven-effective drug or behavioural treatments to slow aging. How can we get this work going? What are the barriers?

 

What could this group and people who read the summary or watch on Youtube do to help your work?

  • To the many folks who are actually in the field and doing RCTs – taking measurements like ours and testing them in your studies is really what we need. There are so many randomized trials of interventions we know extend lifespan when they are effective, that haven’t been examined by the kinds of aging biomarkers we are developing. The Look AHEAD Trial is one example. Those trials provide a setting in which it might be feasible to address whether these biomarkers are in fact sensitive to change.
  • To those of you who are bioinformaticians and molecular biologists dealing with the measurement of molecular quantities, efforts to translate these array based measures into sequencing based approaches are another frontier where we could potentially improve the reliability of measurement (sequencing seems to fare better in that respect), but also reduce cost, because you don’t have to swallow the whole genome, you just need targeted sequencing of the identified sites.
  • And the third thing is the sort of cross tissue consistency of these measurements across translations of these measurements. Those are the things that we are working on but will take participation from the full research community, not just a handful of labs.
  • We also need more research in African Americans, Hispanic, Asian, Native Americans. We have no idea how measures of aging are working in different ethnic variations, we need to expand our horizons in that way. If you want to learn more about this, you can start with this paper.

 

 

Seminar summary by Bolek Kerous.