
Metabolomics and Machine Learning for PMI Estimation: UHPLC–QTOF-MS Profiling of Postmortem Biomarkers
Key Takeaways
- UHPLC–QTOF-MS and machine learning were used to identify stable biomarkers for PMI estimation, achieving 3–6 hour accuracy.
- Identified metabolites include amino acids and nucleotides, with potential for human forensic application, except for anserine.
Ida Marie Marquart Løber explores postmortem interval estimation using UHPLC–QTOF-MS and machine learning, profiling rat tissue metabolites and outlining pathways for human adaptation toward objective, AI-driven forensic timelines.
Ida Marie Marquart Løber of Aarhus University discusses her recent work on advancing postmortem interval (PMI) estimation through untargeted ultrahigh-pressure liquid chromatography–quadrupole time-of-flight mass spectrometry (UHPLC–QTOF-MS) coupled with machine learning. In her study, metabolic profiles were generated from multiple rat tissues across defined decomposition time points, followed by rigorous data curation to retain only reproducible molecular features. Løber further outlined the steps required for adapting these models to human casework and reflects on how AI-driven approaches may reshape forensic timelines into quantitative, objective tools.
You have recently published a paper detailing a machine learning (ML) approach for accurate PMI estimation (1). How was untargeted ultrahigh-pressure liquid chromatography–quadrupole time-of-flight mass spectrometry (UHPLC–QTOF-MS) used in this study to profile postmortem metabolites, and how was it combined with ML for PMI estimation?
Think of UHPLC–QTOF-MS as a super-sensitive chemical “camera.” Instead of taking photos, it captures thousands of tiny molecules floating around in tissues. In our study, we allowed 52 rats to decompose for up to four days and collected blood, brain, muscle, and even eye fluid at set time points (~0 h, 24 h, 48 h, 72 h, and 96 h). By running these samples through UHPLC–QTOF-MS, we obtained a comprehensive snapshot of how the chemical landscape, the biomarkers, change over time after death.
To improve data reliability, we ran multiple injections of the same samples (technical replicates), which helped us spot the molecular features that were stably measured and filter out those that were just noise. By doing so, we reduced the noise in the data set while preserving the biological signal. After the filtering, we kept the top 200 most stable molecular features per tissue.
That’s still a lot of data, so this is where ML stepped in. Having the real-world forensic applicability in mind and using algorithms like Lasso regression (for simple, linear patterns) and Random Forests (for more complex, non-linear patterns), we narrowed it down to just 15 biomarkers per tissue. Those 15 were enough to build predictive models that estimated the postmortem interval (PMI) with an accuracy of about ~3–6 h. To ensure this wasn’t just a lucky punch, we validated the models on a completely independent group of rats sampled a year later, still with great results. The independent validation ensured the predictive reliability of the selected biomarker panels.
Which types of metabolites were identified as consistent biomarkers for postmortem interval estimation, and what is their significance?
The identified metabolites included amino acids, nucleosides, nucleotides, and various breakdown products—the “biological rubble” left as tissues degrade. All selected biomarkers changed in concentration in predictable ways: some increased, others decreased or followed more complex fluctuating patterns. That kind of consistency is exactly what makes them useful as biological indicators of elapsed time since death.
Most of these molecules are also found in humans, which makes them good candidates for translation into forensic practice. The one exception is anserine, but its close human analogue carnosine could serve the same role. What really surprised us was finding lactoylated phenylalanine (Lac-Phe). Normally, this molecule appears in athletes during intense exercise because it’s linked to anaerobic metabolism. We saw it accumulate after death, reflecting the oxygen-deprived state of tissues—an observation not previously reported in a forensic context.
Given that your models were developed using rat tissues, what are the key steps for validating and adapting these models for human forensic cases?
First, we need to see if the same biomarkers behave similarly in human tissues over different PMIs. Then, we need to stress-test the models under real-world conditions: outdoor vs. indoor settings, hot vs. cold climates, and different causes of death. That’s where things get messy, because unlike lab rats, human cases come with huge variability.
To deal with this, we’ll need large, diverse training data sets from human autopsies covering multiple tissues and lots of metadata (case information). Only then can we adapt, retrain, and calibrate models for forensic casework. We have already initiated the first steps towards translating our findings from animals to humans, in collaboration with The National Board of Forensic Medicine (Rättsmedicinalverket, Sweden) and Linköping University.
What are the critical methodological considerations highlighted in the study for ensuring reliable metabolite detection and accurate modeling?
We took contamination seriously. Each rat was sampled only once to avoid microbial interference. We included quality control samples and frequent instrument calibration to make sure the UHPLC–QTOF-MS system behaved consistently. Technical replicates helped us identify reproducible features, and we applied robust normalization (ignoring extreme outlier peaks) to minimize batch effects, which are notorious in LC–MS data.
On the modeling side, we tested both linear (Lasso) and non-linear (Random Forest) approaches, because some biomarkers change in straight-line patterns while others curve or fluctuate. We didn’t just rely on cross-validation—we went the extra mile with a completely independent validation set collected a year later. That’s rare in metabolomics, and it gave us confidence in the reproducibility of our findings.
In what ways do these metabolomics and machine learning approaches advance current methods of PMI estimation?
Classical tools like Henssge’s nomogram can only get you so far, often with error margins of many hours (or more), and they depend heavily on the examiner’s judgment of decomposition signs. Our models cut that uncertainty down to around 3–6 h and remove much of the subjectivity.
Because we tested multiple tissues, the approach is flexible: pathologists can use whichever tissue is available in each case. And unlike categorical methods (“12–24 h after death”), our regression models estimate PMI as a continuous variable, giving a much finer timeline that’s more useful in legal investigations.
How do you think ML and AI will shape the future of forensic applications?
ML and AI aren’t a magic wand, but when paired with high-quality data, they’re a game changer. In forensics, and especially in PMI estimation, it can help us make sense of incredibly complex biological signals that humans alone could never fully interpret.
The future isn’t about AI replacing forensic scientists; it’s about empowering them. I see AI helping us connect dots across multiple types of evidence, chemistry, biology, and even environmental data, so we can build clearer, stronger cases. That’s where we’re headed, but only if we keep prioritizing data quality, careful study design, and rigorous ML and AI implementation. If we do that, I believe that ML and AI can help us transform forensic timelines from rough estimates into precise, objective science.
Reference
(1) Løber, I. M. M.; Hedemann, M. S.; Villesen, P.; Nielsen, K. L. Untangling the Postmortem Metabolome: A Machine Learning Approach for Accurate PMI Estimation. Anal. Chem. 2025, 97, 16123–16132. DOI: 10.1021/acs.analchem.4c05796
Ida Marie Marquart Løber is a PhD candidate in forensic medicine at Aarhus University with an M.Sc. in medicinal chemistry. She has developed deep expertise in analytical chemistry, metabolomics, and high-resolution mass spectrometry, alongside method development and machine learning for biomarker discovery. Her research is dedicated to translating complex data into practical forensic tools, bridging the gap between advanced science and real-world application. Through international collaborations in Denmark and Sweden, she emphasizes reproducibility and cross-disciplinary knowledge exchange. Beyond her technical skills, Ida thrives in collaborative environments, leading projects from experimental design through data analysis, and is passionate about transforming complex data sets into actionable insights that push forensic science forward.
Newsletter
Join the global community of analytical scientists who trust LCGC for insights on the latest techniques, trends, and expert solutions in chromatography.





