Prins Hendrikstraat 1
5611HH Eindhoven
The Netherlands
info@clinlabint.com
PanGlobal Media is not responsible for any error or omission that might occur in the electronic display of product or company data.
Dyslipidemia is one of the major risk factors for the development of cardiovascular disease (CVD). However, which lipoproteins to measure and what cut-off points to use in order to accurately assess this risk remains debatable.
by Mohamed S. Elgendy and Dr Mohamed B. Elshazly
Cardiovascular disease (CVD) mortality in the US in 2011 was estimated at 786 641 deaths representing approximately 33% of total annual deaths [1]. It remains the leading cause of mortality and morbidity in the developed world. Over many years of study, dyslipidemia has been identified as one of the major risk factors for developing CVD that can be modified through behavioral modifications as well as medications.
Lipoproteins
Lipoproteins are small particles formed of lipids and proteins, which play an important role in the transport and metabolism of cholesterol. Based on their relative density, they are divided into five major categories: high-density lipoprotein (HDL), low-density lipoprotein (LDL), intermediate density lipoprotein (IDL), very low-density lipoprotein (VLDL), and chylomicrons. LDL carries 60–70% of total serum cholesterol, HDL carries 20–30%, and VLDL carries 10–15% [2]. The remaining lipoproteins, namely triglyceride-rich lipoproteins such as VLDL, remnants and IDL, in addition to lipoprotein(a), carry a relatively small fraction of total cholesterol. Numerous studies have shown that LDL is the most atherogenic lipoprotein particle and lowering its levels has been the cornerstone of dyslipidemia management and CVD risk reduction in recent years. However, there is emerging evidence indicating that other lipoproteins also play a significant role in the process atherogenesis [23].
Relationship between lipoproteins and CVD risk
Several studies have reported a continuous relationship between LDL reduction and CVD risk reduction [3]. No threshold was identified below which a lower LDL concentration is not associated with lower risk [4]. For example, in the recent IMPROVE-IT trial, the incidence of CVD morbidity and mortality was lower in the ezetimibe/simvastatin group (with a median LDL-C follow-up of 53.7 mg/dL) compared to the simvastatin-alone group (with a median LDL-C follow-up of 69.5 mg/dL) [5]. In another study, individuals with hypobetalipoproteinemia, who have LDL-C levels less than 70 mg/dL, show prolonged longevity and very minimal rates of myocardial infarctions [6]. All of this supports the notion of ‘lower is better’.
LDL-C levels in the range of 25–60 mg/dL are considered physiologically adequate [7]. Even levels below 25 mg/dL have failed to show any adverse effects in a couple of recent trials [8, 9]. Although adverse effects of very low LDL, like hemorrhagic stroke and neurocognitive deficits, have been reported in some studies, they were neither significant nor consistent [10, 11]. Therefore, the benefits of achieving very low levels of LDL outweigh the risks. On the one hand, the lack of randomized clinical trials comparing the outcome of different LDL goals has made it difficult to reach a consensus among different guidelines on the optimal goals for high-risk patients or those with coronary disease equivalents with the commonly used target still being <70 mg/dL [12–14]. On the other hand, in the most recent American College of Cardiology (ACC)/American Heart Association (AHA) guidelines, targets were abandoned because of the notion that the benefit of statin is independent of LDL level [15]. Despite these differences, we believe that the conglomerate of evidence suggests that your LDL can never be too low although data examining patients with extremely low levels <25 mg/dL is still limited. The potential re-establishment of new even lower LDL targets in upcoming guidelines will require careful examination of data from proprotein convertase subtilisin kexin-9 (PCSK-9) trials to identify specific LDL levels below which risk outweighs benefit.
Other factors contribute to total atherogenic risk
Despite the established recognition of LDL as the most atherogenic lipoprotein, it is not representative of total atherogenic risk. Elevated triglycerides were found to be associated with increased risk for CVD and this suggests that triglyceride-rich lipoproteins (TGRLs), especially the remnants, are atherogenic. These lipoproteins include VLDL, IDL, and chylomicrons (only in the non-fasting state). As LDL standard measurement by the Friedewald formula [Total cholesterol – HDL – triglycerides/5] [1] only includes LDL-C and lipoprotein(a), non-HDL has been proposed as a more inclusive parameter of atherogenic risk because it also incorporates VLDL-C, IDL-C and remnants in addition to LDL-C. In fact, several studies have demonstrated that non-HDL-C is more strongly associated with CVD than LDL-C and is a more powerful risk predictor [16–21]. Moreover, non-HDL measurement comes at no extra cost, as it is calculated from the standard lipid profile by subtracting HDL from total cholesterol, and does not require prior fasting. Nevertheless, due to the smaller number of studies examining non-HDL as a target of therapy, compared to that examining LDL, most of the current guidelines recommend non-HDL as a secondary target of therapy [2, 12, 14, 22]. Only the National Lipid Association recommends non-HDL as a primary target of therapy as well as LDL [22]. We believe this current situation represents a transitional phase toward using non-HDL as a primary target of therapy, just like the past transition from total cholesterol to LDL-C. This is most important when discordance exists between LDL and non-HDL levels within individuals, a relatively common finding particularly in patients with low LDL and high triglyceride levels [23]. The currently recommended non-HDL treatment goal is 30 mg/dL higher than that of LDL-C based on the rationale that ‘normal’ VLDL exists when triglycerides level is <150 mg/dL, which is <30 mg/dL [2]. However, in a recent study of 1.3 million US adults, non-HDL level of 93 mg/dL was percentile equivalent to LDL of 70 mg/dL [23] suggesting that a lower non-HDL goal should be targeted.
Particle-based measures such as apolipoprotein-B (Apo-B) and LDL particle concentration (LDL-P) also have the potential to replace cholesterol-based measures such as LDL or non-HDL as predictors of risk and targets of therapy. Apo-B constitutes the protein component of almost all the known atherogenic lipoproteins: VLDL, IDL, and LDL,;therefore, Apo-B measurement has been suggested to better estimate particle concentration, a more accurate reflection of subendothelial atherogenesis. Apo-B has been shown to be a better risk marker than LDL in multiple studies [17, 21, 24–29]. Many guidelines currently recommend Apo-B as an optional risk marker and target of therapy [12, 14, 22, 30]. Similarly, almost all the studies comparing LDL-P to LDL-C have shown superiority of particle concentration in terms of CVD risk assessment [31–34]. In the LUNAR trial and Framingham Offspring Study, there was a strong correlation between Apo-B and LDL-P with non-HDL, respectively, suggesting that non-HDL, available from the standard lipid profile, can be used satisfactorily for risk assessment [31, 35] keeping in mind that Apo-B may be superior in instances when discordance exists [36].
Whereas individual lipid parameters are important in risk prediction, summary estimates that assess the ratio of pro-atherogenic to anti-atherogenic lipoproteins also add important prognostic information regarding CVD risk. Out of the ratios that have been considered, total cholesterol to HDL cholesterol ratio (TC/HDL) and Apo-B/A1 are the most propitious. Despite TC/HDLs strong association with CVD risk [37–43], some have argued against any additional benefit this ratio might have, given that its two variables are included in estimating LDL by the Friedewald formula, in calculating non-HDL-C and in CVD risk estimation scores in addition to the contentiousness of HDL raising therapeutic strategies. However, in a recent 1.3 million population study, it has been documented that there is significant TC/HDL patient-level discordance in relation to LDL and non-HDL [44, 45]. This implies that TC/HDL may carry additional information reflecting atherogenic particle size and concentration [44, 45]. Notably, a TC/HDL ratio of 2.6 was percentile equivalent to an LDL level of 70 mg/dL (Table 1). Outcome data examining the clinical impact of TC/HDL discordance is still in progress and thus current guidelines do not currently recommend using TC/HDL.
Summary
There is no doubt that the field of dyslipidemia management has been one of the most dynamic fields in cardiology over the last 3 decades. With the recent advent of PCSK-9 inhibitors, we need to re-evaluate our understanding of lipoprotein reduction and ask ourselves important questions: Should guidelines re-establish treatment targets? What is the best lipoprotein parameter for predicting risk? Is it one parameter that is superior or is it the input of multiple parameters? What do we do when discordance between lipid parameters within individuals exists? Although a lot of data necessary to answer these questions is still a work in progress, recent data may be able to provide some insightful answers. First, LDL-C is not the optimal marker for total atherogenic risk. Second, instead of evaluating the performance of individual lipid parameters at a population level, we should evaluate their performance at an individual level where identifying discordance within individuals is key to understanding which marker may be superior. Third, particle-based measures such as Apo-B and LDL-P may be superior to cholesterol-based measures; however, summary estimates such as TC/HDL or Apo-B/A1 ratios also add significant information to individual parameters. Fourth, identifying new lipoprotein treatment goals is dependent on identifying certain lipoprotein levels below which risk may outweigh benefit. Therefore, it seems likely that a future where very low percentile-equivalent cut-off points of several lipoprotein parameters and ratios may be set as simultaneous goals for treatment.
References
1. Roger VL, Go AS, et al. Heart disease and stroke statistics–2012 update. Circulation 2012; 125(1): e2–220.
2. NCEP expert panel Third report of the National Cholesterol Education Program (NCEP) expert panel on Detection, Evaluation, and Treatment of High Blood Cholesterol in Adults (Adult Treatment Panel III) final report. Circulation 2002; 106(25): 3143–3143.
3. Boekholdt SM, Hovingh GK, et al. Very low levels of atherogenic lipoproteins and the risk for cardiovascular events: a meta-analysis of statin trials. J Am Coll Cardiol. 2014; 64(5): 485–94.
4. Law MR, Wald NJ, et al. By how much and how quickly does reduction in serum cholesterol concentration lower risk of ischaemic heart disease? BMJ 1994; 308(6925): 367–72.
5. Giugliano RP, Blazing MA. IMProved Reduction of Outcomes: Vytorin Efficacy International Trial. American College of Cardiology 2015;http://www.acc.org/latest-in-cardiology/clinical-trials/2014/11/18/16/25/improve-it
6. Glueck CJ, Gartside P, et al. Longevity syndromes: familial hypobeta and familial hyperalpha lipoproteinemia. J Lab Clin Med. 1976; 88(6): 941–957.
7. Brown MS, Goldstein JL. A receptor-mediated pathway for cholesterol homeostasis. Science 1986; 232(4746): 34–47.
8. Robinson JG, Farnier M, et al. Efficacy and safety of alirocumab in reducing lipids and cardiovascular events. N Engl J Med. 2015; 372(16): 1489–1499.
9. Horton JD, Cohen JC, et al. PCSK9: a convertase that coordinates LDL catabolism. J Lipid Res. 2009; 50(Supplement): S172–177.
10. Law MR, Thompson SG, et al. Assessing possible hazards of reducing serum cholesterol. BMJ 1994; 308(6925): 373–379.
11. Hsia J, MacFadyen JG, et al. Cardiovascular event reduction and adverse events among subjects attaining low-density lipoprotein cholesterol <50 mg/dl with rosuvastatin: The JUPITER Trial (Justification for the use of statins in prevention: an intervention trial evaluating rosuvastatin). J Am Coll Cardiol. 2011; 57(16): 1666–16675.
12. Genest J, McPherson R, et al. 2009 Canadian Cardiovascular Society/Canadian guidelines for the diagnosis and treatment of dyslipidemia and prevention of cardiovascular disease in the adult – 2009 recommendations. Can J Cardiol. 2009; 25(10): 567–579.
13. Grundy SM, Cleeman JI, et al. Implications of recent clinical trials for the National Cholesterol Education Program Adult Treatment Panel III guidelines. Circulation 2004; 110(2): 227–239.
14. European Association for Cardiovascular Prevention & Rehabilitation, Reiner Ž, Catapano AL, et al. ESC/EAS Guidelines for the management of dyslipidaemias. Eur Heart J. 2011; 32(14): 1769–1818.
15. Stone NJ, Robinson JG, et al. 2013 ACC/AHA guideline on the treatment of blood cholesterol to reduce atherosclerotic cardiovascular risk in adults: a report of the American College of Cardiology/American Heart Association Task Force on Practice Guidelines. Circulation 2014; 129(25 Suppl 2): S1–45.
16. Bittner V, Hardison R, et al. Non-high-density lipoprotein cholesterol levels predict five-year outcome in the Bypass Angioplasty Revascularization Investigation (BARI). Circulation 2002; 106(20): 2537–2542.
17. Boekholdt S, Arsenault BJ, et al. Association of LDL cholesterol, non–HDL cholesterol, and apolipoprotein B levels with risk of cardiovascular events among patients treated with statins: a meta-analysis. JAMA 2012; 307(12): 1302–1309.
18. Li C, Ford ES, et al. Serum non-high-density lipoprotein cholesterol concentration and risk of death from cardiovascular diseases among U.S. adults with diagnosed diabetes: the Third National Health and Nutrition Examination Survey linked mortality study. Cardiovasc Diabetol. 2011; 10: 46.
19. Liu J, Sempos CT, et al. Non–high-density lipoprotein and very-low-density lipoprotein cholesterol and their risk predictive values in coronary heart disease. Am J Cardiol. 2006; 98(10): 1363–1368.
20. Robinson JG, Wang S, et al. Meta-analysis of the relationship between non–high-density lipoprotein cholesterol reduction and coronary heart disease risk. J Am Coll Cardiol. 2009; 53(4): 316–322.
21. Sniderman AD, Williams K, et al. A meta-analysis of low-density lipoprotein cholesterol, non-high-density lipoprotein cholesterol, and apolipoprotein B as markers of cardiovascular risk. Circ Cardiovasc Qual Outcomes 2011; 4(3): 337–345.
22. Jacobson TA, Maki KC, et al. National Lipid Association recommendations for patient-centered management of dyslipidemia: part 2. J Clin Lipidol. 2015;http://linkinghub.elsevier.com/retrieve/pii/S1933287415003803
23. Elshazly MB, Martin SS, et al. Non–high-density lipoprotein cholesterol, guideline targets, and population percentiles for secondary prevention in 1.3 million adults: The VLDL-2 Study (very large database of lipids). J Am Coll Cardiol. 2013; 62(21): 1960–1965.
24. Jiang R, Schulze MB, et al. Non-HDL cholesterol and apolipoprotein B predict cardiovascular disease events among men with type 2 diabetes. Diabetes Care 2004; 27(8): 1991–1997.
25. Shai I, Rimm EB, et al. Multivariate assessment of lipid parameters as predictors of coronary heart disease among postmenopausal women: potential implications for clinical guidelines. Circulation 2004; 110(18): 2824–2830.
26. Sniderman A, Williams K, et al. Non-HDL C equals apolipoprotein B: except when it does not! Curr Opin Lipidol. 2010; 21(6): 518–524.
27. Talmud PJ, Hawe E, et al. Nonfasting apolipoprotein B and triglyceride levels as a useful predictor of coronary heart disease risk in middle-aged UK men. Arterioscler Thromb Vasc Biol. 2002; 22(11): 1918–1923.
28. Walldius G, Jungner I. Apolipoprotein B and apolipoprotein A-I: risk indicators of coronary heart disease and targets for lipid-modifying therapy. J Intern Med. 2004; 255(2): 188–205.
29. Walldius G, Jungner I, et al. High apolipoprotein B, low apolipoprotein A-I, and improvement in the prediction of fatal myocardial infarction (AMORIS study): a prospective study. The Lancet 2001; 358(9298): 2026–2033.
30. Grundy SM, Arai H, et al. An International Atherosclerosis Society position paper: global recommendations for the management of dyslipidemia – full report. J Clin Lipidol. 2014; 8(1): 29–60.
31. Cromwell WC, Otvos JD, et al. LDL particle number and risk of future cardiovascular disease in the Framingham Offspring Study – implications for LDL management. J Clin Lipidol. 2007; 1(6): 583–592.
32. El Harchaoui K, van der Steeg WA, et al. Value of low-density lipoprotein particle number and size as predictors of coronary artery disease in apparently healthy men and women: the EPIC-Norfolk Prospective Population Study. J Am Coll Cardiol. 2007; 49(5): 547–553.
33. Mora S, Otvos JD, et al. Lipoprotein particle profiles by nuclear magnetic resonance compared with standard lipids and apolipoproteins in predicting incident cardiovascular disease in women. Circulation 2009; 119(7): 931–939.
34. Otvos JD, Mora S, et al. Clinical implications of discordance between LDL cholesterol and LDL particle number. J Clin Lipidol. 2011; 5(2): 105–113.
35. Ballantyne CM, Pitt B, et al. Alteration of relation of atherogenic lipoprotein cholesterol to apolipoprotein B by intensive statin therapy in patients with acute coronary syndrome (from the Limiting UNdertreatment of lipids in ACS With Rosuvastatin [LUNAR] trial). Am J Cardiol. 2013; 111(4): 506–509.
36. Mora S. Advanced lipoprotein testing and subfractionation are not (yet) ready for routine clinical use. Circulation 2009; 119(17): 2396–404.
37. Prospective Studies Collaboration, Lewington S, Whitlock G, et al. Blood cholesterol and vascular mortality by age, sex, and blood pressure: a meta-analysis of individual data from 61 prospective studies with 55,000 vascular deaths. Lancet 2007; 370(9602): 1829–1839.
38. Ingelsson E, Schaefer EJ, et al. Clinical utility of different lipid measures for prediction of coronary heart disease in men and women. JAMA 2007; 298(7): 776–785.
39. Manickam P, Rathod A, et al. Comparative prognostic utility of conventional and novel lipid parameters for cardiovascular disease risk prediction: do novel lipid parameters offer an advantage? J Clin Lipidol. 2011; 5(2): 82–90.
40. Kastelein JJP, Steeg WA van der, et al. Lipids, apolipoproteins, and their ratios in relation to cardiovascular events with statin treatment. Circulation 2008; 117(23): 3002–3009.
41. McQueen MJ, Hawken S, et al. Lipids, lipoproteins, and apolipoproteins as risk markers of myocardial infarction in 52 countries (the INTERHEART study): a case-control study. Lancet 2008; 372(9634): 224–233.
42. Mora S, Otvos JD, et al. Lipoprotein particle profiles by nuclear magnetic resonance compared with standard lipids and apolipoproteins in predicting incident cardiovascular disease in women. Circulation 2009; 119(7): 931–939.
43. Ridker PM, Rifai N, et al. Non-HDL cholesterol, apolipoproteins A-I and B100, standard lipid measures, lipid ratios, and CRP as risk factors for cardiovascular disease in women. JAMA 2005; 294(3): 326–333.
44. Elshazly MB, Quispe R, et al. Patient-level discordance in population percentiles of the total cholesterol to high-density lipoprotein cholesterol ratio in comparison with low-density lipoprotein cholesterol and non–high-density lipoprotein cholesterol: The Very Large Database of Lipids Study (VLDL-2B). Circulation.2015; 132(8): 667–676.
45. Friedewald WT, Levy RI, et al. Estimation of the concentration of low-density lipoprotein cholesterol in plasma, without use of the preparative ultracentrifuge. Clin Chem. 1972; 18(6): 499–502.
The authors
Mohamed S. Elgendy1 and Mohamed B. Elshazly*2 MD
1Kasr Al Ainy School of Medicine, Cairo University, Cairo, Egypt
2Cleveland Clinic, Heart and Vascular Institute, Cleveland, OH 44195, USA
*Corresponding author
E-mail: elshazm@ccf.org
The application of metabolic profiling of human biofluids to the prediction of drug efficacy, pharmacokinetics, metabolism and/or toxicity forms a paradigm known as pharmacometabonomics. Pharmacometabonomics holds out promise for the improved delivery of personalized medicine in the future, as it takes into account both genetic and environmental factors, including diet, drug intake and most notably, the status of the gut microbiome, in deriving predictions. Pharmacometabonomics is thus complementary to pharmacogenomics and in some instances the two technologies can be synergistically used together. This article introduces pharmacometabonomics and covers current important application areas.
by Dr Dorsa Varshavi, Dorna Varshavi and Prof. Jeremy Everett
Introduction
In 21st century medicine, a major goal is to develop personalized medicine for selected groups of patients in order to reduce the likelihood of adverse drug reactions and to maximize the desired therapeutic effect [1]. Until recently, personalized drug therapy was delivered almost exclusively by pharmacogenomics (PG), where an individual’s genetic makeup is used to predict the outcome of drug treatment [2]. The best-recognized examples of PG involve drug effect predictions made by analysis of genetic polymorphisms in drug-metabolizing enzymes such as the cytochrome P450 isoenzymes [3].
Although genetic variation is an important determinant of individual variability in drug response, it is now well recognized that personalized drug therapy cannot always be attained using genetic knowledge alone. This is because inter-individual variation in drug response is a consequence of multiple factors, including genetic and epigenetic factors and in addition, environmental factors such as nutritional and health status, the condition of the microbiome, exposure to environmental toxins, and co- or pre-administration of other drugs, including alcohol. These environmental factors can strongly affect drug absorption, distribution, metabolism and excretion and thereby cause inter-individual variation in drug efficacy and safety.
Metabonomics is defined as: ‘The study of the metabolic response of organisms to disease, environmental change, or genetic modification’ [4]. In a metabonomics experiment, changes in the levels of biofluid or tissue metabolites, before and after an intervention, such as drug administration, are measured using analytical technologies such as nuclear magnetic resonance (NMR) spectroscopy or mass spectrometry (MS). The alternative term metabolomics is also used and although its definition is observational, rather than interventional, the two terms are now used interchangeably.
Pharmacometabonomics is a recent development from metabonomics and is defined as ‘the prediction of the outcome (for example, efficacy or toxicity) of a drug or xenobiotic intervention in an individual based on a mathematical model of pre-intervention metabolite signatures’ [5–7]. In contrast to a metabonomics experiment, where the effect of an intervention is assessed based on changes in metabolite profiles post-intervention, in a pharmacometabonomics experiment the effect of the intervention is predicted based on the pre-intervention metabolite profiles.
Although first demonstrated in animals [5], pharmacometabonomics was quickly also demonstrated in humans, in a study where an individual’s pre-dose urinary endogenous metabolite profile was used to predict the metabolism of the analgesic paracetamol [6]. NMR-based analyses showed that individuals excreting relatively high levels of the microbial co-metabolite para-cresol sulfate in their pre-dose urine, excreted less paracetamol sulfate and more paracetamol glucuronide post-dose than individuals excreting low pre-dose amounts of para-cresol sulfate (Figs 1 & 2). Para-cresol sulfate is a metabolite produced from the hepatic sulfation of para-cresol, which is itself generated by gut bacteria, particularly Clostridium species. Paracetamol and para-cresol have similar molecular structures and both compete for limited human sulfation capacity via the same sulfotransferase enzymes, particularly SULT1A1. Thus individuals with a microbiome producing large amounts of para-cresol use up a large degree of their sulfonation capacity in metabolizing this toxin to para-cresol sulfate, and a subsequent dose of paracetamol will be metabolized to a greater degree by glucuronidation. This study was important for two key reasons: (1) it was the first demonstration of pharmacometabonomics in humans and (2) it was the first demonstration of the influence of the gut microbiome on human drug metabolism; the fact that the key biomarker in this study of human drug metabolism was a bacterially-derived molecule was a shock. Furthermore, the findings of this study will have implications for other drugs for which sulfonation is important, as well as certain diseases such as autism where abnormal paracetamol metabolism has been observed.
Pharmacometabonomics has been also used to predict individual responses to therapy, including the prediction of patient responses to treatment with the statin simvastatin. Statins can reduce low-density lipoprotein cholesterol (LDL-C) and, therefore, are used for treatment of cardiovascular disease. Kaddurah-Daouk et al. demonstrated that pre-dose plasma levels of the phosphatidylcholine metabolite PC18:2n6, the cholesterol ester CE18:1n7 and the free fatty acid FA18:3n3, were positively correlated with the magnitude of simvastatin-induced reduction in LDL-C, in 36 good responders and 36 poor responders [8]. A targeted pre-dose plasma analysis by MS then demonstrated (amongst other results) a strong correlation between the degree of reduction in LDL-C and higher pre-dose concentrations of three secondary, bacteria-derived, bile acids: lithocholic acid (LCA), taurolithocholic acid (TLCA) and glycolithocholic acid (GLCA), as well as coprostanol (COPR) [9]. This study further supported the contribution of the microbiome in influencing drug responses.
Pharmacometabonomics studies can be pursued jointly with, or followed up by pharmacogenetics studies. A good exemplification of the so-called ‘pharmacometabonomics informed pharmacogenomics approach’ is an MS-based study which demonstrated that pre-dose plasma levels of glycine, a central nervous system inhibitory neurotransmitter, were associated with rates of response or remission during citalopram/escitalopram treatment in patients with major depressive disorder (MDD) in the Mayo Clinic–NIH Pharmacogenetics Research Network (PGRN) Citalopram/Escitalopram Pharmacogenomics (Mayo–PGRN SSRI) study [10]. Tag single-nucleotide polymorphism (SNP) genotyping of the genes encoding enzymes in the glycine pathway from 529 patients enrolled in the Mayo-PGRN SSRI study was then completed. A series of SNPs in the gene encoding glycine dehydrogenase (GLDC) were found to be significantly associated with disease remission, with rs10975641 SNP showing the strongest association. This study demonstrated that pharmacometabonomics data can inform and complement pharmacogenomics data and, when combined, they can provide improved insights into the mechanisms influencing variability in drug response.
There are now many examples of the use of pharmacometabonomics for the prediction of human drug efficacy, toxicity, metabolism and pharmacokinetics and recent reviews are available [7, 11].
Conclusion and future prospect
Since its initial discovery [5], pharmacometabonomics has been increasingly applied in both preclinical and clinical studies to predict drug safety, efficacy, metabolism and pharmacokinetics. Pharmacometabonomics has an important advantage over pharmacogenomics in that it takes into account both genetic and environmental influences on drug administration. Pharmacometabonomics is itself just one specific example of a broader class of approaches known as predictive metabonomics, where the analysis of pre-intervention metabolite profiles can be used to predict clinical responses to other types of intervention, including diet, exercise, or even just the passage of time. A good example of predictive metabonomics can be seen in the recent study by Wang-Sattler et al. [12], who demonstrated that low baseline levels of glycine and lysophosphatidylcholine were predictive of the development of impaired glucose tolerance and/or type-2 diabetes in hundreds of subjects from the population-based, Cooperative Health Research in the Region of Augsburg (KORA) cohort. Another emerging area for which pharmacometabonomics holds promise is in the monitoring of patients over time as they progress through therapies such as cancer chemotherapy or surgery, a paradigm called longitudinal pharmacometabonomics [13]. This approach involves the metabolic profiling of patients before, during and after clinical therapy, in order to predict responses to future treatments and thus choose the optimal treatment regime. PG is now over 50 years old and is still limited in its impact on the practice of medicine. Pharmacometabonomics is much younger and it will take time for it to impact in the clinical arena. We predict that in the near future, personalized medicine will be conducted with assistance from both PG and pharmacometabonomics.
References
1. Pokorska-Bocci A, Stewart A, Sagoo GS, Hall A, Kroese M, Burton H. ‘Personalized medicine’: what’s in a name? Personalized Medicine 2014; 11(2): 197–210.
2. Jorgensen JT. A challenging drug development process in the era of personalized medicine. Drug Discov Today 2011; 16(19–20): 891–897.
3. Pirmohamed M. Personalized pharmacogenomics: predicting efficacy and adverse drug reactions. Ann Rev Genomics Hum Genet. 2014; 15: 349–370.
4. Lindon J, Nicholson J, Holmes E, Everett J. Metabonomics: Metabolic processes studied by NMR spectroscopy of biofluids. Concepts Magn Reson. 2000; 12(5): 289–320.
5. Clayton T, Lindon J, Cloarec O, Antti H, Charuel C, Hanton G, Provost JP, Le Net JL, Baker D, Walley RJ, Everett JR, Nicholson JK. Pharmaco-metabonomic phenotyping and personalized drug treatment. Nature 2006; 440(7087): 1073–1077.
6. Clayton TA, Baker D, Lindon JC, Everett JR, Nicholson JK. Pharmacometabonomic identification of a significant host-microbiome metabolic interaction affecting human drug metabolism. Proc Natl Acad Sci U S A 2009; 106(34): 14728–14733.
7. Everett JR. Pharmacometabonomics in humans: a new tool for personalized medicine. Pharmacogenomics 2015; 16(7): 737–754.
8. Kaddurah-Daouk R, Baillie RA, Zhu HJ, Zeng ZB, Wiest MM, Nguyen UT, Watkins SM, Krauss RM. Lipidomic analysis of variation in response to simvastatin in the Cholesterol and Pharmacogenetics Study. Metabolomics 2010; 6(2): 191–201.
9. Kaddurah-Daouk R, Baillie RA, Zhu H, Zeng ZB, Wiest MM, Nguyen UT, Wojnoonski K, Watkins SM, Trupp M, Krauss RM. Enteric microbiome metabolites correlate with response to simvastatin treatment. PLoS One 2011; 6(10): e25482.
10. Ji Y, Hebbring S, Zhu H, Jenkins GD, Biernacka J, Snyder K, Drews M, Fiehn O, Zeng Z, Schaid D, Mrazek DA, Kaddurah-Daouk R, Weinshilboum RM. Glycine and a glycine dehydrogenase (GLDC) SNP as citalopram/escitalopram response biomarkers in depression: pharmacometabolomics-informed pharmacogenomics. Clin Pharmacol Ther. 2011; 89(1): 97–104.
11. Everett JR. NMR-based pharmacometabonomics: a new approach to personalized medicine. In: Everett JR, Harris RK, Lindon JC, Wilson ID. (eds) NMR in Pharmaceutical Sciences, pp 359–372. Wiley 2015.
12. Wang-Sattler R, Yu Z, Herder C, Messias AC, Floegel A, He Y, Heim K, Campillos M, Holzapfel C, Thorand B, Grallert H, Xu T, et al. Novel biomarkers for pre-diabetes identified by metabolomics. Mol Syst Biol. 2012; 8: 615.
13. Nicholson JK, Everett JR, Lindon JC. Longitudinal pharmacometabonomics for predicting patient responses to therapy: drug metabolism, toxicity and efficacy. Expert Opin Drug Metab Toxicol. 2012; 8(2): 135–139.
The authors
Dorsa Varshavi PhD, Dorna Varshavi MSc, Jeremy Everett* PhD
Medway Metabonomics Research Group, University of Greenwich, Chatham, Kent ME4 4TB, UK
*Corresponding author
E-mail: j.r.everett@greenwich.ac.uk
Liquid chromatography-mass spectrometry (LC-MS/MS) is an analytical chemistry technique that combines the physio-chemical separation capabilities of liquid chromatography (via conventional chromatography within a column) with the analytic power of mass spectrometry. It allows the user to properly ascertain the individual mass/charge ratio of analytes present in a chromatographic peak. The high throughput capabilities of this technique will bring value to the clinical lab, where time taken to analyse samples is paramount. Bringing LC-MS/MS testing into the clinical setting has been a slow process, however, the medical device industry is on the verge of a fundamental breakthrough that could help drive the adoption of this technique.
LC-MS/MS is used primarily for the identification and quantification of particular molecules within a substance, and its application in diagnostics is a promising venture due to its potential ability to increase throughputs and streamline the processes needed. As such, patient data can be analysed quickly and accurately in order to provide improved patient care. Broadly speaking, the methodology can be divided into three parts. Initially, sample preparation is undertaken; be it whole blood, plasma, saliva or urine – the sample must be prepared to ensure large proteins and salts that may dirty the instrumentation are removed. Conventionally, this phase has been undertaken manually, which can be time-consuming and prone to human error. As such there is a need for the automation of this step to improve efficiency and reliability before LC-MS/MS is adopted by the clinical laboratory. Once sample preparation is complete, the liquid chromatography and mass spectrometry steps can take place, in which the sample is separated and analysed respectively.
LC-MS/MS and the clinical laboratory
Although adoption of LC-MS/MS in the clinical laboratory has been slow but steady, this technique has demonstrated vast improvements in analytical specificity when compared to conventional immunoassays. Mass spectrometry’s strength lies in its ability to be extremely specific to the target analyte, due to the absence of cross reactivity; the likes of which can be common in antibody-based immunoassay (IA) methods. However, the uptake of this technique by clinical labs has not been as rapid as expected, with many choosing to continue using immunoassay-based methods instead.
There are a number of factors causing clinical labs to be cautious about the mainstream use of LC-MS/MS systems. There are numerous LC and MS systems available to choose from, something which in itself can seem overwhelming to a clinical scientist who is not an LC-MS/MS expert. In addition, there is a range of options for calibrators and controls available, along with the internal expertise required to develop and validate methods, and set-up and run the instruments. The final factor to impact the decision is often cost, since investment in such systems is commonly high, especially when taking into account the automated components required to help reduce labour needs for sample preparation. As such, finance options are often limited. When combined, these factors can make immunoassay analysers seem like the simpler option.
The emergence of connected components
Although used in many clinical labs, immunoassay techniques are not always accurate. For example small molecule biomarkers, such as steroid hormones, prove challenging due to the lack of specificity in the binding sites on small molecules, a fact that many clinical scientists are all too aware of. Recent improvements to LC-MS/MS systems have focused on advancing both ease of use and efficacy, essentially to make them a viable alternative to IA methods. Laboratory managers can find ample published documentation that shows just how beneficial LC-MS/MS systems are when used in place of IAs. For instance, a study by Nigel W. Brown and colleagues published in Clinical Chemistry in 2005 demonstrated that LC-MS/MS was far more precise than a microparticle enzyme immunoassay (MEIA), which was ‘significantly affected by patient cohort’ (Brown, N et al. Clinical Chemistry 2005; 51(3): 586-593).
Clinical laboratories are faced with increasing complexities in their daily workflows, and there are pressures to provide detailed analyses of patient samples using streamlined and well-coordinated practices. The need to provide efficient turnaround on samples is also on the increase. There is, therefore, a trend where system manufacturers are looking to provide laboratories with the ability to advance efficiency through the implementation of compatible technologies, such as the combination of stand-alone elements (automated sample handlers, LC-MS/MS reagent kits, and software), which are supplied together to better manage workflows. These connected component-based systems, by which the different components of the LC-MS/MS system (sample preparation, liquid chromatography, and mass spectrometry) are placed in tandem with each other, is a big step in the right direction to increase productivity and efficiency, while simplifying the number of decisions that the lab needs to make. However, there are still improvements that can be made. The issue lies in the fact that connected components are not the same as a fully-integrated, automated system with dedicated assays and diagnostic kits that are regulatory compliant. The development of properly synergized components can truly simplify the decisions faced by clinical scientists and enable LC-MS/MS to become an integral part of the clinical laboratory.
The needs of the lab
Clinical labs require a high level of automation with a number of its systems, owing to the high turnover rate demanded to meet the needs of patient care. In addition, easy to use technologies that include walk away operations are essential, and considered commonplace to clinical scientists, owing to the multitude of responsibilities placed on laboratory personnel. These busy labs require built-for-purpose, fully integrated analysers that are able to greatly reduce installation, validation, and training times, having the system ready to operate in a matter of weeks, rather than months. Streamlining the procedure without compromising the quality of the analysis via implementation of better integrated systems can be considered an essential next step in the medical devices industry. Furthermore, results obtained from these systems need not be in isolation: standardization between laboratories using the same system will be achievable owing to the inclusion of dedicated test kits that are fully validated and ready for use with the analyser. The ideal next-generation system for the clinical laboratory will encompass every step, including automated sample preparation, handling and LC-MS in one unit. Moreover, it will be labelled as a medical device, have dedicated assay kits, and be produced, serviced, and supported by a single manufacturer. Finally, such a device would ideally be able to connect bi-directionally with the laboratory information system (LIS) and furthermore to the laboratory automation system (LAS).
In the end, technologies that are able to advance the state of play for laboratory sample analysis are required in order to ensure laboratory personnel can be confident in the analyses they are making. Beyond connected components, the introduction of integrated LC-MS/MS systems into the laboratory could lead to a paradigm shift with regards to specificity in small molecule analysis that is expected by clinical scientists. Systems that can lead to better quality of care for patients and improved analysis for physicians will essentially help healthcare systems operate more efficiently.
The author
Sarah Robinson, Ph.D,
Market Development Specialist,
Thermo Fisher Scientific
& Expert Consultant to the EFLM
Working Group on Test Evaluation
November 2025
The leading international magazine for Clinical laboratory Equipment for everyone in the Vitro diagnostics
Prins Hendrikstraat 1
5611HH Eindhoven
The Netherlands
info@clinlabint.com
PanGlobal Media is not responsible for any error or omission that might occur in the electronic display of product or company data.
This site uses cookies. By continuing to browse the site, you are agreeing to our use of cookies.
Accept settingsHide notification onlyCookie settingsWe may ask you to place cookies on your device. We use cookies to let us know when you visit our websites, how you interact with us, to enrich your user experience and to customise your relationship with our website.
Click on the different sections for more information. You can also change some of your preferences. Please note that blocking some types of cookies may affect your experience on our websites and the services we can provide.
These cookies are strictly necessary to provide you with services available through our website and to use some of its features.
Because these cookies are strictly necessary to provide the website, refusing them will affect the functioning of our site. You can always block or delete cookies by changing your browser settings and block all cookies on this website forcibly. But this will always ask you to accept/refuse cookies when you visit our site again.
We fully respect if you want to refuse cookies, but to avoid asking you each time again to kindly allow us to store a cookie for that purpose. You are always free to unsubscribe or other cookies to get a better experience. If you refuse cookies, we will delete all cookies set in our domain.
We provide you with a list of cookies stored on your computer in our domain, so that you can check what we have stored. For security reasons, we cannot display or modify cookies from other domains. You can check these in your browser's security settings.
.These cookies collect information that is used in aggregate form to help us understand how our website is used or how effective our marketing campaigns are, or to help us customise our website and application for you to improve your experience.
If you do not want us to track your visit to our site, you can disable this in your browser here:
.
We also use various external services such as Google Webfonts, Google Maps and external video providers. Since these providers may collect personal data such as your IP address, you can block them here. Please note that this may significantly reduce the functionality and appearance of our site. Changes will only be effective once you reload the page
Google Webfont Settings:
Google Maps Settings:
Google reCaptcha settings:
Vimeo and Youtube videos embedding:
.U kunt meer lezen over onze cookies en privacy-instellingen op onze Privacybeleid-pagina.
Privacy policy