Porphyrias are a group of disorders of the heme biosynthetic pathway which clinically manifest with acute neurovisceral attacks and cutaneous lesions. Diagnosis of porphyrias is based on the accurate and precise measurement of various porphyrins and precursor molecules in a range of samples. In addition, molecular diagnostic assays can provide definitive diagnosis.
by Dr Vivion E. F. Crowley, Nadia Brazil, and Sarah Savage
What are porphyrias?
Porphyrias are a group of rare disorders each of which results from a deficiency of an individual enzyme within the heme biosynthetic pathway (Fig. 1) [1–3]. With the exception of an acquired form of porphyria cutanea tarda (PCT), all porphyrias are inherited as monogenic autosomal dominant, autosomal recessive or X-linked genetic disorders, with varying degrees of penetrance and expressivity and this impacts on the prevalence and incidence of clinically manifest porphyrias [4]. The biochemical consequence of each porphyria is the overproduction within the heme biosynthetic pathway of specific porphyrin intermediates and/or the porphyrin precursor molecules delta-aminolevulinic acid (ALA) and porphobilinogen (PBG) [2–3]. This in turn has implications for the clinical manifestation of these disorders, their overall classification and their diagnosis (see Table 2).
Clinical presentation
Porphyrias may present clinically with either or both of two symptom patterns. The first is the acute neurovisceral attack, which is a potentially life threatening episode related to excessive hepatic generation of ALA and PBG, and which is a feature only in acute intermittent porphyria (AIP), variegate porphyria (VP), hereditary coproporphyria (HCP) and the very rare ALA dehydratase deficiency porphyria (ADP) [5–7]. These attacks are characterized principally by autonomic dysfunction, including non-specific but severe abdominal pain, constipation, diarrhoea, nausea, vomiting, tachycardia, hypertension or occasionally postural hypotension. In addition, other features may include a predominantly motor peripheral neuropathy which, if left undiagnosed, may extend to respiratory failure reminiscent of Guillain–Barré syndrome, as well as cerebral dysfunction, which can vary from subtle alterations in mental state, to posterior reversible encephalopathy syndrome (PRES). Hyponatremia, most likely due to SIADH [syndrome of inappropriate antidiuretic hormone (ADH) secretion] may also contribute to CNS-related morbidity. The complex neuropathic manifestations appear to be primarily related to axonal degeneration due to direct neurotoxicity by ALA, which structurally resembles the neurotransmitter gamma-aminobutyric acid (GABA) [3, 5–7].
The second clinical presentation paradigm is cutaneous photosensitivity caused by the interaction of ultraviolet light with photoactive porphyrins in the skin resulting in the production of reactive oxygen species (ROS) and an associated inflammatory response [3]. In PCT, VP and HCP the skin lesions typically occur post-pubertally and consist of skin fragility, vesicles, bullae, hyperpigmentation and hypertrichosis affecting sun exposed areas, most usually the face and dorsum of hands [1–3]. In erythropoietic protoporphyria (EPP) and X-linked protoporphyria (XLP), which may present in childhood, there is usually no blistering but instead erythema, edema and purpura feature in the more acute setting, with subsequent chronic skin thickening noted, whereas congenital erythropoietic porphyria (CEP) is characterized by severe cutaneous photosensitivity often occurring in early infancy with bullae and vesicles rupturing and being prone to secondary infection, with resultant scaring, bone resorption, deformation and mutilation of sun-exposed skin [1, 2, 8].
Classification
The classification of porphyrias (Table 1) has traditionally been determined either on the basis of clinical manifestations, i.e. acute or non-acute (cutaneous), or on the primary organ of porphyrin overproduction, i.e. hepatic or erythropoietic [1, 3, 8]. A combined classification has recently been proposed which takes account of both of these elements [2]. However, whichever classification is adopted there should be a realization that VP, and to a lesser extent HCP, can manifest with both acute and cutaneous features either simultaneously or separately.
Clinical and biochemical diagnosis
The clinical manifestations of porphyrias, particularly the acute hepatic porphyrias, are protean and consequently, patients with a clinically active porphyria could initially present to a relatively wide spectrum of clinical specialties including, gastroenterology, acute medicine, dermatology, neurology, endocrinology and hematology amongst others [2]. In general, cutaneous porphyrias should not pose a diagnostic difficulty for an experienced dermatologist used to investigating photosensitive skin disorders, but biochemical testing is still required to define the type of porphyria present. However, definitive diagnosis of an initial acute hepatic porphyria attack is critically dependent on biochemical testing, as symptoms are often non-specific in nature (Tables 1 & 2).
The diagnosis of an acute hepatic porphyria attack is founded on demonstrating an increase in urine PBG levels in direct temporal association with the characteristic acute symptom complex, the minimum level of increase being between 2- and 5-fold [9, 10]. The urine PBG may be measured either as a random sample, where it should be reported as urine PBG to creatinine ratio or as a 24-hour urine collection, where total PBG is reported. The former has proven to be clinically efficacious and has the advantage of timeliness, reduced within-subject variation and convenience over the requirement for a 24 hour urine collection [9]. If the urine PBG is not elevated this effectively rules out an acute porphyria attack at the time of sampling, however, there are certain caveats to this. Thus it is important to note that if specific treatment with either heme preparations or carbohydrate loading has been instigated prior to the test these interventions could reduce the urine PBG level significantly, including normalization [3]. Furthermore, if the measurement of urine PBG is delayed or undertaken at a time removed from the actual acute clinical presentation e.g. by weeks or months, then the finding of a normal urine PBG at that later stage cannot effectively rule out acute porphyria [3]. In this authors experience another important caveat concerns patients with a previous confirmed diagnosis acute porphyria who present with symptoms suggestive of recurrent acute attack. In many instances these patients have a perpetually elevated urine PBG, even in between attacks, and therefore an elevated urine PBG cannot effectively guide diagnosis. In these situations a decision to treat as an acute attack has to be made on the basis of clinical findings.
Therefore, a clinically effective service for acute porphyria diagnosis requires that a timely, quality assured laboratory method for urine PBG should be available for analysis [11]. Although a qualitative method for urine PBG may suffice for the purposes of establishing a diagnosis this should be supported by the availability of a confirmatory quantitative method for urine PBG. The lack of availability of urine PBG assay is very often the basis for misdiagnosis or indeed delayed diagnosis of acute porphyria attacks [10].
In conjunction with PBG, urine ALA is often measured simultaneously and although also elevated it does not tend to reach the levels of PBG in acute porphyrias. The one exception is the extremely rare instance of autosomal recessive ADP due to defective ALA synthase 2 (ALAS2) activity, where markedly elevated urine ALA levels are reported while PBG may be normal or only slightly elevated [2, 3]. In addition, a similar pattern of urine ALA predominance relative to PBG (although not as elevated) may be observed in the context of lead poisoning, wherein patients may also present with abdominal pain and neuropathy [1, 3].
Once the diagnosis of acute porphyria has been made based on the urine PBG the next phase involves determining the type of porphyria present. This is very much dependent on the specific pattern of porphyrin overproduction observed in samples of urine, feces, plasma and erythrocytes. It is critically important that the laboratory analytical methods available extend beyond the sole measurement of total porphyrin levels [10–12]. In particular, it is essential that individual porphyrin analysis and isomer fractionation in both urine and feces is available to facilitate the identification of the porphyria-specific patterns of porphyrin overproduction [10–12]. In many instances non-porphyria disorders affecting the gastrointestinal and hepatobiliary systems or certain dietary factors may cause non-specific secondary elevations in porphyrins, e.g. coproporphyrinuria, which can be diagnostically misleading [3]. In such cases urine PBG levels will not be elevated and the pattern of porphyrins observed will not be indicative of any one of the specific porphyrias per se. Therefore, it is important to realize that a finding of elevated porphyrin levels does not automatically equate to a diagnosis of underlying porphyria. This further highlights the importance of developing specialist porphyria centres to ensure that the appropriate repertoire of quality assured testing and expert interpretation and support are available for diagnosis and management of porphyria patients [11, 13].
The diagnosis of cutaneous (non-acute) porphyrias is also very much based on the specific patterns of porphyrins observed in urine and feces. In addition, the pattern of free and zinc protoporphyrin in erythrocytes can be useful in the diagnosis of CEP, EPP and the related disorder, XLP. Moreover, the identification of the porphyria subtype, either acute or cutaneous, may also be enhanced by identifying characteristic plasma porphyrin fluorescence emission peaks, e.g. VP emission peak between 625 and 628 nm [1–3]. Finally, it is essential that all samples for porphyrin and precursor measurement are protected from light prior to analysis.
Role of genetic diagnosis
Given the heritable nature of porphyrias it is not surprising that molecular genetic analysis has also become an important diagnostic adjunct. There is an extensive allelic heterogeneity of pathogenic mutations among the implicated genes for each porphyria disorder, which means that most mutations are uniquely confined to one or at most a few kindreds. There are, however, a few exceptions to this trend, most notably in relation to founder mutations among the Swedish population and the Afrikaner population in South Africa. The general approach in the application of genetic diagnostic strategies is firstly to characterize the causative mutation in a known affected individual (proband) using a mutation scanning approach [14]. Once a putative mutation has been identified its pathogenicity for a particular porphyria should be affirmed and then more extensive family cascade genetic screening can be organized based on the analysis of this kindred-specific mutation [14].
This approach has important implications in the diagnosis of porphyria susceptibility, particularly for the autosomal dominant acute hepatic porphyrias, where both penetrance and expressivity of the disorders is low [3, 4]. Thus the penetrance among AIP, VP and HCP is between 10 and 40%, implying that the majority of patients with an autosomal dominant acute hepatic porphyria will not manifest with an acute attack (or indeed cutaneous lesions in the case of VP and HCP) in their lifetime [3, 4]. Moreover, this lack of penetrance may also extend to the absence of subclinical biochemical abnormalities indicative of an underlying autosomal dominant acute porphyria, demonstrating the limited sensitivity of biochemical testing in identifying asymptomatic family members.
Currently there is no clear-cut mechanism for discriminating between those who will manifest a clinical and/or biochemical phenotype and those who will not. While the role of environmental precipitating factors, e.g. porphyrinogenic medications, stress, prolonged fasting, menstruation [1–3], have long been recognized in triggering acute porphyria attacks, it is the presence of a pathogenic mutation which is still the single most important factor determining the overall susceptibility for an acute porphyria episode. Therefore, all patients carrying a pathogenic mutation should be regarded as pre-symptomatic carriers, i.e. capable of developing an acute attack, and one of the key applications of genetic analysis in the area is in identifying pre-symptomatic carriers to allow for appropriate counselling and management advice to prevent attacks [3, 14].
In this author’s experience another useful role for molecular diagnostics in porphyrias is in relation to those patients with an historic diagnosis of acute hepatic porphyria in whom the biochemical abnormalities have subsequently normalized over years. In such instances genetic analysis can provide a definitive diagnosis for the type of porphyria and will accommodate a more extensive family screening programme for potential pre-symptomatic carriers.
The current methods of genetic analysis vary but usually involve a confirmatory step using direct nucleotide sequencing of the putative pathogenic variants as the gold standard. However, the emergence of next generation sequencing platforms has further galvanized the diagnostic possibilities in this area. Overall, in autosomal dominant acute hepatic porphyrias, approximately 95% of mutations are identifiable [3, 14]. This sensitivity includes the application of additional methods such as ‘multiplex ligation-dependent probe amplification’ (MLPA) and gene dosage analysis for identifying complex mutations, such large gene deletions, which may not be detected using standard sequencing-based approaches [14].
In autosomal recessive porphyrias including ADP, CEP and EPP, the clinical penetrance approaches 100%. These disorders also display a level of genetic heterogeneity. In the case of EPP the presence of a relatively common low expression single nucleotide polymorphism (SNP) located in the ferrochetalase gene, FECH (IVS3-48C), appears to be essential for the clinical expression of the cutaneous phenotype in the vast majority of cases [15].
The application of molecular genetics has provided a means of establishing definitive porphyria susceptibility, however, similar to the situation for biochemical testing services any genetic diagnostic services in this area must be quality assured to a high standard and need to adopt appropriate mutation scanning assay validation protocols in accordance with international standards and best practice recommendations [11–14].
References
1. Puy H, Gouya L, Deybach JC. Porphyrias. Lancet 2010; 375(9718): 924–937.
2. Balwani M, Desnick RJ. The Porphyrias: advances in diagnosis and treatment. Blood 2012; 120: 4496–4504.
3. Badminton MN, Elder GH. The porphyrias: inherited disorders of haem synthesis. In: Marshall W, Lapsley M, Day A, Ayling R, editors. Clinical Biochemistry Metabolic and Clinical Aspects. Churchill Livingstone Elsevier 2014; pp. 533–549.
4. Elder G, Harper P, Badminton M, Sandberg S, Deybach JC. The incidence of inherited porphyrias in Europe. J Inherit Metab Dis. 2013; 36: 849–857.
5. Simon NG, Herkes GK. The neurologic manifestations of the acute porphyrias. J Clin NeuroSci. 2011; 18: 1147–1153.
6. Sonderup MW, Hift RJ. The neurological manifestations of the acute porphyrias. S Afr Med J. 2014; 104: 285–286.
7. Crimlisk HL. The little imitator-porphyria: a neuropsychiatric disorder. J Neurol Neurosurg Psychiatry. 1997; 62: 319–328.
8. Siegesmund M, van Tuyll van Serooskerker AM, Poblete-Gutierrez P, Frank J. The acute hepatic porphyrias: Current status and future challenges. Best Pract Res Gastroenterol. 2010; 24: 593–605.
9. Aarsand AK, Petersen PH, Sandberg S. Estimation and application of biological variation of urinary delta-aminolevulinic acid and porphobilinogen in healthy individuals and in patients with acute intermittent porphyria. Clin Chem. 2006; 52: 650–656.
10. Kauppinen R, von und zu Fraunberg M. Molecular and biochemical studies of acute intermittent porphyria in 196 patients and their families. Clin Chem. 2002; 48: 1891–1900.
11. Aarsand AK, Villanger JH, Støle E, Deybach JC, Marsden J, To-Figueras J, Badminton M, Elder GH, Sandberg S. European specialist porphyria laboratories: diagnostic strategies, analytical quality, clinical interpretation and reporting as assessed by an external quality assurance programme. Clin Chem. 2011; 57: 1514–1523.
12. Whatley S, Mason N, Woolf J, Newcombe R, Elder G, Badminton M. Diagnostic strategies for autosomal dominant acute porphyrias: Retrospective analysis of 467 unrelated patients referred for mutational analysis of HMBS, CPOX or PPOX gene. Clin Chem. 2009; 55: 1406–1414.
13. Tollånes MC, Aarsand AK, Villanger JH, Støle E, Deybach JC, Marsden J, To-Figueras J, Sandberg S; European Porphyria Network (EPNET). Establishing a network of specialist porphyria centres – effects on diagnostic activities and services. Orphanet J Rare Dis. 2012; 7: 93.
14. Whatley SD, Badminton MN. The role of genetic testing in the management of patients with inherited porphyria and their families. Ann Clin Biochem. 2013; 50: 204–216.
15. Gouya L, Puy H, Robreau AM, Bourgeois M, Lamoril J, Da Silva V, Grandchamp B, Deybach JC. The penetrance of dominant erythropoietic protoporphyria is modulated by expression of wildtype FECH. Nat Genet. 2002; 30: 27–28.
The authors
Vivion E. F. Crowley*1 MB MSc FRCPath FFPath(RCPI) FRCPI, Nadia Brazil2 BA (Mod) FAMLS, Sarah Savage3 BSc MSc
1Consultant Chemical Pathologist, Head of Department, Biochemistry Department, St James’s Hospital, Dublin 8, Ireland
2Porphyrin Laboratory, Biochemistry Department, St James’s Hospital, Dublin 8, Ireland
3Molecular Diagnostic Laboratory, Biochemistry Department, St James’s Hospital, Dublin 8, Ireland
*Corresponding author
E-mail: vcrowley@stjames.ie
Diagnosis of diabetes mellitus
, /in Featured Articles /by 3wmediaDiabetes is characterized by hyperglycemia, but diagnosis no longer depends exclusively on plasma glucose measurements. The endorsement of glycated hemoglobin as a diagnostic test for diabetes has seen its widespread adoption for this purpose: it is vital that its application in this role is appropriate and its limitations understood.
by Dr Shirley Bowles
Introduction
The term diabetes mellitus encompasses several diseases of abnormal carbohydrate metabolism that are characterized by hyperglycemia associated with relative or absolute defects in insulin secretion and varying degrees of peripheral resistance to its action [1]. Diabetes is the most common metabolic disorder: in 2014, 422 million people in the world had diabetes, a prevalence of 8.5% in the adult population [2].
The fact that various pathogenetic processes may be involved in the development of diabetes is illustrated by the etiological classification outlined in Table 1 but, in fact, the vast majority of cases are categorized as either Type 1 (5–10%) or Type 2 (90–95%). Type 1 diabetes is usually due to cellular-mediated autoimmune destruction of the pancreatic β-cells, with absolute loss of insulin secretion, and, although the rate of cell destruction is variable, most individuals will ultimately become dependent on exogenously administered insulin for survival, and are at risk of ketoacidosis. In contrast, patients with Type 2 diabetes often have insulin levels that appear normal, or even elevated, but secretion is considered defective because it is insufficient to compensate for varying degrees of insulin resistance, which may be attributable to the obesity found in most of these patients. In Type 2 diabetes, treatment with insulin is not essential for survival, although it may eventually prove necessary to achieve glycemic control [3].
The majority of individuals with Type 2 diabetes are largely asymptomatic, diagnosed only after laboratory evaluation, whereas those with Type 1 are more likely to present with the classical symptoms of hyperglycemia: polyuria, polydipsia, blurred vision and weight loss. There may also be acute life-threatening consequences of uncontrolled diabetes: diabetic ketoacidosis in Type 1 diabetes and non-ketotic hyperosmolar syndrome in Type 2 [1]. Both forms are associated with a number of characteristic long-term complications, usually considered to be a consequence of microvascular disease, including retinopathy, with potential loss of vision; nephropathy, leading to kidney failure; peripheral and autonomic neuropathy. However, in reality, the major determinant of the reduced life expectancy seen in diabetes is the significantly increased incidence of macrovascular atherosclerotic disease, which causes myocardial infarction or angina, stroke or peripheral vascular disease [4].
Diagnostic criteria
1. Blood glucose measurements
For decades, the diagnosis of diabetes was based exclusively on glucose measurements but, as blood glucose is a continuous variable, cut-off points for diagnosis are necessarily somewhat arbitrary, and information derived from research and clinical practice has prompted periodic re-evaluation of the diagnostic criteria. By 1997, the diagnosis of diabetes, as defined by the World Health Organization (WHO), required a fasting plasma glucose (FPG) of ≥7.8 mmol/L or a plasma glucose (PG) of at least 11.1 mmol/L, in either a random blood specimen or in one collected 2 hours after a standard 75-g glucose load, as part of an oral glucose tolerance test (OGTT). An ‘at-risk’ category was also recognized: impaired glucose tolerance (IGT), which was identified on the basis of an OGTT 2-hour PG of 7.8–11.0 mmol/L. These values were chosen, based on the risk of future symptoms of uncontrolled hyperglycemia [5].
As the major objective of diagnosing diabetes is to intervene so as to prevent premature mortality and morbidity, it seemed logical to consider diagnosis in terms of risk of complications: following the recommendations of the National Diabetes Data Group in 1997 [6], the WHO revised the diagnostic threshold, with respect to the fasting glucose, based on the observed association between glucose levels and the risk of developing the microvascular complication of retinopathy. The OGTT 2-hour PG of ≥11.1 mmol/L closely approximates to a point at which the prevalence of microvascular complications increases dramatically. However, only approximately 25% of those who exceed this 2-hour threshold will also have a FPG ≥7.8 mmol/L, whereas almost all individuals with FPG ≥7.8 mmol/L have a 2-hour OGTT level ≥11.1 mmol/L. Thus, this earlier FPG cut-off defined a greater degree of hyperglycemia, a discrepancy that was considered undesirable: both fasting and 2-hour cut-off points should reflect a similar degree of hyperglycemia and risk of adverse outcomes. In addition, due to the inconvenience of undertaking OGTTs, the FPG alone was often performed, meaning that a substantial number of individuals, who were at increased risk of microvascular complications, would not have been detected. The revised FPG cut-off of ≥7.0 mmol/L was shown to have a similar predictive value for adverse outcomes as the 11.1 mmol/L 2-hour OGTT threshold, which validated the use of this simpler test for diagnostic purposes.
It was at this stage that a secondary criterion for the ‘at-risk’ category was recognized: impaired fasting glycemia (IFG), a FPG of 6.1–6.9 mmol/L. Both IFG, and the previously described IGT, have been referred to as ‘pre-diabetes’, indicating a relatively high risk of future diabetes. Studies have demonstrated an approximately 5–10% annualized risk of progression to diabetes in individuals with either IFG or IGT and 10–15% in those with both abnormalities [7].
2. Glycated hemoglobin
Glycated hemoglobin (HbA1c), formed as a consequence of a non-enzymatic, irreversible reaction between glucose and the N-terminal valine residue of the β globin chains of hemoglobin, reflects average blood glucose levels over the preceding 8–12-week period (the lifespan of a red blood cell) and its potential as an indicator of glycemic control was recognized in 1977 [8]. Over the intervening years, supported by evidence from the Diabetes Control and Complications Trial (Type 1 diabetes) [9] and the United Kingdom Prospective Diabetes Study (Type 2 diabetes) [10], which validated the direct relationship between glycated hemoglobin levels and clinical outcomes, it has had a vital role in monitoring diabetes. With respect to the diagnosis of diabetes, however, although epidemiological studies also showed a clear relationship between HbA1c and retinopathy, variation in methodology and standardization, and concern about the confounding effect of factors affecting erythrocyte turnover, seemed to preclude its use for this purpose [8]. This situation has changed in recent years, as a result of a number of HbA1c standardization programmes, culminating in the work of the IFCC Working Group on Standardization of HbA1c, which established true International Reference Methods for HbA1c and provided a preparation of pure HbA1c, against which manufacturers could standardize their calibrators [11].
In 2011, in response to this global standardization of HbA1c methods, the WHO stated that HbA1c could be used as a diagnostic test for diabetes mellitus, “provided that stringent quality assurance methods are in place, assays are standardized to criteria aligned to the international reference values and there are no conditions present that preclude its accurate measurement” [12]. Based on the DETECT-2 pooled data analysis, which examined the association between diabetes-specific retinopathy and glycemic measures, an HbA1c of 48 mmol/mol was recommended as the cut-off point for diagnosing diabetes [13]. As with glucose measurements, there is a range of HbA1c levels below this diagnostic value, which indicates an increased risk of future diabetes and/or cardiovascular disease: a systematic review indicated that HbA1c values between 37 and 48 mmol/mol are associated with a substantially increased risk of diabetes [14]. The WHO did not provide specific guidance on HbA1c criteria for ‘pre-diabetes’ but the 2009 International Expert Committee concluded that individuals with an HbA1c of 42–47 mmol/mol should be considered at high risk of progression to diabetes [15] (estimated 5-year risk of 25–50% [14]), a range that was endorsed by a UK Expert Position Statement [16].
Current recommendations
The current criteria for the diagnosis of diabetes and ‘pre-diabetes’, in accordance with WHO recommendations, are summarized in Table 2. OGTTs, which are time-consuming, inconvenient and show poor reproducibility, are increasingly confined to the diagnosis of gestational diabetes. HbA1c confers definite advantages over FPG (and OGTT): no patient preparation; lower biological variation; less fluctuation in acute stress and illness, and standardization of measurement is now better than for glucose, which has no internationally recognized reference method. However, there are a number of situations, in which the use of HbA1c for diagnosis is not appropriate (Table 2): as a measure of chronic hyperglycemia, HbA1c should not be used where rapidly developing hyperglycemia is suspected and results will be unreliable in the presence of any factors affecting erythrocyte lifespan [12].
Regardless of the test used, in an asymptomatic patient, a diagnostic result should be confirmed by repeat testing on a separate day, preferably using the same test, in order to increase the likelihood of concordance. In the same way that there is less than 100% concordance between the results of FPG and 2-hour OGTT PG, there is not full concordance between HbA1c and glucose measurements: these three different measures of glycemia represent different physiological processes and, therefore, inevitably, they identify somewhat different populations of patients [17]. In fact, although HbA1c performs equally well as a predictor of retinopathy risk, in most populations, its use results in a lower diabetes prevalence (the OGTT 2-hour PG is the most sensitive test). A study including 6890 adults from the US National Health and Nutrition Examination Survey (1999–2006) indicated that the prevalence of undiagnosed diabetes was 2.3% using HbA1c, compared to 3.6% using FPG [18]. Other studies have confirmed this discrepancy although, in fact, the magnitude of the difference appears to vary between populations, perhaps reflecting geographical or ethnic differences in hemoglobin glycation rates or the distribution of certain forms of anemia or hemoglobinopathy. It is anticipated that, in practice, the lower sensitivity of HbA1c will be mitigated by its ease of use, which will facilitate its wider application [3].
For those individuals with ‘pre-diabetes’, structured lifestyle intervention, aimed at increasing physical activity and achieving a loss of body weight, may prevent, or at least delay, the development of diabetes. Within this category, for all three tests, the risk of future diabetes is curvilinear, extending below the lower limit of the range and becoming disproportionately greater at the higher end: accordingly, intervention and follow-up should be most aggressive for those considered at particularly high risk [3]. The associated increased risk of cardiovascular disease should also be targeted, with appropriate management of other relevant risk factors (smoking, lipids, blood pressure).
From glucose measurements to HbA1c in the diagnosis of diabetes mellitus: one UK laboratory’s experience of the change in clinical practice
Guidance, outlining the WHO’s position on the use of HbA1c in the diagnosis of diabetes, was issued to local clinicians in 2012. Subsequently, in September 2014, updated guidance was disseminated, advocating the use of HbA1c as a diagnostic test for diabetes mellitus, except where inappropriate, and providing advice on follow-up. This was supported by modification of the requesting process, which allowed a distinction to be made between HbA1c requests made for monitoring established diabetes (designated HbA1cM) and those being used for diagnosis (designated HbA1cD). This facilitated the provision of additional targeted guidance in the form of interpretative comments and, importantly, for HbA1cD requests, allowed flagging, as abnormal, results that indicated ‘pre-diabetes’ (42–47 mmol/mol).
The pattern of fasting glucose, OGTT (excluding those from maternity services) and HbA1c requesting between April 2012 and March 2016 is summarized in the Figure 1. Between late 2012 and September 2014, there was a steady increase in HbA1c requests, which was mirrored by a decrease in the number of fasting glucoses requested and OGTTs performed. Since the introduction of the two separate requests, HbA1cD and HbA1cM, in September 2014, it can be seen that, with regard to monitoring, the number of requests has remained at around 2200 per month, about 10% higher than the number being done early in 2012 (when all such requests were for this purpose). In contrast, those requested for diagnostic purposes increased rapidly and, since late 2015, the number of HbA1cD requests has been similar to the total number of HbA1c requests/month in 2014.
Summary
Local experience indicates an enthusiastic uptake in the use of HbA1c for diagnosing diabetes and a concurrent fall in glucose measurements (FPG and 2-hour OGTT PG) for this purpose. As anticipated, the convenience of this test has led to increased screening for diabetes but there is concern that this ease of use may mean that the limitations of HbA1c as a diagnostic test are overlooked, resulting in its application in circumstances when glucose measurements would, in fact, be indicated. There is a clear role for laboratory staff in the provision of ongoing education of clinicians, in order to ensure the appropriate use and interpretation of these tests.
References
1. McCulloch DK. Clinical presentation and diagnosis of diabetes mellitus in adults. UpToDate. (http://uptodate.com/contents/clinical-presentation-and-diagnosis-of-diabetes-mellitus)
2. Global Report on Diabetes. World Health Organization 2016. (http://apps.who.int/iris/bitstream/10665/204871/1/9789241565257_eng.pdf)
3. American Diabetes Association Position Statement. Diagnosis and classification of diabetes mellitus. Diabetes Care 2011; 34(Suppl 1): S62–S69.
4. Report of a WHO Consultation. Definition, diagnosis and classification of diabetes mellitus and its complications. World Health Organization 1999. (https://www.staff.ncl.ac.uk/philip.home/who_dmg.pdf)
5. Definition and diagnosis of diabetes mellitus and intermediate hyperglycemia. World Health Organization 2006. (http://www.who.int/diabetes/publications/Definition%20and%20diagnosis%20of%20diabetes_new.pdf)
6. Expert Committee on the Diagnosis and Classification of Diabetes Mellitus. Report of the Expert Committee on the diagnosis and classification of diabetes mellitus. Diabetes Care 1997; 20: 1183–1197.
7. Inzucchi SE. Diagnosis of diabetes. N Engl J Med. 2012; 367(6): 542–550.
8. Day A. HbA1c and diagnosis of diabetes. The test has finally come of age. Ann Clin Biochem. 2012; 49: 7–8.
9. The Diabetes Control and Complications Trial Research Group. The effect of intensive treatment of diabetes on the development and progression of long-term complications in insulin-dependent diabetes. N Engl J Med. 1993: 329: 977–986.
10. United Kingdom Prospective Diabetes Study (UKPDS) Group. Intensive blood glucose control with sulphonylureas or insulin compared with conventional treatment and risk of complications in patients with type 2 diabetes (UKPDS 33). Lancet 1998; 352: 837–853.
11. The American Diabetes Association, European Association for the Study of Diabetes, International Federation of Clinical Chemistry and Laboratory Medicine and the International Diabetes Federation Consensus Committee. Consensus statement on the worldwide standardisation of the HbA1c measurement. Diabetologia 2007; 50(10): 2042–2043.
12. Use of glycated haemoglobin (HbA1c) in the diagnosis of diabetes mellitus. Abbreviated report of a WHO consultation. World Health Organization 2011. (http://www.who.int/diabetes/publications/report-hba1c_2011.pdf)
13. Colagiuri S, Lee CMY, Wong TW, Balkau B, Shaw JE, Borch-Johnsen K. Glycemic thresholds for diabetes-specific retinopathy: implications for diagnostic criteria for diabetes. Diabetes Care 2011; 34: 145–150.
14. Zhang X, Gregg EW, Wiliamson DF, Barker LE, Thomas W, Imperatore G, Williams DE, Albright AL. A1c level and future risk of diabetes: a systematic review. Diabetes Care 2010; 33(7): 1665–1673.
15. International Expert Position Report on the role of the A1C assay in the diagnosis of diabetes. Diabetes Care 2009; 32: 1327–1334.
16. Expert Position Statement: Use of HbA1c in the diagnosis of diabetes mellitus in the UK. The implementation of World Health Organization guidance 2011. Diabetic Medicine 2012; 29: 1350–1357.
17. American Diabetes Association. Classification and diagnosis of diabetes. Diabetes Care 2015; 38(Suppl 1): S8–S16.
18. Carson AP, Reynolds K, Fonseca VA, Muntner P. Comparison of A1C and fasting glucose criteria to diagnose diabetes among U.S. adults. Diabetes Care 2010; 33: 95–97.
The author
Shirley A. Bowles MB ChB, MSc, FRCPath
Department of Blood Sciences, Countess of Chester Hospital NHS Foundation Trust, Chester, UK
E-mail: shirleybowles@nhs.net
Type 2 diabetes – biomarker models promise new means to predict risk
, /in Featured Articles /by 3wmediaConsiderable rewards could be obtained from early identification of Type 2 diabetes mellitus (T2DM). One of the most obvious, as suggested in a recent report on diabetes’ global burden, would be better disease management. The report, by the University of East Anglia in the UK, concludes that “early investments into prevention and disease management may therefore be particularly worthwhile.”
Risk factors
Such perspectives are strengthened by evidence that the onset of T2DM can be delayed by behaviour modification. A study in the ‘British Medical Journal’ in 2007 noted that lifestyle changes could be “at least as effective as drug treatment” in slowing the onset of diabetes. It concluded that the only barrier to the effectiveness of such a strategy was to identify diabetes quickly enough.
Much is now known about the risk factors associated with T2DM such as parental history, age, body mass index and elevated blood glucose levels. Combining these with measurable indicators of metabolic syndrome – high blood pressure, LDL and HDL cholesterol and excess triglyceride – can result in a credible degree of prediction. However, there are several barriers to the process.
Fasting glucose and oral glucose tolerance
The typical method for assessing T2DM risk is to measure fasting plasma glucose (FPG). However, the test’s specificity is poor. Two decades ago, the so-called Hoorn study at Amsterdam warned about significant levels of variation in blood glucose levels. Although many individuals are identified as having impaired fasting glucose (IFG), their absolute risk of conversion to diabetes is a mere 5 to 10% per year.
Over this period, differences have also emerged about how best to measure glucose. In the year 2000, while some experts (including the American Diabetes Association) recommended the use of fasting plasma glucose (FPG) alone, others noted that many diabetic subjects would have been classified as non-diabetic on the FPG test. As a result, they recommended use of the two-hour oral glucose tolerance test (OGTT). Nevertheless, in spite of its greater accuracy, OGTT is rarely used since it requires two hours to perform and is an unpleasant experience for the patient.
Glucose tolerance only one risk indicator
The above factors have provoked a search for new approaches to predict T2DM. Some beliefs about OGTT have been brought into question, too. In 2002, clinical epidemiologists at the University of Texas Health Center in San Antonio published the results of a prospective cohort study to identify people at high risk of T2DM.
The results were unequivocal. Impaired glucose tolerance was only one indicator of risk. Persons at high risk for T2DM, the study concluded, were “better identified by using a simple prediction model than by relying exclusively on the results of a 2-hour oral glucose tolerance test.”
Predictive models
Subsequent years have been witness to significant efforts to develop and refine predictive models for T2DM. However, five years after the San Antonio study, the choices are still less than wholly clear.
In 2007, the Framingham Offspring study in the US estimated seven-year T2DM risk based on a pyramid of metrics consisting – at the base – of age, sex, parental history and body mass index. This was followed by the inclusion of simple clinical measurements on metabolic syndrome traits, and thereafter, the 2-hour post-oral glucose tolerance test, fasting insulin and C-reactive protein levels. At its most complex, the model used the Gutt insulin sensitivity index or a homoeostasis model of insulin resistance.
For proponents of new alternatives to impaired glucose tolerance, the conclusions of the Framingham study were stark. Complex clinical models, it stated, were not superior to the simple one, and in spite of the definite existence of T2DM prediction rules, “we lack consensus for the most effective approach.”
The limitations of biotech
More recently, investigations at the frontiers of biotech have also faced challenges to clear-cut answers. Although it is clear that multiple genetic loci are associated with the risk of T2DM, researchers have not managed to connect the genetics underlying a family history of diabetes with predictability.
In 2008, researchers at Harvard/Massachusetts General and Emory University published results of a study on 18 single-nucleotide polymorphisms (SNPs) known to have associations with the risk of T2DM, to predict new cases in a large, prospectively examined, community-based cohort. However, the outcome, in terms of risk prediction, was less than encouraging. In reality, it proved to be only slightly better at making a prediction than did traditional risk factors on their own. The authors concluded: “Our findings underscore the view that identification of adverse phenotypic characteristics remains the cornerstone of approaches to predicting the risk of type 2 diabetes.”
Adiponectin and ferritin
Meanwhile, the effort to identify and validate alternate biomarkers for prediction and screening continue. Two especially promising ones appear to be adiponectin, an adipocyte-derived, insulin-sensitizing peptide, and ferritin, a protein that binds to iron and accounts for most of the iron stored in the body.
Studies in the early 2000s in the US and Germany confirmed that adiponectin was independently associated with a reduced risk of type 2 diabetes.
Interest in this area goes back a long time, to a cross-sectional and longitudinal study of Arizona’s Pima Indians, who have the world’s highest reported prevalence and incidence of non-insulin-dependent diabetes mellitus (NIDDM). The study dates to the early 1980s when it sought to document the sequence of metabolic events occurring with “the transition from normal to impaired glucose tolerance and then to diabetes.”
In 2004, a prospective study within the US Nurses’ Health Study investigated iron storage, given a belief that T2DM was a manifestation of hemochromatosis, due to iron overload. Researchers have established that higher iron store (reflected by an elevated ferritin concentration and a lower ratio of transferrin receptors to ferritin) is associated with increased T2DM risk in healthy women, independent of known diabetes risk factors.
However, there still are reasons for caution. In July 2014, or more than a decade after the US Nurses’ Health Study, a meta-analysis of T2DM risk and ferritin in the journal ‘Diabetes/Metabolism Research and Reviews’ warned that though evidence suggested a causal link, “publication bias and unmeasured confounding cannot be excluded.”
Nevertheless, ferritin and adiponectin do appear to play a key role in predicting T2DM when combined with other selected biomarkers.
The Danish model
One predictive model that has emerged in Denmark selected a panel of six biomarkers out of a total of 64, to assess T2DM risk. The selected biomarkers include adiponectin and ferritin, as well as four of their more common counterparts: glucose and insulin, as well as the inflammation markers C-reactive protein (CRP) and interleukin-2 receptor A (IL2RA).
The model was developed by a research team from Copenhagen’s Glostrup Hospital and Steno Diabetes Centre, along with the Copenhagen and Aarhus universities, and Tethys Bioscience of the US.
The researchers used the so-called Inter99 cohort, a study of about 6,600 Danes with the primary outcome of 5-year conversion to T2DM, to select 160 individuals who developed T2DM and 472 who did not. They carefully measured several clinical variables and candidate biomarkers from a multitude of diabetes-associated pathways, using an ultrasensitive immunoassay microsample molecular counting technology.
Their effort ultimately led to six biomarkers that gave a Diabetes Risk Score. This, they concluded in a July 2009 issue of ‘Diabetes Care’, provided “an objective and quantitative estimate of the 5-year risk of developing type 2 diabetes, performs better than single risk indicators and a noninvasive clinical model, and provides better stratification than fasting plasma glucose alone.”
Expert acclaim
The researchers who developed the Danish Diabetes Risk Score are modest in their claims. In an appendix to their report in ‘Diabetes Care’, they point out that their selection process for biomarkers may not have identified the best possible model, but do state that they identified a ‘good’ model.
Some outside observers are however less circumspect, given what many acknowledge to be one of the most exhaustive and profound selection efforts to date. James Meigs of Harvard Medical School calls the Danish Diabetes Risk Score “the most robust multimarker prediction model possible.”
Beyond Europeans to Chinese
One of the only major caveats in the Danish effort consisted of demographics. The report on the Danish model in ‘Diabetes Care’ noted that it “may only apply to white Northern Europeans enrolled in a lifestyle intervention trial” and that it was an open question whether the model “would produce the same biomarkers or discriminate well in race/ethnicity populations that are differentially affected by diabetes.”
Answers to these are still emerging. In 2013, a study on 2,198 community-living Chinese by the Shanghai Institutes for Biological Sciences endorsed the use of ferritin as a biomarker. Though the focus of the research was on iron storage, two of three other biomarkers used in the effort were the same as those in the Danish study, namely adiponectin and CRP (the fourth was γ-glutamyltransferase).
Biomarker search continues
Meanwhile, the search for TD2M biomarkers continues.
Two endothelial dysfunction biomarkers being investigated for T2DM risks consist of E-selectin and ICAM-1. The US Nurses Health Study mentioned above also found that significantly elevated levels of the latter predicted incident diabetes in women independent of traditional risk factors such as BMI, family history, diet and activity. In addition, adjustment for baseline levels of C-reactive protein, fasting insulin, and hemoglobin A (1c) did not alter these associations.
Incretins and melatonin
Incretins, metabolic hormones which lower blood glucose by causing an increase in insulin after eating, are another potentially significant biomarker. An ‘incretin effect’ is associated with the fact that oral glucose elicits a higher insulin response than does intravenous glucose. There are two hormones responsible for the incretin effect: glucose-dependent insulinotropic hormone (GIP) and glucagon-like peptide-1 (GLP-1).
In patients with type 2 diabetes, the incretin effect is reduced. In addition, about half first-degree relatives of patients with T2DM show reduced responses toward GIP, without any significant change in GIP or GLP-1 secretion after oral glucose. To some researchers, this opens the possibility that a reduced responsiveness to GIP is an early step in the pathogenesis of type 2 diabetes.
Variation in the Circadian system has also drawn a great deal of attention.
Reverse transcription polymerase chain reaction (RT-PCR) analyses, led by a team at the University of Lille in France, investigated melatonin receptor 2 (MT2 transcripts) in neural tissues and MT2 expression in human pancreatic islets and beta cells. Their findings suggest a link between circadian rhythm regulation and glucose homoeostasis through the melatonin signalling pathway.
The use of point-of-care ketone meters to diagnose and monitor diabetic ketoacidosis in pediatric patients
, /in Featured Articles /by 3wmediaChildren presenting with diabetic ketoacidosis (DKA) require prompt assessment and treatment initiation to prevent serious complications. The use of point-of-care (POC) analysers to assess blood ketones is beginning to replace the traditional analysis of urine ketones, but some questions remain as to their optimal utilization.
by Dr A.M. Ferguson, Dr J. Michael, Prof. S. DeLurgio and Dr M. Clements
Introduction
Diabetic ketoacidosis (DKA) is an acute complication of uncontrolled diabetes mellitus resulting from insulin deficiency. It is biochemically defined as hyperglycemia (blood glucose >200 mg/dL) with metabolic acidosis (venous pH <7.3 or bicarbonate <15 mmol/L), ketonemia, and ketonuria [1]. The clinical picture of the patient can include fatigue, polydipsia, polyuria, dehydration, abdominal pain, vomiting and altered mental status (Box 1). DKA can occur in known diabetics and can be the presenting symptom prior to diagnosis. Children who are on insulin pump therapy, who have unstable family situations, or have limited access to healthcare are at an increased risk of DKA [1], and DKA is the most common cause of diabetes-related mortality in children. Assessing urine ketones has been part of the standard practice when assessing if a patient has DKA, but this has multiple issues. There are three types of ketones: acetoacetate, acetone, and β-hydroxybutyrate (BHB). BHB is the predominant ketone produced during DKA and can be present at up to 10 times the amount of acetoacetate. The urine dipsticks that are commonly used to assess ketonuria utilize a nitroprusside reagent that reacts with acetoacetate and acetone but not at all with BHB. This is problematic because the major ketone produced in DKA is not detected, which can lead to false negative urine ketone testing. Additionally, as ketosis resolves, BHB is converted to acetoacetate, increasing urine ketones during the recovery phase, potentially leading the clinician to believe that the ketosis is worsening instead of resolving. An added obstacle is the difficulty of getting a urine specimen from a young child, especially one in nappies. Measuring serum ketones, specifically BHB, is a solution to both of these issues. Clinical measurement of serum ketones
As the methodology for measuring serum BHB became more automated, the test moved from being used only on a research basis to being available for clinical use. Initial studies were done to see how serum BHB functioned for the diagnosis of DKA. A large retrospective study looking at simultaneous measurements of BHB and bicarbonate found that BHB levels of ≥3 and ≥3.8 mmol/L in children and adults, respectively, could be used to diagnose DKA and provides a more specific assessment of DKA than bicarbonate alone [2].
When assessing patients for DKA, it is critical to make the diagnosis as quickly as possible to initiate treatment and prevent the patient from decompensating further. The commercial availability of point-of-care (POC) meters to assess serum ketones allows the patient to be tested immediately on presentation at the bedside. There have been multiple studies performed in adults showing that use of POC BHB meters in the emergency room can aid in diagnosis and treatment of DKA. Arora et al. compared POC BHB and urine ketone dipstick results in 54 patients with DKA presenting to the emergency department [3]. They found that both methods were equally sensitive for detecting DKA at 98.1%, but that BHB with a cut-off of ≥1.5 mmol/L is more specific for DKA compared to urine dipsticks (78.6 vs 35.1%) and could cut down on unnecessary DKA work ups in hyperglycemic patients. Another study found that a BHB value of 3.5 mmol/L yielded 100% sensitivity and specificity for the diagnosis of DKA [4].
Use of POC testing in pediatrics
Fewer studies have been done in pediatric patients. One such study by Ham et al. determined that using a POC meter in the hospital setting could aid in monitoring the resolution of DKA in pediatric patients [5]. The BHB values from the POC meter correlated with BHB values from the laboratory for most of the meter’s measurement range. Use of the meter had both a strong positive predictive value (PPV, 0.85) as well as negative predictive value (NPV, 1.0) for indicating the presence or absence of DKA at a meter value of 1.5 mmol/L [5]. Noyes et al. used POC ketone testing to identify the endpoint of an integrated care pathway when treating DKA in children [6]. They compared their current treatment endpoint of pH >7.3 and no presence of urine ketones with an endpoint defined by pH >7.3 and two successive POC ketone measurements of <1 mmol/L. The study measured time of treatment in 35 patient episodes in children ranging in age from 1–14 years. The time to completion of treatment using POC ketone measurement was 17 hours, compared to 28 hours using measurement of urine ketones to end treatment [6] . They found that occasionally a value below 1 mmol/L would be followed by a value above 1 mmol/L, but this never occurred after two subsequent values under 1 mmol/L, leading them to recommend waiting for the two successive low values before ending treatment. In addition to allowing an earlier treatment endpoint, this approach enables less time to be spent in the ICU, with decreased cost associated with treatment. Using a POC ketone meter can also result in fewer tests being ordered overall. Rewers and colleagues asked whether monitoring serum BHB values at the bedside could result in a decrease in laboratory testing in pediatric patients [7]. Their results indicated that the real-time changes observed in POC serum BHB values correlated strongly with changes in pH, bicarbonate, and pCO2 and also had good correlation with the laboratory BHB method. While initial measurement of pH, bicarbonate and pCO2 is encouraged, following up the patient with POC BHB can replace serial laboratory measurements of those analytes and decrease the amount of laboratory testing [7]. Similarly, a separate study showed that use of a POC BHB meter at home decreased diabetes-related hospital visits and hospitalizations of pediatric diabetics when compared to urine ketone testing by allowing earlier identification of ketosis and initiation of treatment [8]. Most of the studies mentioned are close to 10 years old, but measuring serum BHB to diagnose DKA or monitor its resolution has not become standard practice. A recent review of the standard treatment guidelines for DKA in children and adolescents raises the question of whether blood ketones should be evaluated during management of DKA [9]. The authors recommend using serum BHB measurement, either from the laboratory or at the point of care, to both diagnose DKA and monitor treatment. Despite the inaccuracies of POC meters seen at high BHB values [5–7], use of a diagnostic cut-off of >3 mmol/L is well within the accurate range of the meters and can be used to confidently diagnose DKA and monitor the patient’s response to treatment.
Conclusions
Despite the increasing body of knowledge indicating that measurement of serum BHB can aid in both diagnosis and management of DKA, a study conducted in 2014 indicated that although 89% of pediatric emergency medicine and critical care providers responding to a survey stated that they had a DKA protocol at their institution, 67% perceived no clinical advantage in the use of serum ketone measurements [10]. This suggests that evaluation of serum ketone monitoring during DKA management from a quality improvement and research perspective may be necessary before clinical adoption is widespread. The next iteration of DKA management guidelines should address the potential utility of serum ketone monitoring.
References
1. Wolfsdorf J, Craig ME, et al. Diabetic ketoacidosis in children and adolescents with diabetes. Pediatr Diabetes 2009; 10(Suppl 12): 118–133.
2. Sheikh-Ali M, Karon BS, et al. Can serum beta-hydroxybutyrate be used to diagnose diabetic ketoacidosis? Diabetes Care 2008; 31(4): 643–647.
3. Arora S, Henderson SO, et al. Diagnostic accuracy of point-of-care testing for diabetic ketoacidosis at emergency-department triage: {beta}-hydroxybutyrate versus the urine dipstick. Diabetes Care 2011; 34(4): 852–854.
4. Charles RA, Bee YM, et al. Point-of-care blood ketone testing: screening for diabetic ketoacidosis at the emergency department. Singapore Med J. 2007; 48(11): 986–989.
5. Ham MR, Okada P, White PC. Bedside ketone determination in diabetic children with hyperglycemia and ketosis in the acute care setting. Pediatr Diabetes 2004; 5(1): 39–43.
6. Noyes KJ, Crofton P, et al. Hydroxybutyrate near-patient testing to evaluate a new end-point for intravenous insulin therapy in the treatment of diabetic ketoacidosis in children. Pediatr Diabetes 2007; 8(3): 150–156.
7. Rewers A, McFann K, Chase HP. Bedside monitoring of blood beta-hydroxybutyrate levels in the management of diabetic ketoacidosis in children. Diabetes Technology & Therapeutics 2006; 8(6): 671–676.
8. Laffel LM, Wentzell K, et al. Sick day management using blood 3-hydroxybutyrate (3-OHB) compared with urine ketone monitoring reduces hospital visits in young people with T1DM: a randomized clinical trial. Diabet Med. 2006; 23(3): 278–284.
9. Wolfsdorf JI. The International Society of Pediatric and Adolescent Diabetes guidelines for management of diabetic ketoacidosis: Do the guidelines need to be modified? Pediatr Diabetes 2014; 15(4): 277–286.
10. Clark MG, Dalabih A. Variability of DKA management among pediatric emergency room and critical care providers: a call for more evidence-based and cost-effective care? J Clin Res Pediatr Endocrinol. 2014; 6(3): 190–191.
The authors
Angela M. Ferguson*1 PhD, DABCC, FACB; Jeffery Michael1 D.O., FAAP; Stephen DeLurgio2 PhD; Mark Clements1 MD, PhD, CPI
1Children’s Mercy Hospital, Kansas City, MO, USA
2Bloch School, University of Missouri, Kansas City, MO, USA
*Corresponding author
E-mail: amferguson@cmh.edu
Porphyrias: clinical and diagnostic aspects
, /in Featured Articles /by 3wmediaPorphyrias are a group of disorders of the heme biosynthetic pathway which clinically manifest with acute neurovisceral attacks and cutaneous lesions. Diagnosis of porphyrias is based on the accurate and precise measurement of various porphyrins and precursor molecules in a range of samples. In addition, molecular diagnostic assays can provide definitive diagnosis.
by Dr Vivion E. F. Crowley, Nadia Brazil, and Sarah Savage
What are porphyrias?
Porphyrias are a group of rare disorders each of which results from a deficiency of an individual enzyme within the heme biosynthetic pathway (Fig. 1) [1–3]. With the exception of an acquired form of porphyria cutanea tarda (PCT), all porphyrias are inherited as monogenic autosomal dominant, autosomal recessive or X-linked genetic disorders, with varying degrees of penetrance and expressivity and this impacts on the prevalence and incidence of clinically manifest porphyrias [4]. The biochemical consequence of each porphyria is the overproduction within the heme biosynthetic pathway of specific porphyrin intermediates and/or the porphyrin precursor molecules delta-aminolevulinic acid (ALA) and porphobilinogen (PBG) [2–3]. This in turn has implications for the clinical manifestation of these disorders, their overall classification and their diagnosis (see Table 2).
Clinical presentation
Porphyrias may present clinically with either or both of two symptom patterns. The first is the acute neurovisceral attack, which is a potentially life threatening episode related to excessive hepatic generation of ALA and PBG, and which is a feature only in acute intermittent porphyria (AIP), variegate porphyria (VP), hereditary coproporphyria (HCP) and the very rare ALA dehydratase deficiency porphyria (ADP) [5–7]. These attacks are characterized principally by autonomic dysfunction, including non-specific but severe abdominal pain, constipation, diarrhoea, nausea, vomiting, tachycardia, hypertension or occasionally postural hypotension. In addition, other features may include a predominantly motor peripheral neuropathy which, if left undiagnosed, may extend to respiratory failure reminiscent of Guillain–Barré syndrome, as well as cerebral dysfunction, which can vary from subtle alterations in mental state, to posterior reversible encephalopathy syndrome (PRES). Hyponatremia, most likely due to SIADH [syndrome of inappropriate antidiuretic hormone (ADH) secretion] may also contribute to CNS-related morbidity. The complex neuropathic manifestations appear to be primarily related to axonal degeneration due to direct neurotoxicity by ALA, which structurally resembles the neurotransmitter gamma-aminobutyric acid (GABA) [3, 5–7].
The second clinical presentation paradigm is cutaneous photosensitivity caused by the interaction of ultraviolet light with photoactive porphyrins in the skin resulting in the production of reactive oxygen species (ROS) and an associated inflammatory response [3]. In PCT, VP and HCP the skin lesions typically occur post-pubertally and consist of skin fragility, vesicles, bullae, hyperpigmentation and hypertrichosis affecting sun exposed areas, most usually the face and dorsum of hands [1–3]. In erythropoietic protoporphyria (EPP) and X-linked protoporphyria (XLP), which may present in childhood, there is usually no blistering but instead erythema, edema and purpura feature in the more acute setting, with subsequent chronic skin thickening noted, whereas congenital erythropoietic porphyria (CEP) is characterized by severe cutaneous photosensitivity often occurring in early infancy with bullae and vesicles rupturing and being prone to secondary infection, with resultant scaring, bone resorption, deformation and mutilation of sun-exposed skin [1, 2, 8].
Classification
The classification of porphyrias (Table 1) has traditionally been determined either on the basis of clinical manifestations, i.e. acute or non-acute (cutaneous), or on the primary organ of porphyrin overproduction, i.e. hepatic or erythropoietic [1, 3, 8]. A combined classification has recently been proposed which takes account of both of these elements [2]. However, whichever classification is adopted there should be a realization that VP, and to a lesser extent HCP, can manifest with both acute and cutaneous features either simultaneously or separately.
Clinical and biochemical diagnosis
The clinical manifestations of porphyrias, particularly the acute hepatic porphyrias, are protean and consequently, patients with a clinically active porphyria could initially present to a relatively wide spectrum of clinical specialties including, gastroenterology, acute medicine, dermatology, neurology, endocrinology and hematology amongst others [2]. In general, cutaneous porphyrias should not pose a diagnostic difficulty for an experienced dermatologist used to investigating photosensitive skin disorders, but biochemical testing is still required to define the type of porphyria present. However, definitive diagnosis of an initial acute hepatic porphyria attack is critically dependent on biochemical testing, as symptoms are often non-specific in nature (Tables 1 & 2).
The diagnosis of an acute hepatic porphyria attack is founded on demonstrating an increase in urine PBG levels in direct temporal association with the characteristic acute symptom complex, the minimum level of increase being between 2- and 5-fold [9, 10]. The urine PBG may be measured either as a random sample, where it should be reported as urine PBG to creatinine ratio or as a 24-hour urine collection, where total PBG is reported. The former has proven to be clinically efficacious and has the advantage of timeliness, reduced within-subject variation and convenience over the requirement for a 24 hour urine collection [9]. If the urine PBG is not elevated this effectively rules out an acute porphyria attack at the time of sampling, however, there are certain caveats to this. Thus it is important to note that if specific treatment with either heme preparations or carbohydrate loading has been instigated prior to the test these interventions could reduce the urine PBG level significantly, including normalization [3]. Furthermore, if the measurement of urine PBG is delayed or undertaken at a time removed from the actual acute clinical presentation e.g. by weeks or months, then the finding of a normal urine PBG at that later stage cannot effectively rule out acute porphyria [3]. In this authors experience another important caveat concerns patients with a previous confirmed diagnosis acute porphyria who present with symptoms suggestive of recurrent acute attack. In many instances these patients have a perpetually elevated urine PBG, even in between attacks, and therefore an elevated urine PBG cannot effectively guide diagnosis. In these situations a decision to treat as an acute attack has to be made on the basis of clinical findings.
Therefore, a clinically effective service for acute porphyria diagnosis requires that a timely, quality assured laboratory method for urine PBG should be available for analysis [11]. Although a qualitative method for urine PBG may suffice for the purposes of establishing a diagnosis this should be supported by the availability of a confirmatory quantitative method for urine PBG. The lack of availability of urine PBG assay is very often the basis for misdiagnosis or indeed delayed diagnosis of acute porphyria attacks [10].
In conjunction with PBG, urine ALA is often measured simultaneously and although also elevated it does not tend to reach the levels of PBG in acute porphyrias. The one exception is the extremely rare instance of autosomal recessive ADP due to defective ALA synthase 2 (ALAS2) activity, where markedly elevated urine ALA levels are reported while PBG may be normal or only slightly elevated [2, 3]. In addition, a similar pattern of urine ALA predominance relative to PBG (although not as elevated) may be observed in the context of lead poisoning, wherein patients may also present with abdominal pain and neuropathy [1, 3].
Once the diagnosis of acute porphyria has been made based on the urine PBG the next phase involves determining the type of porphyria present. This is very much dependent on the specific pattern of porphyrin overproduction observed in samples of urine, feces, plasma and erythrocytes. It is critically important that the laboratory analytical methods available extend beyond the sole measurement of total porphyrin levels [10–12]. In particular, it is essential that individual porphyrin analysis and isomer fractionation in both urine and feces is available to facilitate the identification of the porphyria-specific patterns of porphyrin overproduction [10–12]. In many instances non-porphyria disorders affecting the gastrointestinal and hepatobiliary systems or certain dietary factors may cause non-specific secondary elevations in porphyrins, e.g. coproporphyrinuria, which can be diagnostically misleading [3]. In such cases urine PBG levels will not be elevated and the pattern of porphyrins observed will not be indicative of any one of the specific porphyrias per se. Therefore, it is important to realize that a finding of elevated porphyrin levels does not automatically equate to a diagnosis of underlying porphyria. This further highlights the importance of developing specialist porphyria centres to ensure that the appropriate repertoire of quality assured testing and expert interpretation and support are available for diagnosis and management of porphyria patients [11, 13].
The diagnosis of cutaneous (non-acute) porphyrias is also very much based on the specific patterns of porphyrins observed in urine and feces. In addition, the pattern of free and zinc protoporphyrin in erythrocytes can be useful in the diagnosis of CEP, EPP and the related disorder, XLP. Moreover, the identification of the porphyria subtype, either acute or cutaneous, may also be enhanced by identifying characteristic plasma porphyrin fluorescence emission peaks, e.g. VP emission peak between 625 and 628 nm [1–3]. Finally, it is essential that all samples for porphyrin and precursor measurement are protected from light prior to analysis.
Role of genetic diagnosis
Given the heritable nature of porphyrias it is not surprising that molecular genetic analysis has also become an important diagnostic adjunct. There is an extensive allelic heterogeneity of pathogenic mutations among the implicated genes for each porphyria disorder, which means that most mutations are uniquely confined to one or at most a few kindreds. There are, however, a few exceptions to this trend, most notably in relation to founder mutations among the Swedish population and the Afrikaner population in South Africa. The general approach in the application of genetic diagnostic strategies is firstly to characterize the causative mutation in a known affected individual (proband) using a mutation scanning approach [14]. Once a putative mutation has been identified its pathogenicity for a particular porphyria should be affirmed and then more extensive family cascade genetic screening can be organized based on the analysis of this kindred-specific mutation [14].
This approach has important implications in the diagnosis of porphyria susceptibility, particularly for the autosomal dominant acute hepatic porphyrias, where both penetrance and expressivity of the disorders is low [3, 4]. Thus the penetrance among AIP, VP and HCP is between 10 and 40%, implying that the majority of patients with an autosomal dominant acute hepatic porphyria will not manifest with an acute attack (or indeed cutaneous lesions in the case of VP and HCP) in their lifetime [3, 4]. Moreover, this lack of penetrance may also extend to the absence of subclinical biochemical abnormalities indicative of an underlying autosomal dominant acute porphyria, demonstrating the limited sensitivity of biochemical testing in identifying asymptomatic family members.
Currently there is no clear-cut mechanism for discriminating between those who will manifest a clinical and/or biochemical phenotype and those who will not. While the role of environmental precipitating factors, e.g. porphyrinogenic medications, stress, prolonged fasting, menstruation [1–3], have long been recognized in triggering acute porphyria attacks, it is the presence of a pathogenic mutation which is still the single most important factor determining the overall susceptibility for an acute porphyria episode. Therefore, all patients carrying a pathogenic mutation should be regarded as pre-symptomatic carriers, i.e. capable of developing an acute attack, and one of the key applications of genetic analysis in the area is in identifying pre-symptomatic carriers to allow for appropriate counselling and management advice to prevent attacks [3, 14].
In this author’s experience another useful role for molecular diagnostics in porphyrias is in relation to those patients with an historic diagnosis of acute hepatic porphyria in whom the biochemical abnormalities have subsequently normalized over years. In such instances genetic analysis can provide a definitive diagnosis for the type of porphyria and will accommodate a more extensive family screening programme for potential pre-symptomatic carriers.
The current methods of genetic analysis vary but usually involve a confirmatory step using direct nucleotide sequencing of the putative pathogenic variants as the gold standard. However, the emergence of next generation sequencing platforms has further galvanized the diagnostic possibilities in this area. Overall, in autosomal dominant acute hepatic porphyrias, approximately 95% of mutations are identifiable [3, 14]. This sensitivity includes the application of additional methods such as ‘multiplex ligation-dependent probe amplification’ (MLPA) and gene dosage analysis for identifying complex mutations, such large gene deletions, which may not be detected using standard sequencing-based approaches [14].
In autosomal recessive porphyrias including ADP, CEP and EPP, the clinical penetrance approaches 100%. These disorders also display a level of genetic heterogeneity. In the case of EPP the presence of a relatively common low expression single nucleotide polymorphism (SNP) located in the ferrochetalase gene, FECH (IVS3-48C), appears to be essential for the clinical expression of the cutaneous phenotype in the vast majority of cases [15].
The application of molecular genetics has provided a means of establishing definitive porphyria susceptibility, however, similar to the situation for biochemical testing services any genetic diagnostic services in this area must be quality assured to a high standard and need to adopt appropriate mutation scanning assay validation protocols in accordance with international standards and best practice recommendations [11–14].
References
1. Puy H, Gouya L, Deybach JC. Porphyrias. Lancet 2010; 375(9718): 924–937.
2. Balwani M, Desnick RJ. The Porphyrias: advances in diagnosis and treatment. Blood 2012; 120: 4496–4504.
3. Badminton MN, Elder GH. The porphyrias: inherited disorders of haem synthesis. In: Marshall W, Lapsley M, Day A, Ayling R, editors. Clinical Biochemistry Metabolic and Clinical Aspects. Churchill Livingstone Elsevier 2014; pp. 533–549.
4. Elder G, Harper P, Badminton M, Sandberg S, Deybach JC. The incidence of inherited porphyrias in Europe. J Inherit Metab Dis. 2013; 36: 849–857.
5. Simon NG, Herkes GK. The neurologic manifestations of the acute porphyrias. J Clin NeuroSci. 2011; 18: 1147–1153.
6. Sonderup MW, Hift RJ. The neurological manifestations of the acute porphyrias. S Afr Med J. 2014; 104: 285–286.
7. Crimlisk HL. The little imitator-porphyria: a neuropsychiatric disorder. J Neurol Neurosurg Psychiatry. 1997; 62: 319–328.
8. Siegesmund M, van Tuyll van Serooskerker AM, Poblete-Gutierrez P, Frank J. The acute hepatic porphyrias: Current status and future challenges. Best Pract Res Gastroenterol. 2010; 24: 593–605.
9. Aarsand AK, Petersen PH, Sandberg S. Estimation and application of biological variation of urinary delta-aminolevulinic acid and porphobilinogen in healthy individuals and in patients with acute intermittent porphyria. Clin Chem. 2006; 52: 650–656.
10. Kauppinen R, von und zu Fraunberg M. Molecular and biochemical studies of acute intermittent porphyria in 196 patients and their families. Clin Chem. 2002; 48: 1891–1900.
11. Aarsand AK, Villanger JH, Støle E, Deybach JC, Marsden J, To-Figueras J, Badminton M, Elder GH, Sandberg S. European specialist porphyria laboratories: diagnostic strategies, analytical quality, clinical interpretation and reporting as assessed by an external quality assurance programme. Clin Chem. 2011; 57: 1514–1523.
12. Whatley S, Mason N, Woolf J, Newcombe R, Elder G, Badminton M. Diagnostic strategies for autosomal dominant acute porphyrias: Retrospective analysis of 467 unrelated patients referred for mutational analysis of HMBS, CPOX or PPOX gene. Clin Chem. 2009; 55: 1406–1414.
13. Tollånes MC, Aarsand AK, Villanger JH, Støle E, Deybach JC, Marsden J, To-Figueras J, Sandberg S; European Porphyria Network (EPNET). Establishing a network of specialist porphyria centres – effects on diagnostic activities and services. Orphanet J Rare Dis. 2012; 7: 93.
14. Whatley SD, Badminton MN. The role of genetic testing in the management of patients with inherited porphyria and their families. Ann Clin Biochem. 2013; 50: 204–216.
15. Gouya L, Puy H, Robreau AM, Bourgeois M, Lamoril J, Da Silva V, Grandchamp B, Deybach JC. The penetrance of dominant erythropoietic protoporphyria is modulated by expression of wildtype FECH. Nat Genet. 2002; 30: 27–28.
The authors
Vivion E. F. Crowley*1 MB MSc FRCPath FFPath(RCPI) FRCPI, Nadia Brazil2 BA (Mod) FAMLS, Sarah Savage3 BSc MSc
1Consultant Chemical Pathologist, Head of Department, Biochemistry Department, St James’s Hospital, Dublin 8, Ireland
2Porphyrin Laboratory, Biochemistry Department, St James’s Hospital, Dublin 8, Ireland
3Molecular Diagnostic Laboratory, Biochemistry Department, St James’s Hospital, Dublin 8, Ireland
*Corresponding author
E-mail: vcrowley@stjames.ie
Max Generation – More than just coagulation analysers
, /in Featured Articles /by 3wmediaNow I can optimise my team’s efficiency
, /in Featured Articles /by 3wmediaSK500 the perfect system for your clinical lab
, /in Featured Articles /by 3wmediaThe FilmArray Torch – the latest advancement in molecular infectious disease diagnostics
, /in Featured Articles /by 3wmediaMPLICON Real-time PCR Kits
, /in Featured Articles /by 3wmediaFor excellence in diagnostics
, /in Featured Articles /by 3wmedia