Inherited skin diseases can be difficult to assess clinically and often diagnosis relies on multiple laboratory investigations. Traditionally, examination of skin biopsies is followed by biochemical testing and Sanger sequencing of genomic DNA. This approach is labour-intensive, costly and time-consuming. The advent of next-generation sequencing (NGS) methods provides an alternative or complementary approach to making highly accurate diagnoses, but is not without its own challenges.
by J. Lee, Dr A. Salam, Dr T. Takeichi and Prof. J. A. McGrath
Background
The identification of pathogenic mutations in monogenic diseases represents one of the major challenges, and fundamental goals, of early 21st Century human genetics. Most genetic diseases are rare, clinically heterogeneous, and difficult to diagnose – a task made more challenging by disparity in genotype–phenotype correlations, inter- and intra-familial variability, and well as mosaic patterns of disease. It is these hurdles that have led to the advent of Next-Generation DNA Sequencing (NGS); a group of technologies that can improve the speed, accuracy, and cost-efficiency of genetic sequencing, while simultaneously mapping normal variation, and thus furthering our understanding of human genetics in both health and disease. Inherited skin diseases encompass a collection of over 500 clinical entities – with variable structural or inflammatory manifestations that can also affect hair, nails, teeth and certain mucosal surfaces [1]. Individually these disorders are uncommon, but collectively they generate a significant health burden and many diagnostic conundrums.
Traditional approaches to the diagnosis of inherited skin diseases
For patients with inherited skin disorders, the traditional approach to diagnosis is to document a comprehensive patient history, including recording accurate family pedigrees, and noting any consanguinity. The clinician will then go on to perform a physical examination, take clinical photographs, and order laboratory investigations, which often include a skin biopsy. Light microscopy is usually uninformative, and the skin may need to be examined by transmission electron microscopy and immunohistochemistry. Additional blood or urine samples may be need for further diagnostic biochemical studies. Changes in skin structure or protein expression may provide clues to candidate genes, for which polymerase chain reaction primers can be designed and used for Sanger sequencing of genomic DNA. This ‘candidate gene’ approach has proved very useful for several autosomal recessive inherited skin diseases, but is typically unhelpful in most dominant diseases or in those with more subtle changes in skin morphology. Cue the advent of NGS technologies and a different approach to diagnostics, where the challenge in genetic discovery shifts away from the generation of data, to the filtering of relevant data [2, 3].
The impact of NGS
NGS encompasses a number of new technologies that vary in their sequencing protocols, thus determining the type of data produced. The approaches taken vary in template preparation, sequencing and imaging, genome alignment and assembly methods. The methodology is therefore also known as high throughput or massively parallel sequencing due to the ability of NGS to process large volumes of genetic data in a short time, in stark contrast to individual gene screening with Sanger sequencing. Whole-genome sequencing (WGS) and whole-exome sequencing (WES) are the two most commonly used NGS techniques. WGS has the ability to sequence an individual’s entire genome, but at the expense of speed and cost. In contrast, WES uses an array to capture protein-coding regions of the human genome, encompassing ~21,000 genes, which make up less than 2% of the genome. Compared with the 2–3 million variants generated by WGS, the data from WES typically reveals around 25,000 variants. Nevertheless, WES is a more economical option than WGS because ~85% of the pathogenic mutations in monogenic diseases are predicted to be in exons. The plethora of data then has to be filtered, with any potentially disease variant with evidence for causality established (Fig. 1). This process often involves the filtering of variants through databases of previously identified sequences, and cross referencing with known biological or genetic databases, for which considerable bioinformatics support is required: a single WES run can generate one terabyte of data.
Whole-exome sequencing: the possible advantages
The challenge
The key questions for WES in the diagnosis of inherited skin diseases are as follows. (1) Are the new technologies better than what already exist for diagnosing known diseases? (2) Can the new technologies be helpful in resolving unknown diagnoses or discovering new clinical entities? (3) Can the new technologies be introduced into clinical work and overcome any practical obstacles? Emerging data indicate a resounding yes to the first two questions, although the third remains work in progress [4].
Breadth of cover
WES encompasses most of the coding regions of the genome, whereas Sanger sequencing targets a predetermined gene, or part of a gene, between specially designed primers. WES is also efficient for sequencing large genes, such as COL7A1, which encodes type VII collagen. This gene, which is mutated in the blistering disease, dystrophic epidermolysis bullosa, is composed of 118 exons. Conventional Sanger sequencing approaches are based on designing ~72 primer pairs to amplify the COL7A1 exons and flanking introns. Thus the Sanger sequencing approach is therefore laborious and expensive, particularly as COL7A1 contains few recurrent mutations and the gene needs to be screened in its entirety to identify pathogenic mutations.
Genetic diagnosis
WES has emerged as an invaluable tool where a patient’s clinical diagnosis is unclear or erroneous. In this situation, Sanger sequencing of multiple candidate genes is destined to failure and to exhaust both time and resources. WES, on the other hand, can identify known variants in order to make a genetic diagnosis that was not initially considered, as has been demonstrated for subtypes of epidermolysis bullosa, and other inherited skin diseases [5, 6]. Indeed, WES has been used to accurately diagnose inherited skin diseases without any a priori clinical information [7]. The rationale is that more accurate and timely diagnoses offered by WES will allow for earlier targeted therapy and ultimately improved patient care.
Genetic discovery
The value of WES in genetic discovery is evident in the number of inherited skin diseases whose original genetic basis has been informed by WES. Recent examples include the discovery of inherited skin and bowel inflammation resulting from mutations in ADAM17 and EGFR [8, 9]. Given the protean nature of inherited skin diseases, many mutations cannot be anticipated based on clinical phenotype and initial investigations, leaving no candidate gene targets for Sanger sequencing. One pertinent example of the completely unexpected candidate gene is the identification of mutations in EXPH5 [10], which encodes a GTPase effector protein, exophilin-5, in a form of intra-epidermal epidermolysis bullosa – a disease that usually arises as a genetic disorder of keratin. WES is therefore superior to Sanger sequencing in the diagnosis of both novel and genetically heterogeneous conditions.
Cost efficiency
The cost of DNA sequencing has reduced by around 100,000-fold over the last 20 years. Although the technique remains relatively expensive at present (~£900 per sample at King’s College London, 2014 prices), further cost reductions are expected that will soon make WES a more economically viable option than Sanger sequencing, for all but a few disorders in which there are recurrent mutations in a small number of small genes. Even at current costs, however, WES already has advantages over Sanger sequencing for some genes, such as COL7A1, for which the cost of Sanger sequencing is ~£1000 (or greater) in the small number of laboratories that undertake sequencing of this gene.
Considering the patient
The diagnosis of many inherited skin disorders often relies on invasive investigations such as sampling a small piece of skin (punch or ellipse biopsy) (Fig. 2). The procedure involves injection of local anesthetic, which can be painful, and the wound usually heals with a small but evident scar. Occasionally, skin biopsy sites can be complicated by bleeding or infection. WES can be performed using DNA extracted from blood, saliva or tissue samples, and although Sanger sequencing can also be performed on similar templates, for many patients, a skin biopsy would have been necessary to determine the gene(s) for sequencing. Thus WES typically offers a less-invasive approach for the patient.
Variant mapping
Aside from discovering genes and pinpointing mutations in inherited skin diseases, WES also generates a huge amount of other data that can be used to map genetic variation. In the longer term, the dissection of bioinformatics data will lead to a better understanding of the implications of certain variants, refining genotype–phenotype correlation, thus providing insight into individual prognosis, and allowing stratified or personalized medicine and therapeutics.
Whole-exome sequencing: the possible disadvantages
Data analysis
The large quantity of sequencing data generated by WES is potentially also a disadvantage. Before WES can be used in routine clinical practice, fast and efficient filtering techniques must exist to allow clinicians and non-geneticists to interpret WES data and to extract the relevant information in order to manage their patient’s needs. But the plethora of data generated by WES also provides considerably more information beyond the pathogenic mutation itself, including several co-incidental potentially damaging mutations (known as ‘incidental findings’) that are completely unconnected to the primary disease being investigated. What should diagnosticians do with this information? Does it make a difference if the implications are clinically actionable or not? There are clearly several unresolved issues.
Accuracy of data
Given the volume of data produced by WES, it is inevitable that some false positive variants are identified. Most laboratories therefore still elect to confirm mutations via an alternative sequencing platform, generally Sanger sequencing, which is therefore a significant barrier to the routine use of WES in diagnostics. From a technical perspective, NGS methods still need to be improved to cover important regulatory elements such as promoters and enhancers, and poorly annotated parts of the genome. Moreover, if WES is to become a routine diagnostic technique, standardized operating procedures and protocols must be created and implemented. For inherited skin disease diagnostics there would also need to be a realignment of technical wet lab skills (skin microscopy) in favour of computer database and in silico work.
Time to diagnosis
Perhaps the biggest challenge for WES, however, lies in the time it takes to process and analyse a case. For many inherited skin diseases, a rapid diagnosis is often very important to optimize clinical management, for example in neonates with suspected epidermolysis bullosa. The diagnostic approach using skin biopsy assessment followed by Sanger sequencing of candidate genes (implicated by skin biopsy) allows for possible diagnoses to be made within 2 to 3 days. In contrast, the quickest time that WES could be completed (at present) would be a minimum of 5 days, although in practice WES often takes considerably longer to complete and analyse. New platforms to shorten WES protocols are in development, but only when more rapid sample analysis is feasible in a diagnostic lab setting can one really begin to think about wholesale change of diagnostic practice.
Conclusion
Since 2011, WES has proven to be a valuable asset in the diagnosis and discovery of inherited skin diseases. But the adoption of WES into clinical diagnostics diagnosis is still being refined and piloted. WES techniques are constantly being improved to become more accurate, quicker and cost-effective, while enrichment methodologies and sequencing technology become more reproducible and standardized. This progress may allow WES to function independently as the stand alone diagnostic and discovery tool in genetics, negating the need for Sanger sequencing to confirm WES findings. However, as our understanding of the role of non-coding DNA in molecular biology grows, and as WGS is further refined, WES is at risk of being superseded by newer NGS techniques both for genetic discovery diagnostics and prognostics. Innovation looms, but ever it was in molecular genetics.
References
1. Leech SN, Moss C. Br J Dermatol. 2007; 156: 1115–1148.
2. Metzker ML. Genome Res. 2005; 15: 1767–1776.
3. Metzker ML. Nat Rev Genet. 2010; 11: 31–46.
4. Cho RJ, et al. J Invest Dermatol. 2012; 132(E1): E27–28.
5. Takeichi T, et al. Br J Dermatol. 2014; doi: 10.1111/bjd.13190. [Epub ahead of print]
6. Salam A, et al. Matrix Biol. 2013; 33: 35–40.
7. Takeichi T, et al. Exp Dermatol. 2013; 22: 825–831.
8. Blaydon DC, et al. N Engl J Med. 2011; 365: 1502–1508.
9. Campbell P, et al. J Invest Dermatol. 2014; doi: 10.1038/jid.2014.164. [Epub ahead of print].
10. McGrath JA, et al. Am J Hum Genet. 2012; 91: 1115–1121.
The authors
John Lee, Amr Salam BSc, MBChB, MRCP(UK), Takuya Takeichi MD PhD, John A McGrath* MD FRCP
St John’s Institute of Dermatology, King’s College London (Guy’s Campus), London, UK.
*Corresponding author
E-mail: john.mccgrath@kcl.ac.uk
Viral hepatitis, the silent epidemic
, /in Featured Articles /by 3wmediaGlobally as many people (1.5 million) die each year from viral hepatitis as from HIV/AIDS, but whereas the latter viral disease attracts government and international action and funding, the former is comparatively neglected. It was for this reason that the WHO initiated World Hepatitis Day four years ago, to be observed on the 28th July each year, and the lack of awareness about the repercussions of viral hepatitis was reflected in this year’s theme of ‘Hepatitis: think again’. So far five hepatitis viruses have been identified, though Hepatitis D is only found as a co-infection with B. Whilst the acute infections that food- and water-borne Hepatitis A and E cause are not insignificant in terms of their incidence, morbidity and mortality, it is Hepatitis B and C (HBV and HCV), that are generating a global public health crisis.
These two viral infections have major characteristics in common with HIV/AIDS. The acute infection, acquired by exposure to infectious blood and other body fluids as well as by sexual and vertical transmission, is frequently asymptomatic in the case of HCV. Acute infections can be followed by a period of clinical latency and thus the unwitting transmission of the virus to others. Though such chronic infections with HBV are very uncommon in healthy adults, they occur in over half of young children infected; between 75% – 85% of people infected with HCV develop a chronic infection. After years or even several decades of chronic, asymptomatic infection, cirrhosis of the liver and hepatocellular carcinoma can result. The WHO estimates that there are around 780,000 deaths from acute and chronic HBV infection, and more than 350,000 from chronic HCV infection annually. Even more alarming is that currently 500 million people are chronically infected with either HBV or HCV.
As is the case with HIV/AIDS, avoiding exposure to infectious blood and semen and diagnostic testing of asymptomatic people can help to contain the global viral hepatitis epidemic. However, now the pertinent characteristics of the disease have been elucidated, it should be far more feasible to control viral hepatitis than HIV/AIDS, a disease for which there is no vaccine and no drugs that actually eradicate the virus. There is a highly effective vaccine for HBV, though approved drugs help prevent serious
liver damage but don’t eliminate the virus. Drugs are now available that can eradicate the HCV virus, and clinical trials are currently testing
a vaccine for chronically infected people.
“Hepatitis: think again”. With appropriate education and adequate national and international funding, this looming global health crisis could be averted.
Liquid chromatography-tandem mass spectrometry: an introduction
, /in Featured Articles /by 3wmediaThe use of liquid chromatography-tandem mass spectrometry for clinical analysis is on the increase. This article describes what it is, why it can offer significant improvements over traditional assays and the limitations to be aware of.
by Dr N. Homer
Introduction
Clinical biochemistry laboratories frequently use radioimmunoassays (RIA) and enzyme-linked immunosorbent assays (ELISA) for analysis of blood and urine. However, these techniques are plagued by issues of cross-reactivity and are only suited to look at one analyte at a time [1]. The use of mass spectrometry (MS) techniques has increased since 2007 when the American Endocrine Society recognized the importance of tandem MS and issued a statement recommending the use of liquid chromatography-tandem mass spectrometry (LC-MS/MS) for the determination of endogenous steroid hormones over more traditional technologies such as immunoassays [2]. This has led to the widespread adoption of LC-MS/MS in clinical biochemistry laboratories, in direct response to this recommendation.
What is liquid chromatography-mass spectrometry?
Mass spectrometry is a technique that measures charged molecules or ions in the gaseous state. Samples are introduced into an ion source, ionized and then separated in a mass analyser according to their mass-to-charge ratio (m/z) and then characterized by their relative abundances. Coupled to chromatographic separation techniques such as gas chromatography (GC) or liquid chromatography (LC), MS is considered to be the ‘gold standard’ for validation of quantitative analytical assays. An overview of how a typical chromatograph-mass spectrometer is set up is shown in Figure 1.
Following separation by a chromatography system the sample is introduced into an ion source at the front end of the mass spectrometer. Ionization modes include atmospheric pressure ionization (API), such as electrospray ionization (ESI) and atmospheric pressure chemical ionization (APCI), and matrix-assisted laser desorption/ionization (MALDI). ESI is most typically used to ionize the biomolecules encountered in clinical samples.
Once ionized the mass analyser separates the ions according to their m/z. Mass analysers include magnetic or electric sectors, time-of-flight (ToF) tubes, quadrupoles and two-dimensional and three-dimensional ion traps. Softer ionization techniques, which generally leave the molecule intact, such as ESI, have led to the use of quadrupole mass analysers. These consist of four parallel rods or poles, generally of hyperbolic cross-section, through which ions are passed and separated.
Tandem mass spectrometry (often termed MS/MS) technology increases the specificity of MS significantly. There are a number of modes that a tandem mass spectrometer can be operated under, depending on the requirement of the experiment (Fig. 2). Tandem MS requires two or more mass analysers to be placed in sequence and the ions are fragmented in a collision cell to give structural information. Trace analysis of complex biological matrices is ideally suited to tandem MS instruments, operated in selected reaction monitoring (SRM) mode. In addition, linear ion traps as the third mass analyser are also increasing in popularity as they offer additional structural identification and specificity.
Sample preparation and liquid chromatography method development
Clinical samples are complex biological matrices and contain interferences that can lead to so-called matrix effects within the mass spectrometer. For validated assays, samples are prepared by addition of an internal standard followed by extraction to remove as much of the interferences as possible. The internal standard is either a closely related analogue of the compound of interest, or a stable isotope labelled version of the compound, enriched with at least two atoms of 13C, 2H or 15N.
Sample preparation methods commonly applied to clinical samples include protein precipitation with an organic solvent, liquid–liquid extraction (LLE) or solid-phase extraction (SPE). If sample clean-up is not sufficient it can lead to matrix effects, including ion suppression of the analyte, usually observed as a loss of response. This affects the detection limits, accuracy and precision of the assay. Various ion suppression tests have been developed and these are an important part of the method validation set-up required for clinical MS assays. The two most effective ways of avoiding ion suppression are improved sample extraction and optimized chromatographic selectivity.
On-line multidimensional chromatography technology allows an unextracted sample to be introduced into the chromatography apparatus and can lead to faster analysis. These systems generally consist of multi-channel switching valves, on-line SPE cartridges and analytical columns ahead of ESI-LC-MS/MS. Steroids and isoprostanes are often analysed in this manner [3, 4].
Liquid chromatography
Once prepared, a sample is introduced into the LC system which consists of a pump and an analytical column. The purpose of the chromatography system is to separate the components of the sample as much as possible, before introduction into the mass spectrometer. Analytical LC columns are stainless steel tubes that are packed with tiny silica beads. The type of LC used in clinical analysis is usually reversed-phase chromatography as the silica beads are generally chemically modified. Typically, samples are introduced onto the column in a highly aqueous phase, the analytes associate with the chemically-modified packed silica beads and are washed off the column with a high organic solvent such as methanol or acetonitrile.
Once the ionization and mass spectrometer parameters have been optimized, much of the method development falls to the chromatography and the importance of this stage should not be underestimated. It is imperative that co-eluting compounds do not interfere with the analytical peaks of interest. In recent years there has been a trend for fast analysis in LC-MS/MS; however, this does not always give a robust assay. In addition, it is important to be aware of isobaric compounds (same mass) and [M+2] isotopomers. An example of isotopomers is that of the stress hormone cortisol (m/z 363) and its inactive form cortisone (m/z 361). In an LC-MS/MS assay for cortisol it is essential to have two separate peaks for cortisol and cortisone, otherwise the risk of isotopomers of cortisone contributing to cortisol would lead to an over-estimation of cortisol in the sample (Fig. 3).
There are a number of parameters that can be altered in LC and these in turn alter the selectivity of the column, that is the order and rate at which the components elute. Parameters that can be influenced include column temperature, mobile phase pH, composition and flow rate, column dimensions, column particle size and the nature of course the chemical modification of the particles too.
Analytical LC column technology is continuously improving. The better the resolution, which is simply how well separated each peak is, the better the assay. Sub-2 µm particles have been introduced in the past decade, which generate sharp peaks and excellent resolution with improved capacity over the more traditional 3–5 µm particles. However, the smaller particle size leads to high backpressure and requires specific LC pumps that can withstand these ultra-high pressures (UHPLC, ultra-high performance LC). To reduce the need for new instrumentation, LC columns packed with fused-core particles ~2.5 µm have been developed to allow separation comparable to sub-2 µm particles. The backpressure generated by these fused-core particles is significantly less than the sub-2 µm particle columns and exclude the requirement of high-pressure capable LC pumps and fittings.
Considerations when establishing an LC-MS/MS clinical biochemistry method
As with all techniques, there are drawbacks to LC-MS/MS. The instrumentation and software can be complex and requires regular maintenance, although manufacturers are addressing this perception by introducing simpler software interfaces with dedicated instrument support for method development and even fool-proof methods, guaranteed by the provider. Also, some compounds are not amenable to ionization due to their chemical nature, but chemical modification before analysis can improve the chance of ionization efficiency, so all is not lost.
Summary
The benefits that MS offers over other traditional assay techniques have seen an increase in the number of assays using this methodology. The analysis of steroid hormones by MS is a well-documented area. Other commonly encountered uses include newborn screening for congenital metabolic diseases such as aminoacidopathies and fatty acid oxidation disorders, multi-analyte therapeutic drug monitoring, oncology drugs, anti-virals, toxicants and drugs of abuse screening and analysis of endogenous peptides [3, 4, 5].
One area that is continuing to gain interest in clinical research is high-resolution MS (HRMS) [5]. This allows for accurate mass determination over a defined mass range, which differs from the targeted analysis approach used by triple quadrupole MS. With technological improvement in the linear range of HRMS instruments to match that of triple quadrupoles, it seems likely that the benefits of HRMS will also be exploited by the clinical biochemistry field, in addition to LC-MS/MS analysis.
The range of clinical applications of MS outlined is broad and constantly expanding. Much research is being conducted in the pioneering fields of proteomics and metabolomics. In recent years the emergence of imaging mass spectrometry also offers exciting possibilities for the future and there is no doubt that MS will continue to feature heavily in the clinical biochemistry laboratory and function as an important clinical research tool.
References
1. Penning TM, et al. Liquid chromatography-mass spectrometry (LC-MS) of steroid hormone metabolites and its applications. J Ster Biochem Mol Biol. 2010; 121: 546–555.
2. Rosner W, et al. Position statement: utility, limitations, and pitfalls in measuring testosterone: an Endocrine Society position statement. J Clin Endocrinol Metab. 2007; 92: 405–413.
3. Shushan B. A review of clinical diagnostic applications of liquid chromatography-tandem mass spectrometry. Mass Spectrom Rev. 2010; 29: 930–944.
4. Chace DH, et al. Use of tandem mass spectrometry for multianalyte screening of dried blood specimens from newborns. Clin Chem. 2003; 49: 1797–1817.
5. Jiwan J-LH, et al. HPLC-high resolution mass spectrometry in clinical laboratory? Clin Biochem. 2011; 44: 136–147.
The author
Natalie Homer PhD
CRF Mass Spectrometry Core Laboratory, Queen’s Medical Research Institute, University of Edinburgh
E-mail: n.z.m.homer@ed.ac.uk
Measuring cigarette smoke exposure: quantification of cotinine by LC-MS/MS
, /in Featured Articles /by 3wmediaSmoking is a major cause of morbidity and mortality worldwide. The adverse health effects of chronic cigarette smoke exposure are widely known. Active smoking increases the risk of developing several pathologies including pulmonary disease, cardiovascular disease and cancer. Importantly, the sequelae of smoking also extend to non-smokers via frequent passive inhalation. Accurate measures of cigarette smoke exposure then are required to draw meaningful conclusions about the healthcare risks to both smokers and non-smokers. Cotinine is the major primary metabolite of nicotine and is the biochemical marker of choice for measuring exposure to cigarette smoke.
by Dr A. Dunlop, Dr B. L. Croal and J. Allison
Background
Chronic exposure to tobacco products is amongst the leading causes of preventable morbidity and mortality worldwide, being responsible for approximately 6 million deaths per annum [1]. Typically this involves inhalation of cigarette smoke which contains in excess of 5000 different chemicals; many of these are known toxins and carcinogens [2]. Upon inhalation of cigarette smoke, nicotine is transported to the lungs within tar droplets, dissolving in the alveolar fluid, and is then absorbed into the bloodstream. Following entry into the pulmonary circulation, nicotine quickly travels to the brain – within a matter of seconds – and exerts its pharmacological effects [3].
Nicotine is the addictive component of tobacco products, stimulating dopamine release in the brain and leading to heightened feelings of pleasure and reward [4]. In active smokers this nicotine dependence sustains chronic exposure to the toxins present in cigarette smoke [5]. Active smokers are therefore at increased risk of developing multiple pathologies including pulmonary disease, cardiovascular disease and cancer [6, 7]. Importantly, non-smokers are also at increased risk via involuntary or passive/second-hand smoke (SHS) exposure [8, 9]. Children are particularly susceptible to involuntary exposure, mainly occurring in enclosed spaces such as the parental home/car, via maternal smoking or passive exposure during pregnancy [10]. The adverse health effects of SHS exposure in children include increased risk of miscarriage, sudden infant death syndrome, lower respiratory tract infections, asthma and invasive meningococcal disease [10].
In addition, an emerging area of interest surrounds involuntary exposure via so-called third-hand smoke (THS). THS is a term used to describe the deposits of tobacco smoke that accumulate on surfaces, objects and in dust particles, persisting long after the dispersal of cigarette smoke. There is some evidence to suggest that atmospheric reactions may lead to re-release of smoke-derived toxins into the environment [11]. However, the health risks of THS are not yet known and remain the subject of ongoing research [12].
Assessing cigarette smoke exposure
The healthcare risks associated with cigarette smoking and SHS exposure ensure that smoking status should always be included in any routine clinical assessment. Monitoring of smoking status may also be indicated in specific circumstances, such as epidemiological studies, smoking cessation programmes, lung transplant patients, employee and health/life insurance screening. The most convenient and cost-effective means of assessing cigarette smoke exposure is by self-report. This may occur either during face-to-face consultation with healthcare professionals or often as part of a generic healthcare questionnaire. However self-report is frequently unreliable in estimating smoking status [13].
Moreover, the risk and extent of SHS exposure to non-smokers cannot be adequately assessed using these methods. For example, self-report cannot reliably quantify exposure in those who co-habit and/or socialise with smokers nor can it inform on fetal exposure in maternal smoking. Consequently, cigarette smoke exposure should be accurately quantified by measuring biomarkers to draw meaningful conclusions between smoking status and health outcomes [14, 15].
Biomarkers of cigarette smoke exposure
Numerous biomarkers have been examined in the analysis of cigarette smoke exposure, e.g. carbon monoxide, carboxyhaemoglobin, thiocyanate and polycyclic aromatic hydrocarbons [4]. However, many are non-specific for tobacco use and contribution from other environmental or dietary sources can cause interference [4]. In contrast, nicotine is a more specific marker of cigarette smoke exposure, being derived solely from tobacco [3]. Biochemical measurements of nicotine and its metabolites then are typically used to provide reliable measures of cigarette smoke exposure. Nicotine largely undergoes hepatic metabolism (with a half-life of approximately 2 h) and the plasma of active smokers typically contains 10–50 ng/mL of nicotine [3]. Cotinine is the major breakdown product of nicotine accounting for around 80% of all metabolites [3]. The half-life of cotinine, at around 16 h, is substantially longer than nicotine and plasma levels in active smokers are approximately 250–300 ng/mL [4]. Consequently, cotinine is the preferred biomarker for measuring cigarette smoke exposure.
Quantifying cotinine in biological matrices
A variety of methods have been developed for quantification of cotinine in several biological matrices including urine, blood, saliva and hair [14, 15]. There is good agreement between cotinine levels in plasma/serum and saliva, whilst levels in urine are typically higher [15].
Immunoassay methods have traditionally been used for the detection of cotinine in urine, offering rapid turnaround with minimal sample preparation. In addition, commercially available immunoassay kits are easily integrated into most core automated analysers available in modern clinical laboratories. However, reagent costs are typically high and it would be fair to say that immunoassays may be susceptible to cross-reactivity with other nicotine and cotinine-derived metabolites and thus may be of questionable accuracy [16, 17].
Gas chromatography–mass spectrometry (GC-MS) methods are also available; although sample preparation is typically labour intensive and time consuming, proving impractical for high sample throughput. Not surprisingly, liquid chromatography-tandem mass spectrometry (LC-MS/MS) methods have emerged as the sine qua non for quantification of cotinine in biological fluids.
LC-MS/MS analyses
Liquid chromatography–tandem mass spectrometry (LC-MS/MS) affords the requisite specificity and sensitivity to detect and quantify cotinine at levels encountered throughout the spectrum of cigarette smoke exposure. The majority of recently published methods now routinely quote lower limits of quantification (LLOQ) in the region of <0.5 ng/mL, in both plasma/serum and in urine [15]. Cut-points to distinguish smokers from non-smokers have been variously proposed from 12 ng/mL down to 3 ng/mL, depending on the population [15]. Nevertheless, regular active smokers can be expected to have serum/plasma cotinine levels in marked excess of 100 ng/mL, although non-smokers are usually comfortably below 10 ng/mL. The majority of LC-MS/MS methods for cotinine have been developed in-house, an important advantage compared with immunoassay techniques. This not only affords flexibility in the choice of matrix to be analysed but also permits the inclusion of more than one analyte in the assay. Thus nicotine, cotinine and various metabolites thereof may be detected in a multiplexed assay. Published guidelines are also widely available to assist in the development and validation of LC-MS/MS methods [18]. The advent of enhanced chromatographic separation techniques, such as ultra-performance liquid chromatography (UPLC), has significantly shortened run times thereby facilitating higher sample throughput. Development of uncomplicated sample preparation procedures has further simplified analyses. For example, in our own laboratory we recently developed a rapid and straightforward UPLC-MS/MS protocol for the determination of cotinine in plasma (Fig. 1) [19]. Analytical run time was 4 min per sample with a LLOQ of 0.2 ng/mL and the assay was linear from 0.5 to 1000 ng/mL; comfortably covering the concentration range of active and non-smokers (Fig. 2). A simple 5-step automated SPE process was also developed, permitting minimal sample handling and using only water and methanol, both cheap and readily available. To date we have successfully deployed this method for the analyses of two large patient cohorts (each comprising several hundred samples) associated with independent epidemiological studies. Although the initial outlay for equipment is high, thereafter LC-MS/MS assays can be run relatively cheaply using readily available inexpensive solvents. Furthermore, sample preparation procedures can usually be streamlined/simplified and therefore easily adapted for high-throughput analyses [19]. Matrix effects, chiefly ion suppression, are a particular disadvantage of LC-MS/MS techniques; however careful consideration and troubleshooting during method development can often overcome this issue [20]. Conclusions and future directions
Despite widespread awareness of the adverse effects of tobacco use and increasing public health initiatives to combat this, cigarette smoking continues to be a major global cause of morbidity and mortality and is likely to remain so for the foreseeable future. Accurate quantification of cigarette smoke exposure via biomarkers is therefore an important measure in stratifying the risk of both active and non-smokers.
The need to quantify ever decreasing amounts of nicotine, cotinine and their metabolites in monitoring exposure to tobacco products ensures that LC-MS/MS techniques and modifications thereof remain at the forefront of detection methods in this field. Similarly, as new biomarkers become available which inform on the detrimental health effects of smoking these methods are ideally placed to keep pace, both in research and in clinical laboratories.
The recent emergence of electronic cigarette devices (e-cigarettes) is currently the subject of much debate. E-cigarettes typically deliver nicotine in a vapour generated via heating a liquid that also contains propylene glycol and other additives e.g. flavouring [21]. Exponents propose e-cigarettes as a safer alternative to smoking associated with tobacco combustion and promote the benefits for smoking cessation. However, some healthcare professionals believe that while e-cigarettes are safer, they may still act as a gateway or as a way of prolonging or even enhancing dependency on nicotine. In addition, the long-term health effects of these products are unknown, as is the need to monitor biomarkers such as nicotine and/or cotinine in so-called ‘e-smokers’.
References
1. WHO report on the global tobacco epidemic 2013. http://www.who.int/tobacco/global_report/2013/en/
2. Talhout R, et al. Hazardous compounds in tobacco smoke. Int J Environ Res Public Health 2011; 8(2): 613–628.
3. Hukkanen J, et al. Metabolism and disposition kinetics of nicotine. Pharmacol Rev. 2005; 57(1): 79–115.
4. Benowitz NL, et al. Nicotine chemistry, metabolism, kinetics and biomarkers. Handb Exp Pharmacol. 2009; 192(192): 29–60.
5. Berrendero F, et al. Neurobiological mechanisms involved in nicotine dependence and reward: participation of the endogenous opioid system. Neurosci Biobehav Rev. 2010; 35(2): 220–231.
6. Doll R, et al. Mortality in relation to smoking: 50 years’ observations on male British doctors. BMJ (Clinical research ed.). 2004; 328(7455): 1519.
7. Jha P. Avoidable global cancer deaths and total deaths from smoking. Nature reviews.Cancer. 2011; 9(9): 655–664.
8. Scientific Committee on Tobacco and Health. Secondhand smoke: Review of evidence since 1998. Update of evidence on health effects of secondhand smoke. Department of Health, UK 2004. http://www.smokefreeengland.co.uk/files/scoth_secondhandsmoke.pdf
9. Vardavas CI, Panagiotakos DB. The causal relationship between passive smoking and inflammation on the development of cardiovascular disease: a review of the evidence. Inflamm Allergy Drug Targets 2009; 8(5): 328–333.
10. Action on Smoking and Health. Research Report. Secondhand smoke: the impact on children. March 2014. http://www.ash.org.uk/files/documents/ASH_596.pdf
11. Sleiman M, et al. Formation of carcinogens indoors by surface-mediated reactions of nicotine with nitrous acid, leading to potential thirdhand smoke hazards. PNAS 2010; 107(15): 6576–6581.
12. Matt GE, et al. Thirdhand tobacco smoke: emerging evidence and arguments for a multidisciplinary research agenda. Environ Health Perspect. 2011; 119(9): 1218–1226.
13. Connor Gorber S, et al. The accuracy of self-reported smoking: a systematic review of the relationship between self-reported and cotinine-assessed smoking status. Nicotine Tob Res. 2009; 11(1): 12–24.
14. Florescu A, et al. Methods for quantification of exposure to cigarette smoking and environmental tobacco smoke: focus on developmental toxicology. Ther Drug Monitg. 2009; 31(1): 14–30.
15. Avila-Tang E, et al. Assesing secondhand smoke using biological markers. Tob Control 2013; 22: 164–171.
16. Schepers G, Walk RA. Cotinine determination by immunoassays may be influenced by other nicotine metabolites. Arch Toxicol. 1988; 62(5): 395–397.
17. Tate J, Ward G. Interferences in immunoassay. Clin Biochem Rev. 2004; 25(2): 105–120.
18. Honour JW. Development and validation of a quantitative assay based on tandem mass spectrometry. Ann Clin Biochem. 2001; 48(2): 97–111.
19. Dunlop AJ, et al. Determination of cotinine by LC-MS-MS with automated solid-phase extraction. J Chromatogr Sci. 2014; 52(4): 351–356.
20. Matuszewski BK, et al. Strategies for the assessment of matrix effect in quantitative bioanalytical methods based on HPLC-MS/MS. Anal Chem. 2003; 75(13): 3019–3030.
21. Grana R, et al. E-cigarettes: a scientific review. Circulation 2014; 129(19): 1972–1986.
The authors
Allan Dunlop1* PhD, Bernard Croal2 MD and James Allison2 BSc
1Department of Clinical Biochemistry Laboratory, Southern General Hospital, Glasgow G51 4TF, UK
2Department of Clinical Biochemistry, Aberdeen Royal Infirmary, Aberdeen AB25 2ZD, UK
*Corresponding author
E-mail: allandunlop@nhs.net
Next-generation sequencing in clinical diagnostics and genomics research
, /in Featured Articles /by 3wmediaThe UK prime minister recently announced an investment package worth £300 million pounds for genomic research. This will include the sequencing of 100,000 genomes by 2017. The project, driven by Genomics England, will have a major impact on many areas of healthcare. Next-generation sequencing (NGS) technology is the method by which this sequencing will be achieved. NGS is currently being used in many healthcare services.
by Dr K. Gilmour
Background
Sequencing of the first human genome took 10 years to complete at a cost of USD300 billion. Although genomics has been recognized and hailed as the future of medicine, the costs associated with sequencing were considered prohibitive. Scientists proposed that large-scale projects would be required to decipher the secrets within each genome and how they interconnect with disease susceptibility, progression and treatment. In 2005 next-generation sequencing (NGS) became commercially available and in the 9 years since has transformed genomics beyond all recognition. Large-scale projects are now financially feasible and the potential of genomics and its link with healthcare can finally be realized.
Different NGS technologies are commercially available with Illumina and Ion torrent™ (Life Technology) probably considered the market leaders. Some NGS instruments can generate a terabase of sequence data in a single run. This equates to around 500 human genomes a week, each costing near to the USD1000 mark in reagents, a financial figure hailed as the ultimate goal. NGS is faster, more accurate and much more sensitive than traditional Sanger sequencing and will contribute directly to improvements in diagnostic medicine, personalized medicine and medical research.
An overview of NGS technology
The details of the NGS workflow differ from technology to technology but the main principle remains the same. Extracted DNA from human, animal or microbe sources, is turned into a ‘library’ of DNA. This usually involves making the large pieces of DNA smaller (fragmenting) and then adding special handles known as ‘adapter DNA’ to the ends of each of the DNA fragments (Fig. 1). Adapters are merely small pieces of DNA of known sequence, which can be used to manipulate the fragments of DNA in order to sequence them. This manipulation includes tethering the individual fragments to either a slide or a tiny bead onto which the fragment is clonally amplified producing millions of DNA molecules all of the same sequence. The whole library of different clonally amplified fragments is then sequenced simultaneously. NGS sequencing chemistry produces a detectable ‘signal’. This signal is often fluorescent, so each time a single nucleotide (A, G, C or T) is incorporated into a DNA molecule a tiny amount of light is emitted and detected. The individual sequence produced is known as a ‘read’ and once the millions of small reads in the reaction have been generated they are aligned and assembled via computer algorithms into much longer sequences. Because millions of reads are generated even molecules of low abundance can be sampled making this technique extremely sensitive. Large sequencers able to generate hundreds of human genome sequences a week can be used in high-throughput research projects. Small, fast bench-top sequencers are also available and are highly suited to the demands of a clinical laboratory.
Human genomics
Identifying the genes involved in rare disorders can help doctors to diagnose and understand the underlying cause and nature of the disease and in turn determine what treatment a patient requires. Genomics offers a global look at all genes and how they interact instead of focusing on specific genes and biochemical pathways. Sequencing the exomes (the parts of the genome that encode genes) of only a few people with a rare genetic disorder can locate the mutated gene involved [1]. Genome-wide association studies (GWAS) are also allowing researchers to identify genes associated with many common diseases and so they help predict how likely people are to suffer from specific diseases in their life-time including such things as Parkinson’s disease [2].
NGS in non-invasive prenatal diagnosis
The sensitivity of NGS makes it ideal for non-invasive prenatal diagnosis of fetal aneuploidies. Maternal blood often contains cell-free fetal DNA at very low concentrations. NGS can be used to pick up anomalies in this DNA and so a simple blood test can replace invasive techniques [3].
Personalized medicine
The ability to stratify patient responses to drugs based on the individual’s genetic content has revolutionised how drug trials are performed and the speed at which new drugs reach the manufacturing stage. In cancer medicine, determining the genetic profile of a patient’s tumour can predict which drugs the tumour will potentially respond to thus reducing the likelihood of exposure to a drug with terrible side effects and no clinical benefit [4]. Currently, tumours of many cancer types are regularly tested for individual gene mutations, the results of which determine the treatment. As research reveals further biomarkers of drug response, multiple genes will need to be tested. It is no longer cost effective to test for each of these biomarkers individually and NGS offers the ability to sequence all or part of the tumour genome. The sensitivity of NGS allows mutations to be detected in tissue that contains only a small number of tumour cells. In most hospitals tumour tissue is formalin fixed and embedded in paraffin (FFPE) before being section and mounted on slides for histopathology review. This process can often lead to DNA damage, including fragmentation, rendering the DNA useless for some molecular techniques. As NGS relies on short DNA fragments, FFPE extracted DNA can still be used [5].
NGS in microbiology
In order to prescribe the correct anti-retroviral drugs, the resistance genes of the HIV strain a patient carries are often sequenced. Sanger sequencing would require 20% of the HIV viral population to contain the drug resistance gene in order to be detected. ‘Deep sequencing’ or sequencing the genome many times using NGS can detect resistance genes even if present in less than 1% of the viral population [6]. Outbreaks of dangerous Escherichia coli strains can now be detected early and spread prevented because of the speed at which the sequencing and reconstruction of the relationships of the isolated strains can be achieved [7]. NGS continues to grow as the technology of choice in microbiology.
Possible problems with NGS
With any new technology or venture on the scale of the Genomics England ‘100,000 Genomes Project’ there are potential problems.
Data analysis
The availability of small bench top sequencers means that even small diagnostic labs will be able to use NGS. Different NGS platforms generate different types of data with differing degrees of quality. Because of the inherent errors of enzymatic driven sequencing and the variability in the sequencing signals generated, a host of clever computer algorithms are needed to determine the likelihood of every base in the sequence being correct. The algorithms used to do these analyses are often sold packaged as software or analysis pipelines and are designed by in-house bioinformaticians. With the misinterpretation of sequence data carrying such dire consequences, robust data analysis is paramount. Illumina will be the technology used for all the sequence data generated by the 100,000 Genome Project so all data will likely be handled, processed and analysed in a very similar manner leading to reproducible and robust results. Other clinical laboratories entering into the sequencing revolution will be bombarded with options of technology as well as analysis methods. Clinical laboratories in most countries adhere to a set of rigorous assessments and standards and all clinical tests must be fully validated. Validation of NGS is complicated but best practice guidelines are aiming to simplify the process. ‘Targeted sequencing’, where panels of only a few to a few hundred clinically relevant genes are sequenced makes validation and analysis easier. Unifying analysis processes will remain an important consideration in the future.
Data storage and security
The 100, 000 Genome Project will produce petabytes of data, but even small diagnostic labs will be producing large quantities of data. Targeted gene panels will help but data storage could still be an issue. NGS generates sequence files and associated raw data files and deciding what should be stored and discarded is debated. The Royal College of Pathologists guidelines recommend that data and records pertaining to pathology tests be retained for a minimum of 25 years. DNA sequence is of a highly sensitive nature as even without patient details attached, it contains all the information to link it the individual from which it was taken. Secure storage of DNA sequence with compression and encryption is an important consideration. The Medical Research Council in the UK has earmarked £24 million pounds of the Genomics England funding for computing power, including analysis and secure storage.
Ethical implications
The mainstream adoption of any new technology has ethical implications. Whilst sequencing a patient’s tumour to determine a cancer treatment plan another gene mutation could be identified, unrelated to the condition being treated. In the UK all patients must consent to any germ-line genetic test. Genetic counselling is offered and patients are helped to come to terms with the implications of the findings. Serendipitous discoveries have the potential to create many ethical dilemmas for clinicians.
The future: a learning healthcare system
Although powerful, medical genomics has so far not had the major impact on healthcare predicted at the time of the release of the first human genome. The 100,000 Genome Project will change that. The project hopes to link up genomic data with the medical records for each patient. This means that research data can be actively generated as the project persists. Every person consenting to the project will be a walking research project from which we can learn important lessons about treatment and response [8]. This could transform our UK healthcare system into a learning environment like no other in the world. It will generate the evidence on which future improvements can be made. With strong collaborative partnerships set up with Illumina, the Wellcome Trust Sanger institute, Medical Research Council, and Cancer Research UK to name but a few, this the Genomics England project has the potential to be a great success.
So-called ‘third generation sequencing’ technology is already a reality and NGS sequencing chemistries are continually evolving and improving. Although it is unlikely in the very near future that every person in the country will have their genome sequenced, NGS is still contributing massively to healthcare improvements in genomics and other clinical diagnostic areas.
References
1. Boycott KM, Vanstone MR, Bulman DE, MacKenzie AE. Rare-disease genetics in the era of next-generation sequencing: discovery to translation. Nat Rev Genet. 2013; 14(10): 681–691.
2. Nalls MA, Pankratz N, Lill CM, Do CB, Hernandez DG, Saad M, DeStefano AL, Kara E, Bras J, et al. Large-scale meta-analysis of genome-wide association data identifies six new risk loci for Parkinson’s disease. Nat Genet. 2014; doi: 10.1038/ng.3043. [Epub ahead of print].
3. Nepomnyashchaya YN, Artemov AV, Roumiantsev SA, Roumyantsev AG, Zhavoronkov A. Non-invasive prenatal diagnostics of aneuploidy using next-generation DNA sequencing technologies, and clinical considerations. Clin Chem Lab Med. 2013; 51(6): 1141–1154.
4. Jackson SE, Chester JD. Personalised cancer medicine. Int J Cancer 2014; doi: 10.1002/ijc.28940. [Epub ahead of print].
5. Fairley JA, Gilmour K, Walsh K. Making the most of pathological specimens: molecular diagnosis in formalin-fixed, paraffin embedded tissue. Curr Drug Targets 2012; 13(12): 1475–1487.
6. Gibson RM, Schmotzer CL, Quiñones-Mateu ME. Next-generation sequencing to help monitor patients infected with HIV: ready for clinical use? Curr Infect Dis Rep. 2014; 16(4): 401.
7. Veenemans J, Overdevest IT, Snelders E, Willemsen I, Hendriks Y, Adesokan A,Doran G, Bruso S, Rolfe A, Pettersson A, Kluytmans JA. Next-generation sequencing for typing and detection of resistance genes: performance of a new commercial method during an outbreak of extended-spectrum-beta-lactamase-producing Escherichia coli. J Clin Microbiol. 2014; 52(7): 2454–2460.
8. Ginsburg G. Medical genomics: gather and use genetic data in health care. Nature 2014; 508(7497): 451–453.
The author
Katelyn Gilmour PhD
Molecular Pathology, Dept. Laboratory Medicine, Royal Infirmary of Edinburgh, Edinburgh EH16 4SA, UK
*Corresponding author
E-mail: Katelyn.gilmour@nhslothian.scot.nhs.uk
Next generation sequencing and metagenomics – fighting the scourge of antibiotic resistance
, /in Featured Articles /by 3wmediaThe emergence of drug-resistant bacteria is one of the biggest public health challenges today. In April 2014, the World Health Organization (WHO) warned that a “post-antibiotic era – in which common infections and minor injuries can kill” is far from “an apocalyptic fantasy” and has instead become “a very real possibility.” Two months later, the newly formed World Alliance Against Antibiotic Resistance (with over 700 members in 55 countries) launched an urgent appeal to label antibiotic resistance “a grave global threat, and asking that antibiotics be declared a cultural heritage deserving legal protection.”
The ‘horror’ of carbapenem resistance
The march of bacterial resistance seems unremitting, even as the pipeline of new antibiotics is drying up. What is especially worrying for public health authorities, however, is a new “horror”. This refers to the growth of bacterial strains resistant to carbapenem – the ‘last resort’ antibiotic for unresponsive patients.
In several countries, carbapenem does not work in more than half of the people with Klebsiella pneumoniae, a major cause of hospital-acquired respiratory tract infections.
During the month of June 2014 alone, the US saw its first case of carbapenem-resistant (CR) Pseudomonas aeruginosa. In Canada, routine testing of raw squid in Saskatoon revealed a bacterial strain resistant to carbapenem. The case was the first of its kind in a food store, and demonstrates an enhancement of exposure risk “from a relatively small slice of the public” – such as travellers to risk zones or those recently hospitalized – to a much larger sector, according to Joseph Rubin, an assistant professor at the University of Saskatchewan.
Resistant genes are the real challenge
The Canadian case also illustrates the real, long-term challenge facing microbiologists. The bacterium in question, Pseudomonas fluorescens, is not risky for people with healthy immune systems. However, it carries a gene to produce carbapenemase, the enzyme inducing resistance to carbapenem.
Indeed, the real problem is less the spread of antibiotic-resistant bacteria than of antibiotic-resistant genes. Bacteria swap “small bits of DNA that carry genes like those for carbapenem resistance” and then “quickly pass it on to other species through gene swapping.”
A ticking time bomb
The implications are stark, and pose Scylla and Charybdis scenarios for public health authorities.
For example, third generation cephalosporins have been shown to fail in treating gonorrhea in much of Europe, Australia and Japan. Given that over 1 million people are infected globally with gonorrhea, every day, this is clearly a ticking time bomb.
Nevertheless, the use of carbapenems seems ill-advised. As the Canadian Medical Association Journal noted in 2011, this is because “gonorrhea readily shares its antibiotic resistance genes”, and using carbapenems “would invariably result in increased resistance … in other microbes. This life-saving antimicrobial would then have been ‘wasted’ on non–life-threatening gonococcal infections.”
Genomics and molecular methods
The above challenges necessitate a sophisticated arsenal of tools in the microbiology lab. Fortunately, breakthroughs in biotechnology, especially DNA sequencing and genomics, have shown some paths to the future.
In 1995, clinical microbiology witnessed the launch of the genomic era when the first bacterial genome of
Haemophilus influenzae was sequenced. So far, “over 1,000 bacterial genomes and 3,000 viral genomes, including representatives of all significant human pathogens,” have been sequenced – leading in turn to “unprecedented advances in pathogen diagnosis and genotyping and in the detection of virulence and antibiotic resistance.”
Sophisticated genomic tools based on molecular array techniques have been key to this success. Backed by high-speed computing, automated platforms and bioinformatics software, they provide labs with a host of new capabilities.
Molecular methods principally address a major limitation with previous phenotypic methods: the long period (weeks or even months) required to isolate some slow-growing bacteria. Indeed, the effort on the Haemophilus influenzae genome in 1995 had taken over a year, and although the Sanger sequencing used for this had long been considered a ‘gold standard’ , it soon faced “inherent limitations in throughput, scalability, speed and resolution.”
NGS offers exponential leap in sequencing
One of the most promising new molecular techniques is next-generation sequencing (NGS), which is also referred to as second-generation sequencing or SGS.
In terms of its basic principle, NGS parallels the capillary electrophoresis (CE) used in Sanger sequencing, with DNA fragments identified by signals as they are resynthesized from a template strand. However, while CE is limited to one or a few DNA fragments, NGS uses massively parallel processes to cover millions, and sequence several human genomes in a single run within days. This allows for identification of DNA base pairs covering whole genomes, and compare genetic differences down to resolution of a single base pair.
The latest NGS systems can generate over 300 Gb per flow cell, discover SNPs and chromosomal rearrangements, conduct transcriptome analysis, generate expression profiles, detect splice variants and quantify protein-DNA interactions.
Predicting antibiotic resistance, rapidly
NGS has already been used as a frontline weapon to identify bacterial pathogens and conduct epidemiological typing to define transmission pathways and support outbreak investigations – including cholera in Haiti in 2010 and and E. coli O104:H4 in Germany in 2011.
The technique is also seen as a way to predict antibiotic resistance, complementing phenotypic tests with the high-speed investigation of anomalies in results. It also resolves some of the biggest hurdles confounding traditional techniques, which require isolating resistance from environmental samples by PCR amplification or the cloning of cultured bacteria.
Both techniques, however, ignore large reservoirs of potential antibiotic resistance. PCR is incapable of broad-spectrum screening and is generally limited to known resistance genes. A bigger problem is that several bacteria are simply not culturable. This is an especially major challenge. Given that most antibiotics are produced by environmental microorganisms, most antibiotic resistance genes are likely to also have emerged outside a clinical setting.
Along with the new field of metagenomics, NGS has been harnessed to directly address these limitations.
The promise of metagenomics
Metagenomics was developed in the late 1990s for the function-based analysis of mixed environmental DNA species – and the existence of genes or genetic variations causing resistance. Metagenomics was directly aimed to address limitations in culturing and PCR amplification.
In its early years, metagenomics was principally targeted at recovery of novel biomolecules from environmental samples. The emergence of NGS, however, opened up wholly new frontiers – above all, to allow a fraction of the DNA in the sample to be isolated and sequenced, without cloning.
Metagenomics has been used to identify a range of antibiotic resistance genes, including tetracycline, aminoglycosides, bleomycin and β-lactamase.
The fight against resistant bacteria is likely to intensify in the years to come. In May 2014, ‘Cell Biology’ published the results of “the largest metagenomic search for antibiotic resistance genes in the DNA sequences of microbial communities from around the globe.” The findings illustrate the scope of the challenge: “bacteria carrying those vexing genes turn up everywhere in nature that scientists look for them.”
Challenges ahead
NGS is no doubt going to play a major role in this process. In spite of its novelty, it has already proven to be user-friendly for clinical laboratories. Last year, researchers showed it to be capable of integrating whole genome sequencing into the routine, daily workflow of a laboratory and rapidly resolve complex bacterial populations. NGS succeeded in identifying all ‘mixed-sample’ organisms taken from primary isolation plates. These included strains with scarce reads, notwithstanding an extremely low depth of coverage across its genome.
However, there is still some way to go. Key issues include the need to establish a reference genome database to archive, access and exchange the massive amount of data which is still to be generated. Some experts have called for the database to be open and accessible to the global scientific community.
A related challenge is that of interdisciplinary skills and collaboration. The analysis of NGS data requires a variety of specialists, namely “clinical and biomedical informaticians, computational biologists, molecular pathologists, programmers, statisticians, biologists, as well as clinicians.” Given that laboratories or other single institutions are unlikely to have all these skills in-house, an open database would again be the best means to find collaborative solutions.
Limits to Moore’s Law
The greatest driver of NGS adoption will be cost. The Human Genome Project cost around USD 3 billion. NGS can reduce this to “a few thousand dollars”, and achieve it much faster.
So far, NGS seems to be one of a handful of technologies challenging Moore’s Law, which describes a long-term trend of computing power doubling every two years. Compared to $95.2 million in costs per genome estimated in autumn 2001, the figure has since fallen by over 25,000 times. The decline has however not been uniform. Typically, costs had been dropping at an average of 70% a year between 2001 and 2007. In 2008, however, they fell massively, to $342,500 per genome in October compared to $7.1 million the previous year. This was wholly due to the move from Sanger-based to NGS sequencing technologies.
Costs per genome were estimated to be just over $4,000 in January 2014.
NGS, of course, has a variety of uses. It is seen as a tool to revolutionize molecular biology, molecular epidemiology and predict the evolution of bacteria. However, the fight against the menace of antibiotic resistance is likely to be one of its highest profile uses in the years to come.
NGS for inherited skin diseases
, /in Featured Articles /by 3wmediaInherited skin diseases can be difficult to assess clinically and often diagnosis relies on multiple laboratory investigations. Traditionally, examination of skin biopsies is followed by biochemical testing and Sanger sequencing of genomic DNA. This approach is labour-intensive, costly and time-consuming. The advent of next-generation sequencing (NGS) methods provides an alternative or complementary approach to making highly accurate diagnoses, but is not without its own challenges.
by J. Lee, Dr A. Salam, Dr T. Takeichi and Prof. J. A. McGrath
Background
The identification of pathogenic mutations in monogenic diseases represents one of the major challenges, and fundamental goals, of early 21st Century human genetics. Most genetic diseases are rare, clinically heterogeneous, and difficult to diagnose – a task made more challenging by disparity in genotype–phenotype correlations, inter- and intra-familial variability, and well as mosaic patterns of disease. It is these hurdles that have led to the advent of Next-Generation DNA Sequencing (NGS); a group of technologies that can improve the speed, accuracy, and cost-efficiency of genetic sequencing, while simultaneously mapping normal variation, and thus furthering our understanding of human genetics in both health and disease. Inherited skin diseases encompass a collection of over 500 clinical entities – with variable structural or inflammatory manifestations that can also affect hair, nails, teeth and certain mucosal surfaces [1]. Individually these disorders are uncommon, but collectively they generate a significant health burden and many diagnostic conundrums.
Traditional approaches to the diagnosis of inherited skin diseases
For patients with inherited skin disorders, the traditional approach to diagnosis is to document a comprehensive patient history, including recording accurate family pedigrees, and noting any consanguinity. The clinician will then go on to perform a physical examination, take clinical photographs, and order laboratory investigations, which often include a skin biopsy. Light microscopy is usually uninformative, and the skin may need to be examined by transmission electron microscopy and immunohistochemistry. Additional blood or urine samples may be need for further diagnostic biochemical studies. Changes in skin structure or protein expression may provide clues to candidate genes, for which polymerase chain reaction primers can be designed and used for Sanger sequencing of genomic DNA. This ‘candidate gene’ approach has proved very useful for several autosomal recessive inherited skin diseases, but is typically unhelpful in most dominant diseases or in those with more subtle changes in skin morphology. Cue the advent of NGS technologies and a different approach to diagnostics, where the challenge in genetic discovery shifts away from the generation of data, to the filtering of relevant data [2, 3].
The impact of NGS
NGS encompasses a number of new technologies that vary in their sequencing protocols, thus determining the type of data produced. The approaches taken vary in template preparation, sequencing and imaging, genome alignment and assembly methods. The methodology is therefore also known as high throughput or massively parallel sequencing due to the ability of NGS to process large volumes of genetic data in a short time, in stark contrast to individual gene screening with Sanger sequencing. Whole-genome sequencing (WGS) and whole-exome sequencing (WES) are the two most commonly used NGS techniques. WGS has the ability to sequence an individual’s entire genome, but at the expense of speed and cost. In contrast, WES uses an array to capture protein-coding regions of the human genome, encompassing ~21,000 genes, which make up less than 2% of the genome. Compared with the 2–3 million variants generated by WGS, the data from WES typically reveals around 25,000 variants. Nevertheless, WES is a more economical option than WGS because ~85% of the pathogenic mutations in monogenic diseases are predicted to be in exons. The plethora of data then has to be filtered, with any potentially disease variant with evidence for causality established (Fig. 1). This process often involves the filtering of variants through databases of previously identified sequences, and cross referencing with known biological or genetic databases, for which considerable bioinformatics support is required: a single WES run can generate one terabyte of data.
Whole-exome sequencing: the possible advantages
The challenge
The key questions for WES in the diagnosis of inherited skin diseases are as follows. (1) Are the new technologies better than what already exist for diagnosing known diseases? (2) Can the new technologies be helpful in resolving unknown diagnoses or discovering new clinical entities? (3) Can the new technologies be introduced into clinical work and overcome any practical obstacles? Emerging data indicate a resounding yes to the first two questions, although the third remains work in progress [4].
Breadth of cover
WES encompasses most of the coding regions of the genome, whereas Sanger sequencing targets a predetermined gene, or part of a gene, between specially designed primers. WES is also efficient for sequencing large genes, such as COL7A1, which encodes type VII collagen. This gene, which is mutated in the blistering disease, dystrophic epidermolysis bullosa, is composed of 118 exons. Conventional Sanger sequencing approaches are based on designing ~72 primer pairs to amplify the COL7A1 exons and flanking introns. Thus the Sanger sequencing approach is therefore laborious and expensive, particularly as COL7A1 contains few recurrent mutations and the gene needs to be screened in its entirety to identify pathogenic mutations.
Genetic diagnosis
WES has emerged as an invaluable tool where a patient’s clinical diagnosis is unclear or erroneous. In this situation, Sanger sequencing of multiple candidate genes is destined to failure and to exhaust both time and resources. WES, on the other hand, can identify known variants in order to make a genetic diagnosis that was not initially considered, as has been demonstrated for subtypes of epidermolysis bullosa, and other inherited skin diseases [5, 6]. Indeed, WES has been used to accurately diagnose inherited skin diseases without any a priori clinical information [7]. The rationale is that more accurate and timely diagnoses offered by WES will allow for earlier targeted therapy and ultimately improved patient care.
Genetic discovery
The value of WES in genetic discovery is evident in the number of inherited skin diseases whose original genetic basis has been informed by WES. Recent examples include the discovery of inherited skin and bowel inflammation resulting from mutations in ADAM17 and EGFR [8, 9]. Given the protean nature of inherited skin diseases, many mutations cannot be anticipated based on clinical phenotype and initial investigations, leaving no candidate gene targets for Sanger sequencing. One pertinent example of the completely unexpected candidate gene is the identification of mutations in EXPH5 [10], which encodes a GTPase effector protein, exophilin-5, in a form of intra-epidermal epidermolysis bullosa – a disease that usually arises as a genetic disorder of keratin. WES is therefore superior to Sanger sequencing in the diagnosis of both novel and genetically heterogeneous conditions.
Cost efficiency
The cost of DNA sequencing has reduced by around 100,000-fold over the last 20 years. Although the technique remains relatively expensive at present (~£900 per sample at King’s College London, 2014 prices), further cost reductions are expected that will soon make WES a more economically viable option than Sanger sequencing, for all but a few disorders in which there are recurrent mutations in a small number of small genes. Even at current costs, however, WES already has advantages over Sanger sequencing for some genes, such as COL7A1, for which the cost of Sanger sequencing is ~£1000 (or greater) in the small number of laboratories that undertake sequencing of this gene.
Considering the patient
The diagnosis of many inherited skin disorders often relies on invasive investigations such as sampling a small piece of skin (punch or ellipse biopsy) (Fig. 2). The procedure involves injection of local anesthetic, which can be painful, and the wound usually heals with a small but evident scar. Occasionally, skin biopsy sites can be complicated by bleeding or infection. WES can be performed using DNA extracted from blood, saliva or tissue samples, and although Sanger sequencing can also be performed on similar templates, for many patients, a skin biopsy would have been necessary to determine the gene(s) for sequencing. Thus WES typically offers a less-invasive approach for the patient.
Variant mapping
Aside from discovering genes and pinpointing mutations in inherited skin diseases, WES also generates a huge amount of other data that can be used to map genetic variation. In the longer term, the dissection of bioinformatics data will lead to a better understanding of the implications of certain variants, refining genotype–phenotype correlation, thus providing insight into individual prognosis, and allowing stratified or personalized medicine and therapeutics.
Whole-exome sequencing: the possible disadvantages
Data analysis
The large quantity of sequencing data generated by WES is potentially also a disadvantage. Before WES can be used in routine clinical practice, fast and efficient filtering techniques must exist to allow clinicians and non-geneticists to interpret WES data and to extract the relevant information in order to manage their patient’s needs. But the plethora of data generated by WES also provides considerably more information beyond the pathogenic mutation itself, including several co-incidental potentially damaging mutations (known as ‘incidental findings’) that are completely unconnected to the primary disease being investigated. What should diagnosticians do with this information? Does it make a difference if the implications are clinically actionable or not? There are clearly several unresolved issues.
Accuracy of data
Given the volume of data produced by WES, it is inevitable that some false positive variants are identified. Most laboratories therefore still elect to confirm mutations via an alternative sequencing platform, generally Sanger sequencing, which is therefore a significant barrier to the routine use of WES in diagnostics. From a technical perspective, NGS methods still need to be improved to cover important regulatory elements such as promoters and enhancers, and poorly annotated parts of the genome. Moreover, if WES is to become a routine diagnostic technique, standardized operating procedures and protocols must be created and implemented. For inherited skin disease diagnostics there would also need to be a realignment of technical wet lab skills (skin microscopy) in favour of computer database and in silico work.
Time to diagnosis
Perhaps the biggest challenge for WES, however, lies in the time it takes to process and analyse a case. For many inherited skin diseases, a rapid diagnosis is often very important to optimize clinical management, for example in neonates with suspected epidermolysis bullosa. The diagnostic approach using skin biopsy assessment followed by Sanger sequencing of candidate genes (implicated by skin biopsy) allows for possible diagnoses to be made within 2 to 3 days. In contrast, the quickest time that WES could be completed (at present) would be a minimum of 5 days, although in practice WES often takes considerably longer to complete and analyse. New platforms to shorten WES protocols are in development, but only when more rapid sample analysis is feasible in a diagnostic lab setting can one really begin to think about wholesale change of diagnostic practice.
Conclusion
Since 2011, WES has proven to be a valuable asset in the diagnosis and discovery of inherited skin diseases. But the adoption of WES into clinical diagnostics diagnosis is still being refined and piloted. WES techniques are constantly being improved to become more accurate, quicker and cost-effective, while enrichment methodologies and sequencing technology become more reproducible and standardized. This progress may allow WES to function independently as the stand alone diagnostic and discovery tool in genetics, negating the need for Sanger sequencing to confirm WES findings. However, as our understanding of the role of non-coding DNA in molecular biology grows, and as WGS is further refined, WES is at risk of being superseded by newer NGS techniques both for genetic discovery diagnostics and prognostics. Innovation looms, but ever it was in molecular genetics.
References
1. Leech SN, Moss C. Br J Dermatol. 2007; 156: 1115–1148.
2. Metzker ML. Genome Res. 2005; 15: 1767–1776.
3. Metzker ML. Nat Rev Genet. 2010; 11: 31–46.
4. Cho RJ, et al. J Invest Dermatol. 2012; 132(E1): E27–28.
5. Takeichi T, et al. Br J Dermatol. 2014; doi: 10.1111/bjd.13190. [Epub ahead of print]
6. Salam A, et al. Matrix Biol. 2013; 33: 35–40.
7. Takeichi T, et al. Exp Dermatol. 2013; 22: 825–831.
8. Blaydon DC, et al. N Engl J Med. 2011; 365: 1502–1508.
9. Campbell P, et al. J Invest Dermatol. 2014; doi: 10.1038/jid.2014.164. [Epub ahead of print].
10. McGrath JA, et al. Am J Hum Genet. 2012; 91: 1115–1121.
The authors
John Lee, Amr Salam BSc, MBChB, MRCP(UK), Takuya Takeichi MD PhD, John A McGrath* MD FRCP
St John’s Institute of Dermatology, King’s College London (Guy’s Campus), London, UK.
*Corresponding author
E-mail: john.mccgrath@kcl.ac.uk
Assessment of tumour markers on the Maglumi 2000 Chemiluminescence Immunoassay System
, /in Featured Articles /by 3wmediaTumour markers have been widely used in clinical settings for early cancer detection, diagnosis, prognosis and recurrence surveillance. Due to the growing usage, it is of vital importance to assess the performance of common tumour markers on in-vitro diagnosis instruments. In this study, the most commonly used tumour markers have been selected to evaluate the performance of the SNIBE Maglumi 2000 chemiluminescence immunoassay system by comparing with our reference methods.
by Dr Xiao Hu, Dr Sheng Kang, Zhiyun Duan and Professor Guichen Zhang
Background
Tumour markers are substances that rise abnormally in the body when cancer is present. They are useful indicators for cancer risk determination, screening, diagnosis, prognosis, post-treatment surveillance and recurrence monitoring [1]. Alpha-fetoprotein (AFP) is a well established marker in liver cancer diagnosis and post-treatment monitoring [2]. Another well studied tumour marker, prostate-specific antigen (PSA), is recommended for the screening of prostate cancer with men over 50 years old [3]. Carcinoembryonic antigen (CEA) is particularly used as a tumour marker for bowel cancer. It measures the response to treatment and monitors whether the disease has revisited [4]. Elevated serum ferritin has been found in patients with pancreatic cancer, breast cancer, colon cancer, non small cell lung cancer, hepatocellular carcinoma and Hodgkin’s lymphoma [5]. Cancer antigen 125 (CA 125) is a marker commonly used for following up patients with ovarian cancer after treatment [6], while cancer antigen 15-3 (CA 15-3) is widely used for breast cancer management [7]. Cancer antigen 19-9 (CA 19-9) is the best validated marker for pancreatic cancer post-treatment evaluation [8]. Cytokeratin 19 fragment (CYFRA 21-1) and squamous cell carcinoma antigen (SCCA) are useful markers for lung cancer diagnosis in combination with other markers [9] [10]. This study has evaluated the performance of ten tumour markers on the SNIBE Maglumi 2000 chemiluminescence immunoassay system.
Precision
According to the principle and method of the CLSI EP5-A2 guideline [11], we made some adjustments to evaluate the precision of ten tumour markers. Intra-assay precision was evaluated on three different levels of serum samples. Each sample was repeatedly measured for 20 times in the same run to calculate the coefficient of variation (CV%). Inter-assay precision was assessed by repeatedly measuring three different levels of samples for 10 days with the same batch of kit. Samples were run in duplicates, two runs per day with at least 3 hours time interval to calculate the coefficient of variation. The results are displayed as mean value and CV%. Table 1 lists the precision results of ten tumour markers. The CVs of the intra- and inter-assays were less than 4.12% and 6.67% respectively (Table 1).
Method comparison
Serum samples from patients with benign diseases to various cancers were offered by the clinical laboratory of our hospital. The patient names were coded with confidentiality. The samples were measured by our reference system and the SNIBE Maglumi 2000 system to form correlation dot plots. Concordance between SNIBE Maglumi 2000 and reference systems for each tumour marker was analysed. For each tested marker, the number of serum samples is ranged from 166 to 460.
By comparing with our reference methods, good correlations were shown between the SNIBE Maglumi 2000 and the ROCHE Cobas e601 or the ABBOTT Architect i2000. The slopes for all markers were between 0.853 and 1.361 while the intercepts ranges from -2.515 to +5.138 (Figure 1A-J). Total PSA has the highest correlation between the SNIBE Maglumi 2000 and the ROCHE Cobas e601 while the lowest relevance (R2=0.981) was seen in CA 19-9 between the SNIBE Maglumi 2000 and the ROCHE Cobas e601 (Figure 1). The total coincidence rate is between 93.7% (Figure 1I) and 99.6% (Figure 1B).
Conclusion
In this study, we have evaluated the performance of the SNIBE Maglumi 2000 chemiluminescence immunoassay system via ten tumour markers. The intra-assay precision and inter-assay precision for all markers examined here are highly acceptable. By comparing with our reference methods, a high correlation has been shown for all markers tested with the SNIBE Maglumi 2000 system. The total coincidence rate is within the acceptable range for all markers examined. To conclude, the SNIBE Maglumi 2000 system is reliable for the measurement of tumour markers in clinical use.
References
1. Sturgeon CM, Hoffman BR, Chan DW, et al. National academy of clinical biochemistry laboratory medicine practice guidelines for use of tumour markers in clinical practice: Quality requirements. Clin Chem. 2008; 54 (8): e1-e10.
2. Manini MA, Sangiovanni A, Fornari F, et al. Clinical and economical impact of 2010 AASLD guidelines for the diagnosis of hepatocellular carcinoma. J Hepatol. 2014; 60 (5): p995-1001.
3. Qaseem A, Barry MJ, Denberg TD, et al. Screening for prostate cancer: a guidance statement from the clinical guidelines committee of the American College of Physicians. Ann Intern Med. 2013; 158 (10): 761-769.
4. Labianca R, Nordlinger B, Beretta GD, et al. Primary colon cancer: ESMO Clinical Practice Guidelines for diagnosis, adjuvant treatment and follow-up. Ann Oncol. 2010; 21 (S5): v70-v77.
5. Alkhateeb AA, Connor JR. The significance of ferritin in cancer: Anti-oxidation, inflammation and tumorigenesis. BBA. 2013; 1836 (2): 245-254.
6. Forstner R, Sala E, Kinkel K, et al. ESUR guidelines: ovarian cancer staging and follow-up. Eur Radiol. 2010; 20: 2773-2780.
7. Sandri MT, Salvatici M, Botteri E, et al. Prognostic role of CA15.3 in 7942 patients with operable breast cancer. Breast Cancer Res Treat. 2012; 132:317–326.
8. Duffy MJ, Sturgeon C, Lamerz R, et al. Tumor markers in pancreatic cancer: a European Group on Tumor Markers (EGTM) status report. Ann Oncol. 2010; 21: 441-447.
9. Molina R, Auge JM, Escudero JM, et al. Mucins CA 125, CA 19.9, CA 15.3 and TAG-72.3 as tumour markers in patients with lung cancer: comparison with CYFRA 21-1, CEA, SCC and NSE. Tumour Biol. 2008; 29:371–380.
10. Chu XY, Hou XB, Song WA, et al. Diagnostic values of SCC, CEA, Cyfra21-1 and NSE for lung cancer in patients with suspicious pulmonary masses: A single center analysis. Cancer Biol Ther. 2011; 11(12): 995-1000.
11. Tholen DW, Kallner A, Kennedy JW, et al. Evaluation of precision performance of quantitative measurement methods; approved guidelines-second edition, EP5 A2. 2004; 24(25).
The authors
Xiao Hu* MD, Sheng Kang PhD, Zhiyun Duan MSc, Dept of Clinical Laboratory, Shenzhen Sixth People’s Hospital, Shenzhen, Guangdong 518052, China
Guichen Zhang Professor, PhD, MD, Medical College, Shenzhen University, Shenzhen, Guangdong 518052 China
(*Corresponding author: xiao121386@163.com)
Mass spectrometry: the gold standard in clinical routine
, /in Featured Articles /by 3wmediaThe application of mass spectrometry has evolved considerably since its first use and mass spectrometric methods were initially introduced in laboratory medicine approximately 40 years ago [1]. The very recent popularity of clinical mass spectrometry can be attributed to the high specificity, accuracy and reliability due to the direct analysis of ions without the risk of cross reactivity as described for antibody detection in immunoassays [2] as well as the ability to detect multi-analytes in a single run. Initially, GC-MS was used for biological analysis, however, this method requires volatile analytes, demanding extensive extraction and derivatization steps for nonvolatile and thermally unstable compounds typically found in clinical analysis. This is not particularly attractive in a clinical setting, in contrast to LC-MS/MS which offers the advantages of mass spectrometry analysis in combination with a simpler sample preparation technique.
by Dr Nihâl Yüksekdag, Dr Marc Egelhofer and Dr Richard Lukacin
One such example is the analysis of methylmalonic acid (MMA), an important biomarker for the identification of vitamin B12 deficiency which, if left untreated, can lead in the long term to permanent neurological damage and/or to hematological and gastroenterological diseases. The sole determination of holoTC, the active form of vitamin B12, does not have the same diagnostic significance as the combined measurement of holoTC and MMA, as the MMA concentration shows a possible vitamin B12 deficiency even before the actual vitamin level decreases [3]. Traditionally, the reference method for this parameter in plasma/serum is GC-MS which, as mentioned above, requires an extremely complex sample preparation that can take several hours [4]. In contrast to this, the sample preparation for LC-MS/MS from Chromsystems is much easier, and, with just a few minutes processing time, considerably faster, while requiring only one quarter of the sample material (see table 1).
Furthermore, data from plasma and urine MMA determinations by the reference GC-MS method and the new LC-MS/MS technology show a strong correlation and excellent agreement (Fig. 1). Therefore, the described LC-MS/MS technique represents a fast, reliable and robust method for routine analysis, achieving a higher throughput and higher efficiency.
Sample preparation as a pivotal step
The correct analytical procedure from extraction and sample preparation, through to the chromatography and MS setup is a prerequisite to achieve optimal results by mass spectrometry, and to fulfil the requirements in clinical diagnostics. The development of an appropriate sample preparation procedure can be complicated and time-consuming, requiring considerable work in order to sensibly embed it in the overall analytical procedure. The ultimate goal is the enrichment of the molecule of interest by a simultaneous elimination of compounds that cause ion suppression or enhancement effects. Moreover, components from plastic, chemicals like salts or particularly from the human matrix (whole blood, serum, plasma, urine), potentially co-eluting from the LC system can compete with the analytes during the ionization process. This leads to a change in compound ionization, and consequently alters the MS signal at the detector [5]. This process is called “ion suppression” and Bonfiglio et al [6] systematically analysed these effects and have found not surprisingly that they are dependent on the sample preparation technique used as well as the compound to be analysed. More polar analytes also showed stronger effects than less polar ones. Short-term variations in ionization can also compromise the accuracy of analyses, if the method is not sufficiently robust. If these variations have a differential impact on the target analyte and internal standard, the overall analysis is affected [7]. The authors also concluded the need for calibration material to be as similar as possible to the sample matrix. In addition, the choice of an appropriate internal standard helps to reduce matrix effects; whenever possible, an isotopically labelled version of the analyte is the ideal choice.
Depending on sample specimens and analyte characteristics, sample preparation techniques can encompass liquid-liquid extraction, solid phase extraction or protein precipitation and are also crucial for the removal of materials that may contaminate the column, trap-column or the analytical system.
Considering all of these factors, successful method development where all parameters work well within at least acceptable levels of CVs, recovery and appropriate limits of quantification (LOQs) can be very challenging. Furthermore, full establishment of a method that is comprehensively validated in the laboratory is a laborious process. The use of commercially available kits, like the one mentioned above for MMA, which have gone through numerous optimization, verification and validation processes from sample preparation through to MS analysis represents a secure, robust and time-saving alternative for clinical laboratories.
Multi-analyte determination
The capability of LC-MS/MS systems for the analysis of several compounds in a single run sounds efficient and relevant, e.g. for the simultaneous analysis of drugs and their metabolites, but may not be as easy as it seems. Every single analyte in a patient sample may possess different chemical and physical properties that affect its recovery in the sample preparation procedure. Consequently, some compounds may be extracted more efficiently than others. Therefore, it can be a highly complex task with a significant amount of work to develop a general sample preparation procedure for quantification of numerous drugs and metabolites, with many of them being analysed in a single run (see Fig. 2), aimed at simplifying the laboratory workflow.
Automation for a higher throughput
One of the major challenges clinical laboratories have been facing is the simplification and acceleration of sample preparation for LC-MS/MS. By using an automated workflow potential pipetting errors can be minimized and, in parallel, the throughput can be drastically increased. This is relevant, for example, to large transplant centres that analyse a high number of patient samples for immunosuppressive drugs, but nevertheless need to achieve fast and reliable results by LC-MS/MS. To date, there is only one system on the market (MassSTAR) that allows a fully automated CE-IVD workflow for immunosuppressants including sample tracking, LIMS connectivity and clotting detection. The automated method offers a time saving of approximately 80% compared to manual preparation. A comparison between manual and automated sample preparation and measurement techniques for the four immunosuppressants cyclosporine A, everolimus, sirolimus and tacrolimus showed very high correlations (Fig. 3). Automated and manual preparation procedures therefore produce almost the same results, with automation reducing the time needed for sample extraction while also increasing sample throughput. These automation options are also provided by Chromsystems for other parameters, such as vitamin D3/D2, the immunosuppressant mycophenolic acid and antiepileptics, for which comparable correlations between the manual and the automated methods have also been shown.
A gold standard in routine
LC-MS/MS is a valuable technique that is often used in reference methods for a wide range of parameters. Its main drivers for growth in clinical laboratories are the limitations of immunoassays for low molecular weight compounds, the easier workflows and higher throughput [8]. However, there are certain downfalls that need to be addressed with one of the most, or even the most critical factor in clinical mass spectrometry being the application of an appropriate sample preparation procedure that is robust as well as reliably fulfilling analytical requirements. A number of proven and CE-IVD approved LC-MS/MS kits for sample preparation from Chromsystems are available and simplify the workflow in the laboratory. Furthermore, automation is also possible for a range of parameters, reducing hands-on time and increasing throughput for those laboratories with the need for higher throughput.
References
1. Vogeser M, Kirchhoff F (2011) Progress in automation of LC-MS in laboratory medicine. Clin Biochem 44(1): 4-13.
2. Korecka M, Shaw L, (2009) Review of the newest HPLC methods with mass spectrometry detection for determination of immunosuppressive drugs in clinical practice. Ann. Transplant 14(2): 61-72.
3. Obeid R, (2014) Methylmalonic acid – a biomarker for vitamin B12 deficiency. DIALOG 1/2014.
4. Obeid R, Geisel J, Kirchhoff F, Bernhardt K, Ranke D, Lukačin R. (2014) External validation of a novel commercially available mass spectrometry kit for MMA in serum/plasma and urine. Poster presented at the congress of the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC) WorldLab, Istanbul, Turkey.
5. Schneider H, Steimer W. (2006) Tandem mass spectrometry in drug monitoring: experience and pitfalls in application. J Lab Med 30(6): 428-437.
6. Bonfiglio R, King RC, Olah TV, Merkle K. (1999) The effects of sample preparation methods on the variability of the electrospray ionization response for model drug compounds. Rapid Commun Mass Spectrom 13(12): 1175-1185.
7. Vogeser M, Seger C. (2010) Pitfalls associated with the use of liquid chromatography-tandem mass spectrometry in the clinical laboratory. Clin Chem 56(8): 1234-1244.
8. Grebe S, Singh R. (2011) LC-MS/MS in the Clinical Laboratory – Where to From Here? Clin Biochem Rev 32: 5-31.
The authors
Nihâl Yüksekdağ PhD, Marc Egelhofer PhD*, and Richard Lukačin PhD.
Chromsystems Instruments & Chemicals GmbH, Am Haag 12, 82166 Gräfelfing, Germany
*Corresponding author, egelhofer@chromsystems.de
ACL AcuStar – Hemostasis Testing System
, /in Featured Articles /by 3wmediaThe BioPlex 2200 Celiac assay IgA kit
, /in Featured Articles /by 3wmedia