26579 Vision Hema Pro

Vision Hema Pro: Blood cell image analysis system

26714 Vela Ad Next gen seq EU 92x270mm 140821 PRINT

CDx solution for NSCLC, Melanoma and CRC testing

26586 Vision Karyo

Vision Karyo – Chromosome analysis

26702 CLI Optilite back cover

Introducing the future of special protein analysis

Frances1 904e28

Viral hepatitis, the silent epidemic

Globally as many people (1.5 million) die each year from viral hepatitis as from HIV/AIDS, but whereas the latter viral disease attracts government and international action and funding, the former is comparatively neglected. It was for this reason that the WHO initiated World Hepatitis Day four years ago, to be observed on the 28th July each year, and the lack of awareness about the repercussions of viral hepatitis was reflected in this year’s theme of ‘Hepatitis: think again’. So far five hepatitis viruses have been identified, though Hepatitis D is only found as a co-infection with B. Whilst the acute infections that food- and water-borne Hepatitis A and E cause are not insignificant in terms of their incidence, morbidity and mortality, it is Hepatitis B and C (HBV and HCV), that are generating a global public health crisis.
These two viral infections have major characteristics in common with HIV/AIDS. The acute infection, acquired by exposure to infectious blood and other body fluids as well as by sexual and vertical transmission, is frequently asymptomatic in the case of HCV. Acute infections can be followed by a period of clinical latency and thus the unwitting transmission of the virus to others. Though such chronic infections with HBV are very uncommon in healthy adults, they occur in over half of young children infected; between 75% – 85% of people infected with HCV develop a chronic infection. After years or even several decades of chronic, asymptomatic infection, cirrhosis of the liver and hepatocellular carcinoma can result. The WHO estimates that there are around 780,000 deaths from acute and chronic HBV infection, and more than 350,000 from chronic HCV infection annually. Even more alarming is that currently 500 million people are chronically infected with either HBV or HCV.  
As is the case with HIV/AIDS, avoiding exposure to infectious blood and semen and diagnostic testing of asymptomatic people can help to contain the global viral hepatitis epidemic. However, now the pertinent characteristics of the disease have been elucidated, it should be far more feasible to control viral hepatitis than HIV/AIDS, a disease for which there is no vaccine and no drugs that actually eradicate the virus. There is a highly effective vaccine for HBV, though approved drugs help prevent serious
liver damage but don’t eliminate the virus. Drugs are now available that can eradicate the HCV virus, and clinical trials are currently testing
a vaccine for chronically infected people.
“Hepatitis: think again”. With appropriate education and adequate national and international funding, this looming global health crisis could be averted.
 

C163 Homer Slide1 cropped

Liquid chromatography-tandem mass spectrometry: an introduction

The use of liquid chromatography-tandem mass spectrometry for clinical analysis is on the increase. This article describes what it is, why it can offer significant improvements over traditional assays and the limitations to be aware of.

by Dr N. Homer

Introduction
Clinical biochemistry laboratories frequently use radioimmunoassays (RIA) and enzyme-linked immunosorbent assays (ELISA) for analysis of blood and urine. However, these techniques are plagued by issues of cross-reactivity and are only suited to look at one analyte at a time [1]. The use of mass spectrometry (MS) techniques has increased since 2007 when the American Endocrine Society recognized the importance of tandem MS and issued a statement recommending the use of liquid chromatography-tandem mass spectrometry (LC-MS/MS) for the determination of endogenous steroid hormones over more traditional technologies such as immunoassays [2]. This has led to the widespread adoption of LC-MS/MS in clinical biochemistry laboratories, in direct response to this recommendation.

What is liquid chromatography-mass spectrometry?
Mass spectrometry is a technique that measures charged molecules or ions in the gaseous state. Samples are introduced into an ion source, ionized and then separated in a mass analyser according to their mass-to-charge ratio (m/z) and then characterized by their relative abundances. Coupled to chromatographic separation techniques such as gas chromatography (GC) or liquid chromatography (LC), MS is considered to be the ‘gold standard’ for validation of quantitative analytical assays. An overview of how a typical chromatograph-mass spectrometer is set up is shown in Figure 1.

Following separation by a chromatography system the sample is introduced into an ion source at the front end of the mass spectrometer. Ionization modes include atmospheric pressure ionization (API), such as electrospray ionization (ESI) and atmospheric pressure chemical ionization (APCI), and matrix-assisted laser desorption/ionization (MALDI). ESI is most typically used to ionize the biomolecules encountered in clinical samples.

Once ionized the mass analyser separates the ions according to their m/z. Mass analysers include magnetic or electric sectors, time-of-flight (ToF) tubes, quadrupoles and two-dimensional and three-dimensional ion traps. Softer ionization techniques, which generally leave the molecule intact, such as ESI, have led to the use of quadrupole mass analysers. These consist of four parallel rods or poles, generally of hyperbolic cross-section, through which ions are passed and separated.

Tandem mass spectrometry (often termed MS/MS) technology increases the specificity of MS significantly. There are a number of modes that a tandem mass spectrometer can be operated under, depending on the requirement of the experiment (Fig. 2). Tandem MS requires two or more mass analysers to be placed in sequence and the ions are fragmented in a collision cell to give structural information. Trace analysis of complex biological matrices is ideally suited to tandem MS instruments, operated in selected reaction monitoring (SRM) mode. In addition, linear ion traps as the third mass analyser are also increasing in popularity as they offer additional structural identification and specificity.


Sample preparation and liquid chromatography method development

Clinical samples are complex biological matrices and contain interferences that can lead to so-called matrix effects within the mass spectrometer. For validated assays, samples are prepared by addition of an internal standard followed by extraction to remove as much of the interferences as possible. The internal standard is either a closely related analogue of the compound of interest, or a stable isotope labelled version of the compound, enriched with at least two atoms of 13C, 2H or 15N.

Sample preparation methods commonly applied to clinical samples include protein precipitation with an organic solvent, liquid–liquid extraction (LLE) or solid-phase extraction (SPE). If sample clean-up is not sufficient it can lead to matrix effects, including ion suppression of the analyte, usually observed as a loss of response. This affects the detection limits, accuracy and precision of the assay. Various ion suppression tests have been developed and these are an important part of the method validation set-up required for clinical MS assays. The two most effective ways of avoiding ion suppression are improved sample extraction and optimized chromatographic selectivity.

On-line multidimensional chromatography technology allows an unextracted sample to be introduced into the chromatography apparatus and can lead to faster analysis. These systems generally consist of multi-channel switching valves, on-line SPE cartridges and analytical columns ahead of ESI-LC-MS/MS. Steroids and isoprostanes are often analysed in this manner [3, 4].

Liquid chromatography
Once prepared, a sample is introduced into the LC system which consists of a pump and an analytical column. The purpose of the chromatography system is to separate the components of the sample as much as possible, before introduction into the mass spectrometer. Analytical LC columns are stainless steel tubes that are packed with tiny silica beads. The type of LC used in clinical analysis is usually reversed-phase chromatography as the silica beads are generally chemically modified. Typically, samples are introduced onto the column in a highly aqueous phase, the analytes associate with the chemically-modified packed silica beads and are washed off the column with a high organic solvent such as methanol or acetonitrile.

Once the ionization and mass spectrometer parameters have been optimized, much of the method development falls to the chromatography and the importance of this stage should not be underestimated. It is imperative that co-eluting compounds do not interfere with the analytical peaks of interest. In recent years there has been a trend for fast analysis in LC-MS/MS; however, this does not always give a robust assay. In addition, it is important to be aware of isobaric compounds (same mass) and [M+2] isotopomers. An example of isotopomers is that of the stress hormone cortisol (m/z 363) and its inactive form cortisone (m/z 361). In an LC-MS/MS assay for cortisol it is essential to have two separate peaks for cortisol and cortisone, otherwise the risk of isotopomers of cortisone contributing to cortisol would lead to an over-estimation of cortisol in the sample (Fig. 3).

There are a number of parameters that can be altered in LC and these in turn alter the selectivity of the column, that is the order and rate at which the components elute. Parameters that can be influenced include column temperature, mobile phase pH, composition and flow rate, column dimensions, column particle size and the nature of course the chemical modification of the particles too.

Analytical LC column technology is continuously improving. The better the resolution, which is simply how well separated each peak is, the better the assay. Sub-2 µm particles have been introduced in the past decade, which generate sharp peaks and excellent resolution with improved capacity over the more traditional 3–5 µm particles. However, the smaller particle size leads to high backpressure and requires specific LC pumps that can withstand these ultra-high pressures (UHPLC, ultra-high performance LC). To reduce the need for new instrumentation, LC columns packed with fused-core particles ~2.5 µm have been developed to allow separation comparable to sub-2 µm particles. The backpressure generated by these fused-core particles is significantly less than the sub-2 µm particle columns and exclude the requirement of high-pressure capable LC pumps and fittings.

Considerations when establishing an LC-MS/MS clinical biochemistry method
As with all techniques, there are drawbacks to LC-MS/MS. The instrumentation and software can be complex and requires regular maintenance, although manufacturers are addressing this perception by introducing simpler software interfaces with dedicated instrument support for method development and even fool-proof methods, guaranteed by the provider. Also, some compounds are not amenable to ionization due to their chemical nature, but chemical modification before analysis can improve the chance of ionization efficiency, so all is not lost.

Summary
The benefits that MS offers over other traditional assay techniques have seen an increase in the number of assays using this methodology. The analysis of steroid hormones by MS is a well-documented area. Other commonly encountered uses include newborn screening for congenital metabolic diseases such as aminoacidopathies and fatty acid oxidation disorders, multi-analyte therapeutic drug monitoring, oncology drugs, anti-virals, toxicants and drugs of abuse screening and analysis of endogenous peptides [3, 4, 5].

One area that is continuing to gain interest in clinical research is high-resolution MS (HRMS) [5]. This allows for accurate mass determination over a defined mass range, which differs from the targeted analysis approach used by triple quadrupole MS. With technological improvement in the linear range of HRMS instruments to match that of triple quadrupoles, it seems likely that the benefits of HRMS will also be exploited by the clinical biochemistry field, in addition to LC-MS/MS analysis.

The range of clinical applications of MS outlined is broad and constantly expanding. Much research is being conducted in the pioneering fields of proteomics and metabolomics. In recent years the emergence of imaging mass spectrometry also offers exciting possibilities for the future and there is no doubt that MS will continue to feature heavily in the clinical biochemistry laboratory and function as an important clinical research tool.

References
1. Penning TM, et al. Liquid chromatography-mass spectrometry (LC-MS) of steroid hormone metabolites and its applications. J Ster Biochem Mol Biol. 2010; 121: 546–555.
2. Rosner W, et al. Position statement: utility, limitations, and pitfalls in measuring testosterone: an Endocrine Society position statement. J Clin Endocrinol Metab. 2007; 92: 405–413.
3. Shushan B. A review of clinical diagnostic applications of liquid chromatography-tandem mass spectrometry. Mass Spectrom Rev. 2010; 29: 930–944.
4. Chace DH, et al. Use of tandem mass spectrometry for multianalyte screening of dried blood specimens from newborns. Clin Chem. 2003; 49: 1797–1817.
5. Jiwan J-LH, et al. HPLC-high resolution mass spectrometry in clinical laboratory? Clin Biochem. 2011; 44: 136–147.

The author

Natalie Homer PhD
CRF Mass Spectrometry Core Laboratory, Queen’s Medical Research Institute, University of Edinburgh
E-mail: n.z.m.homer@ed.ac.uk

C158 Dunlop Fig 1 cropped

Measuring cigarette smoke exposure: quantification of cotinine by LC-MS/MS

Smoking is a major cause of morbidity and mortality worldwide. The adverse health effects of chronic cigarette smoke exposure are widely known. Active smoking increases the risk of developing several pathologies including pulmonary disease, cardiovascular disease and cancer. Importantly, the sequelae of smoking also extend to non-smokers via frequent passive inhalation. Accurate measures of cigarette smoke exposure then are required to draw meaningful conclusions about the healthcare risks to both smokers and non-smokers. Cotinine is the major primary metabolite of nicotine and is the biochemical marker of choice for measuring exposure to cigarette smoke.

by Dr A. Dunlop, Dr B. L. Croal and J. Allison

Background
Chronic exposure to tobacco products is amongst the leading causes of preventable morbidity and mortality worldwide, being responsible for approximately 6 million deaths per annum [1]. Typically this involves inhalation of cigarette smoke which contains in excess of 5000 different chemicals; many of these are known toxins and carcinogens [2].  Upon inhalation of cigarette smoke, nicotine is transported to the lungs within tar droplets, dissolving in the alveolar fluid, and is then absorbed into the bloodstream. Following entry into the pulmonary circulation, nicotine quickly travels to the brain – within a matter of seconds – and exerts its pharmacological effects [3].

Nicotine is the addictive component of tobacco products, stimulating dopamine release in the brain and leading to heightened feelings of pleasure and reward [4]. In active smokers this nicotine dependence sustains chronic exposure to the toxins present in cigarette smoke [5]. Active smokers are therefore at increased risk of developing multiple pathologies including pulmonary disease, cardiovascular disease and cancer [6, 7]. Importantly, non-smokers are also at increased risk via involuntary or passive/second-hand smoke (SHS) exposure [8, 9]. Children are particularly susceptible to involuntary exposure, mainly occurring in enclosed spaces such as the parental home/car, via maternal smoking or passive exposure during pregnancy [10]. The adverse health effects of SHS exposure in children include increased risk of miscarriage, sudden infant death syndrome, lower respiratory tract infections, asthma and invasive meningococcal disease [10].

In addition, an emerging area of interest surrounds involuntary exposure via so-called third-hand smoke (THS). THS is a term used to describe the deposits of tobacco smoke that accumulate on surfaces, objects and in dust particles, persisting long after the dispersal of cigarette smoke. There is some evidence to suggest that atmospheric reactions may lead to re-release of smoke-derived toxins into the environment [11]. However, the health risks of THS are not yet known and remain the subject of ongoing research [12].

Assessing cigarette smoke exposure
The healthcare risks associated with cigarette smoking and SHS exposure ensure that smoking status should always be included in any routine clinical assessment. Monitoring of smoking status may also be indicated in specific circumstances, such as epidemiological studies, smoking cessation programmes, lung transplant patients, employee and health/life insurance screening. The most convenient and cost-effective means of assessing cigarette smoke exposure is by self-report. This may occur either during face-to-face consultation with healthcare professionals or often as part of a generic healthcare questionnaire. However self-report is frequently unreliable in estimating smoking status [13].

Moreover, the risk and extent of SHS exposure to non-smokers cannot be adequately assessed using these methods. For example, self-report cannot reliably quantify exposure in those who co-habit and/or socialise with smokers nor can it inform on fetal exposure in maternal smoking. Consequently, cigarette smoke exposure should be accurately quantified by measuring biomarkers to draw meaningful conclusions between smoking status and health outcomes [14, 15].

Biomarkers of cigarette smoke exposure
Numerous biomarkers have been examined in the analysis of cigarette smoke exposure, e.g. carbon monoxide, carboxyhaemoglobin, thiocyanate and polycyclic aromatic hydrocarbons [4]. However, many are non-specific for tobacco use and contribution from other environmental or dietary sources can cause interference [4]. In contrast, nicotine is a more specific marker of cigarette smoke exposure, being derived solely from tobacco [3]. Biochemical measurements of nicotine and its metabolites then are typically used to provide reliable measures of cigarette smoke exposure. Nicotine largely undergoes hepatic metabolism (with a half-life of approximately 2 h) and the plasma of active smokers typically contains 10–50 ng/mL of nicotine [3]. Cotinine is the major breakdown product of nicotine accounting for around 80% of all metabolites [3]. The half-life of cotinine, at around 16 h, is substantially longer than nicotine and plasma levels in active smokers are approximately 250–300 ng/mL [4]. Consequently, cotinine is the preferred biomarker for measuring cigarette smoke exposure.

Quantifying cotinine in biological matrices
A variety of methods have been developed for quantification of cotinine in several biological matrices including urine, blood, saliva and hair [14, 15]. There is good agreement between cotinine levels in plasma/serum and saliva, whilst levels in urine are typically higher [15].

Immunoassay methods have traditionally been used for the detection of cotinine in urine, offering rapid turnaround with minimal sample preparation. In addition, commercially available immunoassay kits are easily integrated into most core automated analysers available in modern clinical laboratories. However, reagent costs are typically high and it would be fair to say that immunoassays may be susceptible to cross-reactivity with other nicotine and cotinine-derived metabolites and thus may be of questionable accuracy [16, 17].

Gas chromatography–mass spectrometry (GC-MS) methods are also available; although sample preparation is typically labour intensive and time consuming, proving impractical for high sample throughput. Not surprisingly, liquid chromatography-tandem mass spectrometry (LC-MS/MS) methods have emerged as the sine qua non for quantification of cotinine in biological fluids.

LC-MS/MS analyses
Liquid chromatography–tandem mass spectrometry (LC-MS/MS) affords the requisite specificity and sensitivity to detect and quantify cotinine at levels encountered throughout the spectrum of cigarette smoke exposure. The majority of recently published methods now routinely quote lower limits of quantification (LLOQ) in the region of <0.5 ng/mL, in both plasma/serum and in urine [15]. Cut-points to distinguish smokers from non-smokers have been variously proposed from 12 ng/mL down to 3 ng/mL, depending on the population [15]. Nevertheless, regular active smokers can be expected to have serum/plasma cotinine levels in marked excess of 100 ng/mL, although non-smokers are usually comfortably below 10 ng/mL. The majority of LC-MS/MS methods for cotinine have been developed in-house, an important advantage compared with immunoassay techniques. This not only affords flexibility in the choice of matrix to be analysed but also permits the inclusion of more than one analyte in the assay. Thus nicotine, cotinine and various metabolites thereof may be detected in a multiplexed assay. Published guidelines are also widely available to assist in the development and validation of LC-MS/MS methods [18]. The advent of enhanced chromatographic separation techniques, such as ultra-performance liquid chromatography (UPLC), has significantly shortened run times thereby facilitating higher sample throughput. Development of uncomplicated sample preparation procedures has further simplified analyses. For example, in our own laboratory we recently developed a rapid and straightforward UPLC-MS/MS protocol for the determination of cotinine in plasma (Fig. 1) [19]. Analytical run time was 4 min per sample with a LLOQ of 0.2 ng/mL and the assay was linear from 0.5 to 1000 ng/mL; comfortably covering the concentration range of active and non-smokers (Fig. 2). A simple 5-step automated SPE process was also developed, permitting minimal sample handling and using only water and methanol, both cheap and readily available. To date we have successfully deployed this method for the analyses of two large patient cohorts (each comprising several hundred samples) associated with independent epidemiological studies. Although the initial outlay for equipment is high, thereafter LC-MS/MS assays can be run relatively cheaply using readily available inexpensive solvents. Furthermore, sample preparation procedures can usually be streamlined/simplified and therefore easily adapted for high-throughput analyses [19]. Matrix effects, chiefly ion suppression, are a particular disadvantage of LC-MS/MS techniques; however careful consideration and troubleshooting during method development can often overcome this issue [20]. Conclusions and future directions
Despite widespread awareness of the adverse effects of tobacco use and increasing public health initiatives to combat this, cigarette smoking continues to be a major global cause of morbidity and mortality and is likely to remain so for the foreseeable future. Accurate quantification of cigarette smoke exposure via biomarkers is therefore an important measure in stratifying the risk of both active and non-smokers.

The need to quantify ever decreasing amounts of nicotine, cotinine and their metabolites in monitoring exposure to tobacco products ensures that LC-MS/MS techniques and modifications thereof remain at the forefront of detection methods in this field. Similarly, as new biomarkers become available which inform on the detrimental health effects of smoking these methods are ideally placed to keep pace, both in research and in clinical laboratories.

The recent emergence of electronic cigarette devices (e-cigarettes) is currently the subject of much debate. E-cigarettes typically deliver nicotine in a vapour generated via heating a liquid that also contains propylene glycol and other additives e.g. flavouring [21]. Exponents propose e-cigarettes as a safer alternative to smoking associated with tobacco combustion and promote the benefits for smoking cessation. However, some healthcare professionals believe that while e-cigarettes are safer, they may still act as a gateway or as a way of prolonging or even enhancing dependency on nicotine. In addition, the long-term health effects of these products are unknown, as is the need to monitor biomarkers such as nicotine and/or cotinine in so-called ‘e-smokers’.

References
1. WHO report on the global tobacco epidemic 2013. http://www.who.int/tobacco/global_report/2013/en/
2. Talhout R, et al. Hazardous compounds in tobacco smoke. Int J Environ Res Public Health 2011; 8(2): 613–628.
3. Hukkanen J,  et al. Metabolism and disposition kinetics of nicotine. Pharmacol Rev. 2005; 57(1): 79–115.
4. Benowitz NL, et al. Nicotine chemistry, metabolism, kinetics and biomarkers. Handb Exp Pharmacol. 2009; 192(192): 29–60.
5. Berrendero F, et al. Neurobiological mechanisms involved in nicotine dependence and reward: participation of the endogenous opioid system. Neurosci Biobehav Rev. 2010; 35(2): 220–231.
6. Doll R, et al. Mortality in relation to smoking: 50 years’ observations on male British doctors. BMJ (Clinical research ed.). 2004; 328(7455): 1519.
7. Jha P. Avoidable global cancer deaths and total deaths from smoking. Nature reviews.Cancer. 2011; 9(9): 655–664.
8. Scientific Committee on Tobacco and Health. Secondhand smoke: Review of evidence since 1998. Update of evidence on health effects of secondhand smoke.  Department of Health, UK 2004. http://www.smokefreeengland.co.uk/files/scoth_secondhandsmoke.pdf
9. Vardavas CI, Panagiotakos DB. The causal relationship between passive smoking and inflammation on the development of cardiovascular disease: a review of the evidence. Inflamm Allergy Drug Targets 2009; 8(5): 328–333.
10. Action on Smoking and Health. Research Report. Secondhand smoke: the impact on children. March 2014. http://www.ash.org.uk/files/documents/ASH_596.pdf
11. Sleiman M, et al. Formation of carcinogens indoors by surface-mediated reactions of nicotine with nitrous acid, leading to potential thirdhand smoke hazards. PNAS 2010; 107(15): 6576–6581.
12. Matt GE, et al. Thirdhand tobacco smoke: emerging evidence and arguments for a multidisciplinary research agenda. Environ Health Perspect. 2011; 119(9): 1218–1226.
13. Connor Gorber S, et al. The accuracy of self-reported smoking: a systematic review of the relationship between self-reported and cotinine-assessed smoking status. Nicotine Tob Res. 2009; 11(1): 12–24.
14. Florescu A, et al. Methods for quantification of exposure to cigarette smoking and environmental tobacco smoke: focus on developmental toxicology. Ther Drug Monitg. 2009; 31(1): 14–30.
15. Avila-Tang E, et al. Assesing secondhand smoke using biological markers. Tob Control 2013; 22: 164–171.
16. Schepers G, Walk RA. Cotinine determination by immunoassays may be influenced by other nicotine metabolites. Arch Toxicol. 1988; 62(5): 395–397.
17. Tate J, Ward G. Interferences in immunoassay. Clin Biochem Rev. 2004; 25(2): 105–120.
18. Honour JW. Development and validation of a quantitative assay based on tandem mass spectrometry. Ann Clin Biochem. 2001; 48(2): 97–111.
19. Dunlop AJ, et al. Determination of cotinine by LC-MS-MS with automated solid-phase extraction. J Chromatogr Sci. 2014; 52(4): 351–356.
20. Matuszewski BK, et al. Strategies for the assessment of matrix effect in quantitative bioanalytical methods based on HPLC-MS/MS. Anal Chem. 2003; 75(13): 3019–3030.
21. Grana R, et al. E-cigarettes: a scientific review. Circulation 2014; 129(19): 1972–1986.

The authors
Allan Dunlop1* PhD, Bernard Croal2 MD and James Allison2 BSc
1Department of Clinical Biochemistry Laboratory, Southern General Hospital, Glasgow G51 4TF, UK
2Department of Clinical Biochemistry, Aberdeen Royal Infirmary, Aberdeen AB25 2ZD, UK
*Corresponding author
E-mail: allandunlop@nhs.net

C162 Gilmour NGS diagram

Next-generation sequencing in clinical diagnostics and genomics research

The UK prime minister recently announced an investment package worth £300 million pounds for genomic research. This will include the sequencing of 100,000 genomes by 2017. The project, driven by Genomics England, will have a major impact on many areas of healthcare. Next-generation sequencing (NGS) technology is the method by which this sequencing will be achieved. NGS is currently being used in many healthcare services.

by Dr K. Gilmour

Background
Sequencing of the first human genome took 10 years to complete at a cost of USD300 billion. Although genomics has been recognized and hailed as the future of medicine, the costs associated with sequencing were considered prohibitive. Scientists proposed that large-scale projects would be required to decipher the secrets within each genome and how they interconnect with disease susceptibility, progression and treatment. In 2005 next-generation sequencing (NGS) became commercially available and in the 9 years since has transformed genomics beyond all recognition. Large-scale projects are now financially feasible and the potential of genomics and its link with healthcare can finally be realized.
Different NGS technologies are commercially available with Illumina and Ion torrent™ (Life Technology) probably considered the market leaders. Some NGS instruments can generate a terabase of sequence data in a single run. This equates to around 500 human genomes a week, each costing near to the USD1000 mark in reagents, a financial figure hailed as the ultimate goal. NGS is faster, more accurate and much more sensitive than traditional Sanger sequencing and will contribute directly to improvements in diagnostic medicine, personalized medicine and medical research.

An overview of NGS technology
The details of the NGS workflow differ from technology to technology but the main principle remains the same. Extracted DNA from human, animal or microbe sources, is turned into a ‘library’ of DNA. This usually involves making the large pieces of DNA smaller (fragmenting) and then adding special handles known as ‘adapter DNA’ to the ends of each of the DNA fragments (Fig. 1). Adapters are merely small pieces of DNA of known sequence, which can be used to manipulate the fragments of DNA in order to sequence them. This manipulation includes tethering the individual fragments to either a slide or a tiny bead onto which the fragment is clonally amplified producing millions of DNA molecules all of the same sequence. The whole library of different clonally amplified fragments is then sequenced simultaneously. NGS sequencing chemistry produces a detectable ‘signal’. This signal is often fluorescent, so each time a single nucleotide (A, G, C or T) is incorporated into a DNA molecule a tiny amount of light is emitted and detected. The individual sequence produced is known as a ‘read’ and once the millions of small reads in the reaction have been generated they are aligned and assembled via computer algorithms into much longer sequences. Because millions of reads are generated even molecules of low abundance can be sampled making this technique extremely sensitive. Large sequencers able to generate hundreds of human genome sequences a week can be used in high-throughput research projects. Small, fast bench-top sequencers are also available and are highly suited to the demands of a clinical laboratory.

Human genomics
Identifying the genes involved in rare disorders can help doctors to diagnose and understand the underlying cause and nature of the disease and in turn determine what treatment a patient requires. Genomics offers a global look at all genes and how they interact instead of focusing on specific genes and biochemical pathways. Sequencing the exomes (the parts of the genome that encode genes) of only a few people with a rare genetic disorder can locate the mutated gene involved [1]. Genome-wide association studies (GWAS) are also allowing researchers to identify genes associated with many common diseases and so they help predict how likely people are to suffer from specific diseases in their life-time including such things as Parkinson’s disease [2].

NGS in non-invasive prenatal diagnosis
The sensitivity of NGS makes it ideal for non-invasive prenatal diagnosis of fetal aneuploidies. Maternal blood often contains cell-free fetal DNA at very low concentrations. NGS can be used to pick up anomalies in this DNA and so a simple blood test can replace invasive techniques [3].

Personalized medicine
The ability to stratify patient responses to drugs based on the individual’s genetic content has revolutionised how drug trials are performed and the speed at which new drugs reach the manufacturing stage. In cancer medicine, determining the genetic profile of a patient’s tumour can predict which drugs the tumour will potentially respond to thus reducing the likelihood of exposure to a drug with terrible side effects and no clinical benefit [4]. Currently, tumours of many cancer types are regularly tested for individual gene mutations, the results of which determine the treatment. As research reveals further biomarkers of drug response, multiple genes will need to be tested. It is no longer cost effective to test for each of these biomarkers individually and NGS offers the ability to sequence all or part of the tumour genome. The sensitivity of NGS allows mutations to be detected in tissue that contains only a small number of tumour cells. In most hospitals tumour tissue is formalin fixed and embedded in paraffin (FFPE) before being section and mounted on slides for histopathology review. This process can often lead to DNA damage, including fragmentation, rendering the DNA useless for some molecular techniques. As NGS relies on short DNA fragments, FFPE extracted DNA can still be used [5].

NGS in microbiology
In order to prescribe the correct anti-retroviral drugs, the resistance genes of the HIV strain a patient carries are often sequenced. Sanger sequencing would require 20% of the HIV viral population to contain the drug resistance gene in order to be detected. ‘Deep sequencing’ or sequencing the genome many times using NGS can detect resistance genes even if present in less than 1% of the viral population [6]. Outbreaks of dangerous Escherichia coli strains can now be detected early and spread prevented because of the speed at which the sequencing and reconstruction of the relationships of the isolated strains can be achieved [7]. NGS continues to grow as the technology of choice in microbiology.

Possible problems with NGS
With any new technology or venture on the scale of the Genomics England ‘100,000 Genomes Project’ there are potential problems.

Data analysis
The availability of small bench top sequencers means that even small diagnostic labs will be able to use NGS. Different NGS platforms generate different types of data with differing degrees of quality. Because of the inherent errors of enzymatic driven sequencing and the variability in the sequencing signals generated, a host of clever computer algorithms are needed to determine the likelihood of every base in the sequence being correct. The algorithms used to do these analyses are often sold packaged as software or analysis pipelines and are designed by in-house bioinformaticians. With the misinterpretation of sequence data carrying such dire consequences, robust data analysis is paramount. Illumina will be the technology used for all the sequence data generated by the 100,000 Genome Project so all data will likely be handled, processed and analysed in a very similar manner leading to reproducible and robust results. Other clinical laboratories entering into the sequencing revolution will be bombarded with options of technology as well as analysis methods. Clinical laboratories in most countries adhere to a set of rigorous assessments and standards and all clinical tests must be fully validated. Validation of NGS is complicated but best practice guidelines are aiming to simplify the process. ‘Targeted sequencing’, where panels of only a few to a few hundred clinically relevant genes are sequenced makes validation and analysis easier. Unifying analysis processes will remain an important consideration in the future.

Data storage and security
The 100, 000 Genome Project will produce petabytes of data, but even small diagnostic labs will be producing large quantities of data. Targeted gene panels will help but data storage could still be an issue. NGS generates sequence files and associated raw data files and deciding what should be stored and discarded is debated. The Royal College of Pathologists guidelines recommend that data and records pertaining to pathology tests be retained for a minimum of 25 years. DNA sequence is of a highly sensitive nature as even without patient details attached, it contains all the information to link it the individual from which it was taken. Secure storage of DNA sequence with compression and encryption is an important consideration. The Medical Research Council in the UK has earmarked £24 million pounds of the Genomics England funding for computing power, including analysis and secure storage.

Ethical implications
The mainstream adoption of any new technology has ethical implications. Whilst sequencing a patient’s tumour to determine a cancer treatment plan another gene mutation could be identified, unrelated to the condition being treated. In the UK all patients must consent to any germ-line genetic test. Genetic counselling is offered and patients are helped to come to terms with the implications of the findings. Serendipitous discoveries have the potential to create many ethical dilemmas for clinicians.

The future: a learning healthcare system
Although powerful, medical genomics has so far not had the major impact on healthcare predicted at the time of the release of the first human genome. The 100,000 Genome Project will change that. The project hopes to link up genomic data with the medical records for each patient. This means that research data can be actively generated as the project persists. Every person consenting to the project will be a walking research project from which we can learn important lessons about treatment and response [8]. This could transform our UK healthcare system into a learning environment like no other in the world. It will generate the evidence on which future improvements can be made. With strong collaborative partnerships set up with Illumina, the Wellcome Trust Sanger institute, Medical Research Council, and Cancer Research UK to name but a few, this the Genomics England project has the potential to be a great success.
So-called ‘third generation sequencing’ technology is already a reality and NGS sequencing chemistries are continually evolving and improving. Although it is unlikely in the very near future that every person in the country will have their genome sequenced, NGS is still contributing massively to healthcare improvements in genomics and other clinical diagnostic areas.

References
1. Boycott KM, Vanstone MR, Bulman DE, MacKenzie AE. Rare-disease genetics in the era of next-generation sequencing: discovery to translation. Nat Rev Genet. 2013; 14(10): 681–691.
2. Nalls MA, Pankratz N, Lill CM, Do CB, Hernandez DG, Saad M, DeStefano AL, Kara E, Bras J, et al. Large-scale meta-analysis of genome-wide association data identifies six new risk loci for Parkinson’s disease. Nat Genet. 2014; doi: 10.1038/ng.3043. [Epub ahead of print].
3. Nepomnyashchaya YN, Artemov AV, Roumiantsev SA, Roumyantsev AG, Zhavoronkov A. Non-invasive prenatal diagnostics of aneuploidy using next-generation DNA sequencing technologies, and clinical considerations. Clin Chem Lab Med. 2013; 51(6): 1141–1154.
4. Jackson SE, Chester JD. Personalised cancer medicine. Int J Cancer 2014; doi: 10.1002/ijc.28940. [Epub ahead of print].
5. Fairley JA, Gilmour K, Walsh K. Making the most of pathological specimens: molecular diagnosis in formalin-fixed, paraffin embedded tissue. Curr Drug Targets 2012; 13(12): 1475–1487.
6. Gibson RM, Schmotzer CL, Quiñones-Mateu ME. Next-generation sequencing to help monitor patients infected with HIV: ready for clinical use? Curr Infect Dis Rep. 2014; 16(4): 401.
7. Veenemans J, Overdevest IT, Snelders E, Willemsen I, Hendriks Y, Adesokan A,Doran G, Bruso S, Rolfe A, Pettersson A, Kluytmans JA. Next-generation sequencing for typing and detection of resistance genes: performance of a new commercial method during an outbreak of extended-spectrum-beta-lactamase-producing Escherichia coli. J Clin Microbiol. 2014; 52(7): 2454–2460.
8. Ginsburg G. Medical genomics: gather and use genetic data in health care. Nature 2014; 508(7497): 451–453.

The author
Katelyn Gilmour PhD
Molecular Pathology, Dept. Laboratory Medicine, Royal Infirmary of Edinburgh, Edinburgh EH16 4SA, UK
*Corresponding author
E-mail: Katelyn.gilmour@nhslothian.scot.nhs.uk