26587 Vision Sperm Rev 1

Vision Sperm – Sperm analysis

26740 ARK newSeptember2014 voriconazole 92x132

ARK Voriconazole Assay

p25 02

Ovarian reserve and beyond: AMH’s role in women’s reproductive health

The human gene for anti-Müllerian hormone (AMH) was isolated and sequenced 20 years ago [1], with the first immunoassays developed in 1990 [2,3].  Since then, our understanding of this hormone has significantly increased, with most clinical use today focusing on women’s reproductive health. AMH’s ability to reflect the number of small antral and pre-antral follicles present in the ovaries, and therefore the ovarian reserve, has led to AMH measurement being used in a wide array of clinical applications.

One of the first was as a tumour marker in the diagnosis and follow up of women with ovarian granulosa cell tumours (GCT) [4, 5]. More recently, with the dramatic improvements in the treatment of childhood cancers, attention is focused on AMH to assess the likelihood of gonadal damage and infertility after treatment.  It is also being used to investigate the toxicity of different therapeutic regimens, in the choice of those treatments, and the prediction (and potential preservation) of fertility in young women and children following cancer therapy.

Sensitive diagnostic marker for GCT
GCT accounts for 2-3% of all ovarian tumours, with two distinct types: the juvenile and the adult form. The more common adult form generally presents in women at around 50 years. A majority have endocrine manifestations as a direct consequence of hormone secretion by the tumour [6].

GCTs have the potential to secrete estradiol, Inhibin (A and B) and AMH. Inhibin and AMH are the more useful biomarkers since estradiol is only produced in 50-60% of GCT patients and is dependent on stimulation by testosterone from adjacent theca cells. While serum total Inhibin is secreted in almost all GCT and has been shown to successfully detect recurrence following surgery, it is also increased in some epithelial ovarian tumours and fluctuates significantly within the menstrual cycle. AMH is more specific to GCT as expression is limited to ovarian granulosa cells and it does not change substantially over the menstrual cycle.

Although GCT is extremely rare, it is noted for its late recurrence, usually within four-six years, but can be up to 10-20 years after removal of the primary tumour. AMH disappears within days of removal of the ovaries [7] and, following tumour resection, a rise in AMH precedes clinical detection, making it an extremely sensitive marker for the early detection of tumour recurrence.

Lane’s 1999 study followed 56 patients post operatively and showed that AMH was useful in evaluating the completeness of tumour removal [4]. In addition, serial AMH measurements were able to detect recurrence on average three months prior to clinical detection. A second study, which followed 31 patients for up to seven years, confirmed these observations [5]. This group used an AMH assay 20 times more sensitive than previously used and, when comparing both assays found discrepant values in six out of 31 patients. The more sensitive assay accurately reflected the clinical situation and was elevated up to 16 months earlier in patients with tumour recurrence.

However, there is still insufficient published information on which to assess the sensitivity and specificity of AMH for the diagnosis of GCT.  This is due to small patient numbers, the insensitivity of older assays and the lack of solid reference values in pre-menopausal women and children. The advent of more sensitive, fully automated assays will facilitate more robust studies.

Assessment of ovarian damage
The relationship between AMH and the number of small growing follicles (and therefore the number of primordial follicles or ovarian reserve) makes it useful for assessing the gonadal toxicity of cancer therapy and loss of ovarian reserve.  Levels fall rapidly with the onset of cancer treatment, with subsequent recovery dependant on degree of ovarian damage. AMH appears to identify which treatments may spare the ovaries, or are most toxic to them, and may give clinicians additional information to direct therapeutic choices in children and women of childbearing age with cancer.

Radiotherapy is a well-known cause of ovarian damage, even at low radiation levels. Women who have undergone pelvic or total body irradiation are likely to have low or undetectable AMH levels [9, 10]. The gonadal toxicity of alkylating agents is also well established. In a study involving young women with lymphoma, those receiving alkylating agents showed little or no recovery in AMH levels following treatment whereas those receiving alternative chemotherapy showed good recovery. 

Childhood cancer and fertility
Childhood cancer treatment has improved dramatically with survival rates of more than 90%.  However, the consequences of treatment may be permanent damage to the ovaries, affecting fertility. AMH is detectable in females of all ages rising steadily throughout childhood. Several studies have confirmed its role as a clinically useful marker to assess impairment of ovarian reserve in those receiving treatment for cancer  [11, 12, 13].

Brougham showed that AMH decreased during chemotherapy in both prepubertal and pubertal girls, becoming undetectable in 50% of patients; recovery occurred in the low to medium risk groups after completion of treatment, yet remained undetectable in the high risk group.  Inhibin B was undetectable in most patients before treatment and FSH showed no relationship with treatment. Thus AMH indicates a more useful  assessment of residual ovarian reserve, revealing partial loss or ovarian failure.

It is clear that a woman can suffer a significant loss of ovarian reserve without any lasting effects on her fertility, for example following removal of an ovary.  For survivors of childhood cancer this may mean that only a substantial loss of ovarian reserve would have a clinical impact. Indeed, recent work has shown that there is a high number of successful pregnancies in lymphoma survivors, despite low AMH levels [14].  In a study of 84 childhood cancer survivors they achieved pregnancy rates similar to controls despite impaired ovarian reserve [15]. However, a 10-year follow up study of childhood cancer survivors, now in their 30s, showed that the percentage of childless women in this group was greater than in the normal Danish population, particularly in the group of women who received the most gonadotoxic treatment burden. Their pregnancy rate and outcome was especially poor [16].  The truth is difficult to discern on current evidence and more work is required on long term follow up, with fertility and age at menopause as end points.

The real value of measuring AMH in young women surviving cancer would be to forecast long-term reproductive outcome and take steps to preserve their fertility.

Reproductive outcomes in adult women
The same fertility concerns exist for women of childbearing age.  Using AMH values to assess ovarian reserve and individualize risk, more invasive methods of fertility preservation may be appropriate for women with a low AMH, while those with high values for their age may decide  to start cancer treatment without delay.

Most evidence comes from breast cancer studies and is based on the assumption that a woman with a higher pre-treatment AMH before chemotherapy will be more likely to retain ovarian function. A prospective study in women with newly diagnosed breast cancer linked high levels of AMH detected before treatment with retaining long-term ovarian function five years after surgery [17]. Pretreatment serum AMH was seen to be markedly higher in women who continued to have menses. The predictive value of AMH for post-chemotherapy ovarian function has subsequently been confirmed [18] allowing the development of prediction tools combining age and AMH [18].

Individualizing breast cancer adjuvant chemotherapy
Adjuvant endocrine therapy has been shown to reduce the likelihood of reocurrence and improve overall survival rates in hormone receptor-positive (HR-positive) breast cancer. However, it appears that ovarian function after chemotherapy has direct implications on the choice of therapy.  Aromatase inhibitors (AIs) are more effective in postmenopausal women than tamoxifen [19]. However, in premenopausal women, AIs may cause a rise in estrogen levels due to reactivation of ovarian function. Consequently, even in women who have developed chemotherapy-induced ovarian failure, tamoxifen is the standard of care [20, 21].

It has been suggested that all women who are premenopausal prior to chemotherapy, even those in their late 40s and early 50s, should be treated with adjuvant tamoxifen therapy or, if they are going to receive an aromatase inhibitor, should have their ovaries removed or chemically suppressed [22]. For the latter group, these strategies are invasive and are associated with increased side effects. Consequently, being able to predict permanent ovarian failure using information other than the patient’s age is relevant.

Data from recent studies [8, 17] suggest that  pre-chemotherapy assessment of serum AMH concentrations, possibly in combination with inhibin B, may provide important information about the likelihood of developing permanent ovarian failure with chemotherapy. In addition, this could help identify a patient population in which it would be safe to treat with upfront AI monotherapy. The expanding number of studies available all add to our understanding of the role of AMH in ovarian function, its ability to predict a woman’s ovarian reserve for her fertility and the impact of cancer treatment on reproductive health.

References
1. Cate RL, Mattaliano RJ, Hession C, et al.  Isolation of the bovine and human genes for Müllerian inhibiting substance and expression of the human gene in animal cells. Cell 1986; 45, 685-698.
2. Hudson PL, Dougas I, Donahoe PK, et al. An immunoassay to detect human Müllerian inhibiting substance in males and females during normal development. J Clin Endocrinol Metab. 1990; 70, 16-22.
3. Josso et al. An enzyme linked immunoassay for anti-müllerian hormone: a new tool for the evaluation of testicular function in infants and children. JCEM 1990; 70, 23-27.
4. Lane AH, Lee MM, Fuller AF Jr, et al. Diagnostic utility of Müllerian inhibiting substance determination in patients with primary and recurrent granulosa cell tumors. Gynecol Oncol 1999; 73, :51–55.
5. Long WQ, Ranchin V, Pautier P, et al. Detection of minimal levels of serum anti-Müllerian hormone during follow-up of patients with ovarian granulosa cell tumor by means of a highly sensitive enzyme-linked immunosorbent assay. J Clin Endocrinol Metab 2000; 85, 540–544.
6. Bjorkholm E, Silfversward C. Prognostic factors in granulosa-cell tumors. Gynecol Oncol. 1981;11, 261–274.
7. LaMarca A, De Leo V, Giulini S, et al. Anti-Müllerian hormone in premenopausal women and after spontaneous or surgically induced menopause. J Soc Gynecol Invest 2005; 12, 545–548.
8. Henry, NL, Xia R, Schott AF, McConnell D, et al. Prediction of Postchemotherapy Ovarian Function Using Markers of Ovarian Reserve. The Oncologist 2014; 19, 68–74.
9. Lie Fong S, Laven JS, Hakvoort-Cammel FG, et al. Assessment of ovarian reserve in adult childhood cancer survivors using anti-Mullerian hormone. Hum Reprod 2009;24, 982–990
10. Gracia CR, Sammel MD, Freeman E, et al. Impact of cancer therapies on ovarian reserve. Fertil Steril 2012; 97, 134–140 e131.
11. Bath LE, Wallace WH, Shaw MP, et al. Depletion of ovarian reserve in young women after treatment for cancer in childhood: detection by anti-Müllerian hormone, inhibin B and ovarian ultrasound. Hum Reprod 2003; 18, 2368–2374.
12. van Beek RD, van den Heuvel-Eibrink MM, Laven JS, et al. Anti-Müllerian hormone is a sensitive serum marker for gonadal function in women treated for Hodgkin’s lymphoma during childhood. J Clin Endocrinol Metab 2007; 92, 3869–3874.
13. Brougham MF, Crofton PM, Johnson EJ, et al. Anti-Müllerian hormone is a marker of gonadotoxicity in pre- and postpubertal girls treated for cancer: a prospective study. J Clin Endocrinol Metab 2012; 97, 2059–2067.
14. Janse F, Donnez J, Anckaert E, et al. Limited value of ovarian function markers following orthotopic transplantation of ovarian tissue after gonadotoxic treatment. J Clin Endocrinol Metab 2011; 96, 1136–1144.
15. Dillon KE, Sammel MD, Ginsberg JP, et al. Pregnancy After Cancer: Results From a Prospective Cohort Study of Cancer Survivors. Pediatr Blood Cancer. 2013 Dec; 60(12), 2001-6.
16. Nielsen SN, Andersen AN, Schmidt KT, et al. A 10-year follow up of reproductive function in women treated for childhood cancer. Reprod Biomed Online 2013; 27, 192–200.
17. Anderson RA, Cameron DA. Pretreatment serum anti-müllerian hormone predicts long-term ovarian function and bone mass after chemotherapy for early breast cancer. J Clin Endocrinol Metab 2011; 96, 1336–1343.
18. Anderson RA, Rosendahl M, Kelsey TW, et al. Pretreatment anti-Müllerian hormone predicts for loss of ovarian function after chemotherapy for early breast cancer. Eur J Cancer 2013;49, 3404–3411.
19. Burstein HJ, Prestrud AA, Seidenfeld J et al. American Society of Clinical Oncology clinical practice guideline: Update on adjuvant endocrine therapy for women with hormone receptor-positive breast cancer. J Clin Oncol 2010; 28, 3784–3796.
20. Smith IE, Dowsett M, Yap Y-S et al. Adjuvant aromatase inhibitors for early breast cancer after chemotherapy-induced amenorrhoea: Caution and suggested guidelines. J Clin Oncol 2006; 24, 2444–2447.
21. Burstein HJ, Mayer E, Patridge AH et al. Inadvertent use of aromatase inhibitors in patients with breast cancer with residual ovarian function: Cases and lessons. Clin Breast Cancer 2006;7, 158–161.
22. Henry NL, Xia R, Banerjee M et al. Predictors of recovery of ovarian function during aromatase inhibitor therapy. Ann Oncol 2013; 24, 2011–2016.

The author

Sherry Faye, PhD
Director, Global Scientific Affairs,
Beckman Coulter Diagnostics
Brea, CA, USA

C167 Horizon fig2

Clinical application of NGS – ensuring quality

Advances in Next Generation Sequencing (NGS) are bringing much higher throughput and rapidly reducing costs, whilst facilitating new mechanisms for disease prediction. Consequently, the clinical applications of NGS technologies are continuing to develop, with the potential to change the face of genetic medicine [1].

by Hannah Murfet (BSc, PCQI), Product Quality Manager, Horizon Discovery

Applications of NGS in a clinical context are varied, and may include interrogation of known disease-related genes as part of targeted gene panels, exome sequencing, or genome sequencing of both coding and non-coding regions. However, as NGS moves further into the clinic, care must be taken to ensure high levels of quality assurance, rigorous validation, recording of data, quality control, and reporting are maintained. [1] [2]
Guidelines specific to NGS are beginning to emerge and to be adopted by clinical laboratories working with these technologies, in addition to those mandated by clinical accreditation and certification programmes. In this article we give an overview of the specific guidance set out by the American College of Medical Genetics and Genomics in its September 2013 report ‘ACMG clinical laboratory standards for next-generation sequencing’, and the New York State Department of Health’s January 2014 document ‘Next Generation Sequencing (NGS) guidelines for somatic genetic variant detection’.

Quality Assurance
Quality assurance (QA) in the clinical context comprises maintenance of a desired level of quality for laboratory services. Typically, quality management systems take a three tier hierarchy. At the highest level the policies define the organisation’s strategy and focus. Underneath this sit the procedures, which define and document instructions for performing business/quality management or technical activities. Underpinning both of these tiers are accurate records.
In the case of New York State Department of Health guidelines, there is clear focus on the requirement for SOPs, which can be broken down into two levels. The first level states the required flow of information, demonstrating the sequence of events, and associated responsibilities or authorities. The first level procedures are best kept at a relatively high level, and may reference more specific and detailed level two processes.
Testing sequences may be incorporated into one or more level one processes, depending on the complexity of the clinical laboratory’s operations. An overview of the typical testing sequence is shown in the figure below.
Level two processes are best documented as clear ‘how to’ guides, detailing all responsibilities, materials and procedures necessary to complete the activity. For laboratory-focused activities, validation study inputs and outputs can establish clear and consistent protocols, supporting training and laboratory operation.
Accurate record keeping should include which instruments were used in each test, as well as documentation of all reagent lot numbers. Any deviations from standard procedures should be recorded, including any corrective measures [1]. Templates may be generated to ensure consistency in output records for both testing and reporting.
In addition to documented processes, implementation of predetermined checkpoints or key performance indicators should be included to permit the monitoring of QA over time. Once established, these may act as a trigger for assay drift, operator variability, or equipment issues.
In the US, compliance to the HIPAA Act (Health Insurance Portability and Accountability Act) must be implemented to ensure traceability and protection of patient data, and many authorities mandate record retention periods, including CLIA who dictate that records and test reports must be stored for at least two years [1].
Clinical laboratories may look to further certification to ensure tight QA, such as the implementation of ISO 15189, especially in countries where no formal accreditation schemes are in place. [3]

Validation
Validation involves the in-depth assessment of protocols, tests, materials and platforms, providing confidence that critical requirements are being met. Test development and platform optimization should include factors such as determination of sample pooling parameters, and use of synthetic variants to create a strong data set, to compare tools and optimize the workflow. Validation of each entire test should be undertaken, using set conditions for sensitivity, specificity, robustness and reproducibility.  It should be noted that the first test developed may naturally carry a higher validation burden than subsequent tests developed for the same platform. Platform validation and quality management are also vital. [1,2]
Specific validation requirements for NGS as set out by the New York State Department of Health are listed below.  These guidelines may be used as a basic checklist for coverage, or to supplement more general accreditation or certification requirements, e.g. those required by CLIA or ISO 15189. [1]

  • Each reportable variant does not require confirmation every time it is encountered, as long as the variant and target area (gene) containing it was rigorously validated
  • Accuracy and validity of the bioinformatics must be demonstrated
  • Anything that is not exclusively based on a FDA-approved assay is considered to be a laboratory developed test and will require full validation over verification
  • Commercially available materials must be validated by the laboratory for use as a diagnostic tool where there are no clinical indications for use
  • Validation of a single version of all analyses software
  • Performance characteristics for each sample type must be established (e.g. FFPE)
  • Performance characteristics for each type of variant in the assay must be established, and each type of detection should be validated separately (e.g. SNV or structural variants)

Data
NGS has the potential to create huge amounts of data, meaning that accurate and efficient systems for data storage and collection are more essential than ever. Data protocols are generally established through the validation stages, then monitored at predetermined checkpoints with key performance indicators to ensure consistency and accuracy of service provision.
The list below gives an overview of NGS specific data requirements from the New York State Department of Health. [1]

Accuracy

  • Validation, including minimum 50 patient samples with representation for material type (e.g. FFPE), and variants across target areas, confirmed by an independent reference method
  • Minimum 10 positive samples for each type of variant
  • Recommended approach – sequence a well characterised reference sample to determine specificity
  • If vigorous validation of reported variants has not been completed in the original studies, ongoing confirmation by independent reference methods must be performed until at least 10 reference points have been independently validated
  • A disclaimer must be used where incidental findings of unknown significance are included, where there is no established confirmatory assay. The disclaimer must clearly state that the variant has not been verified

Robustness

  • Robustness is the likelihood of assay success. Adequate quality control measures must be in place to determine success of techniques such as extraction, library preparation or sequencing

Precision

  • Precision is related to within-run control
  • For each type of variant a minimum of 3 positive samples containing variants near the stated sensitivity of the assay must be analysed in triplicate in the same run using different barcodes
  • Renewable reference samples can be used to determine the analytical validity of the test. These can establish baseline data to which future modifications can be compared

Repeatability and Reproducibility

  • Repeatability and reproducibility is related to between-run controls, to determine ability to return identical results under identical (repeatability) or changed (reproducibility) conditions
  • For each type of variant a minimum of 3 positive samples containing variants near the stated sensitivity of the assay must be analysed in three separate runs, using different barcodes on different days, by two different technologists where possible
  • If multiplexing samples with distinct barcodes, it must be verified that there is no cross talk and that all target areas and variants are reproducible, independent of which patient/barcode combination is used
  • It is useful to consider instrument-instrument variability as well as inter-operator variability. Parameters for expected reproducibility should be established, and would typically be around 95-98%

Analytical Sensitivity and Specificity

  • Sensitivity and specificity refer to positive and negative percentage variability respectively, when compared to gold standard
  • All types of variants in three target areas with consistently poor coverage should be interrogated, as well as three target areas with consistently good coverage. These can be established with defined mixtures of cell line DNA (not plasmids), but must be verified with 3 – 5 patient samples
  • The limit of detection should be established
  • Confidence intervals for variant types must be determined

A minimum data set is expected, to establish key performance characteristics, including: base calling; read alignment; variant calling; and variant annotation.

Quality Control

In contrast to quality assurance where the infrastructure for quality is established to maintain the right service, quality control addresses testing and sampling to confirm outputs against requirements. Quality control takes place across all aspects of a process from reagents used, to software and in-assay controls.
Quality control of reagent lots is best implemented at the point of goods inspection. A clear label should be placed on the reagent under inspection, and testing performed to validate/confirm analytical sensitivity. Quality control of software updates can be handled through a version control and impact assessment process. All re-validation must be clearly documented and demonstrate consistency in analytical sensitivity.
Sample identity confirmation is essential, especially if samples are pooled. Proficiency testing protocols must be established to allow for execution as required by clinical accreditation bodies (such as CLIA). Quality control stops may be added to laboratory process before the sequencing run, to the run itself and at the end before data analysis.
Use of control materials /reagents at all stages of the sequencing procedure supports quality control. No Template Controls (NTC) should be used at all amplification steps; a negative Control should be used upon initial validation, and periodically thereafter; and a Positive / Sensitivity Control should be used in each sequencing run. [1]
Several different QC protocols may need to be followed, and quality control measures applied can vary depending on chosen methods and instrumentation, but they should always include procedures to identify sample preparation failures and failed sequencing runs. Documentation for QC protocols is best detailed in the relevant SOP.

Reports
Specific requirements around the generation, approval, issue and re-issue of reports are included as part of accreditation programmes, such as CLIA, and standards certifications, such as ISO 15189. The most essential reporting requirements related to NGS are as follows [1,2]:

  • The laboratory director is responsible for designing advantages and limitations of test offerings, ensuring healthcare providers can make informed decisions
  • Turnaround times for reports should ensure there are clear requirements for NGS test prioritisation, and should be clinically appropriate
  • All detected somatic variants should be recorded in a report, identifying each variant’s significance
  • Incidental findings including clinical relevance should be recorded
  • Limitations of the assay should be identified and reported on, including for which target areas the assay lacked sufficient coverage to confidently determine mutational status
  • Information comparing the level of exome vs. genome sequencing to an available disease specific panel test should be included


Conclusions

While complete understanding of the clinical implications of some variants is still to be fully understood, there are clear prospects emerging for NGS to support further development and adoption of companion diagnostics. As the overall picture for NGS evolves, sell-defined guidelines are being developed for everything from quality assurance to reporting.  It is expected that guidance and certification will continue to develop as NGS becomes an ever more common technology within the clinical laboratory.

References
1. American College of Medical Genetics and Genomics. (2013, September). ACMG clinical laboratory standards for next-generation sequencing.
2. New York State Department of Health. (2014, January). “Next Generation” Sequencing (NGS) guidelines for somatic genetic variant detection.
3. Horizon Discovery. (n.d.). ISO 15189: A Standard of Yin and Yang.

www.horizondx.com
26588 Vision Rev 1

V-Counter hematology analyzer

26686 Pointe Scientific PA 02111 01

CYSTANIN C reagent set

26774 SSI Diagnostica CLI2

Two-in-one urinary antigen test

26793 Acon Ad for CLI Website and Medica 2014 Template 101614

Point of Care

C123 thematic

The clinical lab and pharmacogenomics – bridging the last mile

Pharmacogenomics analyses the response of individual patients to a medicinal product, to optimize therapy, obtain maximum efficacy and minimize its side effects.
Pharmacogenomics is now increasingly accepted to encompass pharmacogenetics, which focuses only on heritable biomarkers. Unlike the latter, pharmacogenomics also includes the study of proteins and enzymes as biomarkers.
The proponents of pharmacogenomics believe it holds the key to personalized medicine, in which drugs are tailored to a patient’s unique genetic profile.

Roots of pharmacogenomics in Human Genome Project
Pharmacogenomics is among the first clinical applications of the ambitious Human Genome Project, which was completed in 2003. It has already begun making an impact on clinical medicine, and promises much more as new pharmacogenomic biomarkers are identified by increasingly versatile techniques such as single nucleotide polymorphisms (SNP), small nuclear (sn) RNA-mediation and others.

ADRs are research priority
Pharmacogenomic biomarkers are essentially DNA or RNA characteristics that measure normal biologic and pathogenic processes, as well as the pharmacologic response to drug intervention. 
The highest priority of pharmacogenomic research is to identify biomarkers for adverse drug reactions (ADRs), which account for one-fifth of all readmissions to hospital and 4% of withdrawal of new medicines.
ADRs are among the leading causes of death, with as many as 100,000 deaths a year in the US.

Pharmacogenomic labelling of drugs
There already are over 120 drugs in the US which include pharmacogenomic biomarkers in their labels.  In Europe, the number is smaller, about 35. One reason is that the European Medicines Agency, the pan-EU regulator, has limited authority in the area. This is because of the large number of drugs which have been approved by Member States (rather than the Agency), with updating of the drug label seen as their responsibility. However, “relabelling to include pharmacogenomic data does not seem to be a priority issue” for the regulatory agencies in individual EU Member States.
Pharmacogenomic labelling of drugs, nevertheless, has been standardized in both Europe and the US under three categories: ‘mandatory’ , ‘recommended’ and for ‘informative’ purposes. So far, mandatory pharmacogenomic labelling is required where clinical trials have established the basis for response. In the category of recommended use, there have been no clinical trials (so far).
Typically, biomarker labelling covers the following subjects: drug exposure and clinical response variability, risk of adverse events, genotype-specific dosing, polymorphic drug target and disposition genes.

Considerable attention has been given to biomarkers for a range of widely-used oncology products. Apart from trastuzumab, they include tamoxifen (for breast cancer therapy), irinotecan (metastatic colorectal cancer), panitumumab and cetuximab (colon cancer).
Pharmacogenomic research is also focused on a host of other drugs and drug classes: allopurinol (anti-inflammatories), flucloxacillin and amoxicillin clavulanate (anti-infectives), as well as statins and immunosuppressants.

Companion diagnostics: measuring response to therapy
Biomarkers have made it possible to sell so-called companion diagnostics alongside expensive drugs, so as to direct therapy to the most responsive patients. One of the most prominent examples is the HER-2 test, accompanying Herceptin (trastuzumab), used to fight metastatic gastric cancer. The drug costs €42,000 for a year’s treatment.
Companion diagnostics also enable identification of potential ADRs, for example tests for the HLA-B*5701 allele accompanying the anti-HIV drug abacavir and for HLA-B*1502 with the anti-epileptic carbamazepine. The latter poses a recently confirmed risk of Stevens-Johnson syndrome and toxic epidermal necrolysis (TEN) in Han Chinese and other Asians.

Drug development and relaunch
Pharmacogenomic biomarkers are becoming integrated tools in drug development, to assess pathways encoded by polymorphic genes and to identify the enzymes which lead to the formation of an active drug metabolite, before entering clinical trials. 
One new application for pharmacogenomic data is the relaunch of drugs, which have been withdrawn because of adverse events. Novartis, for example, applied in 2009 to the European Medicines Agency to use Lumiracoxib in genetically selected populations. Lumiracoxib is a prostaglandin endoperoxide synthase 2 inhibitor. It was approved to treat osteoarthritis, but was withdrawn in 2005 because of cases of DILI (drug-induced liver injury). Although retrospective genetic analyses revealed that variants of the HLA-DQ allele could predict elevated transferase levels and identify patients susceptible to DILI, Novartis withdrew its application in 2011 due to its inability to provide additional data within the timeframe specified by the Agency.

Generic drugs and pharmacogenomics
Pharmacogenomics is proving to be a weapon against generic drug imports, especially from large, low-cost producers in countries like India.
In its first-ever Recommendation, the European Society of Pharmacogenomics and Theranostics (ESPT) has called for “a harmonized approach to an updatable drug labelling of generic versions for pharmacogenomic information, as is the case for the original drug.” The ESPT cites the case of Plavix (clopidogrel), used for dual antiplatelet therapy and once the world’s second bestselling drug. Pharmacogenomic information on Plavix, it states, “reveals that genetic polymorphisms of CYP enzymes … contribute to variation in the response of individual patients.” It concludes that pharmacogenomic labelling “should be extrapolated to all medications which are marketed as both branded and generic versions.”

Clinical labs have been late entrants
The role of the clinical laboratory in pharmacogenomics broadly encompasses the following components:

  • New and expanded pharmacogenetic tests
  • Chemopredictive testing
  • Disease and risk profiling

In spite of being a frontline player in the application of pharmacogenomics, the position of the clinical lab has been relatively muted and unrecognised.
In 2000, a feature article noted that the “clinical lab has rarely been discussed within the context of pharmacogenomics.” It however argued that, in the future, “clinical labs will be looked to for genetic test development and validation, and for high-throughput genotyping of patients in clinical trials and routine testing.” It urged “both the labs themselves and the industry as a whole” to take cognisance of the fact.

Different from classical genetic testing
Lab techniques for pharmacogenomics differ significantly from classical genetic testing through chromosome analysis. Although state-of-the-art microarrays can interrogate and evaluate vast masses of alleles, the interpretation of test results into clinically meaningful data is complex, sometimes bewilderingly so.
This is because a particular gene mutation does not always result in a predictable phenotypic effect. A host of non-genetic factors can also play an influential role. Included here are the age, gender and ethnicity of a patient; so too are interactions with other drugs he or she is taking, and above all, any impairment in areas such as liver or renal function.
Usable and actionable information on such diverse factors may only emerge after adequate throughput of clinical data and the establishment of correspondences between genotypic and phenotypic markers. The sole entity which can bring such a scale to being is the clinical laboratory.
Meanwhile, physicians too are overloaded by new diagnostic information which emerges by the day, and need guidance from laboratories on how best to interpret and use the information contained in tests.

Unadopted pharmacogenomic tests
One factor that could accelerate the need for more clinical lab involvement may be pharmacogenomic tests which have not been adopted, in spite of evidence that they work. The best examples here are the enzymes VKORC1 and CYP2C9 for the anti-coagulant warfarin, UGT1A1 for the anti-cancer drug Irinotecan. In such cases, there have been concerns that diagnostic test costs may overwhelm the healthcare system, without demonstrable benefit.
At the moment, it is principally academic groups which are addressing such challenges. The price paid here is an acceptance of the fact that pharmacogenomics will only be “adopted slowly as risk-benefit data demonstrate the value of testing.”

Lab tests and healthcare spending
There is a heated debate underway in the US about laboratory testing as a source of healthcare spending growth.  In November 2013, researchers at Beth Israel Deaconess Medical Center (BIDMC) announced the results of a review of more than 1.6 million results from 46 of the 50 most common lab tests. They found that nearly one-third of all blood tests were unnecessary.”
Some experts believe that a solution to this problem might be to increase ‘useful’ tests by laboratories, above all those for pharmacogenomic biomarkers. Even if the growth of personalized medicine increases laboratory testing, they argue this will improve a physician’s ability to make highly targeted decisions about patient treatment. This, in turn, may well reduce overall healthcare spending.
Such perspectives were suggested by Ramy Arnaout, Assistant Professor of Pathology at Harvard Medical School, and lead author of the BIDMC study. He argues that “lab tests are inexpensive. Ordering one more test or one less test isn’t going to ‘bend the curve,’ even if we do it across the board. It’s everything that happens next – the downstream visits, the surgeries, the hospital stays – that matters to patients and to the economy and should matter to us.”