Labs working to combat Covid-19 will benefit from this initiative, as CytoSMART aims to reduce the huge workload currently facing researchers on projects vital to controlling the disease.
CytoSMART’s unique and compact live-cell microscope films living cell cultures without disturbing their growth or behaviour. The device operates from inside cell culture incubators and is accessible from an online environment. This enables researchers to analyse their cell cultures remotely and assess e.g. the cytopathic effect, which is caused by virus replication. Using the CytoSMART Lux2, researchers will know when to take action for the next step and harvest the virus.
“We aim to do our part to assist researchers in minimizing the time they have to spend in high-contamination labs, by providing them with remote video access to evaluate the status of their cell cultures. The video data is used to remotely monitor the cytopathic effect, this way researchers know when it’s the right time to harvest the virus.” – Joffry Maltha, CEO at CytoSMART Technologies.
According to guidelines by the CDC and the WHO, isolation and characterization of Covid-19 should be performed in BSL-3 laboratories. Performing research in Biosafety Level 3 and 4 laboratories (BSL-3 or BSL-4) means working in a highly controlled area. Many precautionary measures must be taken to ensure the safety of researchers and help prevent the diseases they are working with from spreading outside the lab. Removing and replacing the protective clothing and apparatus can be time consuming and expensive, so entering the lab should ideally only occur when absolutely necessary.
Maltha commented: “We need to help scientists who are working in BSL-3 and BSL-4 laboratories to combat Covid-19. We know that our system can help researchers in monitoring cell growth and deciding when they need to go to the high containment labs and run further experiments.
https://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png003wmediahttps://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png3wmedia2020-08-26 09:31:372021-01-08 11:07:53Dutch company CytoSMART Technologies is to donate 100 mini live-cell imaging systems to researchers in high containment labs worldwide
The advent of molecular biology techniques has revolutionized disease diagnosis. CLI discussed with Dr Chandrasekhar Nair from Molbio Diagnostics the benefit that these techniques have brought and how these technologies are being adapted for point-of-care use for rapid diagnosis and the benefit of rural populations.
What has the impact of molecular biology been on disease diagnosis and treatment?
Accurate and timely diagnosis of infectious diseases is essential for proper medical management of patients. Early detection of the causative agent also enables care providers to intervene in a precise rather than presumptive manner and institute adequate measures to interrupt transmission to the susceptible population in the hospital or community.
The conventional diagnostic model for clinical microbiology has been labour and infrastructure intensive and frequently requires days to weeks before test results are available. Moreover, due to the complexity and length of such testing, this service was usually directed at the hospitalized patient population. Bacterial/viral culture has been – and continues to be – the gold standard for detection. However, time taken for some pathogens to grow, coupled with the difficulty in culturing some pathogens has resulted in a demand for alterna tive techniques that would allow direct pathogen detection in clinical samples rapidly.
The application of engineering techniques to the technological revolution in molecular biology has greatly improved the diagnostic capabilities of modern clinical microbiology laboratories. In particular, rapid techniques for nucleic acid amplification and characterization combined with automation and user-friendly software have significantly broadened the diagnostic arsenal. Among the molecular techniques, applicability of PCR-based methods has gained popularity as it allows for rapid detection of unculturable or fastidious microorganisms directly from clinical samples.
Clinical laboratories are increasingly finding utility of molecular techniques in diagnosis and monitoring of disease conditions. Nucleic acid amplification tests are becoming very popular in the diagnosis and management of viral infections [hepatitis B and V viruses (HBV, HCV), human immunodeficiency virus (HIV), influenza virus, etc] because they allow determination of the viral load. In most cases, they are now considered a reference, or gold standard method for diagnostic practices such as screening donated blood for transfusion-transmitted viruses [cytomegalovirus (CMV), HIV, HCV, etc]. Another important case is the use of molecular tests for the detection of the tuberculosis (TB)-causing bacterium Mycobacterium tuberculosis (MTB). Considering the limited sensitivity of smear microscopy, coupled with the steady rise in drug-resistant MTB, rapid molecular tests appear promising.
What are the challenges of implementing molecular diagnostic techniques in developing countries?
For a long time the field of molecular diagnostics has been limited to the domain of large centralized laboratories because of its dependency on complex and expensive infrastructure, highly skilled manpower and special storage conditions. This investment has also resulted in the need for batch testing to make such facilities affordable. As a result, patients and samples need to travel long distances for a test to be conducted and results are delayed, resulting in a loss of follow-up. These factors have led to a concentration of such facilities in urban centres, and poor reach of molecular diagnostics techniques, particularly in low and middle income countries (LMICs). The poor testing rates in the current COVID-19 pandemic are evidence of such dependence on centralized facilities, limiting the ability to test on demand and take appropriate action.
The lack of timely access to good diagnostics resulting in either delayed or inaccurate diagnosis by other methods has been increasingly resulting in spread of disease and poor treatment outcomes.
How can these challenges be overcome?
We need to increase the reach of molecular diagnostics techniques. Given the economic constraints in LMICs, point-of-care technology (POCT) hold a lot of promise and several major global initiatives are devoted to providing such devices. Facilities for testing that can be deployed, set up and run quickly, at affordable costs, with minimal infrastructure requirements and training are critical to the success of the efforts to increase reach. Mobile data coverage, that exists with reasonable density in LMICs, could also be leveraged for better programme management and hotspot detection.
The success of these technologies also depend on uncompromised performance and adherence to quality standards.
Furthermore, designers of POCT devices need to focus on key user requirements which include: (1) simplicity of use; (2) robustness of reagents and consumables; (3) operator safety; and (4) easy maintainability.
What is Molbio Diagnostics doing to meet these demands?
The Truelab® Real Time Quantitative micro PCR System from Molbio Diagnostics brings PCR technology right to the point of care, at all laboratory and non-laboratory settings, primary centres, in the field, near patient – essentially at all levels of healthcare, thereby decentralizing and democratizing access to molecular diagnostics. With a large and growing menu of assays for infectious diseases, this rapid, portable technology enables early and accurate diagnosis and initiation of correct treatment right at the first point of contact. The platform is infrastructure independent and provides complete end-to-end solution for disease diagnosis. With proven ability to work even at primary health centres and with wireless data transfer capability, this game changing technology brings in a paradigm shift to the global fight in control and management of devastating infectious diseases.
Under the aegis of the Council of Scientific and Industrial Research and New Millennium Indian Technology Leadership Initiative partnership, Bigtec Labs (research and development wing of Molbio Diagnostics Pvt. Ltd.) has developed a portable and battery-operated micro PCR system that has since been extensively validated [under the Department of Biotechnology and Indian Council of Medical Research (DBT & ICMR)]. Bigtec has also developed various tests and nucleic acid preparation devices to facilitate ‘sample to result’ molecular diagnostics in resource limited settings. The micro PCR system has since been launched in India through the parent company, Molbio Diagnostics, which has its manufacturing and marketing base in Goa, India.
The system works on disease specific Truenat™ microchips for conducting a real-time PCR. The sample preparation (extraction and purification) is done on a fully automated, cartridge-based Trueprep® AUTO sample prep device. The purified nucleic acids are further amplified on the Truelab® Real Time Quantitative micro PCR System which enables molecular diagnostics for infectious diseases at the point of care.
This compact battery-operated system has single testing capability and provides sample to result within 1 hour. Hence, it enables same-day reporting and initiation of evidence-based treatment for the patient.It also has real-time data transfer capability (through SMS/email) for immediate reporting of results in emergency cases. Physicians benefit from this technology by having a definitive diagnosis, early in the infection cycle, without patients/samples having to travel extensively to centralized facilities.
The Truelab® Real Time Quantitative micro PCR System from Molbio Diagnostics is a cost-effective and sensitive device that can detect diseases accurately with high specificity. The device is battery-operated and portable. This offers the additional advantage of placing the device in almost any kind of laboratory setting, unlike other devices that require uninterrupted power supply, elaborate infrastructure and air-conditioning.
Considering our platform’s potential to perform molecular diagnostics for infectious diseases at the point of care, India has initiated screening for COVID-19 using the Truenat™ Beta CoV test available on the Truelab® Real Time Quantitative micro PCR System. This will allow same-day testing, reporting, and initiation of patient isolation, if required – thereby reducing the risk of infection spreading while waiting for results.
The successful translation of our innovative concept into a product was made possible by Molbio’s multi-disciplinary workforce – with a constant mission to enable better medicine through precise, faster, cost-effective diagnosis at the point of care; to provide every patient access to the best healthcare through cutting edge technologies. Molbio aims to be a leading global player in the point-of-care diagnostics arena by continuing to innovate and bring new technologies for social betterment.
The company is based in India – how does this affect what you do, how is the clinical lab diagnostics industry developing in India and does it create more chances for you?
In India, we have over between 45¦000–50¦000 in vitro diagnostic laboratories – every one of which uses routine conventional diagnostic methods. Only a handful of them have adopted molecular diagnostic testing for reasons mentioned above. But this is changing with the advent of Molbio’s Truelab® platform, with regular standalone laboratories that were, up to now, outsourcing molecular testing, starting to perform the tests themselves. In the short span of a few years, Molbio has established itself as a company focused on making a significant impact in aiding infectious disease diagnostics worldwide with our extensive testing menu.
Our test range covers infectious diseases such as TB, the entire hepatitis range, High risk HPV, H1N1, along with the recent addition of tests for COVID-19, catering to a large population base and addressing diseases with a very significant global mortality percentages. Our rapid test development for Nipah virus and the leptospirosis-causing Leptospira bacteria show our commitment to neglected tropical diseases. Going forward, Molbio will continue to increase the assay range looking at the needs of the global LMIC geography.
The Truenat™ MTB and MTB-RIF tests have started playing a significant role in India’s mission to becoming TB-free by 2025. We would be happy to partner with other National TB Programmes in achieving sustainable development goals well before 2030.
Our vision has always been ‘innovate to have a real impact’ and hence Molbio will continue to bring in newer POCT platforms so that the benefits of science and technology reach the masses. The interviewee Dr Chandrasekhar Nair, BE, PhD, chief technical officer, Molbio Diagnostics
For further information visit Molbio Diagnostics (http://www.molbiodiagnostics.com)
https://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png003wmediahttps://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png3wmedia2020-08-26 09:31:372021-01-08 11:07:53Benefits of molecular biology in clinical diagnostics
by Dr Jacqueline Gosink Gastrointestinal complaints are very common and can be difficult to diagnose. Among the many causes are genetic deficiencies in digestive enzymes. Molecular genetic analysis of polymorphisms in the patient’s DNA can determine if inborn enzyme deficits are behind the digestive problems, aiding differential diagnostics. Primary lactose intolerance, for example, is associated with polymorphisms in the regulatory region of the lactase gene (LCT), whereas hereditary fructose intolerance (HFI) is caused by mutations in the aldolase B gene (ALDOB). A PCR-based DNA microarray provides parallel determination of the two main lactose intolerance-associated polymorphisms (LCT‑13910C/T and LCT‑22018G/A ) as well as the four HFI-associated mutations (A149P, A174D, N334K and del4E4). The fast and simple determination includes fully automated data evaluation, ensuring highly standardized results.
Lactose intolerance
Primary lactose intolerance is a genetically caused deficiency of lactase, the enzyme responsible for splitting lactose into its constituent sugars glucose and galactose. In affected patients, undigested lactose is fermented in the ileum and large intestine, producing by-products such as short-chain fatty acids, methane and hydrogen, which cause the typical symptoms of abdominal pain, nausea, meteorism and diarrhea. Secondary manifestations include deficiencies, for example of vitamins, and as a result unspecific symptoms such as fatigue, chronic tiredness and depression.
Lactose intolerance represents the natural state in mammals. Lactase activity decreases after weaning and in adulthood is often only a fraction of the activity in infancy. Some humans, however, retain the ability to metabolize lactose into adulthood due to specific genetic variants. The frequency of lactase persistence is around 35% worldwide, although it varies greatly between different population groups. It is prevalent in regions with a long tradition of pastoralism and dairy farming, for example in Europe and in populations of European descent. In large parts of eastern Asia, on the other hand, almost 100% of the population is lactose intolerant.
In addition to the primary genetically caused form of lactose intolerance there is also the secondary acquired form. This develops as a result of damage to the intestine, for example from other gastrointestinal diseases such as Crohn’s disease, coeliac disease, infectious enteritis or injury from abdominal surgery. The two forms need to be distinguished diagnostically because of the need for different treatment regimes. Whereas individuals with primary lactose intolerance must adhere to a lactose-free or low-lactose diet for life or alternatively take lactase supplements, those with secondary lactose intolerance need only restrict their dairy intake until the intestinal epithelium has regenerated through treatment of the underlying cause. Diagnostics of lactose intolerance Classic diagnostic tests for lactose intolerance are the hydrogen breath test and blood glucose tests, with which the patient’s ability to metabolize lactose is examined. However, these tests have a low specificity and sensitivity and are influenced by individual factors such as the composition of intestinal flora, colonic pH, gastrointestinal motility and sensitivity to lactose fermentation products. Moreover, they cannot distinguish between the primary and secondary forms of lactose intolerance. Molecular genetic testing complements these methods, enabling verification or exclusion of primary lactose intolerance with high probability, as well as differentiation of the primary and secondary forms. Genetic testing is, moreover, a non-invasive and more comfortable examination, which does not carry the risk of provoking symptoms of lactose intolerance in non-lactase-persistent individuals. LCT polymorphisms The main mutations associated with lactase persistence are LCT‑13910C>T and LCT‑22018G>A, which are located in the regulatory region of the lactase gene. According to current knowledge, homozygous carriers of the wild-type variants LCT‑13910CC and LCT‑22018GG develop lactose intolerance, while heterozygous carriers of the variants LCT‑13910CT and LCT‑22018GA only show corresponding symptoms in stress situations or with intestinal infections. Homozygous carriers of the mutant variants LCT‑13910TT and LCT‑22018AA are lactose tolerant as adults. These two polymorphisms are strongly coupled.
Hereditary fructose intolerance
HFI is caused by mutations in the gene for aldolase B, an enzyme essential for fructose metabolism. The mutations result in a reduction or loss in activity or stability of aldolase B, which is responsible for catalysing the breakdown of fructose-1-phosphate (F-1-P) to dihydroxyacetone phosphate and glyceraldehyde. The toxic intermediate F-1-P then accumulates in the body, causing symptoms such as nausea, vomiting and digestive disorders and in the longer term liver damage. HFI is a rare disease, occurring, for example, with a prevalence of 1 in 20¦000 in Europe. It manifests already in childhood, but may remain undiagnosed due to patients’ natural dislike of sweets, fruits and vegetables.
In addition to HFI, intolerance to fructose can also be caused by deficits in the transport of fructose into the enterocytes. This form is known as intestinal fructose intolerance or fructose malabsorption. It is much more common than HFI, occurring with a prevalence of about 30%. It is important to distinguish HFI from fructose malabsorption, because of the resulting difference in dietary requirements. Patients with HFI must completely eliminate fructose and its precursors (e.g. sucrose, sorbitol) from their diet to prevent damage to their organs. Patients with fructose malabsorption, however, should follow a fructose-restricted diet. Diagnostics of HFI Intolerance to fructose is usually diagnosed by means of the hydrogen breath test, in which a defined amount of fructose is ingested and then the amount of hydrogen in the exhaled air is measured. In patients with HFI, however, the intake of fructose carries the risk of a severe hypoglycaemic reaction. Therefore, a molecular genetic test for HFI should always be performed before a fructose load test. Early diagnosis of HFI is particularly important to avoid permanent damage to the liver, kidney and small intestine. ALDOB mutations In Europe the most frequent mutants associated with HFI are the amino acid substitutions A149P, A174D, N334K (in Human Gene Mutation Database nomenclature) and the deletion del4E4 in the aldolase B gene. For HFI to manifest, both alleles of an individual’s DNA must be affected by a mutation. In homozygous genotypes, the two alleles contain the same mutation (paternal and maternal inheritance). If the two alleles exhibit different mutations, this is referred to as a compound heterozygous HFI genotype.
Parallel genetic analysis
Molecular genetic determination of the polymorphisms associated with lactose intolerance and HFI enable diagnosis of these genetic conditions with high certainty. The EUROArray Lactose/Fructose Intolerance Direct enables simultaneous detection of the lactose-intolerance-associated polymorphisms ‑13910C/T and ‑22018G/A and the HFI-associated mutations A149P, A174D, N334K and del4E4. Thus, the two genetically caused metabolic disorders can be assessed with a single test.
he test can be performed on whole blood samples, eliminating the need for costly and time-consuming DNA isolation. In the test procedure (Fig. 1), the sections of DNA containing the alleles are first amplified by multiplex PCR using highly specific primers. During this process the PCR products are labelled with a fluorescent dye. The PCR mixture is then incubated with a microarray slide containing immobilized DNA probes. The PCR products hybridize with their complementary probes and are subsequently detected via the emission of fluorescence signals. The data is evaluated fully automatically using EUROArrayScan software (Fig. 2), and in the case of positive results, homozygous and heterozygous states are differentiated. Numerous integrated controls ensure high reliability of results, for example, by verifying that there are no other rare mutations in direct proximity to the tested positions which could interfere with the analysis.
Studies on blood donors
The performance of the EUROArray was investigated using 116 precharacterized samples from blood donors in Germany and from quality assessment schemes. The EUROArray revealed a sensitivity of 100% and a specificity of 100% with respect to the reference molecular genetic method.
Conclusions
Diagnosis of gastrointestinal disorders often involves a long and challenging process of diagnostic tests and restrictive diets. Since lactose and fructose are widely consumed in many diets, it is important to consider intolerance to these sugars during the diagnostic work-up. Simple genetic analysis enables primary lactose intolerance and HFI to be confirmed or excluded as the cause of gut problems. The parallel analysis offered by the EUROArray enables especially fast and effective diagnostics. Patients diagnosed with these genetic conditions can promptly adapt their diets to ease their symptoms. If the analysis is negative, the physician can focus on searching for other causes of the digestive complaints. The molecular genetic analysis thus provides valuable support for the gastroenterology clinic. The author Jacqueline Gosink PhD EUROIMMUN AG, 23560 Lubeck, Germany
https://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png003wmediahttps://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png3wmedia2020-08-26 09:31:372021-01-08 11:07:53Parallel genetic testing for primary lactose intolerance and hereditary fructose intolerance
by Prof. Godfrey Grech, Dr Stefan Jellbauer and Dr Hilary Graham Understanding the molecular characteristics of tumour heterogeneity and the dynamics of progression of disease requires the simultaneous measurement of multiple biomarkers. Of interest, in colorectal cancer, clinical decisions are taken on the basis of staging and grade of the tumour, resulting in highly variable clinical outcomes. Molecular classification using sensitive and precise multiplex assays is required. In this article we shall explain the use of innovative methodologies using signal amplification and bead-based technologies as a solution to this unmet clinical need. Introduction Cancer is the leading cause of death globally, accounting for 9.6-million deaths in 2018, with 70% of cancer-related mortality occurring in low- and middle-income countries. In 2017, only 26% of low-income countries provided evidence of full diagnostic services in the public sector, contributing to late-stage presentation [1]. There are various aspects that negatively affect the survival rate of patients, including but not limited to: (a) highly variable clinical outcome mainly due to lack of molecular classification; (b) treatment of advanced stage of the disease mainly due to lack of, or reluctance to, screening programmes, resulting in treatment of symptomatic disease that is already in advanced stage; (c) heterogeneity of the tumours that are undetected using representative biopsies of the tumour at primary diagnostics; and (d) lack of surveillance of patients to detect early progression of disease and metastasis, mainly due to clinically inaccessible tumour tissue and the need of sensitive technologies to measure early metastatic events.
Colorectal cancer (CRC) represents the second most common cause of cancer-related deaths, with tumour metastasis accounting for the majority of cases. To date, treatment decisions in CRC are based on cancer stage and tumour location, resulting in highly variable clinical outcomes. Only recently, a system of consensus molecular subtype (CMS) was proposed based on gene expression profiling of primary CRC samples [2]. Organoid cultures derived from CRC samples were used in various studies to adapt the CMS signature (CMS1–CMS4) to preclinical models, to study heterogeneity and measure response to therapies. Of interest, the epidermal growth factor receptor (EGFR) and receptor tyrosine-protein kinase erbB-2 (HER2) inhibitors were selective and have a strong inhibitory activity on CMS2, indicating that subtyping provides information on potential first-line treatment [3]. In CRC, copy number variations are associated with the adenoma-to-carcinoma progression, metastatic potential and therapy resistance [4]. Our recent studies using primary and matched metastatic tissue showed that TOP2A (encoding DNA topoisomerase II alpha) and CDX2 (encoding caudal type homeobox 2) gene amplifications are associated with disease progression and metastasis to specific secondary sites. Hence, introducing robust and clinically-friendly molecular assays to enable measurement of multiple biomarkers to assess matched resected material and tumour-derived cells or cell vesicles in blood during therapy and beyond, has become a necessity to overcome this deadly toll. In addition, to support diagnostics in remote countries, the assays should allow measurement in low input, low quality tissue material.
To enable precise future diagnosis and patient classification and surveillance, we developed innovative methodologies (Innoplex assays) measuring expression of multiple marker panels representing the primary tumour heterogeneity and the dynamic changes associated with disease progression. We optimized these Molecular Diagnostics Sensitive and precise multiplex assays enable accurate classification and surveillance of tumours April/May 2020 21 | methodologies for multiplex digitalized readout using various sample sources ranging from archival formalin-fixed paraffinembedded (FFPE) tissues and characterization of gene amplifications in blood-derived exosomes. In this article we summarize the Innoplex assays based on the xMAP Luminex Technology and the Invitrogen QuantiGene™ Plex Assay, the research outputs from the University of Malta in terms of the biomarker panels and the commercialization of the assays through Omnigene Medical Technologies Ltd. Molecular profiling technology and workflow The Innoplex multiplex assays are based on two components, namely (a) the integration of the Invitrogen QuantiGene™ Plex Assay (Thermo Fisher Scientific) and the xMAP Luminex technology enabling multiplexing of the technique, and (b) the novel panel of biomarkers developed by the Laboratory of Molecular Oncology at the University of Malta, headed by Professor Godfrey Grech. The technologies and the research output provides the versatility of the assays. To date a breast cancer molecular classification panel and a CRC metastatic panel were developed and are currently being optimized for the clinical workflow by Omnigene Medical Technologies Ltd through the miniaturization and automation of the RNA-bead plex assay.
The Innoplex RNA-bead plex assays use the Quantigene branched- DNA technology that runs on the Luminex xMAP technology. Specific probes are conjugated to paramagnetic microspheres (beads) that are internally infused with specific portions of red and infrared fluorophores, used by the Luminex optics (first laser/ detector) to identify the specific beads known to harbour specific probes. The Quantigene branched-DNA technology builds a molecular scaffold on the specifically bound probe-target complex to amplify the signal that is read by a second laser/LED [5].
The workflow of the assay can be divided into a pre-analytical phase involving the lysis/homogenization of the tissue or cells, and the analytical phase that involves hybridization, pre-amplification and signal amplification with a total hands-on time of 2|h. This is comparable to the time required to prepare a 5-plex quantitative real-time (qRT)-PCR reaction. Increased multiplexing within a reaction will result in an increase in hands-on time for qRT-PCR, while the same 2|h are retained for the Innoplex assays. As shown by Scerri et al. [5], qRT-PCR 40-plex reactions will require 9|h to prepare as compared to the bead-based assay which retains a 2|h workflow. Hence, the bead-based assays have the advantage for high-throughput analysis in multiplex format. Performance and applications We have shown in previous studies, using breast cancer patient material, that gene expression can be measured using our RNA-based multiplex assays in FFPE patient archival material that was of low quality and low input [6]. Using a 22-plex assay, inter-run regression analysis using RNA extracted from cell lines performed well with an r2>0.99 in our hands. These assays were also evaluated by other groups using snap-frozen and FFPE tissues derived from patient and xenograft samples. In comparison with the reference methods, the bead-based multiplex assays outperformed the qRT-PCR when using FFPE-tissue-derived RNA, giving reliability coefficients of 99.3–100% as compared to 82.4–95% for qPCR results, indicating a lower assay variance [5].
One main advantage of the Innoplex assays is the direct measurement of gene expression on lysed/homogenized tissues and cells, providing a simplified workflow without RNA extraction, cDNA synthesis and target amplification. In addition, due to its chemistry and use of beads, gene expression can be measured in a multiplex format (up to 80 genes) using low input and low quality material. This enables the use of the assay in remote laboratories, and as detailed below for stained microdissected material and to measure multiple markers in low abundance material, such as blood-derived circulating tumours cells.
Comparison of gene expression data from homogenized and lysed patient tissue derived from either unstained or hematoxylin and eosin (H&E)-stained sections shows a high correlation (r2>0.98). This provides an advantage when studying heterogeneous tumours that are microdissected from H&E stained slides. In fact, using this methodology, an estrogen-receptor-positive tumour was analysed and one of the tumour foci had a more advanced tumour expressing the mesenchymal marker, FN1 (fibronectin). This was only possible by running a 40-plex assay on minimal input material (microdissected from 20|μm section) representing markers for molecular classification, epithelial to mesenchymal transition, and proliferation markers [7]. A recent audit on breast cancer diagnosis, indicates clearly that heterogeneous cases characterized using the bead-based multiplex assays on resection tumour samples are not represented in matched biopsies used for patient diagnosis. In fact, only 3.5% of 97 intra-tumour heterogeneous cases were detected in a cohort of 570 patients at diagnosis. The advantage of the digitalized result of the Innoplex assays is to avoid increasing the workload of pathologists when resected samples are re-analysed to characterize multiple sites within a tumour.
Multiplexing provides both sensitivity and versatility in biomarker validation and was instrumental in our hands to measure gene amplifications in cancer-derived exosomes (tumour-derived vesicles in blood) using plasma from CRC patients. Of interest, these methods have been optimized using cancer cell lines to measure RNA transcripts in cells at low abundance, mimicking the isolation of circulating tumour cells from blood [5]. In this study we show that measurement of transcripts of EPCAM (encoding epithelial cell adhesion molecule), KRT19 (encoding keratin, type I cytoskeletal 19), ERBB2 (encoding HER2) and FN1 maintain a linear signal down to 15 cells or less. In addition, the simple workflow with direct measurement using lysed cells enables this assay to be translated more efficiently to the clinical setting. Absolute quantification of transcripts presents alternative endpoint methods to the Invitrogen QuantiGene™ Plex Assay. Droplet digital PCR (dPCR) and Nanostring’s nCounter® technology are precise and sensitive methods. Multiplexing in dPCR is limiting and RNA studies are hindered by reverse transcription inefficiency. The nCounter® technology requires multiple target enrichment (PCR-based pre-amplification) to measure low input RNA, which introduces amplification bias and risk for false positive results. Summary In conclusion, the innovative multiplex assays indicate a shift from reactive medicine (treating patients based on average risks) towards predictive, precise and personalized treatment that takes into account heterogeneity of primary tumour, progression of tumour during therapy and the metastatic surveillance of the individual patient. The versatility of the method allows the development of various assays to support different applications (Figs|1 & 2). Our first innovative methods were developed for the molecular classification of luminal and basal breast cancer and to predict sensitivity to specific therapy in triple-negative breast cancer subtype [8]. As discussed above, the multiplex assays have a wide range of possible applications in the diagnosis of tumours and surveillance of tumours during therapy. The main advantages of these methods include: (a) implementation of high-throughput analysis which has a positive impact on remote testing and implementation of such assays in patient surveillance and clinical trials; (b) the digitalized result excludes subjectivity and equivocal interpretation, which are common events in image-based measurements, and also eliminates the need for highly specialized facilities and human resources; (c) accurate and precise detection of multiple targets in one assay, minimizing the use of precious patient samples; and (d) enables the measurement of gene expression in heterogeneous tumours and low input / low quality patient material. The method is streamlined with the current pathology laboratory practices resulting in a workflow that is cost-effective and with minimal turnaround time. The authors Godfrey Grech*1,2 PhD, Stefan Jelbauer3 PhD, Hilary Graham4 PhD 1 Department of Pathology, Faculty of Medicine & Surgery, University of Malta 2 Scientific Division, Omnigene Medical Technologies Ltd, Malta 3 Thermo Fisher Scientific, Carlsbad, CA 92008, United States 4 Licensed Technologies Group, Luminex Corporation, Austin, Texas
https://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png003wmediahttps://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png3wmedia2020-08-26 09:31:372021-01-08 11:07:53Sensitive and precise multiplex assays enable accurate classification and surveillance of tumours
by Dr Allison B. Chambliss The diagnosis of acute pancreatitis has long relied on elevations in serum amylase or lipase. Recent test utilization efforts have called f or the discontinuation of amylase in acute pancreatitis, favouring the higher specificity and longer elevation of lipase. However, neither biomarker correlates with disease severity, and early recognition of severe cases remains a diagnostic challenge. Introduction to acute pancreatitis Acute pancreatitis (AP) represents one of the most common gastrointestinal-related causes for hospital admissions. AP refers to an inflammatory condition of the pancreas commonly associated with a severe, rapid onset of abdominal pain. Patients may also experience other non-specific symptoms, including fever, tachycardia, nausea and vomiting. AP may be classified as mild, moderate or severe based on the degree of organ failure and systemic complications, a system referred to as the revised Atlanta classification (Table 1) [1].
The most frequent cause of AP is gallstones, which are hardened deposits of bile. Gallstones may account for 40–70% or more of AP cases, depending on the geographic region [2]. Gallstone pancreatitis typically resolves upon spontaneous or endoscopic removal of the stone. Once recovered, gallstone pancreatitis patients typically undergo cholecystectomy, the surgical removal of the gallbladder, to prevent recurrent AP episodes. Alcohol abuse is typically ranked as the second most frequent cause of AP (25–35% of cases), followed by a variety of other rarer causes such as metabolic abnormalities, drugs and toxins, and trauma.
Treatment for most patients involves supportive care, including fluid resuscitation, pain control and monitoring. Although patients with mild disease may recover within a few days without complications, the most severe cases may involve systemic inflammatory response syndrome with the failure of multiple organs, including acute respiratory failure, shock, and/or renal failure. Rapid diagnosis of AP and assessment of risk for disease severity, both of which rely on laboratory testing, are critical to guide patient management. Recurrent episodes of AP may progress to chronic pancreatitis. Increases in disease prevalence The annual incidence of AP is estimated at 20–40 per 100¦000 worldwide [3]. Interestingly, the incidence has increased over the past few decades, particularly in Western countries [4]. One study found an increase of 13.2% in AP-related hospital admissions in 2009–2012 compared to 2002–2005 across the USA [5]. Although these epidemiological trends are not entirely understood, several reasons for the overall increasing incidence of AP have been proposed. One hypothesis is the global epidemic of obesity, which may promote gallstone formation. Increases in alcohol consumption could also play a role in some countries. Other experts suggest that the wider availability and increased frequency of laboratory testing may be major factors. This latter concept is in alignment with the fact that although cases in AP have risen, the mortality rate of the disease has, in fact, declined [5]. Nevertheless, mortality remains high in the severe case category. Biomarkers for AP Serum amylase and lipase are well-established as the primary biomarkers for the diagnosis of AP. Both amylase and lipase are digestive enzymes; amylase hydrolyses complex carbohydrates to simple sugars, and lipase catalyses the hydrolysis of triglycerides. Although lipase is synthesized predominantly by the pancreas, amylase is produced both by the pancreas (P-type) and the salivary glands (S-type) and is found in several other organs and tissues. Both enzymes are released into the circulation at the onset of AP, and elevations of both are typically observed within 3-6|h [6, 7]. Multiple clinical societies and guidelines recommend a serum amylase or lipase test result greater than three times the upper reference limit as a diagnostic criterion for AP, in addition to characteristic symptoms and imaging findings [2, 8]. Both biomarkers are widely measured by automated enzymatic methods and are thus commonly found in routine hospital laboratories, permitting rapid diagnoses. Notably, most routine assays do not distinguish between P-type and S-type amylase. This distinction requires the analysis of amylase isoenzymes, which is typically limited to reference laboratories. Questioning the value of amylase Serum amylase and lipase are well-established as the primary biomarkers for the diagnosis of AP. Both amylase and lipase are digestive enzymes; amylase hydrolyses complex carbohydrates to simple sugars, and lipase catalyses the hydrolysis of triglycerides. Although lipase is synthesized predominantly by the pancreas, amylase is produced both by the pancreas (P-type) and the salivary glands (S-type) and is found in several other organs and tissues. Both enzymes are released into the circulation at the onset of AP, and elevations of both are typically observed within 3-6|h [6, 7]. Multiple clinical societies and guidelines recommend a serum amylase or lipase test result greater than three times the upper reference limit as a diagnostic criterion for AP, in addition to characteristic symptoms and imaging findings [2, 8]. Both biomarkers are widely measured by automated enzymatic methods and are thus commonly found in routine hospital laboratories, permitting rapid diagnoses. Notably, most routine assays do not distinguish between P-type and S-type amylase. This distinction requires the analysis of amylase isoenzymes, which is typically limited to reference laboratories. Questioning the value of amylase In contrast to amylase, lipase is reabsorbed by the tubules of the kidney and is not excreted into the urine. Thus, lipase tends to remain elevated for longer than amylase, which may allow for a longer diagnostic window for AP. This advantage, in addition to lipase’s higher specificity for the pancreas, has led some organizations to recommend lipase over amylase for the diagnosis of AP. The American Board of Internal Medicine Foundation’s Choosing Wisely® campaign, in collaboration with the American Society for Clinical Pathology, has recommended: “Do not test for amylase in cases of suspected acute pancreatitis. Instead, test for lipase” [9].
Despite these recommendations, many hospital laboratories still maintain assays for amylase. We performed a retrospective audit at our institution to determine the ordering patterns of amylase relative to lipase in cases of AP. We found that in a cohort of 438 consecutive patients admitted with AP, lipase was ordered for all patients, while amylase was only ordered for 12% of patients [10]. We observed that most of the amylase orders stemmed from patients with gallstone pancreatitis who were referred for laparoscopic cholecystectomy procedures and who were under the care of the surgical team. We speculated that amylase may have been co-ordered with lipase in this subgroup of patients to check for biomarker normalization. Laparoscopic cholecystectomy is ideally to be performed as early as possible when gallstone AP resolves, and normalization of amylase or lipase may be used to document that resolution. Because amylase is believed to fall more rapidly than lipase after AP, trending amylase over time could possibly allow for a quicker documentation of biomarker normalization. However, our study also showed that there was no significant difference in amylase versus lipase in the time for the biomarker to fall below three times the upper reference limit. These observations led us to further question the added value of amylase relative to lipase alone in the diagnosis and management of AP.
Lipase does have limitations that may preclude it from being the AP biomarker of choice in some cases. Lipase may be elevated in non-pancreatic conditions such as renal insufficiency and cholecystitis (Table 2). Both amylase and lipase may rarely be non-specifically elevated due to complexes with immunoglobulins, termed macroamylasemia and macrolipasemia. Further, amylase may be useful in the workup of other pancreatic diseases and, unlike lipase, can be measured in the urine. Quantitation of amylase in body fluids, such as pancreatic fluid and peritoneal fluid, can aid in the evaluation of pancreatic cysts and pancreatic ascites [11]. For these reasons, many laboratories choose to maintain amylase assays. An unmet need for biomarkers for AP severity Although AP may be easily diagnosed with elevations in amylase or lipase, there is an unmet need for biomarkers or algorithms that can specifically identify severe forms of AP early in the disease course. Twenty to thirty percent of AP patients may develop a moderate or severe form of the disease involving single or multiple organ dysfunction or failure and requiring intensive care. Identifying the severe cases early such that treatment may be tailored to minimize complications remains one of the major challenges of AP. Risk factors such as old age and obesity often correlate with disease severity. However, neither amylase nor lipase levels correlate with disease severity, and no other laboratory tests are consistently accurate to predict severity in patients with AP.
In 2019, the World Society of Emergency Surgery (WSES) published guidelines for the management of severe AP [12]. These guidelines indicate that C-reactive protein (CRP), an acute phase reactant synthesized by the liver and a non-specific indicator of inflammation, may have a role as a prognostic factor for severe AP. However, CRP may not reach peak levels for 48 to 72|h, limiting it as an early severity indicator. Specifically, WSES recommended that a CRP result greater than or equal to 150|mg/L on the third day after AP onset could be used as a prognostic factor for severe disease. Elevated or rising blood urea nitrogen, hematocrit, lactate dehydrogenase, and procalcitonin have also demonstrated predictive value for pancreatic necrosis infections.
Other biomarkers have been investigated to distinguish mild from non-mild forms of AP. Interleukin-6 has shown good discriminatory capability in combination with CRP [13]. Resistin is a more recently discovered peptide hormone that was first described as a contributor to insulin resistance (hence the name). Resistin is secreted by adipocytes and may play a role in obesity, hypertriglyceridemia, and inflammatory cytokine reactions. A prospective observational study found that resistin levels were better than CRP for predicting severe AP on the third day and for predicting the development of necrosis [14]. However, more studies are needed before resistin can be recommended as a prognostic indicator, and clinical resistin testing is not widely available. Thus, there still remains a need for prognostic severity biomarkers that rise early (prior to 48|h) in the course of AP. The authors Allison B. Chambliss PhD, DABCC Department of Pathology, Keck School of Medicine of the University of Southern California, Los Angeles, CA 90033, USA
E-mail: abchambl@usc.edu
https://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png003wmediahttps://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png3wmedia2020-08-26 09:31:372021-01-08 11:07:52Acute pancreatitis biomarkers: to many or too few?
by Dr Huub H. van Rossum Recently, significant improvements have been made in understanding and applying moving average quality control (MA QC) that enable its practical implementation. These include the description of new and laboratory-specific MA QC optimization and validation methods, the online availability thereof, insights into operational requirements, and demonstration of practical implementation. Introduction Moving average quality control (MA QC) is the process of algorithmically averaging obtained test results and using that average for (analytical) quality control purposes. MA QC is generally referred to as patient-based real-time quality control (PBRTQC) because it is one of various methods (e.g. limit checks, delta checks, etc) that use patient results for (real-time) quality control. MA QC was first described over half a century ago as ‘average of normals’ [1]. Since then, it has evolved into a more general MA QC concept not necessarily based on using mean calculations of the obtained ‘normal’ test results [2]. Although MA QC has been available for a few decades, its adoption by laboratories has been limited due to the complexity of setting up the necessary procedures, operational challenges and a lack of evidence to justify its application and demonstrate its value. During the past 5|years, however, significant improvements have been made in the field of MA QC, and research studies have addressed all these issues. Consequently, true practical application of validated MA QC procedures to support analytical quality control in medical laboratories is now possible. Furthermore, the recent improvements may well change the way we perform daily analytical quality control in medical laboratories in the near future. MA QC optimization and validation The recent significant improvements in the field of MA QC include, first and foremost, the description of new methods to design and optimize laboratory-specific MA QC procedures and to enable validation of their actual error-detection performance [2–5]. These methods use realistic MA QC simulations based on laboratory-specific datasets and thus provide objective insights into MA QC error detection [2]. To enable practical implementation, the requirement that the number of MA QC alarms must be manageable is now acknowledged as essential and has been fulfilled when setting up MA QC [2, 6]. The newly developed methods use a novel metric to determine the error-detection performance: that is, the mean or median number of test results needed for error detection. One of the new methods presents these simulation results in bias-detection curves so that the optimal MA QC procedure can be selected, based on its overall error-detection performance [5]. An example of a bias-detection curve and its application is presented in Figure 1. After selecting the optimal MA QC settings, an MA validation chart can be used to obtain objective insights into the overall error-detection performance and the uncertainty thereof. Therefore, this chart can be seen as a validation of the MA QC procedure. An example of an MA validation chart is presented in Figure 2 and shows that the MA QC procedure will almost always (with 97.5% probability) detect a systematic error of −4% (or larger negative errors) within 20 test results.
Importantly, this method has become available to laboratories via the online MA Generator application, enabling them to design their own optimized and validated MA QC procedures [7]. Laboratories can now upload their own datasets of historical results, study potential MA QC settings using this simulation analysis and obtain their own laboratory-specific MA QC settings and MA validation charts. Several laboratories have demonstrated that this tool has enabled them to obtain relevant MA QC settings and thus implement MA QC [8, 9]. Integration of MA QC with internal QC Measurement of internal quality control (iQC) samples is still the cornerstone of analytical quality control as performed in medical laboratories. For many tests, iQC alone is sufficient to assure and control the quality of obtained test results. For some tests, however, iQC itself is insufficient. The reasons for this are related to certain fundamental characteristics of iQC that include: lack of available (stable) control materials, its scheduled character, the risk of using non-commutable control samples and tests with a sigma metric score of ≤4. For several reasons, PBRTQC or, more specifically, MA QC is a particularly valuable and powerful way to support quality assurance in such cases.
First, if no (stable) QC materials are available it is impossible, or it becomes complicated, to use iQC. This is, for example, relevant for the erythrocyte sedimentation rate, serum indices or hemocytometry tests including erythrocyte mean corpuscular volume in particular. MA QC is possible as long as patient results are available. Second, the scheduled character of iQC becomes a limitation and a risk when temporary assay failures or rapid onset of critical errors occur between scheduled iQC. Because a new MA QC value can be calculated for each newly obtained test result, MA QC can be designed as a continuous and real-time QC tool. In this context, detection of temporary assay failure by MA QC between scheduled iQC has been demonstrated for a sodium case [10], and several examples of MA QC detection of rapid onset of critical errors have been published for both chemistry and hematological tests [11]. Third, because PBRTQC methods such as MA QC use obtained patient results, by design there is no commutability issue. Fourth, and finally, for some tests iQC is intrinsically limited in its ability to detect relevant clinical errors, due to the low ratio of biological variations to analytical variations, as reflected in low sigma metric values. Such tests require frequent iQC analysis and application of stringent control rules. However, even with such an intensive and strict iQC set-up, the probability of detecting clinically relevant errors remains limited [12]. In contrast, MA QC has the best error-detection performance for tests with a low sigma value [13].
For all these reasons, MA QC is ideal for supplementing analytical quality control by iQC. Recently, an approach was presented that integrated MA QC into the QC plan when iQC was found to be insufficient [9]. This approach was based on first determining whether one of the abovementioned iQC limitations applied to a test. If so, then iQC alone was considered insufficient and MA QC was studied, using the online MA Generator tool (www.huvaros.com) to obtain optimal MA QC settings and MA QC procedures to support the analytical quality control [7, 9]. The MA QC error-detection performance was validated using MA validation charts. These latter insights into MA QC error detection also enabled iQC measurements to be reduced. The MA QC procedures alone provided significant error-detection performance, so running iQC measures multiple times a day would add only limited error-detection performance. Therefore, it was decided to run the iQC only once a day and add the obtained MA QC procedures to the QC plan.
Others have taken this a step further and studied MA QC not only for tests with limited iQC performance but also for a much larger test selection, in order to reduce the number of iQC measures and more efficiently schedule and apply iQC [4]. This approach has been shown to be successful for a large commercial laboratory with high production numbers. Since the MA QC error-detection performance improves with an increasing number of test results and benefits from a small number of pathological test results, this approach may be particularly valuable to the larger commercial laboratories. For such an approach, the key is objective insights into the error-detection performance of MA QC procedures such as obtained using MA validation charts. Implementation and application of MA QC for real-time QC in medical laboratories The final aspect in which there have been significant improvements in recent years relates to the practical application of MA QC in medical laboratories. Recently, an International Federation of Clinical Chemistry and Laboratory Medicine working group was founded that summarized medical laboratories’ experiences of practically applying MA QC and formulated several recommendations for both MA QC software suppliers and medical laboratories that are working on, or are interested in, implementation of MA QC [14, 15]. Also, a step-by-step roadmap has recently been published to enable MA QC implementation [9]. The first two steps of this roadmap – i.e. selection of tests and obtaining MA QC settings for them – were discussed in the previous two paragraphs.
The next step would be to set up and configure the software used to implement MA QC in medical laboratories. If you are interested in applying MA QC in your laboratory, it is important to review the available software (e.g. analyser, middleware, LIS, third party) and to decide which will be used to run and apply MA QC. Your decision depends not only on the availability of suitable software in or for the laboratory, but also on the actual MA QC functionality present in the software packages.
The minimum software features that are necessary to enable practical implementation have been formulated [2, 15]. In my view, key elements would be that the software supports: exclusion of specified samples (non-patient materials, QC results, extreme results, etc), calculation of relevant MA QC algorithms, applying SD-based as well as non-statistical control limits (plain lower and upper control limits), proper real-time alarming and – depending on the MA QC optimization method – presentation of MA QC in a Levey–Jennings or accuracy graph. Figure 3 presents an example of MA QC in an accuracy graph as operated for real-time QC in my laboratory. To enable effective implementation of MA QC, all of these software features should be configured.
The final implementation step I wish to address here is the design of laboratory protocols for working up MA QC alarms, which determines the extent to which an error detected by an MA QC alarm is acknowledged. An important requirement is that all MA QC alarms should be worked up by means of this protocol.
As previously indicated, because MA QC can generate many more QC results and alarms than iQC, a critical requirement of every MA QC procedure is a manageable number of alarms. As a result, when an MA QC alarm occurs there is a reasonable chance of detecting error.
A first common action as part of the MA QC alarm protocol would be to run iQC. This provides a quick insight into the size of the error and enables rapid confirmation of large errors. As a second step, re-running of recently analysed samples (in addition to running iQC) enables temporary assay failures to be detected and can confirm or exclude errors not necessarily detectable by iQC. Also, finally, a review of recently analysed test results to identify a pre-analytical cause or a single patient with extreme but valid test results is often very useful as part of the MA QC alarm protocol. All these aspects have recently been discussed in greater detail [10, 14]. Conclusions Altogether, the recent developments in the field of PBRTQC and, more specifically, MA QC now – finally – enable true practical implementation of MA QC in medical laboratories and allow more effective and efficient QC plans to be designed. The authors Huub H. van Rossum1,2 PhD 1 Department of Laboratory Medicine, The Netherlands Cancer Institute, Amsterdam, The Netherlands 2 Huvaros, Amsterdam, The Netherlands
E-mail: h.v.rossum@nki.nl
https://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png003wmediahttps://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png3wmedia2020-08-26 09:31:372021-01-08 11:07:52Consider moving average quality control when internal control is insufficient or inefficient – the time is now!
by Prof. Michael Vogeser, Dr Judy Stone and Prof. Alan Rockwood While analytical standardization and metrological traceability are well-defined terms, ‘methodological standardization’ in clinical mass spectrometry is still in a developing stage. We propose a framework that facilitates the widespread implementation of this highly complex and very powerful technology and is based on two pillars – standardization of the description of LC-MS/MS methods and standardization of the release of clinical test results as a three-step sequence of method validation, batch validation and validation of individual measurements. Mass spectrometry in the clinical laboratory Mass spectrometry (MS)-based methods now play an important role in many clinical laboratories worldwide. To date, areas of application have focused especially on screening for hereditary metabolic diseases, therapeutic drug monitoring, clinical toxicology and endocrinology. In fact, these techniques offer significant advantages over immunoassays and photometry as basic standard technologies in clinical chemistry: high analytical selectivity through true molecular detection; wide range of applications without the need for specific molecular features (as in UV detection or specific epitopes); high multiplexing capacity and information-rich detection; and, in many cases, matrixindependent analyses, thanks to the principle of isotope dilution [1].
Various MS technologies – in particular tandem MS (MS/MS-coupling with molecular fragmentation), time-of-flight (TOF) MS and Orbitrap-MS – with front-end fractionation technologies such as HPLC or UPLC potentially allow very reliable analysis, but the technology itself is no guarantee of this: these techniques have a very high complexity and a wide range of potential sources of error [2] which require comprehensive quality assurance [3–5]. Indeed, the high degree of complexity is still the main hurdle for the application of MS in the special environment of clinical laboratories. Specific challenges of this type of laboratory – in contrast to research and development laboratories – include: heterogeneous mix of staff qualifications; requirement for maximum handling safety when operating a large number of analysis platforms; work around the clock; and direct impact on the outcome of the individual patient.
Indeed, after more than two decades of commercial availability of LC-MS/MS instruments, their application in a global perspective has remained very limited. The translation of MS into fully automated ‘black box’ instruments is underway, but still far from being realized on a large scale [6], with laboratory developed tests (LDTs) still dominating the field of clinical MS applications. Kit solutions for specific analytes provided by the in vitro diagnostics (IVD) industry are becoming increasingly available, but their application also requires a very high level of skills and competence from laboratories.
Two main differences of MS-based LDTs as opposed to standard ‘plug-and-play’ analysis systems in today’s clinical laboratories can be identified: first, the high heterogeneity of device configurations and second, the handling of large amounts of data, from sample list structures to technical metadata analysis.
In fact, the random access working mode is now so widespread in all clinical laboratories that the ‘analytical batch’ is no longer standard in laboratories. In the same way, modern analytical instruments no longer challenge the end users with extensive metadata (such as reaction kinetics or calibration diagrams). To achieve the goal of making the extraordinary and disruptive analytical power of MS fully usable for medicine to an appropriate extent, approaches to master the heterogeneity of platform configurations and to regulate the handling of batches and metadata are urgently needed – and standardization efforts seem to be crucial in this context. Standardization of the method description IVD companies manufacture many different instrument platforms, but each of these platforms is very homogeneous worldwide and is produced in large quantities for years. In contrast, MS platforms in clinical laboratories have to be individually assembled from a very large number of components from many manufacturers (sample preparation modules, autosamplers, high performance pumps, switching valves, chromatography columns, ion sources, mass analysers, vacuum systems, software packages, etc). As a result, hardly any two instrument configurations in different laboratories correspond completely with each other. This makes handling very demanding for operators, maintenance personnel, and service engineers.
Methods implemented on these heterogeneous platforms (e.g. instruments from various vendors) are in turn characterized by a very considerable number of variables, e.g. chromatographic gradients, labelling patterns of internal standards, purity of solvents, dead volume of flow paths, etc.
Taken together, the variety of assays referring to an identical analyte (such as tacrolimus or testosterone) is enormous, with an almost astronomical combinatorial complexity.
However, method publications are still traditionally written more or less in a case report approach: the feasibility and performance of a method realization is demonstrated for one individual system configuration. It is usually not clear which features are really essential for the method and which features can be variable between different implementations – and which second implementation can still be considered ‘the same’ method. This means that the question of the true ‘identity’ of a method has not yet been deepened by application notes or publications in scientific journals; thus the level of abstraction required here is missing.
In an attempt to standardize the description of MS/MS-based methods, we selected a set of 35 characteristics that are defined as essential for a method (see Table 1) [7], for example, main approach of sample preparation (e.g. protein precipitation with acetonitrile), main technique of ionization (e.g. electrospray ionization in negative mode); molecular structure of the internal standard; mass transitions; calibration range. In addition, we define 15 characteristics of a method that cannot or should not be realistically standardized in time and space (examples: manufacturer and brand of the MS detector; dead volume of the flow path; lot of analytical columns and solvents). These characteristics – identified as variable – should be documented in the internal report files.
We found it feasible to describe several exemplary MS/MS methods using this scheme and a corresponding matrix. On the basis of this matrix, the method transfer to different platforms and laboratories will be much easier and more reliable. Specifying the identity of a method in the proposed way has the essential advantage that a method revalidation can be transparently triggered by defined criteria, e.g. the use of a novel internal standard with a different labelling pattern.
The proposed scheme for method description may also be the basis of a comprehensive traceability report for any result obtained by an MS-based method in the clinical laboratory. Standardization of batch release (Table 2) While today’s routine analyser platforms essentially provide unambiguous final results for each sample, the process of generating quantitative results from primary data in MS is open and transparent. Primary data in MS are the peak areas of the target analyte observed in diagnostic samples. In addition to these primary data, a range of metadata is provided (e.g. internal standard area, peak height-to-area, peak skewness, qualifier peak area; metadata related to analytical batches, e.g. coefficient of variation (CV) of internal standard areas). This transparency and abundance of data is a cornerstone of the high potential reliability of MS-based assays and therefore their interpretation is very important [8, 9].
However, the evaluation of this metadata – related to individual samples and batches – is nowadays done very heterogeneously from laboratory to laboratory [10]; this applies to LDTs as well as to commercially available kit products. The structure of analytical batches is also very variable and there is no generally accepted standard (number and sequence of analysis of calibration samples in relation to patient and quality control samples, blank injections, zero samples, etc).
While the validation of methods – which is performed before a method is introduced into the diagnostic routine – is discussed in detail in the literature (and in practice), the procedures applied to primary data before release for laboratory reporting have not yet been standardized. Validation is generally defined as the process of testing whether predefined performance specifications are met. Therefore, quality control and release of analytical batches and patient results should also be considered a process of validation, and criteria for the acceptance or rejection of results should be predefined.
A three-step approach to validation, covering the entire life cycle of methods in the clinical laboratory, can be conceptualized: dynamic validation should integrate validation of methods, validation of analytical batches and validation of individual test readings. We believe that standardization of this process of batch and sample result validation and release is needed as a guide for developers of methods, medical directors, and technicians.
In a recent article published in Clinical Mass Spectrometry [11], we propose a list of characteristics that should be considered for batch and sample release. In this article we only mention figures for merits and issues to be addressed and do not claim to have specific numerical acceptance criteria. Therefore, this generic list of items is intended as a framework for the development of an individual series and batch validation plan in a laboratory. Furthermore, we consider this list to be a living document, subject to further development and standardization as the field matures.
We believe that it is essential to include basic batch and sample release requirements as essential characteristics in the description of a method [7]. Therefore, we believe that efforts to standardize method description and batch/sample release should be synergistically linked to facilitate the use of MS in routine laboratories.
The approach proposed to clinical MS in these two companion articles [7, 11] can be the basis for discussion and eventually for the development of official standards for these areas by the Clinical and Laboratory Standards Institute (CLSI) and/or International Organization for Standardization (ISO). We believe that these documents can provide a solid basis for internal and external audits of LC-MS/MS-based Quality Control April/May 2020 9 | LDTs, which will become particularly relevant in the context of the IVD Regulation 746 in the European Union [12].
Both approaches – standardized description of MS methods and standardization of batch release – aim at implementing methodological traceability. This corresponds to the analytical standardization and metrological traceability of measurements to higher order reference materials [13, 14]. Future perspectives In the future, a commercialization of MS-based black-box instruments on a larger scale is expected. However, LC-MS/MS will remain a critical technique for LDTs, and the flexibility of MS to develop tests on demand – independent of the IVD industry on fully open LC-MS/MS platforms – will remain a key pillar of laboratory medicine.
Both publications, which this article puts into context [7, 11], have been published in Clinical Mass Spectrometry, the first and only international journal dedicated to the application of MS methods in diagnostic tests including publications on best practice documents. Both articles are freely available.
Clinical Mass Spectrometry is the official journal of MSACL (The Association for Mass Spectrometry: Applications to the Clinical Laboratory; www.msacl.org). MSACL organizes state-of-the-art congresses that focus on translating MS from clinical research to diagnostic tests (i.e. bench to clinic).
In summary, we advocate innovative approaches to methodological standardization of LC-MS/MS methods to master the complexity of this powerful technology and to facilitate and promote its safe application in clinical laboratories worldwide. The authors Michael Vogeser*1 MD, Judy Stone2 PhD, Alan Rockwood3 PhD 1 Hospital of the University of Munich (LMU), Institute of Laboratory Medicine, Munich, Germany University of California, San Francisco Medical Center, Laboratory Medicine, Parnassus Chemistry, San Francisco, CA, USA 3 Rockwood Scientific Consulting, Salt Lake City, UT, USA
https://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png003wmediahttps://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png3wmedia2020-08-26 09:31:372021-01-08 11:07:52Better health care through mass spectrometry – better mass spectrometry through standardization
The UK-based Covid-19 Volunteer Testing Network launched April 9 to provide essential additional testing capacity to front-line workers. The project, started by Mike Fischer CBE, helps small laboratories convert to run critical antigen testing and identify Covid-19 cases among local healthcare workers – at no cost to Government.
The UK has thousands of small laboratories with the right equipment, personnel and processes to run Covid-19 testing. Although some of the critical RT-PCR machines in university and healthcare settings have already been requisitioned by central Government, thousands of others are currently sitting idle in small, ‘long-tail’ facilities up and down the United Kingdom.
Fischer set up SBL, a non-profit medical research laboratory in Oxfordshire, which is already running 250-500 tests a week for 10 GP surgeries in the local area.
“Although our facility is small – with just three full-time staff, two containment hoods and two real-time machines – we were quickly able to convert to Covid-19 testing using the Centre for Disease Control protocols and are now running up to 500 tests a week for the staff at 10 local GP surgeries on a same-day basis,” said Fischer.
“If other labs could join the effort we could quickly scale to providing tens of thousands of tests a day in complement to the central program.”
“If we are going to beat this pandemic, we need to employ every resource we can to make sure that our essential health care workers can go to work safely. Even at our small facility, we have been able to run up to 500 tests a week for NHS staff on a same-day basis. By creating an emergency network of volunteer laboratories like ours across the UK, we can quickly and efficiently create the capacity we need to deliver tens of thousands of additional tests every day.”
The Covid-19 Volunteer Testing Network is being coordinated on an entirely voluntary basis and is looking for further labs to join the effort. “We hope existing equipment can be used in situ with qualified staff volunteering to conduct the tests. We are able to provide guidance, protocols, documentation and reporting,” Fischer added.
The Fischer Family Trust has also made £1 million in funding available to support the purchase of consumables for the tests if labs are unable to cover these.
For more information about the Covid-19 Volunteer Testing Network, visit: www.covid19-testing.org
https://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png003wmediahttps://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png3wmedia2020-08-26 09:31:372021-01-08 11:07:52Volunteer laboratory network launched in UK to expand Covid-19 testing
NanoPass is sharing its proprietary MicronJet microneedle device with leading vaccine and immunotherapy companies around the world to assist in development of a Covid-19 vaccine.
The NanoPass device targets immune cells of the skin by harnessing the skin’s potent immune system to improve vaccines and/or to dramatically reduce the dose while achieving the same immunity.
“The human skin is our first layer of defence against many infectious diseases,” says Yotam Levin, MD, CEO of NanoPass. “The skin contains specialized Dendritic Cells that process and induce strong immune responses – that’s why microneedle injections enable reduction of vaccine doses by five-fold, thereby reducing overall cost, required capacity and production time. We believe a reliable injection into the skin is critical for successful activation of broad and effective immune responses, which should be explored for most injectable vaccines.”
The company’s technology is supported by more than 55 completed/ongoing clinical studies with various vaccines and vaccine platforms, including H1N1, H5N1 and live attenuated VZV vaccine, that have shown improved immunogenicity and significant dose-sparing. Pre-clinical evidence with mRNA and DNA vaccines showed promising results.
NanoPass has previously supported US CDC in a Phase 3 infant polio vaccination trial; with ITRC on PPD skin testing; in Type 1 Diabetes immunotherapy; and supported NIAID with devices to evaluate immunogenicity of a pandemic flu vaccine; and multiple vaccine pharma.
NanoPass Technologies flagship product, the 0.6 mm MicronJet, is the first true (<1 mm) microneedle to receive FDA clearance as an intradermal delivery device for substances approved for delivery below the surface of the skin. It is supported by extensive clinical data and regulatory approvals in most major markets including the US, Europe, China and Korea.
https://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png003wmediahttps://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png3wmedia2020-08-26 09:31:372021-01-08 11:07:51NanoPass shares proprietary MicronJet microneedle to assist in development of a Covid-19 vaccine
Fujirebio Europe has received the CE mark for the molecular IVD assay iAMP Covid-19 Detection Kit from its partner Atila Biosystems. The qualitative detection kit is based on real-time fluorescent reverse transcription isothermal amplification, eliminating the need for RNA extraction.
The detection kit was also granted Emergency Use Authorization by the US Food and Drug Administration on April 10.
The iAMP COVID-19 Detection Kit can be run on a Real-Time PCR PowerGene 9600 Plus or any other qPCR automate capable of measuring fluorescence in FAM/HEX channel in real-time.
The new iAMP COVID-19 molecular assay complements the existing panel of biomarkers available on the LUMIPULSE® G System for infection (PCT, Ferritin), inflammation (IL-6) and epithelial lung injury (KL-6) to predict disease severity in patients infected with SARS-CoV-2.
Products from Atila Biosystems are available through Fujirebio’s European affiliates and through a large portion of Fujirebio’s existing or new European distribution network.
For more information, visit: www.fujirebio.com/en/contact
https://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png003wmediahttps://clinlabint.com/wp-content/uploads/sites/2/2020/06/clinlab-logo.png3wmedia2020-08-26 09:31:372021-01-08 11:07:51IVD assay iAMP Covid-19 Detection Kit receives CE Mark
We may ask you to place cookies on your device. We use cookies to let us know when you visit our websites, how you interact with us, to enrich your user experience and to customise your relationship with our website.
Click on the different sections for more information. You can also change some of your preferences. Please note that blocking some types of cookies may affect your experience on our websites and the services we can provide.
Essential Website Cookies
These cookies are strictly necessary to provide you with services available through our website and to use some of its features.
Because these cookies are strictly necessary to provide the website, refusing them will affect the functioning of our site. You can always block or delete cookies by changing your browser settings and block all cookies on this website forcibly. But this will always ask you to accept/refuse cookies when you visit our site again.
We fully respect if you want to refuse cookies, but to avoid asking you each time again to kindly allow us to store a cookie for that purpose. You are always free to unsubscribe or other cookies to get a better experience. If you refuse cookies, we will delete all cookies set in our domain.
We provide you with a list of cookies stored on your computer in our domain, so that you can check what we have stored. For security reasons, we cannot display or modify cookies from other domains. You can check these in your browser's security settings.
.
Google Analytics Cookies
These cookies collect information that is used in aggregate form to help us understand how our website is used or how effective our marketing campaigns are, or to help us customise our website and application for you to improve your experience.
If you do not want us to track your visit to our site, you can disable this in your browser here:
.
Other external services
We also use various external services such as Google Webfonts, Google Maps and external video providers. Since these providers may collect personal data such as your IP address, you can block them here. Please note that this may significantly reduce the functionality and appearance of our site. Changes will only be effective once you reload the page
Google Webfont Settings:
Google Maps Settings:
Google reCaptcha settings:
Vimeo and Youtube videos embedding:
.
Privacy Beleid
U kunt meer lezen over onze cookies en privacy-instellingen op onze Privacybeleid-pagina.
Dutch company CytoSMART Technologies is to donate 100 mini live-cell imaging systems to researchers in high containment labs worldwide
, /in Corona News, E-News /by 3wmediaLabs working to combat Covid-19 will benefit from this initiative, as CytoSMART aims to reduce the huge workload currently facing researchers on projects vital to controlling the disease.
CytoSMART’s unique and compact live-cell microscope films living cell cultures without disturbing their growth or behaviour. The device operates from inside cell culture incubators and is accessible from an online environment. This enables researchers to analyse their cell cultures remotely and assess e.g. the cytopathic effect, which is caused by virus replication. Using the CytoSMART Lux2, researchers will know when to take action for the next step and harvest the virus.
“We aim to do our part to assist researchers in minimizing the time they have to spend in high-contamination labs, by providing them with remote video access to evaluate the status of their cell cultures. The video data is used to remotely monitor the cytopathic effect, this way researchers know when it’s the right time to harvest the virus.” – Joffry Maltha, CEO at CytoSMART Technologies.
According to guidelines by the CDC and the WHO, isolation and characterization of Covid-19 should be performed in BSL-3 laboratories. Performing research in Biosafety Level 3 and 4 laboratories (BSL-3 or BSL-4) means working in a highly controlled area. Many precautionary measures must be taken to ensure the safety of researchers and help prevent the diseases they are working with from spreading outside the lab. Removing and replacing the protective clothing and apparatus can be time consuming and expensive, so entering the lab should ideally only occur when absolutely necessary.
Maltha commented: “We need to help scientists who are working in BSL-3 and BSL-4 laboratories to combat Covid-19. We know that our system can help researchers in monitoring cell growth and deciding when they need to go to the high containment labs and run further experiments.
Benefits of molecular biology in clinical diagnostics
, /in Corona News, E-News /by 3wmediaThe advent of molecular biology techniques has revolutionized disease diagnosis. CLI discussed with Dr Chandrasekhar Nair from Molbio Diagnostics the benefit that these techniques have brought and how these technologies are being adapted for point-of-care use for rapid diagnosis and the benefit of rural populations.
What has the impact of molecular biology been on disease diagnosis and treatment?
Accurate and timely diagnosis of infectious diseases is essential for proper medical management of patients. Early detection of the causative agent also enables care providers to intervene in a precise rather than presumptive manner and institute adequate measures to interrupt transmission to the susceptible population in the hospital or community.
The conventional diagnostic model for clinical microbiology has been labour and infrastructure intensive and frequently requires days to weeks before test results are available. Moreover, due to the complexity and length of such testing, this service was usually directed at the hospitalized patient population. Bacterial/viral culture has been – and continues to be – the gold standard for detection. However, time taken for some pathogens to grow, coupled with the difficulty in culturing some pathogens has resulted in a demand for alterna tive techniques that would allow direct pathogen detection in clinical samples rapidly.
The application of engineering techniques to the technological revolution in molecular biology has greatly improved the diagnostic capabilities of modern clinical microbiology laboratories. In particular, rapid techniques for nucleic acid amplification and characterization combined with automation and user-friendly software have significantly broadened the diagnostic arsenal. Among the molecular techniques, applicability of PCR-based methods has gained popularity as it allows for rapid detection of unculturable or fastidious microorganisms directly from clinical samples.
Clinical laboratories are increasingly finding utility of molecular techniques in diagnosis and monitoring of disease conditions. Nucleic acid amplification tests are becoming very popular in the diagnosis and management of viral infections [hepatitis B and V viruses (HBV, HCV), human immunodeficiency virus (HIV), influenza virus, etc] because they allow determination of the viral load. In most cases, they are now considered a reference, or gold standard method for diagnostic practices such as screening donated blood for transfusion-transmitted viruses [cytomegalovirus (CMV), HIV, HCV, etc]. Another important case is the use of molecular tests for the detection of the tuberculosis (TB)-causing bacterium Mycobacterium tuberculosis (MTB). Considering the limited sensitivity of smear microscopy, coupled with the steady rise in drug-resistant MTB, rapid molecular tests appear promising.
What are the challenges of implementing molecular diagnostic techniques in developing countries?
For a long time the field of molecular diagnostics has been limited to the domain of large centralized laboratories because of its dependency on complex and expensive infrastructure, highly skilled manpower and special storage conditions. This investment has also resulted in the need for batch testing to make such facilities affordable. As a result, patients and samples need to travel long distances for a test to be conducted and results are delayed, resulting in a loss of follow-up. These factors have led to a concentration of such facilities in urban centres, and poor reach of molecular diagnostics techniques, particularly in low and middle income countries (LMICs). The poor testing rates in the current COVID-19 pandemic are evidence of such dependence on centralized facilities, limiting the ability to test on demand and take appropriate action.
The lack of timely access to good diagnostics resulting in either delayed or inaccurate diagnosis by other methods has been increasingly resulting in spread of disease and poor treatment outcomes.
How can these challenges be overcome?
We need to increase the reach of molecular diagnostics techniques. Given the economic constraints in LMICs, point-of-care technology (POCT) hold a lot of promise and several major global initiatives are devoted to providing such devices. Facilities for testing that can be deployed, set up and run quickly, at affordable costs, with minimal infrastructure requirements and training are critical to the success of the efforts to increase reach. Mobile data coverage, that exists with reasonable density in LMICs, could also be leveraged for better programme management and hotspot detection.
The success of these technologies also depend on uncompromised performance and adherence to quality standards.
Furthermore, designers of POCT devices need to focus on key user requirements which include: (1) simplicity of use; (2) robustness of reagents and consumables; (3) operator safety; and (4) easy maintainability.
What is Molbio Diagnostics doing to meet these demands?
The Truelab® Real Time Quantitative micro PCR System from Molbio Diagnostics brings PCR technology right to the point of care, at all laboratory and non-laboratory settings, primary centres, in the field, near patient – essentially at all levels of healthcare, thereby decentralizing and democratizing access to molecular diagnostics. With a large and growing menu of assays for infectious diseases, this rapid, portable technology enables early and accurate diagnosis and initiation of correct treatment right at the first point of contact. The platform is infrastructure independent and provides complete end-to-end solution for disease diagnosis. With proven ability to work even at primary health centres and with wireless data transfer capability, this game changing technology brings in a paradigm shift to the global fight in control and management of devastating infectious diseases.
Under the aegis of the Council of Scientific and Industrial Research and New Millennium Indian Technology Leadership Initiative partnership, Bigtec Labs (research and development wing of Molbio Diagnostics Pvt. Ltd.) has developed a portable and battery-operated micro PCR system that has since been extensively validated [under the Department of Biotechnology and Indian Council of Medical Research (DBT & ICMR)]. Bigtec has also developed various tests and nucleic acid preparation devices to facilitate ‘sample to result’ molecular diagnostics in resource limited settings. The micro PCR system has since been launched in India through the parent company, Molbio Diagnostics, which has its manufacturing and marketing base in Goa, India.
The system works on disease specific Truenat™ microchips for conducting a real-time PCR. The sample preparation (extraction and purification) is done on a fully automated, cartridge-based Trueprep® AUTO sample prep device. The purified nucleic acids are further amplified on the Truelab® Real Time Quantitative micro PCR System which enables molecular diagnostics for infectious diseases at the point of care.
This compact battery-operated system has single testing capability and provides sample to result within 1 hour. Hence, it enables same-day reporting and initiation of evidence-based treatment for the patient.It also has real-time data transfer capability (through SMS/email) for immediate reporting of results in emergency cases. Physicians benefit from this technology by having a definitive diagnosis, early in the infection cycle, without patients/samples having to travel extensively to centralized facilities.
The Truelab® Real Time Quantitative micro PCR System from Molbio Diagnostics is a cost-effective and sensitive device that can detect diseases accurately with high specificity. The device is battery-operated and portable. This offers the additional advantage of placing the device in almost any kind of laboratory setting, unlike other devices that require uninterrupted power supply, elaborate infrastructure and air-conditioning.
Considering our platform’s potential to perform molecular diagnostics for infectious diseases at the point of care, India has initiated screening for COVID-19 using the Truenat™ Beta CoV test available on the Truelab® Real Time Quantitative micro PCR System. This will allow same-day testing, reporting, and initiation of patient isolation, if required – thereby reducing the risk of infection spreading while waiting for results.
The successful translation of our innovative concept into a product was made possible by Molbio’s multi-disciplinary workforce – with a constant mission to enable better medicine through precise, faster, cost-effective diagnosis at the point of care; to provide every patient access to the best healthcare through cutting edge technologies. Molbio aims to be a leading global player in the point-of-care diagnostics arena by continuing to innovate and bring new technologies for social betterment.
The company is based in India – how does this affect what you do, how is the clinical lab diagnostics industry developing in India and does it create more chances for you?
In India, we have over between 45¦000–50¦000 in vitro diagnostic laboratories – every one of which uses routine conventional diagnostic methods. Only a handful of them have adopted molecular diagnostic testing for reasons mentioned above. But this is changing with the advent of Molbio’s Truelab® platform, with regular standalone laboratories that were, up to now, outsourcing molecular testing, starting to perform the tests themselves. In the short span of a few years, Molbio has established itself as a company focused on making a significant impact in aiding infectious disease diagnostics worldwide with our extensive testing menu.
Our test range covers infectious diseases such as TB, the entire hepatitis range, High risk HPV, H1N1, along with the recent addition of tests for COVID-19, catering to a large population base and addressing diseases with a very significant global mortality percentages. Our rapid test development for Nipah virus and the leptospirosis-causing Leptospira bacteria show our commitment to neglected tropical diseases. Going forward, Molbio will continue to increase the assay range looking at the needs of the global LMIC geography.
The Truenat™ MTB and MTB-RIF tests have started playing a significant role in India’s mission to becoming TB-free by 2025. We would be happy to partner with other National TB Programmes in achieving sustainable development goals well before 2030.
Our vision has always been ‘innovate to have a real impact’ and hence Molbio will continue to bring in newer POCT platforms so that the benefits of science and technology reach the masses.
The interviewee
Dr Chandrasekhar Nair, BE, PhD, chief technical officer, Molbio Diagnostics
For further information visit Molbio Diagnostics (http://www.molbiodiagnostics.com)
Parallel genetic testing for primary lactose intolerance and hereditary fructose intolerance
, /in E-News /by 3wmediaby Dr Jacqueline Gosink
Gastrointestinal complaints are very common and can be difficult to diagnose. Among the many causes are genetic deficiencies in digestive enzymes. Molecular genetic analysis of polymorphisms in the patient’s DNA can determine if inborn enzyme deficits are behind the digestive problems, aiding differential diagnostics. Primary lactose intolerance, for example, is associated with polymorphisms in the regulatory region of the lactase gene (LCT), whereas hereditary fructose intolerance (HFI) is caused by mutations in the aldolase B gene (ALDOB). A PCR-based DNA microarray provides parallel determination of the two main lactose intolerance-associated polymorphisms (LCT‑13910C/T and LCT‑22018G/A ) as well as the four HFI-associated mutations (A149P, A174D, N334K and del4E4). The fast and simple determination includes fully automated data evaluation, ensuring highly standardized results.
Lactose intolerance
Primary lactose intolerance is a genetically caused deficiency of lactase, the enzyme responsible for splitting lactose into its constituent sugars glucose and galactose. In affected patients, undigested lactose is fermented in the ileum and large intestine, producing by-products such as short-chain fatty acids, methane and hydrogen, which cause the typical symptoms of abdominal pain, nausea, meteorism and diarrhea. Secondary manifestations include deficiencies, for example of vitamins, and as a result unspecific symptoms such as fatigue, chronic tiredness and depression.
Lactose intolerance represents the natural state in mammals. Lactase activity decreases after weaning and in adulthood is often only a fraction of the activity in infancy. Some humans, however, retain the ability to metabolize lactose into adulthood due to specific genetic variants. The frequency of lactase persistence is around 35% worldwide, although it varies greatly between different population groups. It is prevalent in regions with a long tradition of pastoralism and dairy farming, for example in Europe and in populations of European descent. In large parts of eastern Asia, on the other hand, almost 100% of the population is lactose intolerant.
In addition to the primary genetically caused form of lactose intolerance there is also the secondary acquired form. This develops as a result of damage to the intestine, for example from other gastrointestinal diseases such as Crohn’s disease, coeliac disease, infectious enteritis or injury from abdominal surgery. The two forms need to be distinguished diagnostically because of the need for different treatment regimes. Whereas individuals with primary lactose intolerance must adhere to a lactose-free or low-lactose diet for life or alternatively take lactase supplements, those with secondary lactose intolerance need only restrict their dairy intake until the intestinal epithelium has regenerated through treatment of the underlying cause.
Diagnostics of lactose intolerance
Classic diagnostic tests for lactose intolerance are the hydrogen breath test and blood glucose tests, with which the patient’s ability to metabolize lactose is examined. However, these tests have a low specificity and sensitivity and are influenced by individual factors such as the composition of intestinal flora, colonic pH, gastrointestinal motility and sensitivity to lactose fermentation products. Moreover, they cannot distinguish between the primary and secondary forms of lactose intolerance. Molecular genetic testing complements these methods, enabling verification or exclusion of primary lactose intolerance with high probability, as well as differentiation of the primary and secondary forms. Genetic testing is, moreover, a non-invasive and more comfortable examination, which does not carry the risk of provoking symptoms of lactose intolerance in non-lactase-persistent individuals.
LCT polymorphisms
The main mutations associated with lactase persistence are LCT‑13910C>T and LCT‑22018G>A, which are located in the regulatory region of the lactase gene. According to current knowledge, homozygous carriers of the wild-type variants LCT‑13910CC and LCT‑22018GG develop lactose intolerance, while heterozygous carriers of the variants LCT‑13910CT and LCT‑22018GA only show corresponding symptoms in stress situations or with intestinal infections. Homozygous carriers of the mutant variants LCT‑13910TT and LCT‑22018AA are lactose tolerant as adults. These two polymorphisms are strongly coupled.
Hereditary fructose intolerance
HFI is caused by mutations in the gene for aldolase B, an enzyme essential for fructose metabolism. The mutations result in a reduction or loss in activity or stability of aldolase B, which is responsible for catalysing the breakdown of fructose-1-phosphate (F-1-P) to dihydroxyacetone phosphate and glyceraldehyde. The toxic intermediate F-1-P then accumulates in the body, causing symptoms such as nausea, vomiting and digestive disorders and in the longer term liver damage. HFI is a rare disease, occurring, for example, with a prevalence of 1 in 20¦000 in Europe. It manifests already in childhood, but may remain undiagnosed due to patients’ natural dislike of sweets, fruits and vegetables.
In addition to HFI, intolerance to fructose can also be caused by deficits in the transport of fructose into the enterocytes. This form is known as intestinal fructose intolerance or fructose malabsorption. It is much more common than HFI, occurring with a prevalence of about 30%. It is important to distinguish HFI from fructose malabsorption, because of the resulting difference in dietary requirements. Patients with HFI must completely eliminate fructose and its precursors (e.g. sucrose, sorbitol) from their diet to prevent damage to their organs. Patients with fructose malabsorption, however, should follow a fructose-restricted diet.
Diagnostics of HFI
Intolerance to fructose is usually diagnosed by means of the hydrogen breath test, in which a defined amount of fructose is ingested and then the amount of hydrogen in the exhaled air is measured. In patients with HFI, however, the intake of fructose carries the risk of a severe hypoglycaemic reaction. Therefore, a molecular genetic test for HFI should always be performed before a fructose load test. Early diagnosis of HFI is particularly important to avoid permanent damage to the liver, kidney and small intestine.
ALDOB mutations
In Europe the most frequent mutants associated with HFI are the amino acid substitutions A149P, A174D, N334K (in Human Gene Mutation Database nomenclature) and the deletion del4E4 in the aldolase B gene. For HFI to manifest, both alleles of an individual’s DNA must be affected by a mutation. In homozygous genotypes, the two alleles contain the same mutation (paternal and maternal inheritance). If the two alleles exhibit different mutations, this is referred to as a compound heterozygous HFI genotype.
Parallel genetic analysis
Molecular genetic determination of the polymorphisms associated with lactose intolerance and HFI enable diagnosis of these genetic conditions with high certainty. The EUROArray Lactose/Fructose Intolerance Direct enables simultaneous detection of the lactose-intolerance-associated polymorphisms ‑13910C/T and ‑22018G/A and the HFI-associated mutations A149P, A174D, N334K and del4E4. Thus, the two genetically caused metabolic disorders can be assessed with a single test.
he test can be performed on whole blood samples, eliminating the need for costly and time-consuming DNA isolation. In the test procedure (Fig. 1), the sections of DNA containing the alleles are first amplified by multiplex PCR using highly specific primers. During this process the PCR products are labelled with a fluorescent dye. The PCR mixture is then incubated with a microarray slide containing immobilized DNA probes. The PCR products hybridize with their complementary probes and are subsequently detected via the emission of fluorescence signals. The data is evaluated fully automatically using EUROArrayScan software (Fig. 2), and in the case of positive results, homozygous and heterozygous states are differentiated. Numerous integrated controls ensure high reliability of results, for example, by verifying that there are no other rare mutations in direct proximity to the tested positions which could interfere with the analysis.
Studies on blood donors
The performance of the EUROArray was investigated using 116 precharacterized samples from blood donors in Germany and from quality assessment schemes. The EUROArray revealed a sensitivity of 100% and a specificity of 100% with respect to the reference molecular genetic method.
Conclusions
Diagnosis of gastrointestinal disorders often involves a long and challenging process of diagnostic tests and restrictive diets. Since lactose and fructose are widely consumed in many diets, it is important to consider intolerance to these sugars during the diagnostic work-up. Simple genetic analysis enables primary lactose intolerance and HFI to be confirmed or excluded as the cause of gut problems. The parallel analysis offered by the EUROArray enables especially fast and effective diagnostics. Patients diagnosed with these genetic conditions can promptly adapt their diets to ease their symptoms. If the analysis is negative, the physician can focus on searching for other causes of the digestive complaints. The molecular genetic analysis thus provides valuable support for the gastroenterology clinic.
The author
Jacqueline Gosink PhD
EUROIMMUN AG, 23560 Lubeck, Germany
Sensitive and precise multiplex assays enable accurate classification and surveillance of tumours
, /in E-News /by 3wmediaby Prof. Godfrey Grech, Dr Stefan Jellbauer and Dr Hilary Graham
Understanding the molecular characteristics of tumour heterogeneity and the dynamics of progression of disease requires the simultaneous measurement of multiple biomarkers. Of interest, in colorectal cancer, clinical decisions are taken on the basis of staging and grade of the tumour, resulting in highly variable clinical outcomes. Molecular classification using sensitive and precise multiplex assays is required. In this article we shall explain the use of innovative methodologies using signal amplification and bead-based technologies as a solution to this unmet clinical need.
Introduction
Cancer is the leading cause of death globally, accounting for 9.6-million deaths in 2018, with 70% of cancer-related mortality occurring in low- and middle-income countries. In 2017, only 26% of low-income countries provided evidence of full diagnostic services in the public sector, contributing to late-stage presentation [1]. There are various aspects that negatively affect the survival rate of patients, including but not limited to:
(a) highly variable clinical outcome mainly due to lack of molecular classification;
(b) treatment of advanced stage of the disease mainly due to lack of, or reluctance to, screening programmes, resulting in treatment of symptomatic disease that is already in advanced stage;
(c) heterogeneity of the tumours that are undetected using representative biopsies of the tumour at primary diagnostics; and
(d) lack of surveillance of patients to detect early progression of disease and metastasis, mainly due to clinically inaccessible tumour tissue and the need of sensitive technologies to measure early metastatic events.
Colorectal cancer (CRC) represents the second most common cause of cancer-related deaths, with tumour metastasis accounting for the majority of cases. To date, treatment decisions in CRC are based on cancer stage and tumour location, resulting in highly variable clinical outcomes. Only recently, a system of consensus molecular subtype (CMS) was proposed based on gene expression profiling of primary CRC samples [2]. Organoid cultures derived from CRC samples were used in various studies to adapt the CMS signature (CMS1–CMS4) to preclinical models, to study heterogeneity and measure response to therapies. Of interest, the epidermal growth factor receptor (EGFR) and receptor tyrosine-protein kinase erbB-2 (HER2) inhibitors were selective and have a strong inhibitory activity on CMS2, indicating that subtyping provides information on potential first-line treatment [3]. In CRC, copy number variations are associated with the adenoma-to-carcinoma progression, metastatic potential and therapy resistance [4]. Our recent studies using primary and matched metastatic tissue showed that TOP2A (encoding DNA topoisomerase II alpha) and CDX2 (encoding caudal type homeobox 2) gene amplifications are associated with disease progression and metastasis to specific secondary sites. Hence, introducing robust and clinically-friendly molecular assays to enable measurement of multiple biomarkers to assess matched resected material and tumour-derived cells or cell vesicles in blood during therapy and beyond, has become a necessity to overcome this deadly toll. In addition, to support diagnostics in remote countries, the assays should allow measurement in low input, low quality tissue material.
To enable precise future diagnosis and patient classification and surveillance, we developed innovative methodologies (Innoplex assays) measuring expression of multiple marker panels representing the primary tumour heterogeneity and the dynamic changes associated with disease progression. We optimized these Molecular Diagnostics Sensitive and precise multiplex assays enable accurate classification and surveillance of tumours April/May 2020 21 | methodologies for multiplex digitalized readout using various sample sources ranging from archival formalin-fixed paraffinembedded (FFPE) tissues and characterization of gene amplifications in blood-derived exosomes. In this article we summarize the Innoplex assays based on the xMAP Luminex Technology and the Invitrogen QuantiGene™ Plex Assay, the research outputs from the University of Malta in terms of the biomarker panels and the commercialization of the assays through Omnigene Medical Technologies Ltd.
Molecular profiling technology and workflow
The Innoplex multiplex assays are based on two components, namely (a) the integration of the Invitrogen QuantiGene™ Plex Assay (Thermo Fisher Scientific) and the xMAP Luminex technology enabling multiplexing of the technique, and (b) the novel panel of biomarkers developed by the Laboratory of Molecular Oncology at the University of Malta, headed by Professor Godfrey Grech. The technologies and the research output provides the versatility of the assays. To date a breast cancer molecular classification panel and a CRC metastatic panel were developed and are currently being optimized for the clinical workflow by Omnigene Medical Technologies Ltd through the miniaturization and automation of the RNA-bead plex assay.
The Innoplex RNA-bead plex assays use the Quantigene branched- DNA technology that runs on the Luminex xMAP technology. Specific probes are conjugated to paramagnetic microspheres (beads) that are internally infused with specific portions of red and infrared fluorophores, used by the Luminex optics (first laser/ detector) to identify the specific beads known to harbour specific probes. The Quantigene branched-DNA technology builds a molecular scaffold on the specifically bound probe-target complex to amplify the signal that is read by a second laser/LED [5].
The workflow of the assay can be divided into a pre-analytical phase involving the lysis/homogenization of the tissue or cells, and the analytical phase that involves hybridization, pre-amplification and signal amplification with a total hands-on time of 2|h. This is comparable to the time required to prepare a 5-plex quantitative real-time (qRT)-PCR reaction. Increased multiplexing within a reaction will result in an increase in hands-on time for qRT-PCR, while the same 2|h are retained for the Innoplex assays. As shown by Scerri et al. [5], qRT-PCR 40-plex reactions will require 9|h to prepare as compared to the bead-based assay which retains a 2|h workflow. Hence, the bead-based assays have the advantage for high-throughput analysis in multiplex format.
Performance and applications
We have shown in previous studies, using breast cancer patient material, that gene expression can be measured using our RNA-based multiplex assays in FFPE patient archival material that was of low quality and low input [6]. Using a 22-plex assay, inter-run regression analysis using RNA extracted from cell lines performed well with an r2>0.99 in our hands. These assays were also evaluated by other groups using snap-frozen and FFPE tissues derived from patient and xenograft samples. In comparison with the reference methods, the bead-based multiplex assays outperformed the qRT-PCR when using FFPE-tissue-derived RNA, giving reliability coefficients of 99.3–100% as compared to 82.4–95% for qPCR results, indicating a lower assay variance [5].
One main advantage of the Innoplex assays is the direct measurement of gene expression on lysed/homogenized tissues and cells, providing a simplified workflow without RNA extraction, cDNA synthesis and target amplification. In addition, due to its chemistry and use of beads, gene expression can be measured in a multiplex format (up to 80 genes) using low input and low quality material. This enables the use of the assay in remote laboratories, and as detailed below for stained microdissected material and to measure multiple markers in low abundance material, such as blood-derived circulating tumours cells.
Comparison of gene expression data from homogenized and lysed patient tissue derived from either unstained or hematoxylin and eosin (H&E)-stained sections shows a high correlation (r2>0.98). This provides an advantage when studying heterogeneous tumours that are microdissected from H&E stained slides. In fact, using this methodology, an estrogen-receptor-positive tumour was analysed and one of the tumour foci had a more advanced tumour expressing the mesenchymal marker, FN1 (fibronectin). This was only possible by running a 40-plex assay on minimal input material (microdissected from 20|μm section) representing markers for molecular classification, epithelial to mesenchymal transition, and proliferation markers [7]. A recent audit on breast cancer diagnosis, indicates clearly that heterogeneous cases characterized using the bead-based multiplex assays on resection tumour samples are not represented in matched biopsies used for patient diagnosis. In fact, only 3.5% of 97 intra-tumour heterogeneous cases were detected in a cohort of 570 patients at diagnosis. The advantage of the digitalized result of the Innoplex assays is to avoid increasing the workload of pathologists when resected samples are re-analysed to characterize multiple sites within a tumour.
Multiplexing provides both sensitivity and versatility in biomarker validation and was instrumental in our hands to measure gene amplifications in cancer-derived exosomes (tumour-derived vesicles in blood) using plasma from CRC patients. Of interest, these methods have been optimized using cancer cell lines to measure RNA transcripts in cells at low abundance, mimicking the isolation of circulating tumour cells from blood [5]. In this study we show that measurement of transcripts of EPCAM (encoding epithelial cell adhesion molecule), KRT19 (encoding keratin, type I cytoskeletal 19), ERBB2 (encoding HER2) and FN1 maintain a linear signal down to 15 cells or less. In addition, the simple workflow with direct measurement using lysed cells enables this assay to be translated more efficiently to the clinical setting. Absolute quantification of transcripts presents alternative endpoint methods to the Invitrogen QuantiGene™ Plex Assay. Droplet digital PCR (dPCR) and Nanostring’s nCounter® technology are precise and sensitive methods. Multiplexing in dPCR is limiting and RNA studies are hindered by reverse transcription inefficiency. The nCounter® technology requires multiple target enrichment (PCR-based pre-amplification) to measure low input RNA, which introduces amplification bias and risk for false positive results.
Summary
In conclusion, the innovative multiplex assays indicate a shift from reactive medicine (treating patients based on average risks) towards predictive, precise and personalized treatment that takes into account heterogeneity of primary tumour, progression of tumour during therapy and the metastatic surveillance of the individual patient. The versatility of the method allows the development of various assays to support different applications (Figs|1 & 2). Our first innovative methods were developed for the molecular classification of luminal and basal breast cancer and to predict sensitivity to specific therapy in triple-negative breast cancer subtype [8]. As discussed above, the multiplex assays have a wide range of possible applications in the diagnosis of tumours and surveillance of tumours during therapy. The main advantages of these methods include:
(a) implementation of high-throughput analysis which has a positive impact on remote testing and implementation of such assays in patient surveillance and clinical trials;
(b) the digitalized result excludes subjectivity and equivocal interpretation, which are common events in image-based measurements, and also eliminates the need for highly specialized facilities and human resources;
(c) accurate and precise detection of multiple targets in one assay, minimizing the use of precious patient samples; and
(d) enables the measurement of gene expression in heterogeneous tumours and low input / low quality patient material. The method is streamlined with the current pathology laboratory practices resulting in a workflow that is cost-effective and with minimal turnaround time.
The authors
Godfrey Grech*1,2 PhD, Stefan Jelbauer3 PhD, Hilary Graham4 PhD
1 Department of Pathology, Faculty of Medicine & Surgery, University of Malta
2 Scientific Division, Omnigene Medical Technologies Ltd, Malta
3 Thermo Fisher Scientific, Carlsbad, CA 92008, United States
4 Licensed Technologies Group, Luminex Corporation, Austin, Texas
*Corresponding author
E-mail: godfrey.grech@um.edu.mt
Acute pancreatitis biomarkers: to many or too few?
, /in E-News /by 3wmediaby Dr Allison B. Chambliss
The diagnosis of acute pancreatitis has long relied on elevations in serum amylase or lipase. Recent test utilization efforts have called f or the discontinuation of amylase in acute pancreatitis, favouring the higher specificity and longer elevation of lipase. However, neither biomarker correlates with disease severity, and early recognition of severe cases remains a diagnostic challenge.
Introduction to acute pancreatitis
Acute pancreatitis (AP) represents one of the most common gastrointestinal-related causes for hospital admissions. AP refers to an inflammatory condition of the pancreas commonly associated with a severe, rapid onset of abdominal pain. Patients may also experience other non-specific symptoms, including fever, tachycardia, nausea and vomiting. AP may be classified as mild, moderate or severe based on the degree of organ failure and systemic complications, a system referred to as the revised Atlanta classification (Table 1) [1].
The most frequent cause of AP is gallstones, which are hardened deposits of bile. Gallstones may account for 40–70% or more of AP cases, depending on the geographic region [2]. Gallstone pancreatitis typically resolves upon spontaneous or endoscopic removal of the stone. Once recovered, gallstone pancreatitis patients typically undergo cholecystectomy, the surgical removal of the gallbladder, to prevent recurrent AP episodes. Alcohol abuse is typically ranked as the second most frequent cause of AP (25–35% of cases), followed by a variety of other rarer causes such as metabolic abnormalities, drugs and toxins, and trauma.
Treatment for most patients involves supportive care, including fluid resuscitation, pain control and monitoring. Although patients with mild disease may recover within a few days without complications, the most severe cases may involve systemic inflammatory response syndrome with the failure of multiple organs, including acute respiratory failure, shock, and/or renal failure. Rapid diagnosis of AP and assessment of risk for disease severity, both of which rely on laboratory testing, are critical to guide patient management. Recurrent episodes of AP may progress to chronic pancreatitis.
Increases in disease prevalence
The annual incidence of AP is estimated at 20–40 per 100¦000 worldwide [3]. Interestingly, the incidence has increased over the past few decades, particularly in Western countries [4]. One study found an increase of 13.2% in AP-related hospital admissions in 2009–2012 compared to 2002–2005 across the USA [5]. Although these epidemiological trends are not entirely understood, several reasons for the overall increasing incidence of AP have been proposed. One hypothesis is the global epidemic of obesity, which may promote gallstone formation. Increases in alcohol consumption could also play a role in some countries. Other experts suggest that the wider availability and increased frequency of laboratory testing may be major factors. This latter concept is in alignment with the fact that although cases in AP have risen, the mortality rate of the disease has, in fact, declined [5]. Nevertheless, mortality remains high in the severe case category.
Biomarkers for AP
Serum amylase and lipase are well-established as the primary biomarkers for the diagnosis of AP. Both amylase and lipase are digestive enzymes; amylase hydrolyses complex carbohydrates to simple sugars, and lipase catalyses the hydrolysis of triglycerides. Although lipase is synthesized predominantly by the pancreas, amylase is produced both by the pancreas (P-type) and the salivary glands (S-type) and is found in several other organs and tissues. Both enzymes are released into the circulation at the onset of AP, and elevations of both are typically observed within 3-6|h [6, 7]. Multiple clinical societies and guidelines recommend a serum amylase or lipase test result greater than three times the upper reference limit as a diagnostic criterion for AP, in addition to characteristic symptoms and imaging findings [2, 8]. Both biomarkers are widely measured by automated enzymatic methods and are thus commonly found in routine hospital laboratories, permitting rapid diagnoses. Notably, most routine assays do not distinguish between P-type and S-type amylase. This distinction requires the analysis of amylase isoenzymes, which is typically limited to reference laboratories.
Questioning the value of amylase
Serum amylase and lipase are well-established as the primary biomarkers for the diagnosis of AP. Both amylase and lipase are digestive enzymes; amylase hydrolyses complex carbohydrates to simple sugars, and lipase catalyses the hydrolysis of triglycerides. Although lipase is synthesized predominantly by the pancreas, amylase is produced both by the pancreas (P-type) and the salivary glands (S-type) and is found in several other organs and tissues. Both enzymes are released into the circulation at the onset of AP, and elevations of both are typically observed within 3-6|h [6, 7]. Multiple clinical societies and guidelines recommend a serum amylase or lipase test result greater than three times the upper reference limit as a diagnostic criterion for AP, in addition to characteristic symptoms and imaging findings [2, 8]. Both biomarkers are widely measured by automated enzymatic methods and are thus commonly found in routine hospital laboratories, permitting rapid diagnoses. Notably, most routine assays do not distinguish between P-type and S-type amylase. This distinction requires the analysis of amylase isoenzymes, which is typically limited to reference laboratories.
Questioning the value of amylase
In contrast to amylase, lipase is reabsorbed by the tubules of the kidney and is not excreted into the urine. Thus, lipase tends to remain elevated for longer than amylase, which may allow for a longer diagnostic window for AP. This advantage, in addition to lipase’s higher specificity for the pancreas, has led some organizations to recommend lipase over amylase for the diagnosis of AP. The American Board of Internal Medicine Foundation’s Choosing Wisely® campaign, in collaboration with the American Society for Clinical Pathology, has recommended: “Do not test for amylase in cases of suspected acute pancreatitis. Instead, test for lipase” [9].
Despite these recommendations, many hospital laboratories still maintain assays for amylase. We performed a retrospective audit at our institution to determine the ordering patterns of amylase relative to lipase in cases of AP. We found that in a cohort of 438 consecutive patients admitted with AP, lipase was ordered for all patients, while amylase was only ordered for 12% of patients [10]. We observed that most of the amylase orders stemmed from patients with gallstone pancreatitis who were referred for laparoscopic cholecystectomy procedures and who were under the care of the surgical team. We speculated that amylase may have been co-ordered with lipase in this subgroup of patients to check for biomarker normalization. Laparoscopic cholecystectomy is ideally to be performed as early as possible when gallstone AP resolves, and normalization of amylase or lipase may be used to document that resolution. Because amylase is believed to fall more rapidly than lipase after AP, trending amylase over time could possibly allow for a quicker documentation of biomarker normalization. However, our study also showed that there was no significant difference in amylase versus lipase in the time for the biomarker to fall below three times the upper reference limit. These observations led us to further question the added value of amylase relative to lipase alone in the diagnosis and management of AP.
Lipase does have limitations that may preclude it from being the AP biomarker of choice in some cases. Lipase may be elevated in non-pancreatic conditions such as renal insufficiency and cholecystitis (Table 2). Both amylase and lipase may rarely be non-specifically elevated due to complexes with immunoglobulins, termed macroamylasemia and macrolipasemia. Further, amylase may be useful in the workup of other pancreatic diseases and, unlike lipase, can be measured in the urine. Quantitation of amylase in body fluids, such as pancreatic fluid and peritoneal fluid, can aid in the evaluation of pancreatic cysts and pancreatic ascites [11]. For these reasons, many laboratories choose to maintain amylase assays.
An unmet need for biomarkers for AP severity
Although AP may be easily diagnosed with elevations in amylase or lipase, there is an unmet need for biomarkers or algorithms that can specifically identify severe forms of AP early in the disease course. Twenty to thirty percent of AP patients may develop a moderate or severe form of the disease involving single or multiple organ dysfunction or failure and requiring intensive care. Identifying the severe cases early such that treatment may be tailored to minimize complications remains one of the major challenges of AP. Risk factors such as old age and obesity often correlate with disease severity. However, neither amylase nor lipase levels correlate with disease severity, and no other laboratory tests are consistently accurate to predict severity in patients with AP.
In 2019, the World Society of Emergency Surgery (WSES) published guidelines for the management of severe AP [12]. These guidelines indicate that C-reactive protein (CRP), an acute phase reactant synthesized by the liver and a non-specific indicator of inflammation, may have a role as a prognostic factor for severe AP. However, CRP may not reach peak levels for 48 to 72|h, limiting it as an early severity indicator. Specifically, WSES recommended that a CRP result greater than or equal to 150|mg/L on the third day after AP onset could be used as a prognostic factor for severe disease. Elevated or rising blood urea nitrogen, hematocrit, lactate dehydrogenase, and procalcitonin have also demonstrated predictive value for pancreatic necrosis infections.
Other biomarkers have been investigated to distinguish mild from non-mild forms of AP. Interleukin-6 has shown good discriminatory capability in combination with CRP [13]. Resistin is a more recently discovered peptide hormone that was first described as a contributor to insulin resistance (hence the name). Resistin is secreted by adipocytes and may play a role in obesity, hypertriglyceridemia, and inflammatory cytokine reactions. A prospective observational study found that resistin levels were better than CRP for predicting severe AP on the third day and for predicting the development of necrosis [14]. However, more studies are needed before resistin can be recommended as a prognostic indicator, and clinical resistin testing is not widely available. Thus, there still remains a need for prognostic severity biomarkers that rise early (prior to 48|h) in the course of AP.
The authors
Allison B. Chambliss PhD, DABCC
Department of Pathology, Keck School of Medicine of the University of Southern California, Los Angeles, CA 90033, USA
E-mail: abchambl@usc.edu
Consider moving average quality control when internal control is insufficient or inefficient – the time is now!
, /in E-News /by 3wmediaby Dr Huub H. van Rossum
Recently, significant improvements have been made in understanding and applying moving average quality control (MA QC) that enable its practical implementation. These include the description of new and laboratory-specific MA QC optimization and validation methods, the online availability thereof, insights into operational requirements, and demonstration of practical implementation.
Introduction
Moving average quality control (MA QC) is the process of algorithmically averaging obtained test results and using that average for (analytical) quality control purposes. MA QC is generally referred to as patient-based real-time quality control (PBRTQC) because it is one of various methods (e.g. limit checks, delta checks, etc) that use patient results for (real-time) quality control. MA QC was first described over half a century ago as ‘average of normals’ [1]. Since then, it has evolved into a more general MA QC concept not necessarily based on using mean calculations of the obtained ‘normal’ test results [2]. Although MA QC has been available for a few decades, its adoption by laboratories has been limited due to the complexity of setting up the necessary procedures, operational challenges and a lack of evidence to justify its application and demonstrate its value. During the past 5|years, however, significant improvements have been made in the field of MA QC, and research studies have addressed all these issues. Consequently, true practical application of validated MA QC procedures to support analytical quality control in medical laboratories is now possible. Furthermore, the recent improvements may well change the way we perform daily analytical quality control in medical laboratories in the near future.
MA QC optimization and validation
The recent significant improvements in the field of MA QC include, first and foremost, the description of new methods to design and optimize laboratory-specific MA QC procedures and to enable validation of their actual error-detection performance [2–5]. These methods use realistic MA QC simulations based on laboratory-specific datasets and thus provide objective insights into MA QC error detection [2]. To enable practical implementation, the requirement that the number of MA QC alarms must be manageable is now acknowledged as essential and has been fulfilled when setting up MA QC [2, 6]. The newly developed methods use a novel metric to determine the error-detection performance: that is, the mean or median number of test results needed for error detection. One of the new methods presents these simulation results in bias-detection curves so that the optimal MA QC procedure can be selected, based on its overall error-detection performance [5]. An example of a bias-detection curve and its application is presented in Figure 1. After selecting the optimal MA QC settings, an MA validation chart can be used to obtain objective insights into the overall error-detection performance and the uncertainty thereof. Therefore, this chart can be seen as a validation of the MA QC procedure. An example of an MA validation chart is presented in Figure 2 and shows that the MA QC procedure will almost always (with 97.5% probability) detect a systematic error of −4% (or larger negative errors) within 20 test results.
Importantly, this method has become available to laboratories via the online MA Generator application, enabling them to design their own optimized and validated MA QC procedures [7]. Laboratories can now upload their own datasets of historical results, study potential MA QC settings using this simulation analysis and obtain their own laboratory-specific MA QC settings and MA validation charts. Several laboratories have demonstrated that this tool has enabled them to obtain relevant MA QC settings and thus implement MA QC [8, 9].
Integration of MA QC with internal QC
Measurement of internal quality control (iQC) samples is still the cornerstone of analytical quality control as performed in medical laboratories. For many tests, iQC alone is sufficient to assure and control the quality of obtained test results. For some tests, however, iQC itself is insufficient. The reasons for this are related to certain fundamental characteristics of iQC that include: lack of available (stable) control materials, its scheduled character, the risk of using non-commutable control samples and tests with a sigma metric score of ≤4. For several reasons, PBRTQC or, more specifically, MA QC is a particularly valuable and powerful way to support quality assurance in such cases.
First, if no (stable) QC materials are available it is impossible, or it becomes complicated, to use iQC. This is, for example, relevant for the erythrocyte sedimentation rate, serum indices or hemocytometry tests including erythrocyte mean corpuscular volume in particular. MA QC is possible as long as patient results are available. Second, the scheduled character of iQC becomes a limitation and a risk when temporary assay failures or rapid onset of critical errors occur between scheduled iQC. Because a new MA QC value can be calculated for each newly obtained test result, MA QC can be designed as a continuous and real-time QC tool. In this context, detection of temporary assay failure by MA QC between scheduled iQC has been demonstrated for a sodium case [10], and several examples of MA QC detection of rapid onset of critical errors have been published for both chemistry and hematological tests [11]. Third, because PBRTQC methods such as MA QC use obtained patient results, by design there is no commutability issue. Fourth, and finally, for some tests iQC is intrinsically limited in its ability to detect relevant clinical errors, due to the low ratio of biological variations to analytical variations, as reflected in low sigma metric values. Such tests require frequent iQC analysis and application of stringent control rules. However, even with such an intensive and strict iQC set-up, the probability of detecting clinically relevant errors remains limited [12]. In contrast, MA QC has the best error-detection performance for tests with a low sigma value [13].
For all these reasons, MA QC is ideal for supplementing analytical quality control by iQC. Recently, an approach was presented that integrated MA QC into the QC plan when iQC was found to be insufficient [9]. This approach was based on first determining whether one of the abovementioned iQC limitations applied to a test. If so, then iQC alone was considered insufficient and MA QC was studied, using the online MA Generator tool (www.huvaros.com) to obtain optimal MA QC settings and MA QC procedures to support the analytical quality control [7, 9]. The MA QC error-detection performance was validated using MA validation charts. These latter insights into MA QC error detection also enabled iQC measurements to be reduced. The MA QC procedures alone provided significant error-detection performance, so running iQC measures multiple times a day would add only limited error-detection performance. Therefore, it was decided to run the iQC only once a day and add the obtained MA QC procedures to the QC plan.
Others have taken this a step further and studied MA QC not only for tests with limited iQC performance but also for a much larger test selection, in order to reduce the number of iQC measures and more efficiently schedule and apply iQC [4]. This approach has been shown to be successful for a large commercial laboratory with high production numbers. Since the MA QC error-detection performance improves with an increasing number of test results and benefits from a small number of pathological test results, this approach may be particularly valuable to the larger commercial laboratories. For such an approach, the key is objective insights into the error-detection performance of MA QC procedures such as obtained using MA validation charts.
Implementation and application of MA QC for real-time QC in medical laboratories
The final aspect in which there have been significant improvements in recent years relates to the practical application of MA QC in medical laboratories. Recently, an International Federation of Clinical Chemistry and Laboratory Medicine working group was founded that summarized medical laboratories’ experiences of practically applying MA QC and formulated several recommendations for both MA QC software suppliers and medical laboratories that are working on, or are interested in, implementation of MA QC [14, 15]. Also, a step-by-step roadmap has recently been published to enable MA QC implementation [9]. The first two steps of this roadmap – i.e. selection of tests and obtaining MA QC settings for them – were discussed in the previous two paragraphs.
The next step would be to set up and configure the software used to implement MA QC in medical laboratories. If you are interested in applying MA QC in your laboratory, it is important to review the available software (e.g. analyser, middleware, LIS, third party) and to decide which will be used to run and apply MA QC. Your decision depends not only on the availability of suitable software in or for the laboratory, but also on the actual MA QC functionality present in the software packages.
The minimum software features that are necessary to enable practical implementation have been formulated [2, 15]. In my view, key elements would be that the software supports: exclusion of specified samples (non-patient materials, QC results, extreme results, etc), calculation of relevant MA QC algorithms, applying SD-based as well as non-statistical control limits (plain lower and upper control limits), proper real-time alarming and – depending on the MA QC optimization method – presentation of MA QC in a Levey–Jennings or accuracy graph. Figure 3 presents an example of MA QC in an accuracy graph as operated for real-time QC in my laboratory. To enable effective implementation of MA QC, all of these software features should be configured.
The final implementation step I wish to address here is the design of laboratory protocols for working up MA QC alarms, which determines the extent to which an error detected by an MA QC alarm is acknowledged. An important requirement is that all MA QC alarms should be worked up by means of this protocol.
As previously indicated, because MA QC can generate many more QC results and alarms than iQC, a critical requirement of every MA QC procedure is a manageable number of alarms. As a result, when an MA QC alarm occurs there is a reasonable chance of detecting error.
A first common action as part of the MA QC alarm protocol would be to run iQC. This provides a quick insight into the size of the error and enables rapid confirmation of large errors. As a second step, re-running of recently analysed samples (in addition to running iQC) enables temporary assay failures to be detected and can confirm or exclude errors not necessarily detectable by iQC. Also, finally, a review of recently analysed test results to identify a pre-analytical cause or a single patient with extreme but valid test results is often very useful as part of the MA QC alarm protocol. All these aspects have recently been discussed in greater detail [10, 14].
Conclusions
Altogether, the recent developments in the field of PBRTQC and, more specifically, MA QC now – finally – enable true practical implementation of MA QC in medical laboratories and allow more effective and efficient QC plans to be designed.
The authors
Huub H. van Rossum1,2 PhD
1 Department of Laboratory Medicine, The Netherlands Cancer Institute, Amsterdam, The Netherlands
2 Huvaros, Amsterdam, The Netherlands
E-mail: h.v.rossum@nki.nl
Better health care through mass spectrometry – better mass spectrometry through standardization
, /in E-News, Editors' Picks /by 3wmediaby Prof. Michael Vogeser, Dr Judy Stone and Prof. Alan Rockwood
While analytical standardization and metrological traceability are well-defined terms, ‘methodological standardization’ in clinical mass spectrometry is still in a developing stage. We propose a framework that facilitates the widespread implementation of this highly complex and very powerful technology and is based on two pillars – standardization of the description of LC-MS/MS methods and standardization of the release of clinical test results as a three-step sequence of method validation, batch validation and validation of individual measurements.
Mass spectrometry in the clinical laboratory
Mass spectrometry (MS)-based methods now play an important role in many clinical laboratories worldwide. To date, areas of application have focused especially on screening for hereditary metabolic diseases, therapeutic drug monitoring, clinical toxicology and endocrinology. In fact, these techniques offer significant advantages over immunoassays and photometry as basic standard technologies in clinical chemistry: high analytical selectivity through true molecular detection; wide range of applications without the need for specific molecular features (as in UV detection or specific epitopes); high multiplexing capacity and information-rich detection; and, in many cases, matrixindependent analyses, thanks to the principle of isotope dilution [1].
Various MS technologies – in particular tandem MS (MS/MS-coupling with molecular fragmentation), time-of-flight (TOF) MS and Orbitrap-MS – with front-end fractionation technologies such as HPLC or UPLC potentially allow very reliable analysis, but the technology itself is no guarantee of this: these techniques have a very high complexity and a wide range of potential sources of error [2] which require comprehensive quality assurance [3–5]. Indeed, the high degree of complexity is still the main hurdle for the application of MS in the special environment of clinical laboratories. Specific challenges of this type of laboratory – in contrast to research and development laboratories – include: heterogeneous mix of staff qualifications; requirement for maximum handling safety when operating a large number of analysis platforms; work around the clock; and direct impact on the outcome of the individual patient.
Indeed, after more than two decades of commercial availability of LC-MS/MS instruments, their application in a global perspective has remained very limited. The translation of MS into fully automated ‘black box’ instruments is underway, but still far from being realized on a large scale [6], with laboratory developed tests (LDTs) still dominating the field of clinical MS applications. Kit solutions for specific analytes provided by the in vitro diagnostics (IVD) industry are becoming increasingly available, but their application also requires a very high level of skills and competence from laboratories.
Two main differences of MS-based LDTs as opposed to standard ‘plug-and-play’ analysis systems in today’s clinical laboratories can be identified: first, the high heterogeneity of device configurations and second, the handling of large amounts of data, from sample list structures to technical metadata analysis.
In fact, the random access working mode is now so widespread in all clinical laboratories that the ‘analytical batch’ is no longer standard in laboratories. In the same way, modern analytical instruments no longer challenge the end users with extensive metadata (such as reaction kinetics or calibration diagrams). To achieve the goal of making the extraordinary and disruptive analytical power of MS fully usable for medicine to an appropriate extent, approaches to master the heterogeneity of platform configurations and to regulate the handling of batches and metadata are urgently needed – and standardization efforts seem to be crucial in this context.
Standardization of the method description
IVD companies manufacture many different instrument platforms, but each of these platforms is very homogeneous worldwide and is produced in large quantities for years. In contrast, MS platforms in clinical laboratories have to be individually assembled from a very large number of components from many manufacturers (sample preparation modules, autosamplers, high performance pumps, switching valves, chromatography columns, ion sources, mass analysers, vacuum systems, software packages, etc). As a result, hardly any two instrument configurations in different laboratories correspond completely with each other. This makes handling very demanding for operators, maintenance personnel, and service engineers.
Methods implemented on these heterogeneous platforms (e.g. instruments from various vendors) are in turn characterized by a very considerable number of variables, e.g. chromatographic gradients, labelling patterns of internal standards, purity of solvents, dead volume of flow paths, etc.
Taken together, the variety of assays referring to an identical analyte (such as tacrolimus or testosterone) is enormous, with an almost astronomical combinatorial complexity.
However, method publications are still traditionally written more or less in a case report approach: the feasibility and performance of a method realization is demonstrated for one individual system configuration. It is usually not clear which features are really essential for the method and which features can be variable between different implementations – and which second implementation can still be considered ‘the same’ method. This means that the question of the true ‘identity’ of a method has not yet been deepened by application notes or publications in scientific journals; thus the level of abstraction required here is missing.
In an attempt to standardize the description of MS/MS-based methods, we selected a set of 35 characteristics that are defined as essential for a method (see Table 1) [7], for example, main approach of sample preparation (e.g. protein precipitation with acetonitrile), main technique of ionization (e.g. electrospray ionization in negative mode); molecular structure of the internal standard; mass transitions; calibration range. In addition, we define 15 characteristics of a method that cannot or should not be realistically standardized in time and space (examples: manufacturer and brand of the MS detector; dead volume of the flow path; lot of analytical columns and solvents). These characteristics – identified as variable – should be documented in the internal report files.
We found it feasible to describe several exemplary MS/MS methods using this scheme and a corresponding matrix. On the basis of this matrix, the method transfer to different platforms and laboratories will be much easier and more reliable. Specifying the identity of a method in the proposed way has the essential advantage that a method revalidation can be transparently triggered by defined criteria, e.g. the use of a novel internal standard with a different labelling pattern.
The proposed scheme for method description may also be the basis of a comprehensive traceability report for any result obtained by an MS-based method in the clinical laboratory.
Standardization of batch release (Table 2)
While today’s routine analyser platforms essentially provide unambiguous final results for each sample, the process of generating quantitative results from primary data in MS is open and transparent. Primary data in MS are the peak areas of the target analyte observed in diagnostic samples. In addition to these primary data, a range of metadata is provided (e.g. internal standard area, peak height-to-area, peak skewness, qualifier peak area; metadata related to analytical batches, e.g. coefficient of variation (CV) of internal standard areas). This transparency and abundance of data is a cornerstone of the high potential reliability of MS-based assays and therefore their interpretation is very important [8, 9].
However, the evaluation of this metadata – related to individual samples and batches – is nowadays done very heterogeneously from laboratory to laboratory [10]; this applies to LDTs as well as to commercially available kit products. The structure of analytical batches is also very variable and there is no generally accepted standard (number and sequence of analysis of calibration samples in relation to patient and quality control samples, blank injections, zero samples, etc).
While the validation of methods – which is performed before a method is introduced into the diagnostic routine – is discussed in detail in the literature (and in practice), the procedures applied to primary data before release for laboratory reporting have not yet been standardized. Validation is generally defined as the process of testing whether predefined performance specifications are met. Therefore, quality control and release of analytical batches and patient results should also be considered a process of validation, and criteria for the acceptance or rejection of results should be predefined.
A three-step approach to validation, covering the entire life cycle of methods in the clinical laboratory, can be conceptualized: dynamic validation should integrate validation of methods, validation of analytical batches and validation of individual test readings. We believe that standardization of this process of batch and sample result validation and release is needed as a guide for developers of methods, medical directors, and technicians.
In a recent article published in Clinical Mass Spectrometry [11], we propose a list of characteristics that should be considered for batch and sample release. In this article we only mention figures for merits and issues to be addressed and do not claim to have specific numerical acceptance criteria. Therefore, this generic list of items is intended as a framework for the development of an individual series and batch validation plan in a laboratory. Furthermore, we consider this list to be a living document, subject to further development and standardization as the field matures.
We believe that it is essential to include basic batch and sample release requirements as essential characteristics in the description of a method [7]. Therefore, we believe that efforts to standardize method description and batch/sample release should be synergistically linked to facilitate the use of MS in routine laboratories.
The approach proposed to clinical MS in these two companion articles [7, 11] can be the basis for discussion and eventually for the development of official standards for these areas by the Clinical and Laboratory Standards Institute (CLSI) and/or International Organization for Standardization (ISO). We believe that these documents can provide a solid basis for internal and external audits of LC-MS/MS-based Quality Control April/May 2020 9 | LDTs, which will become particularly relevant in the context of the IVD Regulation 746 in the European Union [12].
Both approaches – standardized description of MS methods and standardization of batch release – aim at implementing methodological traceability. This corresponds to the analytical standardization and metrological traceability of measurements to higher order reference materials [13, 14].
Future perspectives
In the future, a commercialization of MS-based black-box instruments on a larger scale is expected. However, LC-MS/MS will remain a critical technique for LDTs, and the flexibility of MS to develop tests on demand – independent of the IVD industry on fully open LC-MS/MS platforms – will remain a key pillar of laboratory medicine.
Both publications, which this article puts into context [7, 11], have been published in Clinical Mass Spectrometry, the first and only international journal dedicated to the application of MS methods in diagnostic tests including publications on best practice documents. Both articles are freely available.
Clinical Mass Spectrometry is the official journal of MSACL (The Association for Mass Spectrometry: Applications to the Clinical Laboratory; www.msacl.org). MSACL organizes state-of-the-art congresses that focus on translating MS from clinical research to diagnostic tests (i.e. bench to clinic).
In summary, we advocate innovative approaches to methodological standardization of LC-MS/MS methods to master the complexity of this powerful technology and to facilitate and promote its safe application in clinical laboratories worldwide.
The authors
Michael Vogeser*1 MD, Judy Stone2 PhD, Alan Rockwood3 PhD
1 Hospital of the University of Munich (LMU), Institute of Laboratory Medicine, Munich, Germany
University of California, San Francisco Medical Center, Laboratory Medicine, Parnassus Chemistry, San Francisco, CA, USA
3 Rockwood Scientific Consulting, Salt Lake City, UT, USA
* Corresponding author
E-mail: Michael.Vogeser@med.uni-muenchen.de
Volunteer laboratory network launched in UK to expand Covid-19 testing
, /in Corona News, E-News /by 3wmediaThe UK-based Covid-19 Volunteer Testing Network launched April 9 to provide essential additional testing capacity to front-line workers. The project, started by Mike Fischer CBE, helps small laboratories convert to run critical antigen testing and identify Covid-19 cases among local healthcare workers – at no cost to Government.
The UK has thousands of small laboratories with the right equipment, personnel and processes to run Covid-19 testing. Although some of the critical RT-PCR machines in university and healthcare settings have already been requisitioned by central Government, thousands of others are currently sitting idle in small, ‘long-tail’ facilities up and down the United Kingdom.
Fischer set up SBL, a non-profit medical research laboratory in Oxfordshire, which is already running 250-500 tests a week for 10 GP surgeries in the local area.
“Although our facility is small – with just three full-time staff, two containment hoods and two real-time machines – we were quickly able to convert to Covid-19 testing using the Centre for Disease Control protocols and are now running up to 500 tests a week for the staff at 10 local GP surgeries on a same-day basis,” said Fischer.
“If other labs could join the effort we could quickly scale to providing tens of thousands of tests a day in complement to the central program.”
“If we are going to beat this pandemic, we need to employ every resource we can to make sure that our essential health care workers can go to work safely. Even at our small facility, we have been able to run up to 500 tests a week for NHS staff on a same-day basis. By creating an emergency network of volunteer laboratories like ours across the UK, we can quickly and efficiently create the capacity we need to deliver tens of thousands of additional tests every day.”
The Covid-19 Volunteer Testing Network is being coordinated on an entirely voluntary basis and is looking for further labs to join the effort. “We hope existing equipment can be used in situ with qualified staff volunteering to conduct the tests. We are able to provide guidance, protocols, documentation and reporting,” Fischer added.
The Fischer Family Trust has also made £1 million in funding available to support the purchase of consumables for the tests if labs are unable to cover these.
For more information about the Covid-19 Volunteer Testing Network, visit: www.covid19-testing.org
NanoPass shares proprietary MicronJet microneedle to assist in development of a Covid-19 vaccine
, /in Corona News, E-News /by 3wmediaNanoPass is sharing its proprietary MicronJet microneedle device with leading vaccine and immunotherapy companies around the world to assist in development of a Covid-19 vaccine.
The NanoPass device targets immune cells of the skin by harnessing the skin’s potent immune system to improve vaccines and/or to dramatically reduce the dose while achieving the same immunity.
“The human skin is our first layer of defence against many infectious diseases,” says Yotam Levin, MD, CEO of NanoPass. “The skin contains specialized Dendritic Cells that process and induce strong immune responses – that’s why microneedle injections enable reduction of vaccine doses by five-fold, thereby reducing overall cost, required capacity and production time. We believe a reliable injection into the skin is critical for successful activation of broad and effective immune responses, which should be explored for most injectable vaccines.”
The company’s technology is supported by more than 55 completed/ongoing clinical studies with various vaccines and vaccine platforms, including H1N1, H5N1 and live attenuated VZV vaccine, that have shown improved immunogenicity and significant dose-sparing. Pre-clinical evidence with mRNA and DNA vaccines showed promising results.
NanoPass has previously supported US CDC in a Phase 3 infant polio vaccination trial; with ITRC on PPD skin testing; in Type 1 Diabetes immunotherapy; and supported NIAID with devices to evaluate immunogenicity of a pandemic flu vaccine; and multiple vaccine pharma.
NanoPass Technologies flagship product, the 0.6 mm MicronJet, is the first true (<1 mm) microneedle to receive FDA clearance as an intradermal delivery device for substances approved for delivery below the surface of the skin. It is supported by extensive clinical data and regulatory approvals in most major markets including the US, Europe, China and Korea.
IVD assay iAMP Covid-19 Detection Kit receives CE Mark
, /in Corona News, E-News /by 3wmediaFujirebio Europe has received the CE mark for the molecular IVD assay iAMP Covid-19 Detection Kit from its partner Atila Biosystems. The qualitative detection kit is based on real-time fluorescent reverse transcription isothermal amplification, eliminating the need for RNA extraction.
The detection kit was also granted Emergency Use Authorization by the US Food and Drug Administration on April 10.
The iAMP COVID-19 Detection Kit can be run on a Real-Time PCR PowerGene 9600 Plus or any other qPCR automate capable of measuring fluorescence in FAM/HEX channel in real-time.
The new iAMP COVID-19 molecular assay complements the existing panel of biomarkers available on the LUMIPULSE® G System for infection (PCT, Ferritin), inflammation (IL-6) and epithelial lung injury (KL-6) to predict disease severity in patients infected with SARS-CoV-2.
Products from Atila Biosystems are available through Fujirebio’s European affiliates and through a large portion of Fujirebio’s existing or new European distribution network.
For more information, visit: www.fujirebio.com/en/contact