Abingdon Health to expand manufacturing capacity on back of increased demand for rapid tests

Abingdon Health will expand its York headquarters in the UK, following further investment in state-of-the-art lateral flow automation. This will substantially increase in its manufacturing footprint, resulting in Europe’s largest capacity for rapid test manufacturing, according to the company.
Abingdon Health is a technology-enabled lateral flow diagnostics company providing innovative rapid testing solutions to a multi-industry, global client base. The company provides specialist assay development and smartphone reader solutions alongside its lateral flow test manufacturing capacity.
The announcement of the expansion comes weeks after the UK Government announced Abingdon Health as one of the leading members of the UK Rapid Test Consortium.
Michael Hunter, Operations Director of Abingdon Health, commented: “The additional footprint and automation come at a timely moment as demand for rapid tests is growing rapidly, with the market likely to exceed US$10bn globally. Our precision automation and multi-site approach means we can adapt to meet the varying manufacturing needs of our growing global client base.”
Earlier this year, Abingdon Health announced a preliminary round of expansion in York after 90% revenue growth in 2019, thanks to new assay developments and assay manufacturing contract wins, and the introduction of its AppDx Smartphone reader software. In April 2020, growth continued with the acquisition of a new lateral flow manufacturing facility in Doncaster, UK. This latest expansion and investment in equipment comes as 2020 sees continuing high demand for Abingdon Health’s services.
Abingdon Health’s two manufacturing sites in York and Doncaster have the capacity to produce millions of rapid tests per month. This adaptable, dual-site approach provides a peace-of-mind solution that assures customers receive product consistency and a security of supply during routine scheduling and spikes in demand.

Base Genomics launches to commercialise ground-breaking epigenetic technology

Epigenetics company Base Genomics has launched with a team of leading scientists and clinicians with the aim of setting a new gold standard in DNA methylation detection. The company has closed an oversubscribed seed funding round of US$11 million to accelerate development of its TAPS technology, initially focusing on developing a blood test for early-stage cancer and minimal residual disease. The funding round was led by Oxford Sciences Innovation.
DNA methylation is an epigenetic mechanism involved in gene regulation and has been shown to be one of the most promising biomarkers for detecting cancer through liquid biopsy. The existing industry standard for mapping DNA methylation degrades DNA and reduces sequence complexity, however, limiting scientific discovery and clinical sensitivity. Base Genomics’ new technology, TAPS, overcomes these issues and generates significantly more information from a given sample, creating new opportunities in research and clinical application.
Dr Anna Schuh, CMO, Base Genomics, commented: “In order to realize the potential of liquid biopsies for clinically meaningful diagnosis and monitoring, sensitive detection and precise quantification of circulating tumour DNA is paramount. Current approaches are not fit for purpose to achieve this, but Base Genomics has developed a game-changing technology which has the potential to make the sensitivity of liquid biopsies a problem of the past.”
First developed at Ludwig Institute for Cancer Research Branch at the University of Oxford, TAPS is a novel chemical reaction that converts methylated cytosine to thymine under mild conditions. Unlike the industry standard technology, bisulfite sequencing, TAPS does not degrade DNA, meaning that significantly more DNA is available for sequencing. TAPS also better retains sequence complexity, cutting sequencing costs in half and enabling simultaneous epigenetic and genetic analysis.
Dr Vincent Smith, CTO, Base Genomics said: “[TAPS] has the potential to have an impact on epigenetics similar to that which Illumina’s SBS chemistry had on Next Generation Sequencing.”
Base Genomics is led by a highly experienced team of scientists and clinicians, including Dr Smith, a world-leader in genomic product development and former Illumina VP; Dr Schuh, Head of Molecular Diagnostics at the University of Oxford and Principal Investigator on over 30 clinical trials; Drs Chunxiao Song and Yibin Liu, co-inventors of TAPS at the Ludwig Institute for Cancer Research, Oxford; and Oliver Waterhouse, previously an Entrepreneur in Residence at Oxford Sciences Innovation and founding team member at Zinc VC.
Waterhouse, founder and CEO, Base Genomics, said: “The ability to sequence a large amount of high-quality epigenetic information from a simple blood test could unlock a new era of preventative medicine. In the future, individuals will not just be sequenced once to determine their largely static genetic code, but will be sequenced repeatedly over time to track dynamic epigenetic changes caused by age, lifestyle, and disease.”

BBI Group acquires DIARECT AG

BBI Group, the world’s leading independent provider of immuno-diagnostic reagents and contract services, has acquired DIARECT, a leading supplier of autoimmune antigen products.
The acquisition enhances BBI’s portfolio and position as a ‘complete’ immunoassay reagent supplier and boosts BBI’s position as the world’s largest diagnostics components company with a market leading antigen portfolio.
The UK-based business, which has manufacturing sites and sales offices across four continents, will integrate DIARECT’s state of the art manufacturing facility in Germany, to establish a Centre of Excellence for recombinant antigen development and manufacturing in Freiburg.
The combined capability enables BBI to further service high growth disease segments such as cancer, cardiac conditions and diabetes, while also enable BBI to drive geographic expansion for DIARECT, which will benefit from a bigger sales force.

Beckman Coulter’s SARS-CoV-2 IgG antibody test now available in markets accepting CE Mark

Beckman Coulter’s Access SARS-CoV-2 IgG assay is now available in markets accepting the CE Mark, the company said in statement 15 June. It has already shipped tests to more than 400 hospitals, clinics and diagnostics laboratories in the United States and has begun shipping to customers globally. Beckman Coulter has more than 16,000 immunoassay analysers worldwide and has increased manufacturing to deliver more than 30 million tests a month.
Many of Beckman Coulter’s analysers can deliver up to 400 routine tests an hour. The Access SARS-CoV-2 IgG test can also be run on Beckman Coulter’s Access 2 analyser, a compact table-top analyser enabling high-quality serology testing to be carried out in small hospitals and clinics.
The Access SARS-CoV-2 IgG Assay is a qualitative immunoassay that detects IgG antibodies directed to the receptor-binding domain of the spike protein of the novel coronavirus that is driving the ongoing global pandemic. It is believed that these antibodies have the potential to be neutralizing antibodies and may play a role in lasting immunity. The test has a confirmed 99.8% specificity and 100% sensitivity at 18 days post PCR confirmed positive test. The assay uses immobilized virus antigens on magnetic particles to capture IgG antibodies from patient serum or plasma samples and reveals them using labelled anti-IgG antibodies.
Commenting on the assay, Shamiram R. Feinglass, M.D., MPH, Chief Medical Officer, Beckman Coulter, said: “An IgG antibody assay such as the test Beckman Coulter has developed can provide valuable information regarding community levels of immunity that will inform public health decision making and rollout of a vaccine when one does become available. The very high sensitivity and specificity of this assay provides a high positive predictive value, even when the overall incidence of disease is low. Additionally, since our assay can be run on multiple different types of analysers, it can be adapted to a variety of healthcare settings to best meet the needs of each community.”

Insight into serology testing

During the course of the current coronavirus pandemic we have all been aware of the urgent need for nucleic acid testing to identify people currently infected with SARS-CoV-2. The second test that is needed, the serology test, to identify who has had the virus, is much more complex to produce. Dr Andy Lane, commercial director from The Native Antigen Company, discusses adaptive immunity and the production of antigens and antibodies for the creation of immunoassays that can be used for in vitro diagnostics.
What is The Native Antigen Company and what does it do?
The Native Antigen Company was founded in Oxford, UK, in 2010, with the goal of developing native viral and bacterial antigens to support the in vitro diagnostics (IVD) industry. The company was the first to release highly pure Zika virus NS1 antigens for the development of specific diagnostics in 2016, and has since built experience and capabilities to support the research community in pandemic scenarios. In February 2020, the company became one of the firstrecognized suppliers of antigens for SARS-CoV-2 (the virus that causes COVID-19), and has continued to develop a broad and expanding range of coronavirus reagents. Additionally, we offer a wide variety of native and recombinant antigens for over 60 infectious diseases and provide custom and contract services to the life sciences and biotechnology industries.
Our reagents are used by a wide range of researchers working in infectious diseases, but are predominantly sold into two major markets: the IVD industry, who use antigens and antibodies to develop immunoassays for serological diagnosis of infection, and the vaccine industry, who use antigens and antibodies to develop immunoassays for the qualification and quantification of animal and patient vaccine responses in clinical trials.
Briefly, how is immunity generated in response to infection?
It goes without saying that the human immune system is highly complex, but it can generally be broken down into the innate and adaptive immune responses. Innate immunity is our first line of defence. It provides a rapid, but somewhat makeshift response that is largely preoccupied with trying to kill infectious agents from the moment they enter the body, with a broad array of non-specific cells, proteins and biochemicals. While this is ongoing, the innate response alerts the adaptive response. Adaptive immunity (overview in Fig. 1) represents the elite troops of the immune system, which launch an attack that is specifically adapted to the infectious agent using more sophisticated weapons to mediate powerful downstream responses. The hallmark of the adaptive response is clonal expansion, where B and T lymphocytes that are able to recognize a pathogen will be positively selected for to rapidly build their numbers. Once these cells reach significant levels, the body is much better equipped to detect and clear the invading pathogen, and tends to form a long-lasting ‘memory’ of the pathogen to better prepare itself for future encounters.
After some viral infections, we develop lifelong immunity; however, after others we are only protected for a short period of time – why does this difference arise?
There are two major reasons for reinfection by a virus shortly after initial exposure. The first is due to the ability of viruses to mutate, which occurs via the natural accumulation of genetic changes over time (antigenic drift) or recombination of a virus’s genome with a related strain, causing it to rapidly mutate into a novel form (antigenic shift). These processes allow a virus to change its ‘appearance’, such that it is no longer recognizable by our immune system, and makes our previous exposure to the original virus of little use. This is best exemplified by the influenza A virus, which is notorious for mutating its surface proteins (hemagglutinins and neuraminidases) to evade immune recognition, resulting in a perpetual game of cat and mouse that requires the development of new vaccine formulations every flu season.
The second reason for ineffective immune responses is a bit more complex and tends to occur as a result of waning memory cell levels in the host’s immune system following initial infection. However, the cause of short-lived immunity is not entirely clear and largely depends on the virus in question as well as myriad influencing factors, such as genetics, age and previous exposure to pathogens. A very relevant example are the endemic coronaviruses, such as OC43-CoV and 229E-CoV, whose infections may result in only a few months of immunity. A study in the early 90s, for example, showed that exposure with 229E-CoV only one year after initial infection resulted in reinfection in the majority of patients and correlated with declining antibody titres [1]. The reason for the decline in immune memory is not entirely clear but is often attributed to the mild pathogenicity of such viruses eliciting a somewhat lacklustre immune response in the first place.
Given the short-lived immunity of some coronaviruses, COVID-19 immunity has been a hot topic. Most patients have shown quite potent and lasting antibody responses, while some have little-to-no detectable antibodies following infection [2]. While we are not yet sure whether this is an immune phenomenon or an issue of poor assay sensitivity, it will take some time before we are able to truly understand the human body’s response to this disease.
Serology testing is of great importance in clinical diagnostics. When doing serology testing to see if a person has had a disease, what exactly is being detected and how is this usually achieved?
By definition, serology is the scientific examination of blood serum and its components. However, in the context of the clinical diagnosis of infectious disease, it generally refers to the use of immunoassays that measure antigens or antibodies. Immunoassays are found in a wide variety of formats but are best exemplified by the enzyme-linked immunosorbent assay (ELISA), which uses plastic titer plates to bind antigens or antibodies from patient samples and produce a detectable signal.
The second major immunoassay format is the lateral flow assay (LFA), which uses an absorbent pad to absorb an analyte and run it through a series of specific antibodies to produce a detectable signal. These assays have the advantage of being inexpensive and portable and can typically provide results within minutes.
Emerging studies suggest that the serology of SARS-CoV-2 is highly complex and differs significantly from other betacoronaviruses. Antibody responses to SARS-CoV-2 appear to occur later and be of lower titres than are typically observed for viral infections, influencing the way in which assays are designed to diagnose both acute and historic infections. Another important consideration is the potential for antibody cross-reactivity to other co-circulating coronaviruses, requiring close attention to the binding specificity of antigens used.
In the current COVID-19 pandemic, serology testing will be crucial for discovering much about the disease – what will we be hoping to learn from this?
From the outset of the pandemic, the reverse-transcriptase polymerase chain reaction (RT-PCR) has been the predominant means of diagnosing active infection. However, as molecular methods rely on the presence of viral nucleic acids, they are limited to a narrow window during the acute phase of infection when the virus is present in the respiratory tract. This has left a major gap in the ability to detect previous cases and understanding the transmission dynamics of this disease. Antibodies to SARS-CoV-2, however, may last for some time after infection to allow for retrospective diagnosis once patients have recovered. This is particularly useful for multiple reasons.
First, as governments ease lockdown restrictions, high-quality epidemiological data is vital for keeping an eye on temporal and geographical disease dynamics, which will require frequent sampling of antibodies in populations (serosurveys). There is also a clear advantage in using serology tests for diagnosis at the point of care. Unlike high-throughput RT-PCR or ELISAs, LFAs present a highly practical and rapid alternative for acute-phase diagnosis and will be crucial in identifying asymptomatic carriers and infected individuals to ensure they are isolated from the general population.
Another major role of serology is in vaccine testing. So far, there are over 130 vaccine candidates currently in the pipeline [3]. While these vaccines are based on a wide range of platforms, (including mRNA, DNA, nanoparticles, subunits, synthetic peptides and virus-like particles, to name a few), it can be said with near certainty, that a SARS-CoV-2 vaccine will elicit immune responses to the spike protein. However, considering that vaccine-induced anti-spike IgG levels may be indistinguishable from those conferred by natural infection, alternative antigens will be needed to design vaccine-specific assays. These assays will also be very useful in assessing the potential risk of vaccine-induced antibody-dependent enhancement, in which antibodies produced by a vaccine are able to facilitate a more aggressive pathogenesis when a patient gets a real SARS-CoV-2 infection.
How do you go about preparing reagents for a serology test for a new pathogen such as SARSCoV- 2 and why is it important that these reagents are ‘native-like’?
When developing any immunoassay, the most important components are the antigens and antibodies used to design it. The considerations for choosing these reagents are wide-ranging: antigens should include the most appropriate epitopes to facilitate high sensitivity and antibodies should be tested for high affinity to the antigen in question. When considering specificity, it is crucial to ensure than detector antibodies do not bind to the cross-reactive epitopes that are often found on more conserved regions of viral antigens.
To modulate the sensitivity and specificity of an assay, specific portions of a protein can also be used. In the case of SARS-CoV-2, researchers are investigating various different regions of its spike protein for use in immunoassays. The S1 and S2 subunits of the spike are a popular choice for the development of immunoassays as they are highly exposed to the virus’s external environment and can readily induce potent antibody responses. In particular, anti-spike antibodies that bind the receptor-binding domain (RBD) of S1 may be able to neutralize virus by preventing binding with ACE2. The spike RBD functions to mediate cell-surface attachment and internalization by binding human ACE2 receptors. Given RBD’s role in host-cell entry, it is able to elicit highly neutralizing antibody responses and is a popular target for the development of vaccines. The RBD also shows high sequence divergence between other coronavirus spike proteins, making it a popular antigen for the development of sensitive and specific immunoassays. The N-terminal domain of the SARS-CoV-2 spike protein shows the highest sequence variability across the coronavirus family, making it a popular choice of antigen for maximizing the specificity of diagnostic assays.
Given the biosafety implications of handling a live virus, recombinant antigens expressed from other organisms are the go-to for developing assays. However, not all expression systems are born equal. Simple organisms like Escherichia coli are easy to genetically manipulate but lack the necessary post-translational machinery to glycosylate proteins. Incidentally, each SARS-CoV-2 spike trimer contains up to 66 glycan sugars to facilitate folding and mediate viral tropisms, amongst other things. From the perspective of assay development, these glycans constitute many of the key surface epitopes that are recognized by detector antibodies and the use of unglycosylated spike risks the binding of non-specific, cross-reacting antibodies that can reduce diagnostic specificity.
To ensure that spike is produced with its full glycosylation pattern and is properly folded, more complex systems need to be used. At The Native Antigen Company, we use our VirtuE mammalian (HEK293) system that has been developed for the bespoke purpose of expressing high-quality antigens with proper folding and full glycosylation.
What’s your vision for the future for The Native Antigen Company and its collaboration with OXGENE?
After the SARS-CoV-2 genome was published in early January, it was an all-out race to develop and release reagents. After a tremendous effort by our R&D team, we managed to produce our first batch of S1 antigens in early February and began to ship them to our customers around the globe. However, the next challenge was manufacturing capacity. Given the demand from the IVD and vaccine industries, we soon began to struggle in meeting such large demand. Fortunately, we were able to reach out to some manufacturers who could support us with scale production.
Our first partner, OXGENE™ has been using their Protein Machine Technology to develop stable cell lines for the production of spike antigens. Their technology uses a proprietary adenoviral vector to carry SARS-CoV-2 DNA into human cells, where it delivers it to the nucleus for stable integration. From here, cell lines can be cultured en masse to produce large quantities of protein without the inherent limitations in yield of transient expression. Work is still ongoing to optimize expression, but we’re hoping for some positive data in the coming weeks.

Enabling innovation: designing research facilities

by Dr Tolga Durak
Around the world, organizations are building next-generation research facilities intended to encourage communication, collaboration and creativity. However, these new spaces must overcome a wide range of complex challenges to meet the needs of researchers today and in the future. This article explores five important questions that should be considered in order to build an innovation space that is safe, successful and productive.

Fundamental questions for good design

Research facility design and construction is evolving rapidly, as organizations around the world strive to create work environments that meet the needs of today’s scientists. Whether these new spaces are relatively small-scale makerspaces, large pharmaceutical manufacturing plants, or tightly regulated high-containment laboratories, they are being built to foster communication, collaboration, and innovation, often in ways that depart significantly from the traditional R&D rubric. As a result, every stage of the process – from initial site assessment, architectural design, and construction and continuing all the way through to ongoing maintenance and operation – must be approached with fresh eyes. To get started, design and construction teams must consider the following five fundamental questions.
1. Who is going to work in the facility and what will they need to be successful?
Most research projects now span multiple disciplines, and laboratory spaces often need to accommodate the varied needs of biologists, chemists, engineers, physicists and/or others – all working together but with different methods. Research facility design must accommodate each specialty’s unique requirements across a wide spectrum that includes equipment, infrastructure (electrical, ventilation, etc), information technology (IT), workflow and compliance. In addition, designers must factor in flexibility, so workspaces can adapt as the research advances and needs change.
2. What is required for compliance?
Navigating regulatory boards and obtaining approvals can be a complex, time-consuming, and expensive process, especially for clinical research facilities. Typically, these structures must be constructed in compliance with Good Laboratory Practice (GLP) regulations, Good Manufacturing Practice (GMP) regulations, and other guidelines and mandates from local, state and federal jurisdictions. In addition, laboratories that research or use infectious agents or other biological hazards must comply with regulations based on the degree of the health-related risk associated with the work being conducted. The four biosafety levels (BSLs) of containment – BSL-1, BSL-2, BSL-3, and BSL-4 – aim to safeguard against the accidental release of pathogenic organisms and other biohazards and may involve airflow systems, containment rooms, sealed container storage, waste management, decontamination procedures, and security capabilities. Clearly, the challenges of compliance need to be tackled early in the design process because meeting all of the requirements can take years, which increases the risk that research priorities change and/or that key staff moves on to other projects.
3. How sustainably can we build it?
When people think about sustainable research facility design, they usually focus on power and water consumption. Granted, researchers typically use lots of heat-generating equipment (which then require complementary cooling solutions). Their labs also generally need extensive ventilation, sophisticated sensor networks, uninterrupted power supplies – as well as back-up redundancies for all of these systems. However, in a broader sense, sustainable research facility design also addresses the health and well-being of the workforce. That means air quality, natural light, workflow and productivity considerations, material selection, and all related aesthetics can drive design and construction processes as well.
4. How will the needs of this facility change?
Science is constantly evolving, and research priorities will shift over time. Likewise, technology, regulations and workforce needs will change too. Flexibility and adaptability need to be key considerations of every plan, and designers and developers have to strike a balance between short- and long-term needs. In some cases, permanent or portable modular components may be the most efficient and cost-effective options.
5. Is building the best business decision?
For some organizations, the best business decision may be to share laboratory space, rather than to build their own. Entering into a partnership, collaboration or lease agreement with an organization that is already operating a facility can expedite research results, reduce costs, ease the burden of meeting compliance requirements and even stimulate innovation. Of course, benefits like those must be weighed against potential disadvantages, such as the lack of customization, loss of control and the risks associated with failure to protect intellectual property.

Summary

Thoughtful consideration of these five key questions will help you create an innovation space that will meet your research needs today and for years to come. As you work through your answers to each one, be sure to solicit input from architects, engineers, builders and others who have the experience and expertise to guide you in the process. Adopting a team approach is essential to building a next-generation innovation space that is that is safe, successful and productive.
The author
Tolga Durak PhD
Environment, Health and Safety Office, Professional
Education, MIT, Cambridge, MA 02139, USA
E-mail: tdurak@mit.edu

Advantages of multimodal imaging by elemental and molecular mass spectrometry

by Dr Ann-Christin Niehoff
Multimodal imaging by mass spectrometry offers a spatially resolved analysis of tissue sections as an additional tool in clinical research. Here, matrix-assisted laser desorption/ionization mass spectrometry for molecular imaging and laser ablation inductively coupled plasma mass spectrometry for elemental imaging are used to tackle two drug applications.
Mass spectrometry imaging
In recent years, mass spectrometry imaging (MSI) has gained more and more interest in the field of clinical, biological and pharmaceutical research. In contrast to hyphenated chromatographic techniques (e.g. LC-MS or GC-MS), MSI provides spatially resolved information while maintaining high sensitivity. With today’s techniques, high spatial resolution down to the low micrometre range can be achieved and is therefore a good combination with existing clinical imaging approaches from pathology.
Multimodal imaging describes the combination of different imaging techniques, as none by itself is a gold standard to answer all questions. Since MSI works with tissue sections, it can be combined easily with various microscopy applications, providing an additional input to clinical histology. Although protocols for different kinds of tissue sections exist, the preference here is to work on cryosections rather than formalin-fixed paraffin-embedded sections; this helps to avoid wash out of analytes from the tissue during the fixation and embedding steps.
Different MSI techniques can be used to focus on molecular or elemental imaging. In this article, the focus will be on matrixassisted laser desorption ionization mass spectrometry (MALDIMS) for molecular and laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) for elemental imaging.
MALDI-MSI
MALDI-MS is the most frequently used imaging technique in molecular MS. The analysis requires coating of the tissue section with a matrix, typically a small organic compound, and performing soft ionization of desorbed molecules by a pulsed laser. Ionization efficiency is highly dependent on molecular structure, matrices and laser wavelength. By scanning over the sample, a full mass spectrum is generated for each pixel.
Using the tandem MS (MS/MS) mode, fragmentation studies can provide information on molecular structures. Matrix preparation is one of the critical aspects and a potential disadvantage of MALDI-MS, Microscopy and Imaging in the Clinical Lab June 2020 21 | Figure1. Multimodal imaging of myocardial infarction Microscopic images of the two parallel sections (a & b) with the area of myocardial infarct (marked with a black line), quantified distribution of gadolinium determined with LA-ICP-MS (c) at 15|μm spot size, distribution of the ligand from Gadofluorine P (d) at 40|μm spot size, as well as the structure of the ligand and the theoretical spectrum (cyan bars) and the measured spectra (black line) with MALDI-MS (e). as it may influence limits of detection and spatial resolution due to analyte extraction of the sample by the matrix. Different instruments for matrix preparation are therefore commercially available to improve homogenous distribution and reproducibility. Owing to matrix effects in molecular MS, quantification is challenging, but is possible to achieve for single analytes via internal standards or standard addition with matrix matched standards.
Here, matrix was sublimated using the iMLayer (Shimadzu). MALDI-MS experiments were performed with the iMScope TRIO (Shimadzu), equipped with a fluorescence microscope, atmospheric pressure MALDI-source and an ion trap/time-of-flight (IT-TOF) mass analyser. IMAGEREVEAL MS (Shimadzu) was used for data analysis.
LA-ICP-MSI
In the field of elemental MSI, LA-ICP-MS provides major, minor and trace elemental information on surfaces and tissue sections. A laser is scanned over the sample and the ablated material is transported by a carrier gas into the ICP source, where the particles are atomized and ionized. To obtain spatially resolved images, transient signals of the respective analyte are required.
As mass analyser, quadrupoles are most frequently used. Although less matrix dependent than MALDI-MS, a fundamental aspect of recent research is method development for reliable quantification strategies, mainly via matrix matched standards. The major disadvantages of LA-ICP-MS are its destructive nature with loss of molecular information.
In this study, experiments were performed with the LSX-G2+ laser ablation system (Teledyne Cetac Technologies) coupled to the quadrupole based ICPMS-2030 (Shimadzu).
Complementary bioimaging of Gadofluorine P in myocardial infarction in mice
Magnetic resonance imaging is a widely used imaging technique in daily clinical practice. To enhance contrast during this examination, several different contrast agents are available. While most gadoliniumbased contrast agents (GBCAs) distribute systemically, some targetspecific GBCAs are under investigation as well. Gadofluorine P is one of these target-specific contrast agents and shows high affinity towards the collagen-rich extracellular matrix which is secreted in the event of myocardial infarction (MI) [1].
In this application, mice underwent injection of Gadofluorine P solution as contrast agents 6|weeks after an induced MI. Afterwards the mice were sacrificed and the hearts were removed for cryosections preparation. By multimodal imaging, LA-ICP-MS was used to generate quantified elemental imaging of gadolinium, while MALDI-MS validated the findings (Fig. 1) and could further provide information for phospholipids and heme b distribution (data not shown).
Figure 1 shows the microscopic images (a & b) of the two thin sections analysed. With LA-ICP-MS (c), a homogeneous distribution of the gadolinium in the healthy myocardium with an average concentration of about 50|μg/g was detected. The infarct region contained two times higher gadolinium concentrations of about 110|μg/g with maximum values up to 370|μg/g.
Higher gadolinium concentrations could also be found in the ventricle due to the intravenous administration of the contrast agent. These distributions could be verified with MALDI-MS imaging (d).
In this experiment, only the protonated ligand of Gadofluorine P rather than the intact complex could be detected (e). The main peak (m/z 1168.39) was used to create the image, which showed good correlation to the gadolinium distribution determined with LA-ICP-MS. The highest intensities of the molecular probe were found in MI and ventricle regions, whereas healthy myocardium showed low and homogenous intensities.
Multimodal imaging of photosensitizers in 3D tumour cell models
Photodynamic therapy offers an alternative cancer treatment. A photosensitive compound (photosensitizer; PS) is administered and the tumour is subsequently irradiated. The activation of the PS leads to the formation of a reactive oxygen species and subsequently to cell apoptosis. One main challenge in the development of PS is the hydrophobic character of the compounds, which hinders tissue penetration. Additionally, the orally administered compound needs to pass through the mucus layer in the gastrointestinal tract. Thus, the determination of the penetration depth of these compounds is of great interest.
The use of 3D tumour spheroids enables in vitro drug screening, while simulating the tumour environment better than 2D cell cultures. The photosensitizer 5,10,15,20-tetrakis (3-hydroxy-phenyl)-porphyrin (mTHPP) and its palladiumtagged analogue mTHPP-Pd were analysed in this study. Here, multimodal imaging is used to visualize the penetration depth of mTHPP and the lipid distribution in 3D tumour spheroid by MALDI-MS (5|μm spot size) as well as to quantify the drug by LA-ICP-MS (7|μm spot size) [2,3].
The MALDI-MS and LA-ICP-MS images of a tumour spheroid treated with mTHPP or mTHPP-Pd are shown in Figure 2. In the microscopic image, an almost spherical tissue section with a diameter of approx. 550|μm can be seen. The distribution map of mTHPP shows a ring-shaped distribution, which can be precisely correlated with the outer cell layer of the tumour spheroid. The PS is distributed homogeneously inside the outer layer and not around the spheroid, although it does not penetrate deeper into the tissue.
Nevertheless, the MALDI-MS experiments revealed that the PS can be detected as an intact molecule without substantial decomposition during the sample preparation. The LA-ICP-MS results for a spheroid incubated with mTHPP-Pd show the same distribution as the mTHPP detected by MALDI-MS. Since the metal-tagged PS is needed for ICP-MS analysis, only spheroids treated with this compound could be investigated. Conversely, this modification of the molecule could no longer be detected using MALDI-MS. Owing to the loading with palladium, the preferred protonation sites of the molecule are unavailable, impairing the detection.
However, before LA-ICP-MS experiments, MALDI-MS can be used to identify phospholipids as shown in Figure 3. Palladium concentrations up to 10|μg/g with an average of 1.9|μg/g were detected (Fig. 3b). This represents an enrichment of PS by a factor of 3 (average) up to 18 (highest concentration) compared to the incubation concentration. The phospholipids PC(34:1), PC(34:0) and PC(30:0) could be detected and show different distributions coherent with the different metabolic zones in a tumour spheroid.
Conclusion
In conclusion, the two applications shown provide an example of how to add MSI to clinical research. Multimodal imaging has successfully been performed to address drug penetration and enrichment in different kinds of tissue based on the combination of elemental imaging and molecular imaging by LA-ICP-MS and MALDI-MS.
The author
Ann-Christin Niehoff PhD
European Innovation Center, Shimadzu Europa GmbH,
47259 Duisburg, Germany

E-mail: acn@shimadzu.eu

Technology update – Pushing the ‘norms’ of conventional high-complexity clinical cytometry

by Dr Carsten Lange
Flow cytometry is a powerful technique for the detailed analysis of complex populations which, over the last two decades, has evolved from a staple technique of the research laboratory into an essential part of the modern clinical laboratory.
Some of the current ‘norms’ for clinical flow cytometry include its critical use for phenotyping hematological malignancies, as well as playing a vital role – along with other testing methods – in diagnosing disease, informing treatment plans and monitoring patients. Only time can tell how this powerful analytical technology will contribute to the clinical lab of the future. We can, however, anticipate that it will only continue to increase in importance, based on technical innovations that have driven the evolution of flow cytometry over the last decade.
In addition to today’s applications in disease diagnostics, the power of this technology continues to be used in cell biology research and pharmaceutical discovery. This evolution has been made possible by a higher number of analytical parameters to measure cells in suspension. The first cytometers were systems capable of merely three or four parameters, using a single laser and four detectors, and were the size of a small car. Today, however, flow cytometers (including cell sorters) can analyse more than 30 parameters, and new technology in benchtop analysers can deliver exponentially better performance in a smaller footprint.
Shifting paradigms
This paradigm shift, toward higher performance in a small instrument, is driven by clinical laboratories that want to capture the power of flow cytometric analysis, but don’t want to invest a significant amount of time in learning the instrumentation. The democratization of flow cytometry is enabled by key advances in technology. Advantage is being taken of prominent concepts in other scientific fields, such as the telecommunications industry, to allow the subsystems to be miniaturized while at the same time providing even better performance. These compact high-performance systems not only deliver better performance than historically expensive systems, but they are also easy to set up, operate and maintain, enabling a greater number of clinical laboratories to maximize the power of flow cytometry.
The power to see more
Performance of flow cytometers is typically measured by their capacity to resolve and their sensitivity to detect dim and/or rare populations. In this regard, efficient light management for optimal excitation and emission of fluorochrome-tagged cells is critical to performance.
With conventional flow cytometers, laser excitation sources are optimized by shaping and focusing light through a series of lenses and filters onto a flow cell where cells are hydrodynamically focused. However, newer flow cytometers use unique laser designs that are focused onto a flow cell with integrated optics. These systems can ensure increased excitation of the dyes not only on (and within) cells, but also increased collection of the emitted light for integration and measurement. When designing a compact clinical cytometer, the use of fibre optics to carry light is an efficient way of transmission, providing flexibility in laying out system components. These cables capture emitted light to deliver it onto a unique detector array, reducing crosstalk between channels, which improves performance.
Another recent development is a key concept borrowed from the telecommunications industry, the wavelength division multiplexer (WDM), which is used for light detection and measurement. Wavelength division multiplexing is a method used to deconstruct and measure multiple wavelengths of light as signals that relate to analytical parameters. The detectors used to measure each parameter are avalanche photodiodes (APDs), which are highly sensitive semiconductor devices. By contrast, conventional clinical cytometers to date have (and continue to use) photomultiplier tubes (PMTs). The major advantages of using APDs over PMTs include but are not limited to:

  1. enhanced linearity;
  2. 4–5 times the quantum efficiency;
  3. higher dynamic range, 106 versus 103;
  4. smaller size and about one-tenth the cost.

Shows the WDM of the first commercially available clinical cytometer to use compact APDs which reduce the overall instrument footprint (DxFLEX, Beckman Coulter). Each WDM contains optical and detector components to selectively measure specific wavelengths. This improves light collection for higher sensitivity to detect dim populations.
The WDM’s innovative and simple design uses a single bandpass filter to select the various colours of light. This contrasts with traditional clinical cytometers, which use a series of dichroic steering filters and bandpass filters that bounce the light along an array, leading to successively less available light, resulting in diminishing light collection efficiency, and ultimately compromising fluorescence sensitivity and resolution.
Simplifying high complexity
Leveraging the linearity of detection systems that use APDs in the operation of the cytometer can be dramatically simplified owing to the predictability of the signals. The linear gain and the normalization performed during the daily quality control routine takes care of the relative variations during instrument set-up commonly seen in instruments. Further, setting up a highcomplexity assay is simplified by using a software gain-only adjustment. The linearity of gain adjustment also simplifies the typically arduous task of spectral compensation which has been the barrier for many to push to a higher number of colours/ parameters. To maximize the benefit of the APD linearity, new software algorithms have been developed that facilitate set-up and analysis of high-complexity experiments by simplifying compensation.
It is now possible to create a compensation library that stores the APD gain settings and spectral spill-over coefficients for every parameter and multicolour combination. This allows users to make a virtual spectral compensation matrix selecting various single colours from the library. In addition, the library can intelligently adjust the compensation values when gains are adjusted owing to the predictive responses of linear APDs. The result is a dramatically simplified and intuitive method of setting up high-complexity applications.
The size factor
For most cytometers, measuring size of particles less than 300|nm is difficult because they deliver relative sizing information using forward scattered light from the 488|nm blue laser. For these systems, particles of less than 1|mm (1000|nm) usually fall below the noise threshold of the laser and detector subsystems. In contrast, newer systems use principles of Mie scattering, which predicts that with lower wavelengths of excitation there will be an increased amount of scattered light and improved resolution.
Therefore, measuring scattered light from a shorter-wavelength 405|nm violet laser versus a longer-wavelength 488|nm blue laser will allow the system to resolve smaller particles. The use of the violet side scatter parameter enables systems to detect particles of less than 0.2|mm (200|nm) in size, enabling excellent resolution of microparticles.
The future is now
Combining powerful performance and innovative design and technology, it is possible to deliver a compact, easy-to-use flow cytometer. Pushing the ‘norms’ of conventional flow cytometry, today’s – and tomorrow’s – cytometers simplify high-complexity applications in the clinical laboratory, as well as a deeper understanding in the frontier applications of hematopoietic cancers. Flow cytometry remains a powerful tool for interrogating complex questions. Today’s clinical laboratories want to harness that power and are demanding smaller and more powerful flow cytometers that are more affordable and easier to use. Using innovation, engineers can deliver solutions to meet the challenge.
The author
Carsten Lange PhD
Beckman Coulter GmbH, 47807 Krefeld, Germany

E-mail: clange@beckman.com DxFLEX flow cytometer: https://www.mybeckman.uk/flow-cytometry/instruments/dxflex

Erythrocyte sedimentation rate: getting the most out of this test

by Peter Murphy
It was first noticed that the rate of erythrocyte sedimentation changed owing to illness in the 1700s. The use of this attribute as a measure of inflammatory activity due to underlying disease was formalized into a test in the early 1900s and what has become known as the Westergren test has again recently been proposed to be the reference method for measuring erythrocyte sedimentation rate, which is still a commonly used hematology test today. This article allows you to understand why it is used, how the results are affected by physiological factors and how to perform it to obtain useful and reliable results.
Using erythrocyte sedimentation rate measurement to indicate inflammation
Explaining erythrocyte sedimentation rate measurement
The erythrocyte sedimentation rate (ESR) is a general condition indicator and serves as a guide to determine diagnosis and treatment follow-up of different autoimmune diseases, acute and chronic infections and tumours. ESR is the speed at which erythrocytes settle in a tube and provides medical practitioners with valuable information for the diagnosis of their patients. Normal-sized erythrocytes are negatively charged and repel each other, which limits their sedimentation rate. Erythrocytes that form clumps fall faster than small ones, so factors that increase aggregation will increase sedimentation. This increased sedimentation indicates health problems, resulting in a need for additional tests.
Applications of ESR measurement
There’s a long list of conditions for which ESR can be used to assist in making a correct diagnosis or managing the care of a patient: autoimmune diseases such as rheumatoid arthritis, temporal arteritis and polymyalgia rheumatica are well known examples, as is multiple myeloma. When the presence of inflammation is suspected, ESR is a simple and cost-effective way of confirming this. Moreover, for patients with a known condition, the ESR test can provide useful information into the overall effectiveness of their treatment.
The Westergren method
The discovery of the ESR dates back to 1794, but in the 1920s, pathologist Robert Fåhraeus and Alf Westergren developed ESR measurement as we know it. To this day, the so-called Westergren method is recognized as the gold standard, among others by the Erythrocyte sedimentation rate: getting the most out of this test by Peter Murphy It was first noticed that the rate of erythrocyte sedimentation changed owing to illness in the 1700s. The use of this attribute as a measure of inflammatory activity due to underlying disease was formalized into a test in the early 1900s and what has become known as the Westergren test has again recently been proposed to be the reference method for measuring erythrocyte sedimentation rate, which is still a commonly used hematology test today. This article allows you to understand why it is used, how the results are affected by physiological factors and how to perform it to obtain useful and reliable results. Hematology and Flow Cytometry June 2020 13 | Clinical and Laboratory Standards Institute (CLSI). In 2017, the International Council for Standardization in Hematology (ICSH) reconfirmed the Westergren method as the reference method for ESR measurement. The Westergren method owes its popularity to the fact that it’s a simple and inexpensive first-line test, providing valuable information to GPs in the investigation of inflammation after only 60 (or even 30) minutes.
Critical factors of a reliable ESR test
Although the Westergren method may be the gold standard, many factors can meddle with its reliability. Therefore, always keep in mind the following requirements:

  • non-hemolysed blood anti-coagulated with EDTA at collection;
  • blood sample is thoroughly mixed and diluted 4|:|1 using a sodium citrate solution;
  • the tube is held in vertical position at a constant temperature (±1|°C) between 18|°C and 25|°C in an area free from vibrations, drafts and direct sunlight; and
  • results are interpreted after at least 30|minutes.

Can we speed up ESR measurement?
In the original Westergren method, the ESR is read after 60|minutes. You can imagine this puts practical limitations on the workflow in clinical laboratories. A laboratory investigation, however, showed that 30-minute ESR readings correlate highly with the corresponding 60-minute ESR readings, which is why today most laboratories perform 30-minute ESR readings and then extrapolate them to derive the 60-minute ESR result. There are Westergren alternatives that claim to measure ESR after only 20|seconds, but as it takes at least 10|minutes before sedimentation starts at a constant rate, these tests risk leading to a number of false negatives.
Why speeding up ESR measurement is not a good idea
The Westergren method and faster alternatives
As mentioned above, the 30-minute version of the Westergren test has become the standard in most hospitals and laboratories. However, even though 30|minutes can be regarded as a short time frame, some companies have worked on Westergren alternatives that can be read after mere minutes or even seconds. A major step forward, or so it seems.
What’s the deal with fast ESR measurement methods?
There are several conditions that ESR methods should comply with in order for them to be reliable. For example, test tubes must be held in vertical position, and the blood must be thoroughly mixed and diluted. Still the most important condition of all doesn’t revolve around equipment; it revolves around time. It takes approximately 10|minutes before red blood cell sedimentation starts at a constant rate. This means that ESR readings after 20|seconds do not actually measure sedimentation but calculate a mathematically derived ESR. This, in turn, leads to ESR readings that don’t correlate with the Westergren standard, leading to a number of false negatives. So, in their attempt to speed up the diagnosis of patients, laboratories that use Westergren alternatives risk overlooking important signs of disease.
Speed or reliability?
Healthcare and in vitro diagnostics are being improved daily and theories are constantly evolving. This makes it hard to determine which ESR method is the right one to choose. The choice is even harder when you consider that ESR alternatives are comparable to the Westergren method, as long as you treat healthy people under Erythrocyte sedimentation rate test normal circumstances. It’s when people are ill that the results start to deviate. This is why our advice is to always choose a method that adheres closely to the Westergren method [such as automated ESR analysers Starrsed (RR Mechatronics), MixRate and Excyte (ELITech)]. Westergren has always been the method of choice in fundamental studies, meaning that ESR is essentially based on this procedure. Moreover, the Westergren method is recommended by the CLSI and reconfirmed as the gold standard by ICSH, two organizations that inform healthcare professionals on state of the art technologies for in vitro diagnostic testing.
Not everything can be rushed
Moving forward is part of human nature; it’s why we’re always so busy making things better, faster and more comfortable. But in the case of ESR measurement, we simply have to face the fact that not everything can be rushed. We may be able to speed up the way we live, work and travel; we cannot force red blood cells to settle faster than they do. What we can do, is make ESR measurement tests as reliable as possible and have them help us improve diagnostics and save lives.
Physiological and clinical factors that influence ESR values
In the investigation of inflammation, ESR measurement is often the first-line test of choice as it’s simple, inexpensive and – if based on the Westergren method – reliable, reproducible and sensitive. But as is the case with every test, there are physiological and clinical factors that may influence ESR results. In this section, we’ll tell you more about them. However, when reading about factors that influence ESR results, please keep in mind that much, if not all of this information, is based on studies undertaken with the Westergren gold standard ESR method only. This is mainly due to the fact that the Westergren ESR method has been almost universally used to investigate the clinical utility of the test in a range of disease states, with much of this work published in peer reviewed journals. As a result, there’s a deep body of knowledge that describes the impact of disease, the limitations and sources of interference with the Westergren ESR. As the Westergren method for ESR measures a physical process under a defined set of conditions, this expansive body of knowledge cannot simply be ‘transferred’ to estimations of ESR by methods that use centrifugation or optical rheology.
What’s normal in ESR?
Before discussing the factors that influence ESR results, first we should answer the question: what is normal? When patients suffer from a condition that causes inflammation, their erythrocytes form clumps which makes them settle faster than they would in the absence of an inflammatory response. However, ‘faster’ is a relative term, and what’s ‘normal’ changes based on sex and age category.
Physiological and clinical factors that increase ESR
The most obvious explanation for increased ESR is inflammation. During acute phase reactions, macromolecular plasma proteins, particularly fibrinogen, are produced that decrease the negative charges between erythrocytes and thereby encourage the formation of cell clumps. And as cell clumps settle faster, this increases ESR. Inflammation indicates a physical problem, meaning additional tests and follow-up are needed. However, there are other factors that increase ESR but don’t necessarily come with inflammation. For example, ESR values are higher for women than for men and increase progressively with age. Pregnancy also increases ESR, which means you’ll be dealing with ESR results above average. In anemia, the number of red blood cells is reduced, which increases so-called rouleaux formation so that the cells fall faster. This effect is strengthened by the reduced hematocrit, which affects the speed of the upward plasma current. Another factor that increases ESR revolves around high protein concentrations. And in macrocytosis, erythrocytes have a shape with a small surface-to-volume ratio, which leads to a higher sedimentation rate.
Physiological and clinical factors that decrease ESR
Apart from factors that increase ESR, medical practitioners and laboratory scientists should also consider the factors that decrease ESR. This is especially important as decreased ESR results may lead to missed diagnoses, whereas increased ESR results either lead to the right follow-up or false positives. Polycythemia, caused by increased numbers of red blood cells or by a decrease in plasma volume, artificially lowers ESR. Red blood cell abnormalities also affect aggregation, rouleaux formation and therefore sedimentation rate. Another cause of a low ESR is a decrease in plasma proteins, especially of fibrinogen and paraproteins.
The four factors that determine ESR reliability (dos and don’ts)
As with any test, the reliability of ESR measurements stands or falls with proper implementation. When not reliably performed, the nonspecific indicator for inflammation may point in the wrong direction, and result in either a false positive or a false negative. This may lead to the initiation of unnecessary investigations or worse: the overlooking of serious problems that actually needed follow-up. In this section, we discuss some do’s and don’ts when performing ESR measurement, to guarantee ESR reliability.
Factor 1: blood collection
Do: make sure you mix and dilute the sample 4:1 using a sodium citrate solution. If you adhere to these practices, you standardize the way you handle the blood samples, and therefore their suitability for ESR.
Don’t: leave the sample for too long before testing. We can imagine you’re pretty busy, and that you can’t do everything at the same time. However, when it comes to blood collection for ESR tests, some speed is required. After four hours, the results won’t be as accurate as before, which may negatively impact the reliability of the result. We therefore recommend performing the test within these four hours. If you really can’t make it in time, 24|hours is the max, but only if the sample is stored at 4|°C.
Factor 2: tube handling
Do: hold the tube vertically. A tube that is not held completely vertical can lead to increased sedimentation rates and is one of the technical factors that can affect ESR readings. And as we discussed in the previous paragraph, temperature is a factor too. Therefore, always place the tube in a stable and vertical position and at a constant temperature.
Don’t: expose the sample to vibrations, draft and sunlight, as all of these factors can have a strong influence on the final result obtained.
Factor 3: result reading
Do: wait 30|minutes. This is a very important one. Before reading ESR results, you should always wait 30|minutes. There are ESR testing methods that claim to show reliable results within only 20|seconds, but as it takes 10|minutes before sedimentation starts at a constant rate, these tests do not actually measure sedimentation. In fact, they calculate a mathematically derived ESR, leading to a number of false negatives.
Don’t: include the buffy coat (which is made up of leukocytes) in the erythrocyte column.
Factor 4: test quality
Do: go with an automated ESR test. They provide you with more reliable results, not least because they can correct hazy results. Moreover, automated ESR tests have a higher throughput compared to manual tests and minimize human contact with the tubes, which helps you reduce operations costs and minimize occupational health and safety risks.
Don’t: choose an ESR test that deviates from the Westergren standard. This method has always been the method of choice in fundamental studies, meaning that ESR is essentially based on this procedure. ESR tests that deviate from the Westergren will logically provide you with different ESR values, meaning they can lead you in the wrong direction. This is why the Westergren method is recom-mended by the CLSI and reconfirmed as the gold standard by ICSH.
ESR test as a reliable tool
If you keep these dos and don’ts in mind, you’re well on your way to making the ESR test a reliable tool that’s going to help you diagnose patients fast and error-free.
The author
Peter Murphy MBA(TechMgt), MAACB, BSc, GradDipEd
ELITech Group, Braeside, Victoria 3195, Australia
E-mail: p.murphy@elitechgroup.com

Flow cytometry: a critical technique in combating leishmaniasis

by Professor Paul Kaye
Leishmaniasis is classified as a neglected tropical disease. It is the cause of a huge health burden and is common in Asia, Africa, South and Central America, and even southern Europe. This article discusses how flow cytometry can help to evaluate diagnosis, monitor the effects of therapy and help in the creation of a vaccine.

Background

The leishmaniases are a family of devastating diseases, affecting a great many people across the globe and presenting a significant risk to both public health and socioeconomic development. The leishmaniases are vector-borne diseases, caused by infection with one of 20 species of the parasitic protozoan Leishmania (Fig. 1), transmitted through the bite of the infected female phlebotomine sand fly.
They can be broadly classified as tegumentary leishmaniases (TLs), affecting the skin and mucosa, and visceral leishmaniasis (VL), affecting internal organs. Whereas VL is responsible for over 20¦000 deaths per year, TL are non-life-threatening, chronic and potentially disfiguring, and account for around two-thirds of the global disease burden.
Within TL, there are three subtypes: self-healing lesions at the location of sand fly bite (cutaneous leishmaniasis; CL), lesions that spread from the original skin lesion to the mucosae (mucosal leishmaniasis; ML), and those which spread uncontrolled across the body (disseminated or diffuse cutaneous leishmaniasis; DCL). VL, also known as kala azar, involves major organs including the spleen, liver and bone marrow. In addition, patients recovering from VL after drug treatment often develop post kala-azar dermal leishmaniasis (PKDL), a chronic skin condition, characterized by nodular or macular lesions beginning on the face and spreading to the trunk and arms. As it may develop in up to half of patients previously treated and apparently cured from VL, it is thought that PKDL plays a central role in community transmission of VL.
The World Health Organization designates leishmaniasis as a neglected tropical disease (NTD), which together affect more than one|billion people across 149 countries worldwide; true prevalence may be even higher. Disproportionately, NTDs affect the poorest, malnourished individuals, and contribute to a vicious circle of poverty and disease. The significant physical marks, including ulcers, often left in the wake of the TLs may have an impact on mental health and perpetuate social stigma associated with the diseases [5]. There are over 1|million new cases of TL and 0.5|million new cases of VL each year, which together account for the loss of approximately 2.4|million disability-adjusted life years.

Treatment challenges

Leishmaniasis treatment can be quite difficult since at-risk populations may lack access to healthcare, and the limited battery of drugs has been increasingly compromised by resistance. Additionally, because the parasites in question are eukaryotic, they are not dissimilar from human cells, so the medication is also liable to be harmful – even fatal – to host as well as to pathogen.
Although the burden of VL in South Asia has been reduced with single-dose liposomal amphotericin B, the drug is less effective in other geographic locations, namely East Africa. Various drug combinations have been tested, unsuccessfully, and new chemical entities and immune-modulators are in early stages of development and as yet untested in the field. Unfortunately, little has changed in the treatment for CL for the past 50|years.
No vaccines are currently approved for any form of human leishmaniasis, although vaccines for canine VL have reached the market. Barriers to vaccine development include the limited investment in leishmaniases R&D and the high costs involved in bringing new products to those that need them.

Current work

My work on leishmaniasis has taken a holistic view, rooted in the immunology of the host-parasite interaction, but employing tools and approaches that span many disciplines: mathematics, ecology, vector biology and most recently neuroscience. Thirty years of discovery science has led to the development of a candidate for a therapeutic vaccine for PKDL, the mysterious sequela to VL [6]. ‘Therapeutic’ vaccines are given after an individual is infected with a pathogen and are designed to enhance our immune system and help eliminate the infection.
With colleagues from Sudan, we are in the midst of a phase IIb clinical trial funded by the Wellcome Trust, evaluating the efficacy of this therapeutic vaccine in Sudanese patients with persistent PKDL.
However, the research has been a long time in the making and has a long way to go. To continue to make progress, we linked with colleagues in Ethiopia, Kenya and Uganda and at the European Vaccine Initiative (http://www.euvaccine.eu/) in Germany, to develop a new research consortium to evaluate the immune status of people suffering from leishmaniasis. For example, using flow cytometry for blood and multiplexed immunohistochemistry for tissue biopsies, we can enumerate the proportions of lymphocytes, monocytes and neutrophils based on surface marker expression (e.g. CD3, CD19, CD14, CD16), and characterize their function, for instance by expression of cytokines (e.g. interferon-gamma) or other cell surface proteins that define function state. To support this endeavour, we recently received a grant from the European & Developing Countries Clinical Trials Partnership (EDCTP) that will allow us to not only extend our vaccine programme in Sudan [9] but also to address other important research challenges.
To develop vaccines and indeed new drugs, we often need tools capable of performing in-depth comparisons of how the body’s immune system is coping with the infection when a patient is first admitted to hospital and how it changes as the patient undergoes treatment and is hopefully cured. For example, recent evidence suggests that during infection, T lymphocytes may become ‘exhausted’ and unable to fight infection and the exhausted state can be identified by expression of surface molecules such as programmed cell death protein|1 (PD-1) and lymphocyte activation gene 3 protein (LAG-3). It is important to know if exhaustion can be reversed following treatment or whether we need to stimulate new populations of T lymphocytes. By understanding these nuanced changes in immune cells in our blood, we can design ways to improve how vaccines and drugs work in concert with immune cells, and understand why some patients might relapse from their disease or develop PKDL. Flow cytometry is a central tool for immunologists and plays a critical role in uncovering mechanisms of immunity and in assessing how well vaccines work and biomarkers of drug response. It uses antibodies that recognize specific molecules or markers on the surface or inside immune cells, such as those mentioned above, that help us predict their function. These antibodies are fluorescently labelled and the fluorescent signal can be detected by exposing each cell individually to laser light as they pass through a small aperture, the essence of flow cytometry.
For flow cytometry to be beneficial in this project, we needed to purchase five new flow cytometers that could meet exacting standards. They needed to be sufficiently sensitive to identify rare cell populations, often with low levels of surface marker expression. For multicentre research projects, reproducibility of data between sites is essential. Hence, we needed excellent inter-machine reproducibility and the Figure 2. Initial training course with recently appointed flow managers (Credit: Dr Karen Hogg, University of York) | 10 manufacturer had to be able to provide service support across the region. In our search for the right flow cytometer to support the consortium, we settled upon the CytoFLEX, Beckman Coulter Life Sciences’ research flow cytometer, which uses avalanche photodiode detection to arrive at the required level of sensitivity. With assistance from Beckman Coulter, we devised and have run initial training courses with a group of recently appointed flow managers from each partner country, to share standard operating procedures, develop high-level data analysis strategies as well as to provide instruction in routine instrument maintenance.
Beckman Coulter also provides another important aid to reducing errors in flow cytometry for multisite projects such as this, namely freeze-dried antibody cocktails (DURAClone panels) [10], that allow highly multiplexed phenotyping of small volumes of blood added directly to a single tube. Particularly for investigators in remote locations, the use of dry, preformulated reagents, rather than liquid (‘wet’) antibodies, removes the need for a cold chain. Equally importantly, staining of cells when manual mixing of 15 or 16 antibodies is required can introduce data inconsistencies when conducted by different individuals and at different locations.
Together, these innovations have allowed us to establish a new network for flow cytometry in East Africa that will allow us to identify and functionally characterize and identify the types of immune cells present in the blood during these devastating diseases. We will match this data with similar multiplexed techniques in pathology to compare blood immune cell profiles with those of cells found in the skin, to give a more complete picture of the host response to infection before and after treatment or vaccination.

Future Directions

As mentioned, we are currently in the midst of an efficacy trial of our therapeutic vaccine, ChAd63-KH. The technology we are using is similar to that being used by researchers at the university of Oxford to develop a coronavirus vaccine. In short, we introduce two genes from Leishmania parasites (KMP-11 and HASPB1) into a well-studied chimpanzee adenovirus (ChAd63 viral vector). After vaccination with this vaccine, host cells become infected with the virus and express the Leishmania proteins in a way that can be recognized efficiently by the immune system. We are particularly interested in how well this vaccine can generate T|cells to fight the infection.
With the first of our clinical objectives now well underway – the ongoing therapeutic clinical trial in patients with PKDL will be completed in mid-2021 – we have two additional goals. The next, funded by EDCTP, is to start a new clinical trial to determine whether the vaccine can prevent progression from VL to PKDL. And finally, we hope to develop a human challenge model of leishmaniasis to test the vaccine for its ability to protect against infection by different forms of parasite. This would open the way to the development of a cost-effective prophylactic vaccine to prevent these diseases occurring in vulnerable populations across the world.
Our research also has larger ambitions for the long term. Our East African partners are also linked together through their work on leishmaniasis in drug development, as members of the Leishmaniasis East Africa Platform group, established to help coordinate drug development activities in the region by the Drugs for Neglected Diseases Partnership. Central questions about why the disease varies between countries are being addressed, and the increased capacity for flow cytometry will additionally support patient monitoring during drug trials conducted by DNDi or other groups. Indeed, through the capacity building this project provides, we hope this project will extend its reach beyond leishmaniasis, providing muchneeded support for research on other neglected diseases of poverty that affect people in the region, including bacterial, fungal, other parasitic and viral diseases. By continuing to demonstrate the analytical power of flow cytometry and its role in helping design much-needed therapies, we hope to open up additional discovery research possibilities for colleagues in Africa and around the world.
The research described in this article is part of the EDCTP2 programme supported by the European Union (grant number RIA2016V-1640; PREV_PKDL; https://www.prevpkdl.eu).
The author
Paul Kaye PhD, FRCPath, FMedSci
Hull York Medical School, University of York, York, UK
E-mail: paul.kaye@york.ac.uk