Technological advances allow use of mid-infrared spectroscopy for therapeutic drug monitoring
Therapeutic drug monitoring is used for a number of reasons. The usual techniques include immunoassay and mass spectrometry, both of which have limitations. CLI chatted to Dr Pin Dong (University of York, York, UK) to discuss the potential for Fourier transform infrared spectroscopy to bridge the gap and provide high specificity measurements with a fast turnaround time.
Why is it necessary to measure drug levels in patients’ serum or plasma samples?
There are a number of reasons why therapeutic drug monitoring (TDM) is useful, including reducing toxicity, ensuring efficacy, monitoring immunosuppressants and managing altered pharmcokinetics.
Reducing toxicity
I would say the primary reason for TDM is that we want to reduce the toxicity of drugs to the patients because some drugs, such as the antiepileptic drugs phenytoin and carbamazepine, have a very narrow therapeutic window. For example, for phenytoin the therapeutic range is from 10 to 20 µ/mL and any concentration over 20 µ/mL will lead to some side effects, such as ataxia or vomiting.
Ensuring efficacy
Another reason for TDM is to ensure efficacy of the medication. This is very important for antibiotics such as vancomycin – the efficacy is concentration dependent and it is important that the drug remains at effective concentrations to combat the infection while mitigating the risks of toxicity that arise with higher concentrations of vancomycin which include damage to the kidneys and hearing.
Monitoring immunosuppressants
Another common reason for needing TDM is in monitoring the use of immunosuppressants, which are often used for patients undergoing organ transplant it. In this situation, immunosuppressants are used long term but such drugs also have toxicities and so again we have to make the balance between ensuring efficacy to prevent organ rejection and avoiding toxic side effects.
Managing altered pharmacokinetics
TDM is essential for the adjusting medication dose for optimal therapeutic outcome for patients with kidney or liver dysfunctions, such as those in ICU, where drug metabolism and clearance are significantly affected.
Future directions
In the future, TDM would be a key part for precision and personalized medicine, where drug regimens will be tailored according to an individual’s genetic, physiological and metabolic profile.
What methods are currently used to do TDM – why are they useful and what are the drawbacks?
First, I would say the immunoassay is widely used because immunoassays have the advantage of automation and are user-friendly. Immunoassays, for example the enzyme-linked immunosorbent assay (ELISA), are based on the antigen and antibody binding but the drawback of immunoassays is that the specificity is not high enough. They often suffer from the cross-reaction with endogenous substances and metabolites. Additionally, antibodies are not readily available for every drug. However, there are several commercially available immunoassay kits available for TDM, such as for tacrolimus, which is used to quantify the drug plasma concentration.
Another widely used technique is liquid chromatography coupled with mass spectrometry (LC-MS). For this, the triple quadrupole MS is often used because it has a very high sensitivity and is very specific. However, the drawbacks of LC-MS are first that it is a specialized technique requiring very highly trained technicians and second it is a lab-based technique so the hospitals always have to send samples to the lab and wait for the results. So the technique often doesn’t meet the clinical need for a short sample turnaround time. Also, there is a lack of standardization in LC-MS techniques, which is affected by factors such as the column, ionization source, sample preparation, etc.
So to summarize, the unmet clinical need is for a technique which has enough specificity but also can provide results quickly.
What other methods are available that might overcome some of the limitations of LC-MS?
Surface-enhanced Raman spectroscopy (SERS) and Fourier transform infrared (FTIR), particularly attenuated total reflection (ATR)-FTIR spectroscopy offer good specificity for small drug molecules and enable fast sample scanning. Also, their strong potential for miniaturization make them promising candidates for on-site clinical use.
Both Raman and FTIR spectroscopy are vibrational spectroscopic techniques, which means that they study the interaction of the light with the chemical bonds of drug molecules.
However, there are also differences between Raman and FTIR spectroscopy. FTIR uses a broadband light source, such as global, which typically covers wavelengths from 2.5 to 25 µm. Upon the light radiation, chemical bonds absorb wavelengths that match their vibrational energy levels. FTIR measures this absorption and produce a mid-IR absorption spectrum with over the range of 2.5–25 µm. The spectrum contains the information of molecular structures and the peak intensity can be used for quantification.
Raman spectroscopy, however, measures inelastic scattering of light. It uses monochromatic light (usually lasers) to excite drug molecules, and detect the energy difference between the incident and scattered photons – which is called Raman shift – is measured.
The Raman inelastic scattering is inherently weak, so SERS has been developed to amplify the weak Raman signals. This technique typically employs metallic surfaces with gold or silver nanoparticles, to generate intense localized electromagnetic fields near the nanostructure surface, often called ‘hot spots’. These fields significantly enhance the inter-
action between light and nearby molecules, amplifying Raman signals by 4 to 10 orders of magnitude and even enabling the detection of even single molecules.
However, the application of SERS for absolute quantification in TDM presents several challenges.
• Data reproducibility
Signal enhancement in SERS depends on the hot spots created by nanoparticles interacting strongly with nearby molecules. Variations in the spatial distribution and aggregation of nanoparticles during sample preparation can lead to inconsistent signal enhancement, making reproducibility a significant issue.
• Equipment and expertise
SERS is primarily a lab-based technique, requiring sophisticated
instrumentation and highly trained personnel for operation
and analysis.
• Safety considerations
The use of high-power laser sources in SERS poses potential safety risks. Laboratories employing SERS must implement stringent laser safety protocols to ensure operator safety.
Infrared spectrometer (Adobestock.com)
What are the advantages and limitations of ATR-FTIR?
Advantages of ATR-FTIR
I’ll discuss the advantages of ATR-FTIR first. These are as follows.
1. Fast scanning
ATR-FTIR provides rapid analysis, making it suitable for time-sensitive applications.
2. Potential for miniaturization and integration
The compact design of ATR-FTIR systems facilitates the development of portable devices, such as handheld or box size.
3. User-friendly operation
ATR-FTIR systems are easy to use, requiring minimal training compared to more complex techniques like LC-MS/MS.
These characteristics make ATR-FTIR a promising candidate for on-site TDM devices.
Limitations of ATR-FTIR
The main limitation of ATR-FTIR is that current FTIR systems lack sufficient signal-to-noise ratio for TDM applications, where drug concen-
trations range from micrograms to nanograms per millilitre.
Challenges of ATR-FTIR
Matrix effects
Endogenous substances such as water, lipids, and proteins exhibit strong mid-IR absorption that can overlap with the fingerprint regions of drugs or broaden their signals due to molecular interactions. For example, the hydrogen bond will decrease the amplitude of the IR absorption peak.
Weak absorption signals
The number of target drug molecules in TDM samples are small which cannot give a strong enough mid-IR signals for quantification. This limitation highlights the need for high-quality photonic mid-IR sensors to amplify signals effectively.
Atmospheric noise
Water vapor exhibits IR absorption in the range of 5.5 to 8 μm. In FTIR analysis, we typically subtract the atmospheric background. However, variations still occur because the scans of the atmospheric background and the sample are conducted at different times. Fluctuations in atmospheric conditions during this interval introduce variability. Water vapor interference is especially problematic when measuring low-concentration drug samples. In the lab environment, a nitrogen purge can be used to create a consistent environment, but this is impractical for on-site TDM applications, further complicating measurements.
Data analysis and specificity
One common critique of the mid-IR techniques is that their specificity is not as high as that of mass spectrometry. However, the specificity of FTIR comes from the drug fingerprint region of the spectrum. The challenge though is to be able to deeply explore the information from the drug fingerprint region. The protocol we published can greatly remove the interference of serum background and allows us to choose target drug peaks that are free from such interference. However, this is not always achievable for all analytes. In such cases advanced data processing techniques and AI will be crucial for in-depth analysis and enhanced specificity. This underscores the need to build a comprehensive mid-IR database of blood samples. Such a database would empower AI to optimize FTIR techniques for TDM by accounting for interferences and variability in clinical samples.
Interpretation of TDM data
Interpreting drug plasma concentration data for clinical decision-making remains a challenge across all TDM techniques. Patients exhibit variable responses to treatments, making it essential for physicians to consider both TDM values and the patient’s physiological condition to formulate effective treatment plans.
Future vision
In the future, we envision a compact, box-sized mid-IR sensing device integrated with AI. Medical staff would simply load a small volume of the patient’s blood sample and input key physiological parameters. The device would quickly analyse the sample and provide treatment recommendations in real-time. Such an approach would:
• enhance efficiency for healthcare professionals;
• standardize treatment plans, making them more criteria-based; and
• provide critical support for less experienced physicians, especially in complex cases.
Blood sample for vancomycin therapeutic drug level monitoring (Adobestock.com)
Can these limitations be overcome and, if so, how?
Increase signal-to-noise ratio
There are two ways in which we can increase the signal-to-noise ratio.
Firstly, we can reduce the matrix effect: sample preparation is necessary to reduce or eliminate the effect of endogenous substances, such as water, protein, lipids and water-soluble substances.
As we know, water and proteins constitute more than 90% of serum and cause a very strong mid-IR background effect. Additionally, we found that serum lipids have a very strong effect as their concentration is in the milligram per millilitre range, much higher than the microgram per millilitre level of drugs. The mid-IR signals of drugs are often obscured by the stronger signals of serum endogenous components. Therefore, when applying mid-IR techniques for TDM, it is essential to eliminate most endogenous serum substances while minimizing the loss of the target drug (Fig. 1). In our study, we developed a sample preparation protocol that sequentially removes serum lipids, small en-
dogenous water-soluble substances, proteins, and water. This approach allows FTIR spectroscopy to accurately quantify drug concentrations in serum samples as low as 10 micrograms per milliliter.
Secondly, the use of photonic mid-IR sensors significantly improves sensitivity. We have been working on dielectric metasurface-enhanced mid-IR sensors and preliminary results indicate that it can enhance signals by up to 2–3 orders of magnitude.
Reduce atmospheric noise
To address this, we used PM-IRRAS (polarization-modulated infrared reflection-absorption spectroscopy), which enables the simultaneous measurement of the atmosphere and the sample. By using the self-referencing capability of PM-IRRAS, we can effectively eliminate the variability caused by atmospheric background fluctuations. This highlights one of the key advantages of photonic sensing: self-referencing can be easily integrated into the design and fabrication of photonic sensors.
Figure 1.
FTIR spectra
related to
serum sample preparation
(P. Dong)
How do you imagine that your findings will lead to improvements in TDM?
Our published work addresses the scientific question of how endogenous substances in human serum affect the limit of detection (LOD) of FTIR for TDM samples. Importantly, we also propose solutions to mitigate these effects. This work highlights the importance of sample preparation, a factor that is equally relevant for Raman techniques, which face similar challenges due to the Raman signals of endogenous substances in human blood samples. Our findings lay the foundation for developing on-site, handheld mid-IR sensing platforms for clinical TDM applications.
I would like to emphasize that mid-IR sensing is a highly promising technique for on-site TDM, offering the potential to meet the critical clinical requirement of rapid sample turnaround time. However, it is important to note that there is no one-size-fits-all solution. For instance, not all clinical situations require rapid sample turnaround. On-site TDM would be particularly beneficial for patients in the ICU and in emergency toxicity cases.
Another aspect of our work is its relevance to urine sample preparation for drug abuse testing. Urine testing is commonly used for detecting drug abuse, and an on-site technique would be highly beneficial for law enforcement. However, integrating the entire mid-IR sensing platform into a compact, box-sized device presents several challenges. For sample preparation, we need to integrate all procedures into microfluidic systems, which is an area I am currently focusing on. For improving the sensitivity, my colleagues are working on metasurface mid-IR sensors, whose sensitivity is 2–3 orders of magnitude higher than ATR-FTIR. We aim to measure drug levels in the sub-microgram per millilitre range. We are also collaborating with the University of Southampton, the University of Sheffield, and the National Oceanography Centre, focusing on laser sources, detectors, and electronics, with the ultimate goal of achieving miniaturization and integration of the system.
The interviewee
Dr rer. nat. Pin Dong PhD, Research Associate Photonics Research Group, School of Physics, Engineering and Technology, University of York, Heslington, York YO10 5DD, UK
Email: pin.dong@york.ac.uk
For further information see:
Dong P, Li K, Rowe DJ, Krauss TF, Wang Y. Protocol for therapeutic drug monitoring within the clinical range using mid-infrared spectroscopy. Anal Chem 2024;96(48):19021–19028
(https://pubs.acs.org/doi/10.1021/acs.analchem.4c03864).