451
|
Zhang X, Vimalraj V, Patel M. Routine Analysis of N-Glycans Using Liquid Chromatography Coupled to Routine Mass Detection. Methods Mol Biol 2021; 2271:205-219. [PMID: 33908010 DOI: 10.1007/978-1-0716-1241-5_15] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/15/2022]
Abstract
Analysis of N-glycans are commonly conducted via enzymatic release, labeling, and liquid chromatography (LC) separation and fluorescent detection. Mass spectrometry (MS) has been increasingly used as an orthogonal detection method to provide additional structural information and increase the confidence of N-glycan analysis. In this chapter, we describe a method to perform routine analysis of N-glycans including the sample preparation with a signal-enhancement label, LC-MS data generation, and data analysis. Using this method, up to 24 N-glycan samples can be prepared at one time and analyzed by LC-MS. With the addition of automation platform, up to 96 N-glycan samples can be prepared and analyzed in a high-throughput manner.
Collapse
|
452
|
Faber BG, Ebsim R, Saunders FR, Frysz M, Davey Smith G, Cootes T, Tobias JH, Lindner C. Deriving alpha angle from anterior-posterior dual-energy x-ray absorptiometry scans: an automated and validated approach. Wellcome Open Res 2021; 6:60. [PMID: 36072553 PMCID: PMC9426635 DOI: 10.12688/wellcomeopenres.16656.2] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/13/2022] [Indexed: 02/02/2023] Open
Abstract
Introduction: Alpha angle (AA) is a widely used imaging measure of hip shape that is commonly used to define cam morphology, a bulging of the lateral aspect of the femoral head. Cam morphology has shown strong associations with hip osteoarthritis (OA) making the AA a clinically relevant measure. In both clinical practice and research studies, AA tends to be measured manually which can be inconsistent and time-consuming. Objective: We aimed to (i) develop an automated method of deriving AA from anterior-posterior dual-energy x-ray absorptiometry (DXA) scans; and (ii) validate this method against manual measures of AA. Methods: 6,807 individuals with left hip DXAs were selected from UK Biobank. Outline points were manually placed around the femoral head on 1,930 images before training a Random Forest-based algorithm to place the points on a further 4,877 images. An automatic method for calculating AA was written in Python 3 utilising these outline points. An iterative approach was taken to developing and validating the method, testing the automated measures against independent batches of manually measured images in sequential experiments. Results: Over the course of six experimental stages the concordance correlation coefficient, when comparing the automatic AA to manual measures of AA, improved from 0.28 [95% confidence interval 0.13-0.43] for the initial version to 0.88 [0.84-0.92] for the final version. The inter-rater kappa statistic comparing automatic versus manual measures of cam morphology, defined as AA ³≥60°, improved from 0.43 [80% agreement] for the initial version to 0.86 [94% agreement] for the final version. Conclusions: We have developed and validated an automated measure of AA from DXA scans, showing high agreement with manually measuring AA. The proposed method is available to the wider research community from Zenodo.
Collapse
|
453
|
Chung Y, Shin S, Shim H, Sohn JY, Lee DE, Lee H, Eom HS, Kim KG, Kong SY. Development of an Automated Image Analyzer for Microvessel Density Measurement in Bone Marrow Biopsies. Ann Lab Med 2020; 40:312-316. [PMID: 32067430 PMCID: PMC7054689 DOI: 10.3343/alm.2020.40.4.312] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2019] [Revised: 10/30/2019] [Accepted: 01/14/2020] [Indexed: 11/19/2022] Open
Abstract
Angiogenesis is important for the proliferation and survival of multiple myeloma (MM) cells. Bone marrow (BM) microvessel density (MVD) is a useful marker of angiogenesis and an increase in MVD can be used as a marker of poor prognosis in MM patients. We developed an automated image analyzer to assess MVD from images of BM biopsies stained with anti-CD34 antibodies using two color models. MVD was calculated by merging images from the red and hue channels after eliminating non-microvessels. The analyzer results were compared with those obtained by two experienced hematopathologists in a blinded manner using the 84 BM samples of MM patients. Manual assessment of the MVD by two hematopathologists yielded mean±SD values of 19.4±11.8 and 20.0±11.8. The analyzer generated a mean±SD of 19.5±11.2. The intraclass correlation coefficient (ICC) and Bland-Altman plot of the MVD results demonstrated very good agreement between the automated image analyzer and both hematopathologists (ICC=0.893 [0.840–0.929] and ICC=0.906 [0.859–0.938]). This automated analyzer can provide time- and labor-saving benefits with more objective results in hematology laboratories.
Collapse
|
454
|
Panesar SS, Kliot M, Parrish R, Fernandez-Miranda J, Cagle Y, Britz GW. Promises and Perils of Artificial Intelligence in Neurosurgery. Neurosurgery 2020; 87:33-44. [PMID: 31748800 DOI: 10.1093/neuros/nyz471] [Citation(s) in RCA: 35] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/10/2019] [Accepted: 08/28/2019] [Indexed: 11/13/2022] Open
Abstract
Artificial intelligence (AI)-facilitated clinical automation is expected to become increasingly prevalent in the near future. AI techniques may permit rapid and detailed analysis of the large quantities of clinical data generated in modern healthcare settings, at a level that is otherwise impossible by humans. Subsequently, AI may enhance clinical practice by pushing the limits of diagnostics, clinical decision making, and prognostication. Moreover, if combined with surgical robotics and other surgical adjuncts such as image guidance, AI may find its way into the operating room and permit more accurate interventions, with fewer errors. Despite the considerable hype surrounding the impending medical AI revolution, little has been written about potential downsides to increasing clinical automation. These may include both direct and indirect consequences. Directly, faulty, inadequately trained, or poorly understood algorithms may produce erroneous results, which may have wide-scale impact. Indirectly, increasing use of automation may exacerbate de-skilling of human physicians due to over-reliance, poor understanding, overconfidence, and lack of necessary vigilance of an automated clinical workflow. Many of these negative phenomena have already been witnessed in other industries that have already undergone, or are undergoing "automation revolutions," namely commercial aviation and the automotive industry. This narrative review explores the potential benefits and consequences of the anticipated medical AI revolution from a neurosurgical perspective.
Collapse
|
455
|
Otero-Muras I, Carbonell P. Automated engineering of synthetic metabolic pathways for efficient biomanufacturing. Metab Eng 2020; 63:61-80. [PMID: 33316374 DOI: 10.1016/j.ymben.2020.11.012] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/22/2020] [Revised: 11/15/2020] [Accepted: 11/20/2020] [Indexed: 12/19/2022]
Abstract
Metabolic engineering involves the engineering and optimization of processes from single-cell to fermentation in order to increase production of valuable chemicals for health, food, energy, materials and others. A systems approach to metabolic engineering has gained traction in recent years thanks to advances in strain engineering, leading to an accelerated scaling from rapid prototyping to industrial production. Metabolic engineering is nowadays on track towards a truly manufacturing technology, with reduced times from conception to production enabled by automated protocols for DNA assembly of metabolic pathways in engineered producer strains. In this review, we discuss how the success of the metabolic engineering pipeline often relies on retrobiosynthetic protocols able to identify promising production routes and dynamic regulation strategies through automated biodesign algorithms, which are subsequently assembled as embedded integrated genetic circuits in the host strain. Those approaches are orchestrated by an experimental design strategy that provides optimal scheduling planning of the DNA assembly, rapid prototyping and, ultimately, brings forward an accelerated Design-Build-Test-Learn cycle and the overall optimization of the biomanufacturing process. Achieving such a vision will address the increasingly compelling demand in our society for delivering valuable biomolecules in an affordable, inclusive and sustainable bioeconomy.
Collapse
|
456
|
Krasovsky T, Lubetzky AV, Archambault PS, Wright WG. Will virtual rehabilitation replace clinicians: a contemporary debate about technological versus human obsolescence. J Neuroeng Rehabil 2020; 17:163. [PMID: 33298128 PMCID: PMC7724440 DOI: 10.1186/s12984-020-00769-0] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2020] [Accepted: 10/07/2020] [Indexed: 11/21/2022] Open
Abstract
This article is inspired by a pseudo Oxford-style debate, which was held in Tel Aviv University, Israel at the International Conference on Virtual Rehabilitation (ICVR) 2019, which is the official conference of the International Society for Virtual Rehabilitation. The debate, between two 2-person teams with a moderator, was organized by the ICVR Program committee to address the question "Will virtual rehabilitation replace clinicians?" It brought together five academics with technical, research, and/or clinical backgrounds-Gerry Fluet, Tal Krasovsky, Anat Lubetzky, Philippe Archambault, W. Geoffrey Wright-to debate the pros and cons of using virtual reality (VR) and related technologies to help assess, diagnose, treat, and track recovery, and more specifically investigate the likelihood that advanced technology will ultimately replace human clinicians. Both teams were assigned a side to defend, whether it represented their own viewpoint or not, and to take whatever positions necessary to make a persuasive argument and win the debate. In this paper we present a recapitulation of the arguments presented by both sides, and further include an in-depth consideration of the question. We attempt to judiciously lay out a number of arguments that fall along a spectrum from moderate to extreme; the most extreme and/or indefensible positions are presented for rhetorical and demonstrative purposes. Although there may not be a clear answer today, this paper raises questions which are related to the basic nature of the rehabilitation profession, and to the current and potential role of technology within it.
Collapse
|
457
|
Fenrich KK, Hallworth BW, Vavrek R, Raposo PJF, Misiaszek JE, Bennett DJ, Fouad K, Torres-Espin A. Self-directed rehabilitation training intensity thresholds for efficient recovery of skilled forelimb function in rats with cervical spinal cord injury. Exp Neurol 2020; 339:113543. [PMID: 33290776 DOI: 10.1016/j.expneurol.2020.113543] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/01/2020] [Revised: 11/18/2020] [Accepted: 12/02/2020] [Indexed: 01/01/2023]
Abstract
Task specific rehabilitation training is commonly used to treat motor dysfunction after neurological injures such as spinal cord injury (SCI), yet the use of task specific training in preclinical animal studies of SCI is not common. This is due in part to the difficulty in training animals to perform specific motor tasks, but also due to the lack of knowledge about optimal rehabilitation training parameters to maximize recovery. The single pellet reaching, grasping and retrieval (SPRGR) task (a.k.a. single pellet reaching task or Whishaw task) is a skilled forelimb motor task used to provide rehabilitation training and test motor recovery in rodents with cervical SCI. However, the relationships between the amount, duration, intensity, and timing of training remain poorly understood. In this study, using automated robots that allow rats with cervical SCI ad libitum access to self-directed SPRGR rehabilitation training, we show clear relationships between the total amount of rehabilitation training, the intensity of training (i.e., number of attempts/h), and performance in the task. Specifically, we found that rats naturally segregate into High and Low performance groups based on training strategy and performance in the task. Analysis of the different training strategies showed that more training (i.e., increased number of attempts in the SPRGR task throughout rehabilitation training) at higher intensities (i.e., number of attempts per hour) increased performance in the task, and that improved performance in the SPRGR task was linked to differences in corticospinal tract axon collateral densities in the injured spinal cords. Importantly, however, our data also indicate that rehabilitation training becomes progressively less efficient (i.e., less recovery for each attempt) as both the amount and intensity of rehabilitation training increases. Finally, we found that Low performing animals could increase their training intensity and transition to High performing animals in chronic SCI. These results highlight the rehabilitation training strategies that are most effective to regain skilled forelimb motor function after SCI, which will facilitate pre-clinical rehabilitation studies using animal models and could be beneficial in the development of more efficient clinical rehabilitation training strategies.
Collapse
|
458
|
Lisova K, Wang J, Chao PH, van Dam RM. A simple and efficient automated microvolume radiosynthesis of [ 18F]Florbetaben. EJNMMI Radiopharm Chem 2020; 5:30. [PMID: 33275179 PMCID: PMC7718361 DOI: 10.1186/s41181-020-00113-w] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/20/2020] [Accepted: 11/20/2020] [Indexed: 12/15/2022] Open
Abstract
BACKGROUND Current automated radiosynthesizers are generally optimized for producing large batches of PET tracers. Preclinical imaging studies, however, often require only a small portion of a regular batch, which cannot be economically produced on a conventional synthesizer. Alternative approaches are desired to produce small to moderate batches to reduce cost and the amount of reagents and radioisotope needed to produce PET tracers with high molar activity. In this work we describe the first reported microvolume method for production of [18F]Florbetaben for use in imaging of Alzheimer's disease. PROCEDURES The microscale synthesis of [18F]Florbetaben was adapted from conventional-scale synthesis methods. Aqueous [18F]fluoride was azeotropically dried with K2CO3/K222 (275/383 nmol) complex prior to radiofluorination of the Boc-protected precursor (80 nmol) in 10 μL DMSO at 130 °C for 5 min. The resulting intermediate was deprotected with HCl at 90 °C for 3 min and recovered from the chip in aqueous acetonitrile solution. The crude product was purified via analytical scale HPLC and the collected fraction reformulated via solid-phase extraction using a miniature C18 cartridge. RESULTS Starting with 270 ± 100 MBq (n = 3) of [18F]Fluoride, the method affords formulated product with 49 ± 3% (decay-corrected) yield,> 98% radiochemical purity and a molar activity of 338 ± 55 GBq/μmol. The miniature C18 cartridge enables efficient elution with only 150 μL of ethanol which is diluted to a final volume of 1.0 mL, thus providing a sufficient concentration for in vivo imaging. The whole procedure can be completed in 55 min. CONCLUSIONS This work describes an efficient and reliable procedure to produce [18F]Florbetaben in quantities sufficient for large-scale preclinical applications. This method provides very high yields and molar activities compared to reported literature methods. This method can be applied to higher starting activities with special consideration given to automation and radiolysis prevention.
Collapse
|
459
|
Using dual energy X-ray absorptiometry to estimate commercial cut weights at abattoir chain-speed. Meat Sci 2020; 173:108400. [PMID: 33316705 DOI: 10.1016/j.meatsci.2020.108400] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/14/2020] [Revised: 10/16/2020] [Accepted: 11/30/2020] [Indexed: 11/20/2022]
Abstract
This experiment assessed the ability of an on-line dual energy x-ray absorptiometer (DEXA) installed at a commercial abattoir to determine commercial cut weights in lamb carcases at abattoir chain-speed. 200 lamb carcases were scanned using a DEXA that was trained to predict the computed tomography determined proportions of fat, lean, and bone. Models were then trained using hot carcase weight and, DEXA fat% value or GR tissue depth to predict cut weight. Results from validation tests of DEXA models demonstrated excellent precision for predicting cut weight, in most cases describing more than 85% of the variation, and RMSE values that represented between 5 and 13% of the average weight of each cut. For most cuts these weight predictions were superior to those informed by GR tissue depth. This precision was maintained upon validation. Additional analyses utilised pixel information from the fore, saddle, and hind sections of DEXA images. This further enhanced the predictive power of cut weight models.
Collapse
|
460
|
The Confluence of Innovation in Therapeutics and Regulation: Recent CMC Considerations. J Pharm Sci 2020; 109:3524-3534. [PMID: 32971125 PMCID: PMC7505112 DOI: 10.1016/j.xphs.2020.09.025] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2020] [Revised: 09/17/2020] [Accepted: 09/18/2020] [Indexed: 01/02/2023]
Abstract
The field of human therapeutics has expanded tremendously from small molecules to complex biological modalities, and this trend has accelerated in the last two decades with a greater diversity in the types and applications of novel modalities, accompanied by increasing sophistication in drug delivery technology. These innovations have led to a corresponding increase in the number of therapies seeking regulatory approval, and as the industry continues to evolve regulations will need to adapt to the ever-changing landscape. The growth in this field thus represents a challenge for regulatory authorities as well as for sponsors. This review provides a brief description of novel biologics, including innovative antibody therapeutics, genetic modification technologies, new developments in vaccines, and multifunctional modalities. It also describes a few pertinent drug delivery mechanisms such as nanoparticles, liposomes, coformulation, recombinant human hyaluronidase for subcutaneous delivery, pulmonary delivery, and 3D printing. In addition, it provides an overview of the current CMC regulatory challenges and discusses potential methods of accelerating regulatory mechanisms for more efficient approvals. Finally, we look at the future of biotherapeutics and emphasize the need to bring these modalities to the forefront of patient care from a global perspective as effectively as possible.
Collapse
|
461
|
Batumalai V, Jameson MG, King O, Walker R, Slater C, Dundas K, Dinsdale G, Wallis A, Ochoa C, Gray R, Vial P, Vinod SK. Cautiously optimistic: A survey of radiation oncology professionals' perceptions of automation in radiotherapy planning. Tech Innov Patient Support Radiat Oncol 2020; 16:58-64. [PMID: 33251344 PMCID: PMC7683263 DOI: 10.1016/j.tipsro.2020.10.003] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2020] [Revised: 10/15/2020] [Accepted: 10/27/2020] [Indexed: 02/06/2023] Open
Abstract
INTRODUCTION While there is evidence to show the positive effects of automation, the impact on radiation oncology professionals has been poorly considered. This study examined radiation oncology professionals' perceptions of automation in radiotherapy planning. METHOD An online survey link was sent to the chief radiation therapists (RT) of all Australian radiotherapy centres to be forwarded to RTs, medical physicists (MP) and radiation oncologists (RO) within their institution. The survey was open from May-July 2019. RESULTS Participants were 204 RTs, 84 MPs and 37 ROs (response rates ∼10% of the overall radiation oncology workforce). Respondents felt automation resulted in improvement in consistency in planning (90%), productivity (88%), quality of planning (57%), and staff focus on patient care (49%). When asked about perceived impact of automation, the responses were; will change the primary tasks of certain jobs (66%), will allow staff to do the remaining components of their job more effectively (51%), will eliminate jobs (20%), and will not have an impact on jobs (6%). 27% of respondents believe automation will reduce job satisfaction. 71% of respondents strongly agree/agree that automation will cause a loss of skills, while only 25% strongly agree/agree that the training and education tools in their department are sufficient. CONCLUSION Although the effect of automation is perceived positively, there are some concerns on loss of skillsets and the lack of training to maintain this. These results highlight the need for continued education to ensure that skills and knowledge are not lost with automation.
Collapse
|
462
|
Luque-Córdoba D, Priego-Capote F. Fully automated method for quantitative determination of steroids in serum: An approach to evaluate steroidogenesis. Talanta 2020; 224:121923. [PMID: 33379124 DOI: 10.1016/j.talanta.2020.121923] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/15/2020] [Revised: 11/18/2020] [Accepted: 11/23/2020] [Indexed: 12/26/2022]
Abstract
Steroidogenesis is a set of metabolic reactions where the enzymes play a key role to control the physiological levels of steroids. A deficiency in steroidogenesis induces an accumulation and/or insufficiency of steroids in human blood and can lead to different pathologies. This issue added to the low levels of steroids (pg mL-1 to ng mL-1) in this biofluid make of their determination an analytical challenge. In this research, we present a high-throughtput and fully automated method based on solid-phase extraction on-line coupled to liquid chromatography with tandem mass spectrometry detection (SPE-LC-MS/MS) to quantify estrogens (estrone and estradiol), androgens (testosterone, androstenedione, dihydrotestosterone and dehydroepiandrosterone), progestogens (progesterone, pregnenolone, 17-hydroxyprogesterone and 17-hydroxypregnenolone), glucocorticoids (21-hydroxyprogesterone, 11-deoxycortisol, cortisone, corticosterone and cortisol) and one mineralocorticoid (aldosterone) in human serum. The performance of the SPE step and the multiple reaction monitoring (MRM) mode allowed reaching a high sensitivity and selectivity levels without any derivatization reaction. The fragmentation mechanisms of the steroids were complementary studied by LC-MS/MS in high-resolution mode to confirm the MRM transitions. The method was characterized with two SPE sorbents with similar physico-chemical properties. Thus, limits of quantification were at pg mL-1 levels, the variability was below 25% (except for pregnenolone and cortisone), and the accuracy, expressed as bias, was always within ±25%. The proposed method was tested in human serum from ten volunteers, who reported levels for the sixteen target steroids that were satisfactorily in agreement with the physiological ranges reported in the literature.
Collapse
|
463
|
Vala C, Mothes C, Chicheri G, Magadur P, Viot G, Deloye JB, Maia S, Bouvet Y, Dupont AC, Arlicot N, Guilloteau D, Emond P, Vercouillie J. Fully automated radiosynthesis of [ 18F]LBT999 on TRACERlab FX FN and AllinOne modules, a PET radiopharmaceutical for imaging the dopamine transporter in human brain. EJNMMI Radiopharm Chem 2020; 5:26. [PMID: 33196944 PMCID: PMC7669936 DOI: 10.1186/s41181-020-00105-w] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/16/2020] [Accepted: 10/19/2020] [Indexed: 11/30/2022] Open
Abstract
Background Fluorine labelled 8-((E)-4-fluoro-but-2-enyl)-3β-p-tolyl-8-aza-bicyclo[3.2.1]octane-2β-carboxylic acid methyl ester ([18F]LBT999) is a selective radioligand for the in vivo neuroimaging and quantification of the dopamine transporter by Positron Emission Tomography (PET). [18F]LBT999 was produced on a TRACERlab FXFN for the Phase I study but for Phase III and a potent industrial production transfer, production was also implemented on an AllinOne (AIO) system requiring a single use cassette. Both production methods are reported herein. Results Automation of [18F]LBT999 radiosynthesis on FXFN was carried out in 35% yield (decay-corrected) in 65 min (n = 16), with a radiochemical purity higher than 99% and a molar activity of 158 GBq/μmol at the end of synthesis. The transfer to the AIO platform followed by optimizations allowed the production of [18F]LBT999 in 32.7% yield (decay-corrected) within 48 min (n = 5), with a radiochemical purity better than 98% and a molar activity above 154 GBq/μmol on average at the end of synthesis. Quality controls of both methods met the specification for clinical application. Conclusion Both modules allow efficient and reproducible radiosynthesis of [18F]LBT999 with good radiochemical yields and a reasonable synthesis time. The developments made on AIO, such as its ability to meet pharmaceutical criteria and to more easily comply with GMP requirements, make it an optimal approach for the potent industrial production of [18F]LBT999 and future wider use.
Collapse
|
464
|
Garabedian BM, Meadows CW, Mingardon F, Guenther JM, de Rond T, Abourjeily R, Lee TS. An automated workflow to screen alkene reductases using high-throughput thin layer chromatography. BIOTECHNOLOGY FOR BIOFUELS 2020; 13:184. [PMID: 33292503 PMCID: PMC7653764 DOI: 10.1186/s13068-020-01821-w] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 07/31/2020] [Accepted: 10/21/2020] [Indexed: 06/12/2023]
Abstract
BACKGROUND Synthetic biology efforts often require high-throughput screening tools for enzyme engineering campaigns. While innovations in chromatographic and mass spectrometry-based techniques provide relevant structural information associated with enzyme activity, these approaches can require cost-intensive instrumentation and technical expertise not broadly available. Moreover, complex workflows and analysis time can significantly impact throughput. To this end, we develop an automated, 96-well screening platform based on thin layer chromatography (TLC) and use it to monitor in vitro activity of a geranylgeranyl reductase isolated from Sulfolobus acidocaldarius (SaGGR). RESULTS Unreduced SaGGR products are oxidized to their corresponding epoxide and applied to thin layer silica plates by acoustic printing. These derivatives are chromatographically separated based on the extent of epoxidation and are covalently ligated to a chromophore, allowing detection of enzyme variants with unique product distributions or enhanced reductase activity. Herein, we employ this workflow to examine farnesol reduction using a codon-saturation mutagenesis library at the Leu377 site of SaGGR. We show this TLC-based screen can distinguish between fourfold differences in enzyme activity for select mutants and validated those results by GC-MS. CONCLUSIONS With appropriate quantitation methods, this workflow can be used to screen polyprenyl reductase activity and can be readily adapted to analyze broader catalyst libraries whose products are amenable to TLC analysis.
Collapse
|
465
|
Rodríguez-Maese R, Ferrer L, Leal LO. Automatic multicommuted flow systems applied in sample treatment for radionuclide determination in biological and environmental analysis. JOURNAL OF ENVIRONMENTAL RADIOACTIVITY 2020; 223-224:106390. [PMID: 32883535 DOI: 10.1016/j.jenvrad.2020.106390] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/23/2020] [Accepted: 08/16/2020] [Indexed: 06/11/2023]
Abstract
The presence of artificial and natural radioactivity in the environment is currently a topic of great relevance and ecological interest, even in human health issue, due to the increase of different anthropogenic activities. The use of multicommuted flow analysis techniques (e.g. Multi-Syringe Flow Injection Analysis - MSFIA, Lab-On-Valve - LOV and Lab-In-Syringe - LIS) has allowed the automation of radiochemical procedures to separate and preconcentrate radionuclides in environmental and biological samples. In comparison with the manual approach commonly used in routine analysis for radioactivity monitoring, the automation has enabled the development of highly reproducible methodologies with a great analysis frequency. Moreover, during the analytical procedure, the intervention of the analyst is drastically reduced, minimizing the radiological risk. The automation also offers significant advantages such as minimum consumption of time and reagents, reducing the cost and the generation of waste, contributing to the green chemistry. In this review, several multicommuted flow analysis techniques (MSFIA, LOV and LIS) reported in the last decade applied for the development of automatic sample treatment methodologies, used to separate, preconcentrate and quantify 90Sr, 99Tc, natural U and 226Ra in biological and environmental samples are described and critically compared.
Collapse
|
466
|
Ghimire R, Skinner J, Carnathan M. Who perceived automation as a threat to their jobs in metro Atlanta: Results from the 2019 Metro Atlanta Speaks survey. TECHNOLOGY IN SOCIETY 2020; 63:101368. [PMID: 32904576 PMCID: PMC7456576 DOI: 10.1016/j.techsoc.2020.101368] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/28/2020] [Revised: 08/13/2020] [Accepted: 08/29/2020] [Indexed: 06/11/2023]
Abstract
While ethnic minorities, less-educated or less-skilled workers, and low-income workers are, in general, deemed more vulnerable to automation, the literature has not adequately investigated whether or not these sociodemographic groups perceive automation as a threat to their jobs. Using the 2019 Metro Atlanta Speaks survey, we find that high-income residents and residents with a graduate or a professional degree did not perceive automation as a threat to their jobs, but relatively older residents, blacks or African Americans, and low-income residents perceived automation as a threat to their jobs. Although Hispanics or Latinos and less-educated residents are identified to be more vulnerable to automation, they did not perceive automation as a threat to their jobs. Hence, automation is most likely to make Hispanics or Latinos and less-educated residents unemployed in metro Atlanta as they do not perceive automation as a threat to their jobs despite being deemed more vulnerable to automation.
Collapse
|
467
|
Brinson RG, Elliott KW, Arbogast LW, Sheen DA, Giddens JP, Marino JP, Delaglio F. Principal component analysis for automated classification of 2D spectra and interferograms of protein therapeutics: influence of noise, reconstruction details, and data preparation. JOURNAL OF BIOMOLECULAR NMR 2020; 74:643-656. [PMID: 32700053 DOI: 10.1007/s10858-020-00332-y] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/19/2020] [Accepted: 07/02/2020] [Indexed: 06/11/2023]
Abstract
Protein therapeutics have numerous critical quality attributes (CQA) that must be evaluated to ensure safety and efficacy, including the requirement to adopt and retain the correct three-dimensional fold without forming unintended aggregates. Therefore, the ability to monitor protein higher order structure (HOS) can be valuable throughout the lifecycle of a protein therapeutic, from development to manufacture. 2D NMR has been introduced as a robust and precise tool to assess the HOS of a protein biotherapeutic. A common use case is to decide whether two groups of spectra are substantially different, as an indicator of difference in HOS. We demonstrate a quantitative use of principal component analysis (PCA) scores to perform this decision-making, and demonstrate the effect of acquisition and processing details on class separation using samples of NISTmAb monoclonal antibody Reference Material subjected to two different oxidative stress protocols. The work introduces an approach to computing similarity from PCA scores based upon the technique of histogram intersection, a method originally developed for retrieval of images from large databases. Results show that class separation can be robust with respect to random noise, reconstruction method, and analysis region selection. By contrast, details such as baseline distortion can have a pronounced effect, and so must be controlled carefully. Since the classification approach can be performed without the need to identify peaks, results suggest that it is possible to use even more efficient measurement strategies that do not produce spectra that can be analyzed visually, but nevertheless allow useful decision-making that is objective and automated.
Collapse
|
468
|
Li LH, Kaptein SJF, Schmid MA, Zmurko J, Leyssen P, Neyts J, Dallmeier K. A dengue type 2 reporter virus assay amenable to high-throughput screening. Antiviral Res 2020; 183:104929. [PMID: 32898584 DOI: 10.1016/j.antiviral.2020.104929] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/03/2020] [Revised: 08/27/2020] [Accepted: 09/02/2020] [Indexed: 10/23/2022]
Abstract
Dengue virus (DV) is an important mosquito-borne flavivirus threatening almost half of the world's population. Prophylaxis and potent anti-DV drugs are urgently needed. Here, we developed a high content imaging-based (HCI) assay with DV type 2 expressing the fluorescent protein mCherry (DV2/mCherry) to improve the efficiency and robustness of the drug discovery process. For the construction of the reporter virus, the mCherry gene followed by the ribosome-skipping 2A sequence of the Thosea asigna virus (T2A) was placed upstream of the full DV2 open reading frame. The biological characteristics including mCherry expression, virus replication rate, and plaque phenotype was examined and validated in BHK-21, Vero and C6/36 cells. A robust image-based antiviral assay combined with an automated robotic system was then developed, with a Z' factor of 0.73. To validate the image-based antiviral assay, a panel of reference compounds with different molecular mechanisms of anti-DV activity was assessed: (i) the glycosylation inhibitor, Celgosivir, (ii) two NS4b-targeting compounds: a 3-Acyl-indole derivative and NITD618, and (iii) two nucleoside viral polymerase inhibitors, 2'CMC and 7DMA. The inhibition profiles were quantified and obtained by means of HCI and RT-qPCR. Both methods resulted in very comparable inhibition profiles. In conclusion, a powerful and robust assay was developed with a fully automated data generation and processing pipeline. It makes the new reporter virus assay amenable to high-throughput screening of large libraries of small molecules.
Collapse
|
469
|
Ren CD, Qi W, Wyatt EA, Yeary J, Westland K, Berke M, Rathore N. Application of a High Throughput and Automated Workflow to Therapeutic Protein Formulation Development. J Pharm Sci 2020; 110:1130-1141. [PMID: 33203511 DOI: 10.1016/j.xphs.2020.10.040] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/30/2020] [Revised: 09/25/2020] [Accepted: 10/19/2020] [Indexed: 10/23/2022]
Abstract
Rapid and efficient formulation development is critical to successfully bringing therapeutic protein drug products into a competitive market under increasingly aggressive timelines. Conventional application of high throughput techniques for formulation development have been limited to lower protein concentrations, which are not applicable to late stage development of high concentration therapeutics. In this work, we present a high throughput (HT) formulation workflow that enables screening at representative concentrations via integration of a micro-buffer exchange system with automated analytical instruments. The operational recommendations associated with the use of such HT systems as well as the efficiencies gained (reduction in hands-on time and run time by over 70% and 30%, respectively), which enable practical characterization of an expanded formulation design space, are discussed. To demonstrate that the workflow is fit for purpose, the formulation properties and stability profiles (SEC and CEX) from samples generated by the HT workflow were compared to those processed by ultrafiltration/diafiltration, and the results were shown to be in good agreement. This approach was further applied to two case studies, one focused on a formulation screen that studied the effects of pH and excipient on viscosity and stability, and the other focused on selection of an appropriate viscosity mimic solution for a protein product.
Collapse
|
470
|
Hooks DA, Dubois R, Meillet V, Nicot J, Berte B, Yamashita S, Mahida S, Sellal JM, Frontera A, Denis A, Sacher F, Derval N, Crozier I, Melton I, Haissaguerre M, Jais P. Automated rhythm-based control of radiofrequency ablation close to the atrioventricular node: Preclinical, animal, and first-in-human testing. Heart Rhythm 2020; 18:734-742. [PMID: 33091601 DOI: 10.1016/j.hrthm.2020.10.014] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 08/06/2020] [Revised: 10/03/2020] [Accepted: 10/14/2020] [Indexed: 10/23/2022]
Abstract
BACKGROUND The risk of heart block during radiofrequency ablation of atrioventricular (AV) nodal reentrant tachycardia and septal accessory pathways is minimized by rapidly ceasing ablation in response to markers of risk, such as atrioventricular dissociation, fast junctional rhythm, PR interval prolongation, or 2 consecutive atrial or ventricular depolarizations. Currently this is done manually. OBJECTIVES The objectives of this study were to build and test a control system able to monitor cardiac rhythm and automatically terminate ablation energy when required. METHODS The device was built from off-shelf componentry. Preclinical testing involved real-time input of electrogram/electrocardiogram data from 209 ablation procedures (20 patients) over slow (n = 19) and fast (n = 1) AV nodal pathways. The device response speed was compared with the human response speed. The device's ability to prevent heart block was tested in 5 sheep. First-in-human testing was then performed in 12 patients undergoing AV nodal reentrant tachycardia ablation. RESULTS Risk conditions necessitating shutoff of ablation (200 total; 111 preclinical and 89 first-in-human) were detected by the device with 100% sensitivity and 94% specificity, automatically terminating ablation while still allowing successful ablation in all patients. Device shutoff of ablation was always faster than human response (median difference 1.24 seconds). In each of 5 sheep, 40 consecutive attempts to cause heart block by ablating over the His bundle were unsuccessful because of automatic shutoff in response to rhythm change. CONCLUSION Automated shutoff of ablation close to the AV node in response to markers of the risk of heart block is feasible with high accuracy as well as faster response than human response. The system may improve the safety of ablation near the AV node by preventing heart block.
Collapse
|
471
|
An evaluation of DistillerSR's machine learning-based prioritization tool for title/abstract screening - impact on reviewer-relevant outcomes. BMC Med Res Methodol 2020; 20:256. [PMID: 33059590 PMCID: PMC7559198 DOI: 10.1186/s12874-020-01129-1] [Citation(s) in RCA: 36] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/11/2020] [Accepted: 09/22/2020] [Indexed: 01/14/2023] Open
Abstract
BACKGROUND Systematic reviews often require substantial resources, partially due to the large number of records identified during searching. Although artificial intelligence may not be ready to fully replace human reviewers, it may accelerate and reduce the screening burden. Using DistillerSR (May 2020 release), we evaluated the performance of the prioritization simulation tool to determine the reduction in screening burden and time savings. METHODS Using a true recall @ 95%, response sets from 10 completed systematic reviews were used to evaluate: (i) the reduction of screening burden; (ii) the accuracy of the prioritization algorithm; and (iii) the hours saved when a modified screening approach was implemented. To account for variation in the simulations, and to introduce randomness (through shuffling the references), 10 simulations were run for each review. Means, standard deviations, medians and interquartile ranges (IQR) are presented. RESULTS Among the 10 systematic reviews, using true recall @ 95% there was a median reduction in screening burden of 47.1% (IQR: 37.5 to 58.0%). A median of 41.2% (IQR: 33.4 to 46.9%) of the excluded records needed to be screened to achieve true recall @ 95%. The median title/abstract screening hours saved using a modified screening approach at a true recall @ 95% was 29.8 h (IQR: 28.1 to 74.7 h). This was increased to a median of 36 h (IQR: 32.2 to 79.7 h) when considering the time saved not retrieving and screening full texts of the remaining 5% of records not yet identified as included at title/abstract. Among the 100 simulations (10 simulations per review), none of these 5% of records were a final included study in the systematic review. The reduction in screening burden to achieve true recall @ 95% compared to @ 100% resulted in a reduced screening burden median of 40.6% (IQR: 38.3 to 54.2%). CONCLUSIONS The prioritization tool in DistillerSR can reduce screening burden. A modified or stop screening approach once a true recall @ 95% is achieved appears to be a valid method for rapid reviews, and perhaps systematic reviews. This needs to be further evaluated in prospective reviews using the estimated recall.
Collapse
|
472
|
Chen R, Ho JC, Lin JMS. Extracting medication information from unstructured public health data: a demonstration on data from population-based and tertiary-based samples. BMC Med Res Methodol 2020; 20:258. [PMID: 33059588 PMCID: PMC7559204 DOI: 10.1186/s12874-020-01131-7] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2019] [Accepted: 09/23/2020] [Indexed: 11/19/2022] Open
Abstract
Background Unstructured data from clinical epidemiological studies can be valuable and easy to obtain. However, it requires further extraction and processing for data analysis. Doing this manually is labor-intensive, slow and subject to error. In this study, we propose an automation framework for extracting and processing unstructured data. Methods The proposed automation framework consisted of two natural language processing (NLP) based tools for unstructured text data for medications and reasons for medication use. We first checked spelling using a spell-check program trained on publicly available knowledge sources and then applied NLP techniques. We mapped medication names into generic names using vocabulary from publicly available knowledge sources. We used WHO’s Anatomical Therapeutic Chemical (ATC) classification system to map generic medication names to medication classes. We processed the reasons for medication with the Lancaster stemmer method and then grouped and mapped to disease classes based on organ systems. Finally, we demonstrated this automation framework on two data sources for Mylagic Encephalomyelitis/ Chronic Fatigue Syndrome (ME/CFS): tertiary-based (n = 378) and population-based (n = 664) samples. Results A total of 8681 raw medication records were used for this demonstration. The 1266 distinct medication names (omitting supplements) were condensed to 89 ATC classification system categories. The 1432 distinct raw reasons for medication use were condensed to 65 categories via NLP. Compared to completion of the entire process manually, our automation process reduced the number of the terms requiring manual labor for mapping by 84.4% for medications and 59.4% for reasons for medication use. Additionally, this process improved the precision of the mapped results. Conclusions Our automation framework demonstrates the usefulness of NLP strategies even when there is no established mapping database. For a less established database (e.g., reasons for medication use), the method is easily modifiable as new knowledge sources for mapping are introduced. The capability to condense large features into interpretable ones will be valuable for subsequent analytical studies involving techniques such as machine learning and data mining.
Collapse
|
473
|
Yang C, Daigger GT, Belia E, Kerkez B. Extracting useful signals from flawed sensor data: Developing hybrid data-driven approaches with physical factors. WATER RESEARCH 2020; 185:116282. [PMID: 33086467 DOI: 10.1016/j.watres.2020.116282] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 04/22/2020] [Revised: 08/03/2020] [Accepted: 08/06/2020] [Indexed: 06/11/2023]
Abstract
Increased availability and affordability of sensors, especially water quality sensors, is poised to improve process control and modelling in water and wastewater systems. Sensor measurements are often flawed by unavoidable influent complexity and sensor instability, making extraction of useful signals problematic. Although a natural reaction is to put extra effort into sensor maintenance to achieve more reliable measurements, useful signals can be extracted from those unqualified signals by appropriate usage of available data-driven tools instructed by physical factors (e.g. prior process knowledge, physical constraints, phenomenal observations). Such methodology is herein defined as hybrid approaches. While the concept of coupling physical factors into data-driven tools is not new in downstream applications such as process modelling and control, little literature has explicitly applied it in the first and equally important step - signal processing. With flawed influent five-day biochemical oxygen demand (BOD5) sensor measurements as an example, this paper provides a comprehensive case study demonstrating how physical factors were incorporated throughout the procedures of processing a flawed signal for its maximum value. Results showed that useful signals were extracted and validated via an assembly of well-established machine learning tools, whose performance was improved with physical factors. An Improved Standard Signal Processing Architecture (ISSPA) is also proposed based on the results of this research.
Collapse
|
474
|
Lupo SA, Romesberg RL, Lu X. Automated inline pigment removal for the analysis of pesticide residues in spinach by liquid chromatography tandem mass spectrometry. J Chromatogr A 2020; 1629:461477. [PMID: 32823011 DOI: 10.1016/j.chroma.2020.461477] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/14/2020] [Revised: 08/08/2020] [Accepted: 08/09/2020] [Indexed: 01/08/2023]
Abstract
An automated inline sample preparation (ILSP) method has been developed for pesticide residue analysis in spinach by LC-MS/MS. Chlorophyll pigments and other matrix constituents were removed from the sample extract using a UHPLC system equipped with an auxiliary pump, 6-port high pressure switching valve, and dual-directional ILSP cartridge containing bonded silica. The new procedure was evaluated as an entirely separate workflow using a simple solid-liquid extraction and as part of a cleanup strategy in conjunction with QuEChERS. Accuracy and precision experiments were conducted in spinach at two concentration levels (n = 6). Of the 63 pesticides tested, 86% (0.005 mg/kg) and 100% (0.05 mg/kg) displayed average recoveries within 70-120% and RSD values ≤20% for the ILSP method. In addition, low to moderate matrix effects (<50%) were calculated for 95% of the analytes. Overall performance of the proposed method was found to be better or comparable to a traditional QuEChERS procedure utilizing AOAC formulated salts and dSPE sorbents, while significantly reducing the amount of pigments reaching the MS source. The ILSP workflow is a simpler procedure with fewer steps that require less time than traditional extraction and cleanup techniques.
Collapse
|
475
|
De-risking excipient particle size distribution variability with automated robust mixing: Integrating quality by design and process analytical technology. Eur J Pharm Biopharm 2020; 157:9-24. [PMID: 33022392 DOI: 10.1016/j.ejpb.2020.09.014] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/25/2020] [Revised: 09/14/2020] [Accepted: 09/22/2020] [Indexed: 11/21/2022]
Abstract
BACKGROUND Particle size distribution (PSD) variability in excipients affects mixing. In response, manufacturers rely on raw material control and rigidly defined process parameters to achieve quality. However, this status quo is costly; and diverges from regulatory exceptions for process robustness. Although robustness improves cost and material usage efficiency, it remains under-adopted. METHOD To address this gap, a robust batch mixing operation that mitigated the impact of PSD variability was evaluated, with blends comprising chlorpheniramine, microcrystalline cellulose and lactose. PSD of lactose was varied to simulate commercially-relevant variability. Due to PSD-induced rheological variations, the blends had different optimal mixing speeds. For the automation study, near infrared (NIR) spectroscopy; process optimization and endpoint detection algorithms; and control hardware were integrated within a cluster of software environments. NIR spectroscopy was employed for in-line PSD characterization and blend monitoring, to modulate mixing speed and detect endpoint (feedforward and feedback control). RESULTS NIR spectroscopy rapidly detected PSD variations by the 6th-9th rotations, to activate feedforward control, which mitigated the effect of PSD variability and reduced the mixing time by 13-34%. Endpoints were correctly detected. PSD variations and blend homogeneity were accurately predicted (relative standard error of prediction ≤ 2%). CONCLUSION The automated robust mixing operation was successful. Pertinently, NIR spectrometer can be adopted for multimodal sensing. Its applicability for production-driven characterization of raw materials in batch and continuous pharmaceutical processing should be further explored. Lastly, this study laid the groundwork for end-to-end implementation of process analytical technology in robust batch processing.
Collapse
|