76
|
Abstract
One challenge in biology is to make sense of the complexity of biological networks. A good system to approach this is signaling pathways, whose well-characterized molecular details allow us to relate the internal processes of each pathway to their input-output behavior. In this study, we analyzed mathematical models of three metazoan signaling pathways: the canonical Wnt, MAPK/ERK, and Tgfβ pathways. We find an unexpected convergence: the three pathways behave in some physiological contexts as linear signal transmitters. Testing the results experimentally, we present direct measurements of linear input-output behavior in the Wnt and ERK pathways. Analytics from each model further reveal that linearity arises through different means in each pathway, which we tested experimentally in the Wnt and ERK pathways. Linearity is a desired property in engineering where it facilitates fidelity and superposition in signal transmission. Our findings illustrate how cells tune different complex networks to converge on the same behavior.
Collapse
|
77
|
Malkow T, Papakonstantinou G, Pilenga A, Grahl-Madsen L, Tsotridis G. Immittance Data Validation using Fast Fourier Transformation (FFT) Computation - Synthetic and Experimental Examples. ChemElectroChem 2018; 4:2771-2776. [PMID: 29577006 PMCID: PMC5861679 DOI: 10.1002/celc.201700629] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/26/2017] [Indexed: 11/09/2022]
Abstract
Exact data of an electric circuit (EC) model of RLC (resistor, inductor, capacitor) elements representing rational immittance of LTI (linear, time invariant) systems are numerically Fourier transformed to demonstrate within error bounds applicability of the Hilbert integral tranform (HT) and Kramers-Kronig (KK) integral tranform (KKT) method. Immittance spectroscopy (IS) data are validated for their HT (KKT) compliance using non-equispaced fast Fourier transformation (NFFT) computations. Failing of HT (KKT) testing may not only stem from non-compliance with causality, stability and linearity which are readily distinguished using anti HT (KKT) relations. It could also indicate violation of uniform boundedness to be overcome either by using singly or multiply subtracted KK transform (SSKK or MSKK) or by seeking KKT of the same set of data at a complementary immit- tance level. Experimental IS data of a fuel cell (FC) are also numerically HT (KKT) validated by NFFT assessing whether LTI principles are met. Figures of merit are suggested to measure success in numerical validation of IS data.
Collapse
|
78
|
Naveen P, Lingaraju HB, Deepak M, Medhini B, Prasad KS. Method Development and Validation for the Determination of Caffeine: An Alkaloid from Coffea arabica by High-performance Liquid Chromatography Method. Pharmacognosy Res 2018; 10:88-91. [PMID: 29568193 PMCID: PMC5855379 DOI: 10.4103/pr.pr_79_17] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022] Open
Abstract
Objective: The present study was investigated to develop and validate a reversed phase high performance liquid chromatography method for the determination of caffeine from bean material of Coffee arabica. Materials and Methods: The separation was achieved on a reversed-phase C18 column using a mobile phase composed of water: methanol (50:50) at a flow rate of 1.0 mlmin-1. The detection was carried out on a UV detector at 272 nm. The developed method was validated according to the requirements for International Conference on Harmonisation (ICH) guidelines, which includes specificity, linearity, precision, accuracy, limit of detection and limit of quantitation. Results: The developed method validates good linearity with excellent correlation coefficient (R2 > 0.999). In repeatability and intermediate precision, the percentage relative standard deviation (% RSD) of peak area was less than 1% shows high precision of the method. The recovery rate for caffeine was within 98.78% - 101.28% indicates high accuracy of the method. The low limit of detection and limit of quantitation of caffeine enable the detection and quantitation of caffeine from C. arabica at low concentrations. Conclusion: The developed HPLC method is a simple, rapid, precisely, accurately and widely accepted and it is recommended for efficient assays in routine work. SUMMARY A simple, accurate, and sensitive high-performance liquid chromatography (HPLC) method for caffeine from Coffea arabica has been developed and validated. The developed HPLC method was validated for linearity, specificity, precision, recovery, limits of detection, and limits of quantification by the International Conference on Harmonization guidelines. The results revealed that the proposed method is highly reliable. This method could be successfully applied for routine quality work analysis. Abbreviation Used:C. arabica: Coffee arabica, ICH: International Conference on Harmonisation, % RSD: Percentage Relative Standard Deviation, R2: Correlation Coefficient, ppm: Parts per million, LOD: Limits of detection, LOQ: Limits of quantification, SD: Standard deviation, S: Slope, RP-HPLC: Reverse phase high performance liquid chromatography, v/v: Volume per volume.
Collapse
|
79
|
Abstract
We have discussed many statistical tests and tools in this series of commentaries, and while we have mentioned the underlying assumptions of the tests, we have not explored them in detail. We stop to look at some of the assumptions of the t-test and linear regression, justify and explain them, mention what can go wrong when the assumptions are not met, and suggest some solutions in this case.
Collapse
|
80
|
Nikbakht N, Tafreshiha A, Zoccolan D, Diamond ME. Supralinear and Supramodal Integration of Visual and Tactile Signals in Rats: Psychophysics and Neuronal Mechanisms. Neuron 2018; 97:626-639.e8. [PMID: 29395913 PMCID: PMC5814688 DOI: 10.1016/j.neuron.2018.01.003] [Citation(s) in RCA: 48] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/08/2017] [Revised: 11/24/2017] [Accepted: 12/31/2017] [Indexed: 11/30/2022]
Abstract
To better understand how object recognition can be triggered independently of the sensory channel through which information is acquired, we devised a task in which rats judged the orientation of a raised, black and white grating. They learned to recognize two categories of orientation: 0° ± 45° (“horizontal”) and 90° ± 45° (“vertical”). Each trial required a visual (V), a tactile (T), or a visual-tactile (VT) discrimination; VT performance was better than that predicted by optimal linear combination of V and T signals, indicating synergy between sensory channels. We examined posterior parietal cortex (PPC) and uncovered key neuronal correlates of the behavioral findings: PPC carried both graded information about object orientation and categorical information about the rat’s upcoming choice; single neurons exhibited identical responses under the three modality conditions. Finally, a linear classifier of neuronal population firing replicated the behavioral findings. Taken together, these findings suggest that PPC is involved in the supramodal processing of shape. Rats combine vision and touch to distinguish two grating orientation categories Performance with vision and touch together reveals synergy between the two channels Posterior parietal cortex (PPC) neuronal responses are invariant to modality PPC neurons carry information about object orientation and the rat’s categorization
Collapse
|
81
|
A Pneumatic Tactile Sensor for Co-Operative Robots. SENSORS 2017; 17:s17112592. [PMID: 29125565 PMCID: PMC5712939 DOI: 10.3390/s17112592] [Citation(s) in RCA: 31] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/20/2017] [Revised: 11/03/2017] [Accepted: 11/06/2017] [Indexed: 11/16/2022]
Abstract
Tactile sensors of comprehensive functions are urgently needed for the advanced robot to co-exist and co-operate with human beings. Pneumatic tactile sensors based on air bladder possess some noticeable advantages for human-robot interaction application. In this paper, we construct a pneumatic tactile sensor and apply it on the fingertip of robot hand to realize the sensing of force, vibration and slippage via the change of the pressure of the air bladder, and we utilize the sensor to perceive the object's features such as softness and roughness. The pneumatic tactile sensor has good linearity, repeatability and low hysteresis and both its size and sensing range can be customized by using different material as well as different thicknesses of the air bladder. It is also simple and cheap to fabricate. Therefore, the pneumatic tactile sensor is suitable for the application of co-operative robots and can be widely utilized to improve the performance of service robots. We can apply it to the fingertip of the robot to endow the robotic hand with the ability to co-operate with humans and handle the fragile objects because of the inherent compliance of the air bladder.
Collapse
|
82
|
Calabrese EJ. The Mistaken Birth and Adoption of LNT: An Abridged Version. Dose Response 2017; 15:1559325817735478. [PMID: 29051718 PMCID: PMC5637971 DOI: 10.1177/1559325817735478] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2017] [Accepted: 08/22/2017] [Indexed: 11/16/2022] Open
Abstract
The historical foundations of cancer risk assessment were based on the discovery of X-ray-induced gene mutations by Hermann J. Muller, its transformation into the linear nonthreshold (LNT) single-hit theory, the recommendation of the model by the US National Academy of Sciences, Biological Effects of Atomic Radiation I, Genetics Panel in 1956, and subsequent widespread adoption by regulatory agencies worldwide. This article summarizes substantial recent historical revelations of this history, which profoundly challenge the standard and widely acceptable history of cancer risk assessment, showing multiple significant scientific errors and incorrect interpretations, mixed with deliberate misrepresentation of the scientific record by leading ideologically motivated radiation geneticists. These novel historical findings demonstrate that the scientific foundations of the LNT single-hit model were seriously flawed and should not have been adopted for cancer risk assessment.
Collapse
|
83
|
Ma XL, Song FF, Zhang H, Huan X, Li SY. Compositional Monosaccharide Analysis of Morus nigra Linn by HPLC and HPCE Quantitative Determination and Comparison of Polysaccharide from Morus nigra Linn by HPCE and HPLC. CURR PHARM ANAL 2017; 13:433-437. [PMID: 29213223 PMCID: PMC5684802 DOI: 10.2174/1573412913666170330150807] [Citation(s) in RCA: 16] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/12/2016] [Revised: 03/10/2017] [Accepted: 03/10/2017] [Indexed: 11/22/2022]
Abstract
BACKGROUND Morus nigra Linn is not only treated as health food but also medicine in Chinese history. OBJECTIVE Here, we have extracted and separated the heteroglycan to monosaccharides of the dry fruits. METHOD After heteroglycan being hydrolysised, then were we derivatived the monosaccharides with 1-phenyl-3-methyl-5-pyrazolone (PMP), and then subsequently HPCE and HPLC were used to separate. HPCE, which held an uncoated capillary (d=75μm) and detected by PDA at 245 nm with borate buffer, the voltage was set at 15 kV and capillary temperature was 25°C. There is Yilite column (Hypersil BDS C18 5μm, 4.6mm×250mm) with HPLC and detected by UV at 245 nm and capillary temperature 25°C.The High Performance capillary electrophoresis method and High Performance Liquid Chromatog-raphy method were compared on monosaccharide composition and quantitative determination of polysac-charide from the black mulberry. RESULTS Through several indicators including standard curve, precision, reproducibility, stability and recovery rate, the result of the two methods is basically consistent while they have complementary ad-vantages. CONCLUSION Both methods are suitable for the determination of the black mulberry polysaccharide, but each has its own merits.
Collapse
|
84
|
Crump KS. Bogen's Critique of Linear-No-Threshold Default Assumptions. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2017; 37:1802-1807. [PMID: 27959476 DOI: 10.1111/risa.12748] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/17/2016] [Accepted: 11/10/2016] [Indexed: 05/21/2023]
Abstract
In an article recently published in this journal, Bogen(1) concluded that an NRC committee's recommendations that default linear, nonthreshold (LNT) assumptions be applied to dose- response assessment for noncarcinogens and nonlinear mode of action carcinogens are not justified. Bogen criticized two arguments used by the committee for LNT: when any new dose adds to a background dose that explains background levels of risk (additivity to background or AB), or when there is substantial interindividual heterogeneity in susceptibility (SIH) in the exposed human population. Bogen showed by examples that SIH can be false. Herein is outlined a general proof that confirms Bogen's claim. However, it is also noted that SIH leads to a nonthreshold population distribution even if individual distributions all have thresholds, and that small changes to SIH assumptions can result in LNT. Bogen criticizes AB because it only applies when there is additivity to background, but offers no help in deciding when or how often AB holds. Bogen does not contradict the fact that AB can lead to LNT but notes that, even if low-dose linearity results, the response at higher doses may not be useful in predicting the amount of low-dose linearity. Although this is theoretically true, it seems reasonable to assume that generally there is some quantitative relationship between the low-dose slope and the slope suggested at higher doses. Several incorrect or misleading statements by Bogen are noted.
Collapse
|
85
|
Naveen P, Lingaraju HB, Anitha, Prasad KS. Simultaneous determination of rutin, isoquercetin, and quercetin flavonoids in Nelumbo nucifera by high-performance liquid chromatography method. Int J Pharm Investig 2017; 7:94-100. [PMID: 28929052 PMCID: PMC5553270 DOI: 10.4103/jphi.jphi_33_17] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022] Open
Abstract
Objective: The present study was investigated to provide a documentary evidence for the determination of rutin, isoquercetin, and quercetin flavonoids from the flora of Nelumbo nucifera by reversed-phase high-performance liquid chromatography (RP-HPLC). Materials and Methods: RP-HPLC analysis was performed by gradient elution with a low-pressure gradient using 0.5% acetic acid: acetonitrile as a mobile phase with a flow rate of 1.0 ml/min. The separation was done at 26°C using a Kinetex XB-C18 column as stationary phase and the detection wavelength at 356 nm. The proposed method was validated as per International Conference on Harmonisation guidelines with respect to specificity, linearity, precision, accuracy, limit of detection (LOD), and limit of quantification (LOQ). Results: The validated results were within the acceptable limits. In specificity, the retention time of rutin, isoquercetin, and quercetin peak in the sample was matched with the reference standard peak and showed good resolution. An excellent linearity was obtained with correlation coefficient (r) higher than 0.999. In precision, the repeatability and intermediate showed <1.0% of % relative standard deviation of peak area percentage indicating high precision and accurate. The recovery rate for rutin, isoquercetin, and quercetin was between 99.85%–101.37%, 101.90%–103.24%, and 101.74%–106.73%, respectively. The lower LOD and LOQ of rutin, isoquercetin, and quercetin enable the detection and quantitation of these flavonoids in N. nucifera at low concentrations. Conclusion: The developed analytical method is convenient for the determination of flavonoids content in herbal drugs.
Collapse
|
86
|
Obuchowski NA, Bullen J. Quantitative imaging biomarkers: Effect of sample size and bias on confidence interval coverage. Stat Methods Med Res 2017; 27:3139-3150. [PMID: 29298603 DOI: 10.1177/0962280217693662] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/12/2023]
Abstract
Introduction Quantitative imaging biomarkers (QIBs) are being increasingly used in medical practice and clinical trials. An essential first step in the adoption of a quantitative imaging biomarker is the characterization of its technical performance, i.e. precision and bias, through one or more performance studies. Then, given the technical performance, a confidence interval for a new patient's true biomarker value can be constructed. Estimating bias and precision can be problematic because rarely are both estimated in the same study, precision studies are usually quite small, and bias cannot be measured when there is no reference standard. Methods A Monte Carlo simulation study was conducted to assess factors affecting nominal coverage of confidence intervals for a new patient's quantitative imaging biomarker measurement and for change in the quantitative imaging biomarker over time. Factors considered include sample size for estimating bias and precision, effect of fixed and non-proportional bias, clustered data, and absence of a reference standard. Results Technical performance studies of a quantitative imaging biomarker should include at least 35 test-retest subjects to estimate precision and 65 cases to estimate bias. Confidence intervals for a new patient's quantitative imaging biomarker measurement constructed under the no-bias assumption provide nominal coverage as long as the fixed bias is <12%. For confidence intervals of the true change over time, linearity must hold and the slope of the regression of the measurements vs. true values should be between 0.95 and 1.05. The regression slope can be assessed adequately as long as fixed multiples of the measurand can be generated. Even small non-proportional bias greatly reduces confidence interval coverage. Multiple lesions in the same subject can be treated as independent when estimating precision. Conclusion Technical performance studies of quantitative imaging biomarkers require moderate sample sizes in order to provide robust estimates of bias and precision for constructing confidence intervals for new patients. Assumptions of linearity and non-proportional bias should be assessed thoroughly.
Collapse
|
87
|
Wikenros C, Balogh G, Sand H, Nicholson KL, Månsson J. Mobility of moose-comparing the effects of wolf predation risk, reproductive status, and seasonality. Ecol Evol 2016; 6:8870-8880. [PMID: 28035275 PMCID: PMC5192942 DOI: 10.1002/ece3.2598] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/09/2016] [Revised: 10/12/2016] [Accepted: 10/25/2016] [Indexed: 11/11/2022] Open
Abstract
In a predator–prey system, prey species may adapt to the presence of predators with behavioral changes such as increased vigilance, shifting habitats, or changes in their mobility. In North America, moose (Alces alces) have shown behavioral adaptations to presence of predators, but such antipredator behavioral responses have not yet been found in Scandinavian moose in response to the recolonization of wolves (Canis lupus). We studied travel speed and direction of movement of GPS‐collared female moose (n = 26) in relation to spatiotemporal differences in wolf predation risk, reproductive status, and time of year. Travel speed was highest during the calving (May–July) and postcalving (August–October) seasons and was lower for females with calves than females without calves. Similarly, time of year and reproductive status affected the direction of movement, as more concentrated movement was observed for females with calves at heel, during the calving season. We did not find support for that wolf predation risk was an important factor affecting moose travel speed or direction of movement. Likely causal factors for the weak effect of wolf predation risk on mobility of moose include high moose‐to‐wolf ratio and intensive hunter harvest of the moose population during the past century.
Collapse
|
88
|
Yang C, Kim H. Linearized Programming of Memristors for Artificial Neuro-Sensor Signal Processing. SENSORS 2016; 16:s16081320. [PMID: 27548186 PMCID: PMC5017485 DOI: 10.3390/s16081320] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/18/2016] [Revised: 08/11/2016] [Accepted: 08/11/2016] [Indexed: 11/16/2022]
Abstract
A linearized programming method of memristor-based neural weights is proposed. Memristor is known as an ideal element to implement a neural synapse due to its embedded functions of analog memory and analog multiplication. Its resistance variation with a voltage input is generally a nonlinear function of time. Linearization of memristance variation about time is very important for the easiness of memristor programming. In this paper, a method utilizing an anti-serial architecture for linear programming is proposed. The anti-serial architecture is composed of two memristors with opposite polarities. It linearizes the variation of memristance due to complimentary actions of two memristors. For programming a memristor, additional memristor with opposite polarity is employed. The linearization effect of weight programming of an anti-serial architecture is investigated and memristor bridge synapse which is built with two sets of anti-serial memristor architecture is taken as an application example of the proposed method. Simulations are performed with memristors of both linear drift model and nonlinear model.
Collapse
|
89
|
Cao Y, Brady GJ, Gui H, Rutherglen C, Arnold MS, Zhou C. Radio Frequency Transistors Using Aligned Semiconducting Carbon Nanotubes with Current-Gain Cutoff Frequency and Maximum Oscillation Frequency Simultaneously Greater than 70 GHz. ACS NANO 2016; 10:6782-6790. [PMID: 27327074 DOI: 10.1021/acsnano.6b02395] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/06/2023]
Abstract
In this paper, we report record radio frequency (RF) performance of carbon nanotube transistors based on combined use of a self-aligned T-shape gate structure, and well-aligned, high-semiconducting-purity, high-density polyfluorene-sorted semiconducting carbon nanotubes, which were deposited using dose-controlled, floating evaporative self-assembly method. These transistors show outstanding direct current (DC) performance with on-current density of 350 μA/μm, transconductance as high as 310 μS/μm, and superior current saturation with normalized output resistance greater than 100 kΩ·μm. These transistors create a record as carbon nanotube RF transistors that demonstrate both the current-gain cutoff frequency (ft) and the maximum oscillation frequency (fmax) greater than 70 GHz. Furthermore, these transistors exhibit good linearity performance with 1 dB gain compression point (P1dB) of 14 dBm and input third-order intercept point (IIP3) of 22 dBm. Our study advances state-of-the-art of carbon nanotube RF electronics, which have the potential to be made flexible and may find broad applications for signal amplification, wireless communication, and wearable/flexible electronics.
Collapse
|
90
|
Wyrobek AJ, Britten RA. Individual variations in dose response for spatial memory learning among outbred wistar rats exposed from 5 to 20 cGy of (56) Fe particles. ENVIRONMENTAL AND MOLECULAR MUTAGENESIS 2016; 57:331-340. [PMID: 27237589 DOI: 10.1002/em.22018] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 01/21/2016] [Accepted: 04/13/2016] [Indexed: 06/05/2023]
Abstract
Exposures of brain tissue to ionizing radiation can lead to persistent deficits in cognitive functions and behaviors. However, little is known about the quantitative relationships between exposure dose and neurological risks, especially for lower doses and among genetically diverse individuals. We investigated the dose relationship for spatial memory learning among genetically outbred male Wistar rats exposed to graded doses of (56) Fe particles (sham, 5, 10, 15, and 20 cGy; 1 GeV/n). Spatial memory learning was assessed on a Barnes maze using REL3 ratios measured at three months after exposure. Irradiated animals showed dose-dependent declines in spatial memory learning that were fit by a linear regression (P for slope <0.0002). The irradiated animals showed significantly impaired learning at 10 cGy exposures, no detectable learning between 10 and 15 cGy, and worsened performances between 15 and 20 cGy. The proportions of poor learners and the magnitude of their impairment were fit by linear regressions with doubling doses of ∼10 cGy. In contrast, there were no detectable deficits in learning among the good learners in this dose range. Our findings suggest that genetically diverse individuals can vary substantially in their spatial memory learning, and that exposures at low doses appear to preferentially impact poor learners. This hypothesis invites future investigations of the genetic and physiological mechanisms of inter-individual variations in brain function related to spatial memory learning after low-dose HZE radiation exposures and to determine whether it also applies to physical trauma to brain tissue and exposures to chemical neurotoxicants. Environ. Mol. Mutagen. 57:331-340, 2016. © 2016 Wiley Periodicals, Inc.
Collapse
|
91
|
Chen J, Zhang LS, Feng ZH. High-fidelity AFM scanning stage based on multilayer ceramic capacitors. SCANNING 2016; 38:184-190. [PMID: 26367125 DOI: 10.1002/sca.21253] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/22/2015] [Revised: 07/02/2015] [Accepted: 07/13/2015] [Indexed: 06/05/2023]
Abstract
A kind of multilayer ceramic capacitors (MLCCs) has been verified to have good micro-actuating properties, thus making them good candidates for nano-positioning. In this paper, we successfully employed the MLCCs as lateral scanners for a tripod scanning stage. The MLCC-based lateral scanners display hysteresis under 1.5% and a nonlinearity less than 2% even with the simplest open-loop voltage drive. The developed scanning stage was integrated into a commercial AFM to evaluate its imaging performance. Experimental results showed that sample images with high fidelities were obtained. SCANNING 38:184-190, 2016. © 2015 Wiley Periodicals, Inc.
Collapse
|
92
|
Nithya L, Raj NAN, Rathinamuthu S. Analyzing the characteristics of 6 MV photon beam at low monitor unit settings. J Med Phys 2016; 41:34-7. [PMID: 27051168 PMCID: PMC4795415 DOI: 10.4103/0971-6203.177285] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/27/2022] Open
Abstract
Analyzing the characteristics of a low monitor unit (MU) setting is essential, particularly for intensity-modulated techniques. Intensity modulation can be achieved through intensity-modulated radiotherapy (IMRT) or volumetric-modulated arc therapy (VMAT). There is possibility for low MUs in the segments of IMRT and VMAT plans. The minimum MU/segment must be set by the physicist in the treatment planning system at the time of commissioning. In this study, the characteristics such as dose linearity, stability, flatness, and symmetry of 6 MV photon beam of a Synergy linear accelerator at low MU settings were investigated for different dose rates. The measurements were performed for Synergy linear accelerator using a slab phantom with a FC65-G chamber and Profiler 2. The MU linearity was studied for 1–100 MU using a field size of 10 cm ×10 cm. The linearity error for 1 MU was 4.2%. Flatness of the beam was deteriorated in 1 MU condition. The beam stability and symmetry was well within the specification. Using this study, we conclude that the treatment delivered with <3 MU may result in uncertainty in dose delivery. To ensure the correct dose delivery with less uncertainty, it is recommended to use ≥3 MU as the minimum MU per segment in IMRT and VMAT plans.
Collapse
|
93
|
Obuchowski NA, Buckler A, Kinahan P, Chen-Mayer H, Petrick N, Barboriak DP, Bullen J, Barnhart H, Sullivan DC. Statistical Issues in Testing Conformance with the Quantitative Imaging Biomarker Alliance (QIBA) Profile Claims. Acad Radiol 2016; 23:496-506. [PMID: 26898527 DOI: 10.1016/j.acra.2015.12.020] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/13/2015] [Revised: 12/11/2015] [Accepted: 12/22/2015] [Indexed: 11/19/2022]
Abstract
A major initiative of the Quantitative Imaging Biomarker Alliance is to develop standards-based documents called "Profiles," which describe one or more technical performance claims for a given imaging modality. The term "actor" denotes any entity (device, software, or person) whose performance must meet certain specifications for the claim to be met. The objective of this paper is to present the statistical issues in testing actors' conformance with the specifications. In particular, we present the general rationale and interpretation of the claims, the minimum requirements for testing whether an actor achieves the performance requirements, the study designs used for testing conformity, and the statistical analysis plan. We use three examples to illustrate the process: apparent diffusion coefficient in solid tumors measured by MRI, change in Perc 15 as a biomarker for the progression of emphysema, and percent change in solid tumor volume by computed tomography as a biomarker for lung cancer progression.
Collapse
|
94
|
Bogen KT. Linear-No-Threshold Default Assumptions for Noncancer and Nongenotoxic Cancer Risks: A Mathematical and Biological Critique. RISK ANALYSIS : AN OFFICIAL PUBLICATION OF THE SOCIETY FOR RISK ANALYSIS 2016; 36:589-604. [PMID: 26249816 DOI: 10.1111/risa.12460] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/21/2023]
Abstract
To improve U.S. Environmental Protection Agency (EPA) dose-response (DR) assessments for noncarcinogens and for nonlinear mode of action (MOA) carcinogens, the 2009 NRC Science and Decisions Panel recommended that the adjustment-factor approach traditionally applied to these endpoints should be replaced by a new default assumption that both endpoints have linear-no-threshold (LNT) population-wide DR relationships. The panel claimed this new approach is warranted because population DR is LNT when any new dose adds to a background dose that explains background levels of risk, and/or when there is substantial interindividual heterogeneity in susceptibility in the exposed human population. Mathematically, however, the first claim is either false or effectively meaningless and the second claim is false. Any dose-and population-response relationship that is statistically consistent with an LNT relationship may instead be an additive mixture of just two quasi-threshold DR relationships, which jointly exhibit low-dose S-shaped, quasi-threshold nonlinearity just below the lower end of the observed "linear" dose range. In this case, LNT extrapolation would necessarily overestimate increased risk by increasingly large relative magnitudes at diminishing values of above-background dose. The fact that chemically-induced apoptotic cell death occurs by unambiguously nonlinear, quasi-threshold DR mechanisms is apparent from recent data concerning this quintessential toxicity endpoint. The 2009 NRC Science and Decisions Panel claims and recommendations that default LNT assumptions be applied to DR assessment for noncarcinogens and nonlinear MOA carcinogens are therefore not justified either mathematically or biologically.
Collapse
|
95
|
Zhang Z. Model building strategy for logistic regression: purposeful selection. ANNALS OF TRANSLATIONAL MEDICINE 2016; 4:111. [PMID: 27127764 PMCID: PMC4828741 DOI: 10.21037/atm.2016.02.15] [Citation(s) in RCA: 239] [Impact Index Per Article: 29.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/06/2023]
Abstract
Logistic regression is one of the most commonly used models to account for confounders in medical literature. The article introduces how to perform purposeful selection model building strategy with R. I stress on the use of likelihood ratio test to see whether deleting a variable will have significant impact on model fit. A deleted variable should also be checked for whether it is an important adjustment of remaining covariates. Interaction should be checked to disentangle complex relationship between covariates and their synergistic effect on response variable. Model should be checked for the goodness-of-fit (GOF). In other words, how the fitted model reflects the real data. Hosmer-Lemeshow GOF test is the most widely used for logistic regression model.
Collapse
|
96
|
Chen J, Yu Q, Zhu Z, Peng Y, Fang F. Spatial summation revealed in the earliest visual evoked component C1 and the effect of attention on its linearity. J Neurophysiol 2015; 115:500-9. [PMID: 26561595 DOI: 10.1152/jn.00044.2015] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/15/2015] [Accepted: 11/10/2015] [Indexed: 11/22/2022] Open
Abstract
In natural scenes, multiple objects are usually presented simultaneously. How do specific areas of the brain respond to multiple objects based on their responses to each individual object? Previous functional magnetic resonance imaging (fMRI) studies have shown that the activity induced by a multiobject stimulus in the primary visual cortex (V1) can be predicted by the linear or nonlinear sum of the activities induced by its component objects. However, there has been little evidence from electroencephelogram (EEG) studies so far. Here we explored how V1 responded to multiple objects by comparing the EEG signals evoked by a three-grating stimulus with those evoked by its two components (the central grating and 2 flanking gratings). We focused on the earliest visual component C1 (onset latency of ∼50 ms) because it has been shown to reflect the feedforward responses of neurons in V1. We found that when the stimulus was unattended, the amplitude of the C1 evoked by the three-grating stimulus roughly equaled the sum of the amplitudes of the C1s evoked by its two components, regardless of the distances between these gratings. When the stimulus was attended, this linear spatial summation existed only when the three gratings were far apart from each other. When the three gratings were close to each other, the spatial summation became compressed. These results suggest that the earliest visual responses in V1 follow a linear summation rule when attention is not involved and that attention can affect the earliest interactions between multiple objects.
Collapse
|
97
|
McNair E, Qureshi AM, Bally C. Performance Evaluation of the Plateletworks® in the Measurement of Blood Cell Counts as compared to the Beckman Coulter Unicel DXH 800. THE JOURNAL OF EXTRA-CORPOREAL TECHNOLOGY 2015; 47:113-118. [PMID: 26405360 PMCID: PMC4557548] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Subscribe] [Scholar Register] [Received: 12/16/2014] [Accepted: 07/01/2015] [Indexed: 06/05/2023]
Abstract
Prior to undergoing cardiac surgery many patients may have impaired platelet function due to platelet inhibition. Point of care testing (POCT) that produces quick results of platelet counts and function allow earlier clinician interpretation, diagnosis and treatment. Before being adopted for routine clinical use, a POCT device's performance must be evaluated by standard laboratory techniques to ensure high quality results. The purpose of this study is to determine the performance of the Plateletworks?V BC 3200 automated hematology analyzer by correlating its precision, accuracy and linearity for the measurement of blood counts to our hospital central laboratory analyzer (Beckman Coulter Unicel DXH 800). The study utilizes well described methods for Within-Run and Day-to-Day precision, comparison of methods (bias), and linearity. Control samples from the manufacturer were used for the precision studies, blood samples from 115 cardiac surgical subjects were used for comparison of methods and accuracy, and pre-diluted control samples from the manufacturer were used for the linearity studies. The precision of the Plateletworks® analyzer was acceptable. The overall coefficient of variation (CV) for the measured parameters at all levels of control for Within-Run precision was acceptable ranging from 0.65-6.4%. Likewise, the CV for the measured parameters at all levels of control for Day-to-Day precision was acceptable ranging from 1.45% to 6.7%. The correlation and accuracy between the two analyzers for the evaluated parameters (platelets, red blood cells, white blood cells, and hemoglobin) was acceptable. The linearity for the measured parameters was also acceptable with a range between 98-100%. The performance of the Plateletworks® analyzer was acceptable for providing blood cell counts as compared to our central hospital laboratory analyzer.
Collapse
|
98
|
Raunig DL, McShane LM, Pennello G, Gatsonis C, Carson PL, Voyvodic JT, Wahl RL, Kurland BF, Schwarz AJ, Gönen M, Zahlmann G, Kondratovich MV, O'Donnell K, Petrick N, Cole PE, Garra B, Sullivan DC. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment. Stat Methods Med Res 2015; 24:27-67. [PMID: 24919831 PMCID: PMC5574197 DOI: 10.1177/0962280214537344] [Citation(s) in RCA: 241] [Impact Index Per Article: 26.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/13/2022]
Abstract
Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined.
Collapse
|
99
|
Linden MA, Sedgewick GJ, Ericson M. An innovative method for obtaining consistent images and quantification of histochemically stained specimens. J Histochem Cytochem 2015; 63:233-43. [PMID: 25575568 DOI: 10.1369/0022155415568996] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022] Open
Abstract
Obtaining digital images of color brightfield microscopy is an important aspect of biomedical research and the clinical practice of diagnostic pathology. Although the field of digital pathology has had tremendous advances in whole-slide imaging systems, little effort has been directed toward standardizing color brightfield digital imaging to maintain image-to-image consistency and tonal linearity. Using a single camera and microscope to obtain digital images of three stains, we show that microscope and camera systems inherently produce image-to-image variation. Moreover, we demonstrate that post-processing with a widely used raster graphics editor software program does not completely correct for session-to-session inconsistency. We introduce a reliable method for creating consistent images with a hardware/software solution (ChromaCal™; Datacolor Inc., NJ) along with its features for creating color standardization, preserving linear tonal levels, providing automated white balancing and setting automated brightness to consistent levels. The resulting image consistency using this method will also streamline mean density and morphometry measurements, as images are easily segmented and single thresholds can be used. We suggest that this is a superior method for color brightfield imaging, which can be used for quantification and can be readily incorporated into workflows.
Collapse
|
100
|
Zhu-Shimoni J, Yu C, Nishihara J, Wong RM, Gunawan F, Lin M, Krawitz D, Liu P, Sandoval W, Vanderlaan M. Host cell protein testing by ELISAs and the use of orthogonal methods. Biotechnol Bioeng 2014; 111:2367-79. [PMID: 24995961 DOI: 10.1002/bit.25327] [Citation(s) in RCA: 113] [Impact Index Per Article: 11.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2014] [Revised: 06/06/2014] [Accepted: 07/02/2014] [Indexed: 02/05/2023]
Abstract
Host cell proteins (HCPs) are among the process-related impurities monitored during recombinant protein pharmaceutical process development. The challenges of HCP detection include (1) low levels of residual HCPs present in large excess of product protein, (2) the assay must measure a large number of different protein analytes, and (3) the population of HCP species may change during process development. Suitable methods for measuring process-related impurities are needed to support process development, process validation, and control system testing. A multi-analyte enzyme-linked immunosorbent assay (ELISA) is the workhorse method for HCP testing due to its high throughput, sensitivity and selectivity. However, as the anti-HCP antibodies, the critical reagents for HCP ELISA, do not comprehensively recognize all the HCP species, it is especially important to ensure that weak and non-immunoreactive HCPs are not overlooked by the ELISA. In some cases limited amount of antibodies to HCP species or antigen excess causes dilution-dependent non-linearity with multi-product HCP ELISA. In our experience, correct interpretation of assay data can lead to isolation and identification of co-purifying HCP with the product in some cases. Moreover, even if the antibodies for a particular HCP are present in the reagent, the corresponding HCP may not be readily detected in the ELISA due to antibody/antigen binding conditions and availability of HCP epitopes. This report reviews the use of the HCP ELISA, discusses its limitations, and demonstrates the importance of orthogonal methods, including mass spectrometry, to complement the platform HCP ELISA for support of process development. In addition, risk and impact assessment for low-level HCPs is also outlined, with consideration of clinical information.
Collapse
|