1
|
Heroin-induced suppression of saccharin intake in OPRM1 A118G mice. Brain Res Bull 2017; 138:73-79. [PMID: 28939474 DOI: 10.1016/j.brainresbull.2017.09.008] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/31/2017] [Revised: 09/12/2017] [Accepted: 09/15/2017] [Indexed: 12/19/2022]
Abstract
The single nucleotide polymorphism of the μ-opioid receptor, OPRM1 A118G, has been associated with greater drug and alcohol use, increased sensitivity to pain, and reduced sensitivity to the antinociceptive effects of opiates. In the present studies, we employed a 'humanized' mouse model containing the wild-type (118AA) or variant (118GG) allele to examine behavior in a model of heroin-induced devaluation of an otherwise palatable saccharin cue when repeated saccharin-heroin pairings occurred every 24h (Experiment 1) or every 48h (Experiment 2). The results showed that, while both the 118AA and 118GG mice demonstrated robust avoidance of the heroin-paired saccharin cue following daily taste-drug pairings, only the 118AA mice suppressed intake of the heroin-paired saccharin cue when 48h elapsed between each taste-drug pairing. Humanized 118GG mice, then, defend their intake of the sweet cue despite saccharin-heroin pairings and this effect is illuminated by the use of spaced, rather than massed, trials. Given that this pattern of strain difference is not evident with saccharin-cocaine pairings (Freet et al., 2015), reduced avoidance of the heroin-paired saccharin cue by the 118GG mice may be due to an interaction between the opiate and the subjects' drive for the sweet or, alternatively, to differential downstream sensitivity to the aversive kappa mediated properties of the drug. These alternative hypotheses are addressed.
Collapse
|
2
|
Once is too much: Early development of the opponent process in taste reactivity behavior is associated with later escalation of cocaine self-administration in rats. Brain Res Bull 2017; 138:88-95. [PMID: 28899796 DOI: 10.1016/j.brainresbull.2017.09.002] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/05/2017] [Revised: 08/29/2017] [Accepted: 09/06/2017] [Indexed: 01/23/2023]
Abstract
Evidence suggests that the addiction process may begin immediately in some vulnerable subjects. Specifically, some rats have been shown to exhibit aversive taste reactivity (gapes) following the intraoral delivery of a cocaine-predictive taste cue after as few as 1-2 taste-drug pairings. After only 3-4 trials, the number of gapes becomes a reliable predictor of later cocaine self-administration. Given that escalation of drug-taking behavior over time is recognized as a key feature of substance use disorder (SUD) and addiction, the present study examined the relationship between early aversion to the cocaine-predictive flavor cue and later escalation of cocaine self-administration in an extended-access paradigm. The data show that rats who exhibit the greatest conditioned aversion early in training to the intraorally delivered cocaine-paired cue exhibit the greatest escalation of cocaine self-administration over 15 extended-access trials. This finding suggests that early onset of the conditioned opponent process (i.e., the near immediate shift from ingestion to rejection of the drug-paired cue) is a reliable predictor of future vulnerability and resilience to cocaine addiction-like behavior. Future studies must determine the underlying neural mechanisms associated with this early transition and, hence, with early vulnerability to the later development of SUD and addiction. In so doing, we shall be in position to discover novel diagnostics and novel avenues of prevention and treatment.
Collapse
|
3
|
Preweaning iron deficiency increases non-contingent responding during cocaine self-administration in rats. Physiol Behav 2016; 167:282-288. [PMID: 27640134 DOI: 10.1016/j.physbeh.2016.09.007] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/24/2016] [Revised: 08/12/2016] [Accepted: 09/12/2016] [Indexed: 11/27/2022]
Abstract
Iron deficiency (ID) is the most prevalent single-nutrient deficiency worldwide. There is evidence that ID early in development (preweaning in rat) causes irreversible neurologic, behavioral, and motor development deficits. Many of these effects have been attributed to damage to dopamine systems, including ID-induced changes in transporter and receptor numbers in the striatum and nucleus accumbens. These mesolimbic dopaminergic neurons are, in part, responsible for mediating reward and thus play a key role in addiction. However, there has been relatively little investigation into the behavioral effects of ID on drug addiction. In 2002, we found that rats made ID from weaning (postnatal day 21) and throughout the experiment acquired cocaine self-administration significantly more slowly than controls and failed to increase responding when the dose of the drug was decreased. In the present study, we assessed addiction for self-administered cocaine in rats with a history of preweaning ID only during postnatal days 4 through 21, and iron replete thereafter. The results showed that while ID did not affect the number of cocaine infusions or the overall addiction-like behavior score, ID rats scored higher on a measure of continued responding for drug than did iron replete controls. This increase in responding, however, was less goal-directed as ID rats also responded more quickly to the non-rewarded manipulandum than did control rats. Thus, while ID early in infancy did not significantly increase addiction-like behaviors for cocaine in this small study, the pattern of data suggests a possible underlying learning or performance impairment. Future studies will be needed to elucidate the exact neuro-behavioral deficits that lead to the increase in indiscriminate responding for drug in rats with a history of perinatal ID.
Collapse
|
4
|
Drug-motivated behavior in rats with lesions of the thalamic orosensory area. Behav Neurosci 2015; 130:103-13. [PMID: 26653714 DOI: 10.1037/bne0000114] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Rats suppress intake of a palatable taste cue when paired with a rewarding or an aversive stimulus in appetitive or aversive conditioning, respectively. A similar phenomenon occurs with drugs of abuse, but the nature of this conditioning has been subject for debate. While relatively little is known about the underlying neural circuitry, we recently reported bilateral lesions of the thalamic trigeminal orosensory area isolate drug-induced suppression of intake of a taste cue. The lesion blocks avoidance of the taste cue when paired with experimenter delivered drugs of abuse, yet has no effect on avoidance of the same cue when paired with an aversive agent or when it predicts access to a highly palatable sucrose solution. We hypothesize the lesion may blunt the rewarding properties of the drug. To test this, we used a runway apparatus, as running speed has been shown to increase with increasing reward value. Our hypothesis was supported by failure of the lesioned rats to increase running speed for morphine. Interestingly, lesioned rats did avoid intake of the drug-paired cue when presented in the runway apparatus and displayed naloxone-precipitated withdrawal. Using a partial crossover design, the lesion prevented avoidance of a cocaine-paired cue when presented in the home cage. We conclude that the lesion disrupts avoidance of a taste cue in anticipation of the rewarding properties of a drug but, at least in the presence of contextual cues, allows for avoidance of a taste cue as it elicits the onset of an aversive conditioned state of withdrawal.
Collapse
|
5
|
Transplantation of human retinal pigment epithelial cells in the nucleus accumbens of cocaine self-administering rats provides protection from seeking. Brain Res Bull 2015; 123:53-60. [PMID: 26562520 DOI: 10.1016/j.brainresbull.2015.11.008] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/18/2015] [Revised: 10/30/2015] [Accepted: 11/05/2015] [Indexed: 01/05/2023]
Abstract
Chronic exposure to drugs and alcohol leads to damage to dopaminergic neurons and their projections in the 'reward pathway' that originate in the ventral tegmental area (VTA) and terminate in the nucleus accumbens (NAc). This damage is thought to contribute to the signature symptom of addiction: chronic relapse. In this study we show that bilateral transplants of human retinal pigment epithelial cells (RPECs), a cell mediated dopaminergic and trophic neuromodulator, into the medial shell of the NAc, rescue rats with a history of high rates of cocaine self-administration from drug-seeking when returned, after 2 weeks of abstinence, to the drug-associated chamber under extinction conditions (i.e., with no drug available). Excellent survival was noted for the transplant of RPECs in the shell and/or the core of the NAc bilaterally in all rats that showed behavioral recovery from cocaine seeking. Design based unbiased stereology of tyrosine hydroxylase (TH) positive cell bodies in the VTA showed better preservation (p<0.035) in transplanted animals compared to control animals. This experiment shows that the RPEC graft provides beneficial effects to prevent drug seeking in drug addiction via its effects directly on the NAc and its neural network with the VTA.
Collapse
|
6
|
Abstract
Drug overdose now exceeds car accidents as the leading cause of accidental death in the United States. Of those drug overdoses, a large percentage of the deaths are due to heroin and/or pharmaceutical overdose, specifically misuse of prescription opioid analgesics. It is imperative, then, that we understand the mechanisms that lead to opioid abuse and addiction. The rewarding actions of opioids are mediated largely by the mu-opioid receptor (MOR), and signaling by this receptor is modulated by various interacting proteins. The neurotransmitter dopamine also contributes to opioid reward, and opioid addiction has been linked to reduced expression of dopamine D2 receptors (D2R) in the brain. That said, it is not known if alterations in the expression of these proteins relate to drug exposure and/or to the "addiction-like" behavior exhibited for the drug. Here, we held total drug self-administration constant across acquisition and showed that reduced expression of the D2R and the MOR interacting protein, Wntless, in the medial prefrontal cortex was associated with greater addiction-like behavior for heroin in general and with a greater willingness to work for the drug in particular. In contrast, reduced expression of the D2R in the nucleus accumbens and hippocampus was correlated with greater seeking during signaled nonavailability of the drug. Taken together, these data link reduced expression of both the D2R and Wntless to the explicit motivation for the drug rather than to differences in total drug intake per se.
Collapse
|
7
|
Early avoidance of a heroin-paired taste-cue and subsequent addiction-like behavior in rats. Brain Res Bull 2015; 123:61-70. [PMID: 26494018 DOI: 10.1016/j.brainresbull.2015.10.008] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/19/2015] [Revised: 10/13/2015] [Accepted: 10/14/2015] [Indexed: 11/26/2022]
Abstract
The ability to predict individual vulnerability to substance abuse would allow for a better understanding of the progression of the disease and development of better methods for prevention and/or early intervention. Here we use drug-induced devaluation of a saccharin cue in an effort to predict later addiction-like behavior in a model akin to that used by Deroche-Gamonet et al. (2004) and seek to link such vulnerability to changes in expression of various mu opioid receptor and D2 receptor-interacting proteins in brain. The results show that the greatest heroin-induced suppression of intake of a saccharin cue is associated with the greatest vulnerability to later addiction-like behavior and to differences in the expression of WLS, β-catenin, and NCS-1 in brain compared to rats that exhibited the least suppression of intake of the heroin-paired cue and/or saline controls. Finally, because the self-administration model employed produced no significant differences in drug intake between groups, overall, the resultant changes in protein expression can be more closely linked to individual differences in motivation for drug.
Collapse
|
8
|
Cocaine-induced suppression of saccharin intake and morphine modulation of Ca²⁺ channel currents in sensory neurons of OPRM1 A118G mice. Physiol Behav 2014; 139:216-23. [PMID: 25449401 DOI: 10.1016/j.physbeh.2014.11.040] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2014] [Revised: 11/11/2014] [Accepted: 11/12/2014] [Indexed: 01/09/2023]
Abstract
Several studies have shown that human carriers of the single nucleotide polymorphism of the μ-opioid receptor, OPRM1 A118G, exhibit greater drug and alcohol use, increased sensitivity to pain, and reduced sensitivity to the antinociceptive effects of opiates. In the present study, we employed a 'humanized' mouse model containing the wild-type (118AA) or variant (118GG) allele to examine behavior in our model of drug-induced suppression of a natural reward cue and to compare the morphine pharmacological profile in acutely isolated sensory neurons. Compared with 118AA mice, our results demonstrate that homozygous 118GG mice exhibit greater avoidance of the cocaine-paired saccharin cue, a behavior linked to an aversive withdrawal-like state. Electrophysiological recordings confirmed the reduced modulation of Ca(2+) channels by morphine in trigeminal ganglion (TG) neurons from 118GG mice compared to the 118AA control cells. However, repeated cocaine exposure in 118GG mice led to a leftward shift of the morphine concentration-response relationship when compared with 118GG control mice, while a rightward shift was observed in 118AA mice. These results suggest that cocaine exposure of mice carrying the 118G allele leads to a heightened sensitivity of the reward system and a blunted modulation of Ca(2+) channels by morphine in sensory neurons.
Collapse
|
9
|
Bilateral lesions of the thalamic trigeminal orosensory area dissociate natural from drug reward in contrast paradigms. Behav Neurosci 2012; 126:538-50. [PMID: 22687147 DOI: 10.1037/a0028842] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Substance abuse and addiction are associated with an apparent devaluation of, and inattention to, natural rewards. This consequence of addiction can be modeled using a reward comparison paradigm where rats avoid intake of a palatable taste cue that comes to predict access to a drug of abuse. Evidence suggests rats avoid intake following such pairings, at least in part, because the taste cue pales in comparison to the highly rewarding drug expected in the near future. In accordance, lesions of the gustatory thalamus or cortex eliminate avoidance of a taste cue when paired with either a drug of abuse or a rewarding sucrose solution, but not when paired with the aversive agent, LiCl. The present study used bilateral ibotenic acid lesions to evaluate the role of a neighboring thalamic structure, the trigeminal orosensory area (TOA), in avoidance of a gustatory cue when paired with sucrose (experiment 1), morphine (experiment 2), cocaine (experiment 3), or LiCl (experiment 4). The results show that the TOA lesion disrupts, but does not eliminate avoidance of a taste cue that predicts access to a preferred sucrose solution and leaves intact the development of a LiCl-induced conditioned taste aversion. The lesion does, however, eliminate the suppression of intake of a taste cue when paired with experimenter-administered morphine or cocaine using our standard parameters. As such, this is the first manipulation found to dissociate avoidance of a taste cue when mediated by a sweet or by a drug of abuse.
Collapse
|
10
|
Imagery Interference Diminishes in Older Adults: Age-Related Differences in the Magnitude of the Perky Effect. ACTA ACUST UNITED AC 2010; 29:307-322. [DOI: 10.2190/ic.29.4.c] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/22/2022]
Abstract
Studies have documented the negative effects of mental imagery on perception (also known as the Perky effect) in younger adults, but imagery-interference effects in older adults have never been assessed. Two experiments examined this issue directly. Experiment 1 demonstrated that visual mental images diminish visual acuity in younger adults (mean age = 19.0) but not older adults (mean age = 73.6). Experiment 2 obtained parallel results, showing that visual imagery interfered with performance on a visual detection task in younger (mean age = 18.7) but not older adults (mean age = 66.7). Processes underlying age-related differences in imagery-interference effects are discussed and implications of these results for changes in cognitive performance in older adults are considered.
Collapse
|
11
|
Altered eyeblink reflex conditioning in restless legs syndrome patients. Sleep Med 2010; 11:314-9. [DOI: 10.1016/j.sleep.2009.06.010] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 01/29/2009] [Revised: 04/24/2009] [Accepted: 06/16/2009] [Indexed: 10/19/2022]
|
12
|
Perinatal nutritional iron deficiency impairs noradrenergic-mediated synaptic efficacy in the CA1 area of rat hippocampus. J Nutr 2010; 140:642-7. [PMID: 20089786 PMCID: PMC2821889 DOI: 10.3945/jn.109.114702] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/14/2022] Open
Abstract
Many studies have shown that perinatal nutritional iron deficiency (ID) produces learning impairments in children. Research has also shown that catecholamines like epinephrine and norepinephrine play a pivotal role in the consolidation of memories. In this study, we sought to determine if perinatal ID impairs the following: 1) noradrenergic synaptic function in the hippocampus; and 2) several forms of hippocampus-dependent fear learning. Electrophysiological brain slice methods were used to examine noradrenergic-mediated synaptic efficacy in the CA1-hippocampus of rats that were subjected to perinatal ID or control (CN) diets. Rats were fed ID (3 mg Fe/kg) or CN (45 mg Fe/kg) diets on gestational d 14. These diets were maintained until postnatal d (P) 12 after which all rats were switched to the CN diet. Hippocampal slices were prepared between P26 and P30. The noradrenergic agonist isoproterenol (ISO) (1, 2, or 4 micromol) was used to induce modulatory increases in synaptic efficacy in the hippocampal slices. CN slices showed a long-lasting increase in synaptic efficacy as the result of ISO perfusion in the slice bath, whereas ID slices did not show increases in synaptic efficacy as the result of ISO perfusion. ID and CN groups did not differ when ISO was perfused through slices from adult rats (P61). Both young and adult ID rats showed reduced levels of hippocampus-dependent fear learning compared with the young and adult CN rats. Together, these findings suggest that ID may impair early forms of noradrenergic-mediated synaptic plasticity, which may in turn play a role in adult learning deficits.
Collapse
|
13
|
Perinatal nutritional iron deficiency impairs hippocampus-dependent trace eyeblink conditioning in rats. Dev Neurosci 2007; 30:243-54. [PMID: 17962715 DOI: 10.1159/000110502] [Citation(s) in RCA: 17] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2006] [Accepted: 03/17/2007] [Indexed: 11/19/2022] Open
Abstract
Studies show that iron deficient (ID) children are at risk for poor cognitive development. Research also shows that ID may impair the development of the skeletal motor abilities. The present study sought to determine if perinatal ID in rats impairs a motor learning task called eyeblink conditioning. This task used a hippocampus-dependent trace version or non-hippocampus-dependent delay version. Rats were placed on ID or control diets from gestational day (G) 12 to postnatal day (P) 12. Young rats (P32-29) subjected to perinatal ID showed severe impairments in trace eyeblink conditioning but only minor impairments in delay eyeblink conditioning. A young moderate ID group (ID from G12 to P2) was also impaired in trace eyeblink conditioning. The ID rats that became adults (P64-69) showed only minor impairments in trace eyeblink conditioning. Young ID rats showed no deficits in motoric ability on a separate rotorod learning test. This study suggests that perinatal ID impairs motoric learning by altering higher-order learning centers like the hippocampus more so than by altering the skeletal motor system.
Collapse
|
14
|
Abstract
Rehabilitation of the elderly patient with a neurologic disease consists primarily of the coordinated actions of an interdisciplinary team of physicians. Key aspects of this process are remediation to reduce neurologic impairments, prevention of secondary complications and comorbidities, compensation to offset and adapt to residual disabilities, and maintenance of function over the long term.
Collapse
|
15
|
Abstract
Ideomotor apraxia, disordered movement execution to command, commonly follows left-hemisphere damage, implying left-hemisphere dominance for certain kinds of movements. To delineate this dominance we used different command modalities to elicit meaningful movements and tested imitation of nonsense movements. Twenty-seven patients with unilateral hemispheric stroke and 10 age-matched controls were evaluated. Patients with left-hemisphere damage performed both meaningful and nonsense movements poorer than the other study groups; thus, the meaningfulness of the movements is irrelevant for the left-hemisphere motor dominance. The performance varied, however, with the command modality and movement type. Based on this and earlier studies we posit that the left-hemisphere motor dominance is determined by the artificiality of the test situation (it concerns movements performed to command and out of the natural context) and increased spatial and temporal complexity of the demanded movements. No association between the lesion locus within the left hemisphere and the severity of the ideomotor apraxia was found.
Collapse
|
16
|
Botulinum toxin type A in the treatment of upper extremity spasticity: a randomized, double-blind, placebo-controlled trial. Neurology 1996. [PMID: 8628472 DOI: 10.1212/ana.410280407] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.0] [Reference Citation Analysis] [Abstract] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/09/2023] Open
Abstract
Spasticity is a disorder of excess muscle tone associated with CNS disease. We hypothesized that botulinum toxin, a neuromuscular blocking agent, would reduce tone in spastic muscles after stroke. This randomized, double-blind, placebo-controlled, multicenter clinical trial evaluated the safety and efficacy of botulinum toxin type A (BTXA) in the treatment of chronic upper limb spasticity after stroke. Thirty-nine patients received IM injections of a total dose of either 75, 150, or 300 units of BTXA or placebo into the biceps, flexor carpi radialis, and flexor carpi ulnaris muscles. At baseline, patients demonstrated a mean wrist flexor tone of 2.9 and elbow flexor tone of 2.6 on the Ashworth Scale (0 to 4). Treatment with the 300-unit BTXA dose resulted in a statistically and clinically significant mean decrease in wrist flexor tone of 1.2 (p = 0.028), 1.1 (p = 0.044), and 1.2 (p = 0.026) points and elbow flexor tone of 1.2 (p = 0.024), 1.2 (p = 0.028), and 1.1 (p = 0.199) at weeks 2, 4, and 6 postinjection. In the placebo group, tone reduction at the wrist was 0.3, 0.2, and 0.0 and at the elbow was 0.3, 0.3, and 0.6 at weeks 2, 4, and 6 postinjection. BTXA groups reported significant improvement on the physician and patient Global Assessment of Response to Treatment at weeks 4 and 6 postinjection. There were no serious adverse effects. In this 3-month study, BTXA safely reduced upper extremity muscle tone in patients with chronic spasticity after stroke.
Collapse
|
17
|
Botulinum toxin type A in the treatment of upper extremity spasticity: a randomized, double-blind, placebo-controlled trial. Neurology 1996; 46:1306-10. [PMID: 8628472 DOI: 10.1212/wnl.46.5.1306] [Citation(s) in RCA: 268] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/06/2023] Open
Abstract
Spasticity is a disorder of excess muscle tone associated with CNS disease. We hypothesized that botulinum toxin, a neuromuscular blocking agent, would reduce tone in spastic muscles after stroke. This randomized, double-blind, placebo-controlled, multicenter clinical trial evaluated the safety and efficacy of botulinum toxin type A (BTXA) in the treatment of chronic upper limb spasticity after stroke. Thirty-nine patients received IM injections of a total dose of either 75, 150, or 300 units of BTXA or placebo into the biceps, flexor carpi radialis, and flexor carpi ulnaris muscles. At baseline, patients demonstrated a mean wrist flexor tone of 2.9 and elbow flexor tone of 2.6 on the Ashworth Scale (0 to 4). Treatment with the 300-unit BTXA dose resulted in a statistically and clinically significant mean decrease in wrist flexor tone of 1.2 (p = 0.028), 1.1 (p = 0.044), and 1.2 (p = 0.026) points and elbow flexor tone of 1.2 (p = 0.024), 1.2 (p = 0.028), and 1.1 (p = 0.199) at weeks 2, 4, and 6 postinjection. In the placebo group, tone reduction at the wrist was 0.3, 0.2, and 0.0 and at the elbow was 0.3, 0.3, and 0.6 at weeks 2, 4, and 6 postinjection. BTXA groups reported significant improvement on the physician and patient Global Assessment of Response to Treatment at weeks 4 and 6 postinjection. There were no serious adverse effects. In this 3-month study, BTXA safely reduced upper extremity muscle tone in patients with chronic spasticity after stroke.
Collapse
|
18
|
Abstract
Recognition of non-verbal environmental sounds was investigated in 52 subjects with unilateral cerebro-vascular accidents and 18 age-matched normal controls. Impaired performance was most consistently found following cortical damage of homologous areas in either the left or the right hemisphere. Lesions involved the superior temporal gyrus (including the planum temporale), the inferior parietal lobe and the parietal operculum; this area appears to constitute the human auditory cortical processing area. We found different error patterns dependent upon the side of the lesion: patients with right hemisphere damage failed to discriminate between acoustically related sounds, patients with left hemisphere lesions tended to confuse semantically related sound sources. The impairment following right hemisphere damage was specific for non-verbal environmental sounds while left hemisphere damage was associated with disturbed semantic capabilities in multiple modalities.
Collapse
|
19
|
Evaluation of an adenosine 5'-triphosphate assay as a screening method to detect significant bacteriuria. J Clin Microbiol 1976; 3:42-6. [PMID: 767357 PMCID: PMC274223 DOI: 10.1128/jcm.3.1.42-46.1976] [Citation(s) in RCA: 42] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/24/2022] Open
Abstract
The bioluminescent reaction of adenosine 5'-triphosphate (ATP) with luciferin and luciferase has been used in conjunction with a sensitive photometer (Lab-Line's ATP photometer) to detect significant bacteriuria in urine. This rapid method of screening urine specimens for bacteriuria was evaluated by using 348 urine specimens submitted to the clinical microbiology laboratory at the University of Minnesota Hospitals for routine culture using the calibrated loop-streak plate method. There was 89.4% agreement between the culture method and the ATP assay, with 7.0% false positive and 27.0% false negative results from the ATP assay using 10(5) organisms/ml of urine or greater as positive for significant bacteriuria and less than 10(5) organisms/ml as negative for significant bacteriuria.
Collapse
|