1
|
Ingram RJ, Ludwig HD, Scherm H. Epidemiology of Exobasidium Leaf and Fruit Spot of Rabbiteye Blueberry: Pathogen Overwintering, Primary Infection, and Disease Progression on Leaves and Fruit. Plant Dis 2019; 103:1293-1301. [PMID: 30998451 DOI: 10.1094/pdis-09-18-1534-re] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Epidemiological field studies utilizing disease monitoring, spore trapping, and trap plants were conducted on rabbiteye blueberry (Vaccinium virgatum) between 2014 and 2017 to shed light on the epidemiology of Exobasidium leaf and fruit spot, an emerging disease in the southeastern United States caused by the fungus Exobasidium maculosum. Wash plating of field-collected blueberry tissue from the late dormant season through bud expansion showed that the pathogen overwintered epiphytically on blueberry plants in the field, most likely in its yeast-like conidial stage. Agrichemical applications during the dormant season altered epiphytic populations of the pathogen, which correlated directly with leaf spot incidence later in the spring. Disease monitoring of field plants and weekly exposure of potted trap plants revealed that young leaves at the mouse-ear stage were most susceptible to infection, that disease incidence on leaves progressed monocyclically, and that infection periods were associated with rainfall variables such as the number of days per week with ≥1.0 mm of rain or cumulative weekly rainfall. Weekly spore trapping with an Andersen sampler showed that airborne inoculum was detected only after sporulating leaf lesions producing basidiospores were present in the field, suggesting that the primary inoculum is not airborne. The first symptoms on young, green fruit were observed soon after petal fall (requiring removal of the waxy fruit layer to visualize lesions), and visible disease progress on fruit was delayed by 1 to 3 weeks relative to that on leaves. Fruit infection of field plants and trap plants occurred before airborne propagules were detected by spore trapping and before sporulating leaf lesions were present in the field. Hence, this study showed that fruit infections are initiated by the same initial inoculum as leaf infections but it was not possible to conclusively exclude the possibility of a contribution of basidiospore inoculum from leaf lesions to disease progress on later developing fruit. This is one of only a few studies addressing the epidemiology and disease cycle of an Exobasidium sp. in a pathosystem where artificial inoculation has not been possible to date.
Collapse
Affiliation(s)
- R J Ingram
- Department of Plant Pathology, University of Georgia, Athens, GA
| | - H D Ludwig
- Department of Plant Pathology, University of Georgia, Athens, GA
| | - H Scherm
- Department of Plant Pathology, University of Georgia, Athens, GA
| |
Collapse
|
2
|
Abstract
Xylella fastidiosa is one of the most important threats to plant health worldwide. This bacterial pathogen has a long history, causing disease in the Americas on a range of agricultural crops and trees, with severe economic repercussions particularly on grapevine and citrus. In Europe, X. fastidiosa was detected for the first time in 2013 in association with a severe disease affecting olive trees in southern Italy. Subsequent mandatory surveys throughout Europe led to discoveries in France and Spain in various host species and environments. Detection of additional introductions of X. fastidiosa continue to be reported from Europe, for example from northern Italy in late 2018. These events are leading to a sea change in research, monitoring and management efforts as exemplified by the articles in this Focus Issue . X. fastidiosa is part of complex pathosystems together with hosts and vectors. Although certain X. fastidiosa subspecies and environments have been well studied, particularly those that pertain to established disease in North and South America, this represents only a fraction of the existing genetic, epidemiological, and ecological diversity. This Focus Issue highlights some of the key challenges that must be overcome to address this new global threat, recent advances in understanding the pathosystem, and steps toward improved disease control. It brings together the broad research themes needed to address the global threat of X. fastidiosa, encompassing topics from host susceptibility and resistance, genome sequencing, detection methods, transmission by vectors, epidemiological drivers, chemical and biological control, to public databases and social sciences. Open communication and collaboration among scientists, stakeholders, and the general public from different parts of the world will pave the path to novel ideas to understand and combat this pathogen.
Collapse
Affiliation(s)
- R P P Almeida
- 1 Department of Environmental Science, Policy & Management, University of California, Berkeley
| | - L De La Fuente
- 2 Department of Entomology & Plant Pathology, Auburn University, Auburn, AL
| | - R Koebnik
- 3 IRD, Cirad, Universite de Montpellier, IPME, Montpellier, France
| | - J R S Lopes
- 4 Department of Entomology and Acarology, Luiz de Queiroz College of Agriculture (Esalq), University of São Paulo, Piracicaba, SP, Brazil
| | - S Parnell
- 5 School of Environment & Life Sciences, University of Salford, Manchester, UK; and
| | - H Scherm
- 6 Department of Plant Pathology, University of Georgia, Athens 30605
| |
Collapse
|
3
|
Williford LA, Savelle AT, Scherm H. Effects of Blueberry red ringspot virus on Yield and Fruit Maturation in Southern Highbush Blueberry. Plant Dis 2016; 100:171-174. [PMID: 30688573 DOI: 10.1094/pdis-04-15-0381-re] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Blueberry red ringspot virus (BRRV) has become prevalent in southern highbush blueberry in the southeastern United States but information about the yield effects associated with the disease is limited and conflicting. A 3-year study was conducted on mature, container-grown plants of 'Star' and 'Jewel' blueberry that were either systemically infected or not infected with BRRV to determine the effect of the disease on flower bud numbers and fruit yield and on advances or delays in fruit ripening. On Star, flower bud counts were lower for BRRV-positive plants (P = 0.0137 in one year and P = 0.1085 in another) but no such effect was observed for Jewel. When fruit were harvested over time during the ripening period in the spring, no consistent yield or berry weight reductions were observed due to BRRV infection for either cultivar. On Star, fruit maturity tended to be slightly advanced in BRRV-positive plants in all years. Specifically, the weight of unripe fruit remaining after the last harvest was consistently lower for BRRV-positive plants than for BRRV-negative plants, suggesting that BRRV infection in Star may lead to a shorter fruit ripening period. No such effect on fruit ripening was observed for Jewel. It is concluded that-for the cultivars examined in this study-BRRV causes a relatively benign infection with no negative yield implications.
Collapse
Affiliation(s)
- L A Williford
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - A T Savelle
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| |
Collapse
|
4
|
Everhart SE, Scherm H. Fine-Scale Genetic Structure of Monilinia fructicola During Brown Rot Epidemics Within Individual Peach Tree Canopies. Phytopathology 2015; 105:542-549. [PMID: 25317843 DOI: 10.1094/phyto-03-14-0088-r] [Citation(s) in RCA: 10] [Impact Index Per Article: 1.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/04/2023]
Abstract
The purpose of this study was to determine the fine-scale genetic structure of populations of the brown rot pathogen Monilinia fructicola within individual peach tree canopies to better understand within-tree plant pathogen diversity and to complement previous work on spatiotemporal development of brown rot disease at the canopy level. Across 3 years in a total of six trees, we monitored disease development, collected isolates from every M. fructicola symptom during the course of the season, and created high-resolution three-dimensional maps of all symptom and isolate locations within individual canopies using an electromagnetic digitizer. Each canopy population (65 to 173 isolates per tree) was characterized using a set of 13 microsatellite markers and analyzed for evidence of spatial genetic autocorrelation among isolates during the epidemic phase of the disease. Results showed high genetic diversity (average uh=0.529) and high genotypic diversity (average D=0.928) within canopies. The percentage of unique multilocus genotypes within trees was greater for blossom blight isolates (78.2%) than for fruit rot isolates (51.3%), indicating a greater contribution of clonal reproduction during the preharvest epidemic. For fruit rot isolates, between 54.2 and 81.7% of isolates were contained in one to four dominant clonal genotypes per tree having at least 10 members. All six fruit rot populations showed positive and significant spatial genetic autocorrelation for distance classes between 0.37 and 1.48 m. Despite high levels of within-tree pathogen diversity, the contribution of locally available inoculum combined with short-distance dispersal is likely the main factor generating clonal population foci and associated spatial genetic clustering within trees.
Collapse
Affiliation(s)
- S E Everhart
- First and second authors: Department of Plant Pathology, University of Georgia, Athens, GA 30602
| | | |
Collapse
|
5
|
Holland RM, Christiano RSC, Gamliel-Atinsky E, Scherm H. Distribution of Xylella fastidiosa in Blueberry Stem and Root Sections in Relation to Disease Severity in the Field. Plant Dis 2014; 98:443-447. [PMID: 30708723 DOI: 10.1094/pdis-06-13-0680-re] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Xylella fastidiosa causes bacterial leaf scorch, a new disease of southern highbush blueberry in the southeastern United States. Infections occlude the xylem of affected plants, causing drought-like symptoms and, eventually, plant death. To assess the likelihood of mitigation of bacterial leaf scorch through cultural practices such as pruning or hedging of affected plants, we determined the localization and population density of X. fastidiosa in naturally infected blueberry plants with varying levels of bacterial leaf scorch severity. Stem segments were sampled from the current season's growth down to the base of the plant, as were root segments on plants that were either asymptomatic or had light, moderate, or severe symptoms in three plantings affected by the disease. Stem sap was extracted from each segment and population densities of X. fastidiosa were determined using real-time polymerase chain reaction with species-specific primers. Detection frequencies were lowest (but non-zero) in sap from asymptomatic plants and highest in plants with severe symptoms. In asymptomatic plants, detection was generally least frequent (0 to 20.0%) in top and root sections and highest (4.6 to 55.6%) in middle and base stem sections. As disease severity increased, detection frequencies in roots increased to >80% in two plantings and to 60% in the third planting. Overall, detection frequencies were highest (>80%) in middle and base stem sections of plants from the moderate and severe disease classes. The lowest bacterial titers (averaging 0 to 2.1 × 101 CFU per 50 μl of sap) were observed in top and root sections of asymptomatic plants, whereas the highest titers (generally between 104 and 105 CFU per 50 μl of sap) were obtained from middle, base, and root sections of plants from the moderate and severe classes. The presence of the bacterium in middle and base stem sections at low disease severity indicates rapid distribution of X. fastidiosa in affected plants. Because the pathogen accumulates in the roots at moderate and high disease severity levels, management strategies such as pruning and mowing are unlikely to be effective in curing affected plants from bacterial leaf scorch.
Collapse
Affiliation(s)
- R M Holland
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - R S C Christiano
- Department of Plant Pathology, University of Georgia, Athens 30602
| | | | - H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| |
Collapse
|
6
|
Scherm H, Thomas CS, Garrett KA, Olsen JM. Meta-analysis and other approaches for synthesizing structured and unstructured data in plant pathology. Annu Rev Phytopathol 2014; 52:453-76. [PMID: 25001455 DOI: 10.1146/annurev-phyto-102313-050214] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/12/2023]
Abstract
The term data deluge is used widely to describe the rapidly accelerating growth of information in the technical literature, in scientific databases, and in informal sources such as the Internet and social media. The massive volume and increased complexity of information challenge traditional methods of data analysis but at the same time provide unprecedented opportunities to test hypotheses or uncover new relationships via mining of existing databases and literature. In this review, we discuss analytical approaches that are beginning to be applied to help synthesize the vast amount of information generated by the data deluge and thus accelerate the pace of discovery in plant pathology. We begin with a review of meta-analysis as an established approach for summarizing standardized (structured) data across the literature. We then turn to examples of synthesizing more complex, unstructured data sets through a range of data-mining approaches, including the incorporation of 'omics data in epidemiological analyses. We conclude with a discussion of methodologies for leveraging information contained in novel, open-source data sets through web crawling, text mining, and social media analytics, primarily in the context of digital disease surveillance. Rapidly evolving computational resources provide platforms for integrating large and complex data sets, motivating research that will draw on new types and scales of information to address big questions.
Collapse
Affiliation(s)
- H Scherm
- Department of Plant Pathology, University of Georgia, Athens, Georgia 30602;
| | | | | | | |
Collapse
|
7
|
Mehra LK, MacLean DD, Savelle AT, Scherm H. Postharvest Disease Development on Southern Highbush Blueberry Fruit in Relation to Berry Flesh Type and Harvest Method. Plant Dis 2013; 97:213-221. [PMID: 30722315 DOI: 10.1094/pdis-03-12-0307-re] [Citation(s) in RCA: 13] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Postharvest decay, incited by various fungal pathogens, is a major concern in most blueberry production areas of the United States. Because the risk of infection is increased by fruit bruising, which in turn is increased by machine-harvesting, it has been difficult to harvest fruit from the early-maturing but soft-textured southern highbush blueberries (SHB) mechanically for the fresh market. This could change fundamentally with the recent development of SHB genotypes with crisp-textured ("crispy") berries, i.e., fruit with qualitatively firmer flesh and/or more resistant skin. Four replicate row sections of two or three SHB genotypes having crispy fruit and three with conventional fruit were either hand- or machine-harvested at a commercial blueberry farm in northern Florida in April 2009 and May 2010. Harvested fruit were sorted, packed, and placed in cold storage (2°C) for up to 3 weeks. Average counts of aerobic bacteria, total yeasts and molds, coliforms, and Escherichia coli on fruit samples before the cold storage period were below commercial tolerance levels in most cases. In both years, natural disease incidence after cold storage was lowest for hand-harvested crispy fruit and highest for machine-harvested conventional fruit. Interestingly, machine-harvested crispy fruit had the same or lower disease incidence as hand-harvested conventional fruit. Across all treatments, natural postharvest disease incidence was inversely related to fruit firmness, with firmness values >220 g/mm associated with low disease. In separate experiments, samples from the 0-day cold storage period were inoculated at the stem end with Alternaria alternata, Botrytis cinerea, or Colletotrichum acutatum, and disease incidence was assessed after 7 days in a cold room followed by 60 to 72 h at room temperature. In response to artificial inoculation, less disease developed on crispy berries. No significant effect of harvest method was observed, except for A. alternata inoculation in 2009, when hand-harvested fruit developed a lower level of disease than machine-harvested fruit. Taken together, this study suggests that mechanical harvesting of SHB cultivars with crisp-textured berries is feasible from a postharvest pathology perspective.
Collapse
Affiliation(s)
- L K Mehra
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - D D MacLean
- Department of Horticulture, University of Georgia, Tifton 31793
| | - A T Savelle
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| |
Collapse
|
8
|
Affiliation(s)
- R W Sutherst
- CSIRO Entomology, Australia, Long Pocket Laboratories, PMB No 3, Indooroopilly, Queensland 4068, Australia
| | | | | |
Collapse
|
9
|
Dutta B, Scherm H, Gitaitis RD, Walcott RR. Acidovorax citrulli Seed Inoculum Load Affects Seedling Transmission and Spread of Bacterial Fruit Blotch of Watermelon Under Greenhouse Conditions. Plant Dis 2012; 96:705-711. [PMID: 30727513 DOI: 10.1094/pdis-04-11-0292] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Infested seed are typically the primary source of inoculum for bacterial fruit blotch (BFB) of cucurbits. An inoculum threshold of 1 infested seed per 10,000 seeds is widely used in seed health testing for Acidovorax citrulli. However, the influence of seed inoculum load on BFB seedling transmission has not been elucidated. In this study, watermelon seedlots (128 seeds/lot) containing one seed inoculated with A. citrulli at levels ranging from 1 × 101 to 1 × 107 CFU were used to investigate the effect of seed inoculum load on seedling transmission and spatiotemporal spread of BFB under greenhouse conditions. The relationship between A. citrulli seed inoculum load and frequency of BFB seedling transmission followed a sigmoidal pattern (R2 = 0.986, P = 0.0047). In all, 100 and 96.6% of seedlots containing one seed with 1 × 107 and 1 × 105 CFU of A. citrulli, respectively, transmitted the pathogen to seedlings; in contrast, the proportion of seedlots that yielded BFB-infected seedlings was lower for lots with one seed infested with 1 × 103 (46.6%) and 1 × 101 (16.7%) CFU of A. citrulli. The relationship between A. citrulli seed inoculum load and frequency of pathogen detection in seedlots using immunomagnetic separation combined with a real-time polymerase chain reaction assay also followed a sigmoidal pattern (R2 = 0.997, P = 0.0034). Whereas 100% of samples from seedlots (10,000 seeds/lot) with one seed containing ≥1 × 105 CFU tested positive for A. citrulli, 75% of samples from lots with one seed containing 1 × 103 CFU tested positive for the pathogen, and only 16.7% of samples with one seed containing 10 CFU tested positive. Because disease transmission was observed for lots with just one seed containing 10 A. citrulli CFU, zero tolerance for seedborne A. citrulli is recommended for effective BFB management. The seedling transmission experiments also revealed that temporal spread of BFB in 128-cell seedling trays increased linearly with A. citrulli inoculum load (r2 = 0.976, P = 0.0037). Additionally, the frequency of spatial spread of BFB from an inoculated seedling in the center of a planting tray to adjacent healthy seedlings over one-, two-, or three-cell distances was greater for lots with one seed infested with at least 1 × 105 CFU than for lots with one seed infested at lower inoculum loads (1 × 101 and 1 × 103 CFU/seed).
Collapse
Affiliation(s)
- B Dutta
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - R D Gitaitis
- Department of Plant Pathology, Coastal Plain Experiment Station, University of Georgia, Tifton 31793
| | - R R Walcott
- Department of Plant Pathology, University of Georgia, Athens
| |
Collapse
|
10
|
Christiano RSC, Reilly CC, Miller WP, Scherm H. Oxytetracycline Dynamics on Peach Leaves in Relation to Temperature, Sunlight, and Simulated Rain. Plant Dis 2010; 94:1213-1218. [PMID: 30743611 DOI: 10.1094/pdis-04-10-0282] [Citation(s) in RCA: 14] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Oxytetracycline (OTC), a member of the tetracycline antibiotics, is used as a foliar spray to control Xanthomonas arboricola pv. pruni on stone fruits and Erwinia amylovora on pome fruits. We studied the dynamics of OTC residues on attached peach (Prunus persica) leaves treated with 300 ppm active ingredient of an agricultural OTC in relation to temperature, natural sunlight, and simulated rain. We further evaluated the potential of three ultraviolet (UV) protectants (lignin, titanium dioxide, and oxybenzone) and one sticker-extender (Nu Film-17) to prolong OTC longevity on the leaf surface. OTC residue was determined by high-pressure liquid chromatography (HPLC)-UV (C18 reversed-phase column). In controlled conditions in darkness, constant temperatures up to 40°C did not affect OTC degradation on leaves. In contrast, OTC residue decreased rapidly in natural sunlight in the absence of rain, declining, on average, by 43.8, 77.8, and 92.1% within 1, 2, and 4 days after application, respectively; 7 days after application, OTC levels were near the detection limit. Use of shade fabric with 10 and 40% sunlight transmittance, simulating overcast sky, reduced OTC degradation significantly but did not extend OTC persistence beyond 7 days. Areas under the OTC residue curve, summarizing OTC dynamics during the 7-day exposure period, were negatively and significantly correlated with solar radiation and UV radiation variables, but not with temperature. UV protectants and Nu Film-17 were ineffective in improving OTC persistence in outdoor conditions. Simulated rain at 44 mm h-1 drastically (by 67.2%) lowered OTC residue after 2 min, and levels were near the detection limit after 60 min of continuous rain, regardless of whether plants were exposed to rainfall 1 or 24 h after OTC application. In artificial inoculation experiments with X. arboricola pv. pruni on attached peach leaves, OTC concentrations ≥50 ppm active ingredient (corresponding to ≥0.06 μg OTC cm-2 leaf surface) were sufficient to suppress bacterial spot development. By extrapolation from our outdoor exposure experiments, similar OTC residues following application of labeled OTC rates would be reached after less than 2 days under full sunlight, after 4 days under overcast sky, or after 2 min of a heavy rainstorm.
Collapse
Affiliation(s)
- R S C Christiano
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - C C Reilly
- USDA-ARS Southeastern Fruit and Tree Nut Research Laboratory, Byron, GA 31008
| | - W P Miller
- Department of Crop and Soil Sciences, University of Georgia, Athens 30602
| | - H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| |
Collapse
|
11
|
Abstract
Conventional sampling designs such as simple random sampling (SRS) tend to be inefficient when assessing rare and highly clustered populations because most of the time is spent evaluating empty quadrats, leading to high error variances and high cost. In previous studies with rare plant and animal populations, adaptive cluster sampling, where sampling occurs preferentially in the neighborhood of quadrats in which the species of interest is detected during the sampling bout, has been shown to estimate population parameters with greater precision at an effort comparable to SRS. Here, we use computer simulations to evaluate the efficiency of adaptive cluster sampling for estimating low levels of disease incidence (0.1, 0.5, 1.0, and 5.0%) at various levels of aggregation of infected plants having variance-to-mean ratios (V/M) of approximately 1, 3, 5, and 10. For each simulation, an initial sample size of 50, 100, and 150 quadrats was evaluated, and the condition to adapt neighborhood sampling (CA), i.e., the minimum number of infected plants per quadrat that triggers a switch from random sampling to sampling in neighboring quadrats, was varied from 1 to 4 (corresponding to 7.7 to 30.8% incidence of infected plants per quadrat). The simulations showed that cluster sampling was consistently more precise than SRS at a field-level disease incidence of 0.1 and 0.5%, especially when diseased plants were highly aggregated (V/M = 5 or 10) and when the most liberal condition to adapt (CA = 1) was used. One drawback of adaptive cluster sampling is that the final sample size is unknown at the beginning of the sampling bout because it depends on how often neighborhood sampling is triggered. In our simulations, the final sample size was close to the initial sample size for disease incidence up to 1.0%, especially when a more conservative condition to adapt (CA > 1) was used. For these conditions, the effect of disease aggregation was minor. In summary, both precision and the sample size required with adaptive cluster sampling responded similarly to disease incidence and aggregation, i.e., both were most favorable at the lowest disease incidence with the highest levels of clustering. However, whereas relative precision was optimized with the most liberal condition to adapt, the ratio of final to initial sample size was best for more conservative CA values, indicating a tradeoff. In our simulations, precision and final sample size were both simultaneously favorable for disease incidence of up to 1.0%, but only when infected plants were most aggregated (V/M = 10).
Collapse
Affiliation(s)
- P S Ojiambo
- Department of Plant Pathology, North Carolina State University, Raleigh 27695, USA.
| | | |
Collapse
|
12
|
Dvorák M, Wolf A, Scherm H, Klír P, Vanerková H. [[Determination and evaluation of the influence of addictive drugs on drivers of motor vehicles in Germany]]. Soud Lek 2009; 54:56-58. [PMID: 20302041] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/29/2023]
Affiliation(s)
- M Dvorák
- Ustav soudního lékarství LF UK a FN, Plzen
| | | | | | | | | |
Collapse
|
13
|
Tarnowski TLB, Savelle AT, Scherm H. Activity of Fungicides Against Monilinia vaccinii-corymbosi in Blueberry Flowers Treated at Different Phenological Stages. Plant Dis 2008; 92:961-965. [PMID: 30769729 DOI: 10.1094/pdis-92-6-0961] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
The activity of fenbuconazole and azoxystrobin applied to blueberry flowers at different phenological stages against subsequent gynoecial infection by the mummy berry fungus Monilinia vaccinii-corymbosi was evaluated. In the greenhouse, potted blueberry plants having flower clusters at five distinct stages (from bud scale separation to anthesis) were treated with the two fungicides. One day after anthesis (between 1 and 15 days after fungicide treatment), individual flowers were detached and inoculated with conidia of M. vaccinii-corymbosi in the laboratory. Four days after inoculation, hyphal ingress into the style was determined microscopically as a measure of fungicide efficacy. Results revealed a significant flower stage effect (P < 0.0001), whereby only fungicide application at anthesis but not at the four preanthesis stages reduced subsequent fungal ingress into the style. There was no significant difference between the two fungicides (P > 0.50) nor was there a significant fungicide-flower stage interaction (P > 0.30). In the field during 2 years, mature blueberry plants were treated with the two fungicides and exposed to natural pathogen inoculum. At the time of application, flower clusters at anthesis and at three preanthesis stages were selected and tagged. Mummy berry incidence in fruit developing from the tagged clusters was assessed to determine treatment effects. Whereas fenbuconazole lowered disease incidence for all preanthesis stages, azoxystrobin was effective only at the latest preanthesis stage. The discrepancy between these results and those of the greenhouse study (where there was no preanthesis activity of either fungicide) indirectly suggests post-infection fungicidal activity in the ovary, the base of which was exposed to the fungicide spray at the time of treatment for all flower phenology stages. Thus, although there appears to be insufficient translocation of the two fungicides in flowers treated at preanthesis stages to prevent stylar ingress by the pathogen, fungicidal activity in the ovary may be sufficient to halt subsequent fungal colonization, especially for fenbuconazole. To prescribe the most effective management program for flower-infecting fungi, translocation and post-infection activity of fungicides in floral tissues must be better understood.
Collapse
Affiliation(s)
- T L B Tarnowski
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - A T Savelle
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| |
Collapse
|
14
|
Amiri A, Scherm H, Brannen PM, Schnabel G. Laboratory Evaluation of Three Rapid, Agar-Based Assays to Assess Fungicide Sensitivity in Monilinia fructicola. Plant Dis 2008; 92:415-420. [PMID: 30769692 DOI: 10.1094/pdis-92-3-0415] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Three rapid, agar-based assays were compared with a traditional petri dish method for assessing the sensitivity of Monilinia fructicola to propiconazole (0.3 and 2.0 μg/ml), thiophanate-methyl (1.0 and 50 μg/ml), and azoxystrobin (1.0 and 35 μg/ml) in the laboratory. The three assays were based on mycelial growth inhibition on agar disks sliced from lipbalm tubes filled with fungicide-amended potato dextrose agar (PDA), on PDA-coated cotton swabs, or in PDA-filled microcentrifuge tubes. Mycelial growth inhibition of eight previously characterized isolates (two resistant to propiconazole, two highly resistant to thiophanate-methyl, two with low levels of resistance to thiophanate-methyl, and two sensitive to all three fungicides) was determined visually 24, 48, and 72 h after inoculation. The 48-h time point was the earliest suitable time to collect data for all methods because insufficient growth was recorded in the petri dish and tube assays after 24 h. With the exception of the swab assay, all methods classified the isolates previously determined to be fungicide sensitive correctly (i.e., no fungal growth was observed for these isolates). For propiconazole-resistant isolates, the lipbalm assay resulted in levels of growth inhibition very similar to the petri dish method, whereas the swab assay and the tube assay overestimated and underestimated, respectively, the level of resistance. Both the lipbalm and the swab assays classified isolates correctly as being thiophanate-methyl resistant, and both were able to discriminate the isolates previously classified as having low versus high levels of resistance when treated with this fungicide at 50 μg/ml, as was the petri dish method. None of the eight isolates which previously were determined to be azoxystrobin sensitive grew on azoxystrobin-amended media, regardless of the assay type. Overall, the average percentage of correct isolate classifications (relative to their previously determined resistance status) on propiconazole- and thiophanate-methyl-amended media after 48 h ranged from 87.5 to 100, 85.3 to 100, 63.2 to 94.5, and 50.5 to 81.0% for the petri dish, lipbalm, swab, and tube assays, respectively. The lipbalm assay provided the most accurate assessments (85.3 to 100%) after only 24 h of incubation, supporting its use as a rapid and simple tool to monitor resistance levels in M. fructicola field populations.
Collapse
Affiliation(s)
- A Amiri
- Department of Entomology, Soils, and Plant Sciences, Clemson University, Clemson, SC 29634
| | - H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - P M Brannen
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - G Schnabel
- Department of Entomology, Soils, and Plant Sciences, Clemson University
| |
Collapse
|
15
|
Holb IJ, Scherm H. Quantitative relationships between different injury factors and development of brown rot caused by Monilinia fructigena in integrated and organic apple orchards. Phytopathology 2008; 98:79-86. [PMID: 18943241 DOI: 10.1094/phyto-98-1-0079] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
In a 4-year study, the incidence of various types of injuries (caused by insects, birds, growth cracks, mechanical wounding, and other, unidentified factors) was assessed in relation to brown rot development (caused by Monilinia fructigena) on fruit of three apple cultivars (Prima, Jonathan, and Mutsu) in integrated and organic blocks of two apple orchards in Hungary. In addition, populations of male codling moths (Cydia pomonella) were monitored with pheromone traps season-long in both management systems. On average, injury incidence on fruit at harvest was 6.1 and 19.2% in the integrated and organic treatments, respectively. Insect injury, which was caused primarily by C. pomonella, had the highest incidence among the five injury types, accounting for 79.4% of the total injury by harvest in the organic blocks and 36.6% in the integrated blocks. Levels of all other injury types remained close to zero during most of the season, but the incidence of bird injury and growth cracks increased markedly in the final 3 to 5 weeks before harvest in both production systems. Brown rot developed more slowly and reached a lower incidence in the integrated (6.4% final incidence on average) compared with the organic blocks (20.1% average incidence). In addition, the disease developed later but attained higher levels as the cultivar ripening season increased from early-maturing Prima to late-maturing Mutsu. Overall, 94.3 to 98.7% of all injured fruit were also infected by M. fructigena, whereas the incidence of brown-rotted fruit without visible injury was very low (0.8 to 1.6%). Correlation coefficients (on a per plot basis) and association indices (on a per-fruit basis) were calculated between brown rot and the various injury types for two selected assessment dates 4 weeks preharvest and at harvest. At both dates, the strongest significant (P < 0.05) relationships were observed between brown rot and insect injury and between brown rot and the cumulative number of trapped C. pomonella. At the harvest assessment, two additional significant correlations were between brown rot and bird injury and between brown rot and growth cracks. In every case, correlation coefficients were larger in organic than in integrated blocks. Although it is well established that brown rot in pome fruits is closely associated with fruit injuries, this is the first study to provide season-long progress data on different injury types and quantitative analyses of their relative importance at different times in the growing season and across two distinct management systems.
Collapse
Affiliation(s)
- I J Holb
- Centre of Agricultural Sciences, University of Debrecen, P.O. Box 36, H-4015 Debrecen, Hungray.
| | | |
Collapse
|
16
|
Scherm H, Savelle AT, Boozer RT, Foshee WG. Seasonal Dynamics of Conidial Production Potential of Fusicladium carpophilum on Twig Lesions in Southeastern Peach Orchards. Plant Dis 2008; 92:47-50. [PMID: 30786365 DOI: 10.1094/pdis-92-1-0047] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Conidia produced on overwintered lesions on 1-year-old twigs constitute the only source of primary inoculum for the peach scab fungus, Fusicladium carpophilum; however, there is little quantitative information about the dynamics of sporulation throughout the season. Starting in late winter and continuing until midsummer over a 4-year period, twig segments were sampled every 1 to 2 weeks from peach trees untreated with fungicide from a total of 18 trials (site-cultivar-year combinations) in Georgia, Alabama, and South Carolina. Twig samples were incubated in a moist chamber in the laboratory for 48 h and washed on a wrist-action shaker, and conidial production potential was determined by microscopic counts in aliquots of the wash water. When plotted against calendar date (day of the year), there was considerable variation among cultivars, sites, and years in the temporal pattern of conidial numbers of F. carpophilum. For example, conidia first were detected on samples collected between mid-February and late March, and the highest peak in conidial numbers was observed between late March and mid-May. In contrast, when conidial numbers were expressed as cumulative totals in relation to phenological time (either days after full bloom or days after calyx-split), temporal progress was very similar among trials. Conidial production summarized in this manner generally commenced before bloom and reached 25 and 90% of the seasonal total by calyx-split and 10 weeks after bloom, respectively. A two-parameter sigmoidal function described the relationship between cumulative conidial production and phenological time very well (R2 = 0.9727 and 0.9790 for days after full bloom and days after calyx-split, respectively; P < 0.0001, n = 260). Expression of time in degree-days did not improve the relationship between cumulative conidial numbers and phenological time. Thus, knowledge of host tree phenology may be sufficient to derive strategic estimates of disease risk based on the predictable seasonal pattern of conidial production potential; this seasonal, inoculum-based risk estimate may be used to adjust daily infection risk estimates based on models that consider microclimatic conditions affecting pathogen growth and infection.
Collapse
Affiliation(s)
- H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - A T Savelle
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - R T Boozer
- Alabama Cooperative Extension System, Chilton Research and Extension Center, Clanton 35045
| | - W G Foshee
- Department of Horticulture, Auburn University, Auburn, AL 36849
| |
Collapse
|
17
|
Christiano RSC, Scherm H. Quantitative aspects of the spread of asian soybean rust in the southeastern United States, 2005 to 2006. Phytopathology 2007; 97:1428-1433. [PMID: 18943512 DOI: 10.1094/phyto-97-11-1428] [Citation(s) in RCA: 11] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
ABSTRACT The regional dynamics of soybean rust, caused by Phakopsora pachyrhizi, in six southeastern states (Florida, Georgia, Alabama, South Carolina, North Carolina, and Virginia) in 2005 and 2006 were analyzed based on disease records collected as part of U.S. Department of Agriculture's soybean rust surveillance and monitoring program. The season-long rate of temporal disease progress averaged approximately 0.5 new cases day(1) and was higher in nonsentinel soybean (Glycine max) plots than in sentinel soybean plots and kudzu (Pueraria lobata) plots. Despite the early detection of rust on kudzu in January and/or February each year (representing the final phase of the previous year's epidemic), the disease developed slowly during the spring and early summer on this host species and did not enter its exponential phase until late August, more than 1 month after it did so on soybean. On soybean, cases occurred very sporadically before the beginning of July, after which their number increased rapidly. Thus, while kudzu likely provides the initial inoculum for epidemics on soybean, the rapid increase in disease prevalence on kudzu toward the end of the season appears to be driven by inoculum produced on soybean. Of 112 soybean cases with growth stage data, only one occurred during vegetative crop development while approximately 75% occurred at stage R6 (full seed) or higher. The median nearest-neighbor distance of spread among cases was approximately 70 km in both years, with 10% of the distances each being below approximately 30 km and above approximately 200 km. Considering only the epidemic on soybean, the disease expanded at an average rate of 8.8 and 10.4 km day(1) in 2005 and 2006, respectively. These rates are at the lower range of those reported for the annual spread of tobacco blue mold from the Caribbean Basin through the southeastern United States. Regional spread of soybean rust may be limited by the slow disease progress on kudzu during the first half of the year combined with the short period available for disease establishment on soybean during the vulnerable phase of host reproductive development, although low inoculum availability in 2005 and dry conditions in 2006 also may have reduced epidemic potential.
Collapse
|
18
|
Holb IJ, Scherm H. Temporal dynamics of brown rot in different apple management systems and importance of dropped fruit for disease development. Phytopathology 2007; 97:1104-1111. [PMID: 18944175 DOI: 10.1094/phyto-97-9-1104] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
ABSTRACT Epidemic development of brown rot, caused by Monilinia fructigena, was monitored in integrated and organic apple orchards at two locations in eastern Hungary between 2002 and 2005 on three cultivars with early, midseason, and late ripening periods. Disease incidence and severity measures were affected significantly (P < 0.05) by management system (organic versus integrated) and cultivar, but there was no significant management system-cultivar interaction. Epidemics started 2 to 4 weeks earlier in organic orchards and on the early cv. Prima compared with integrated orchards and the late cv. Mutsu. Disease intensity increased markedly in the final 3 to 5 weeks before harvest and was considerably lower in integrated than in organic orchards. Final brown rot incidence on fruit in the tree was correlated with incidence on dropped fruit on the orchard floor (r > 0.75, P < 0.05), whereby the lag period from the appearance of the first symptomatic fruit on the ground to the occurrence of the first symptomatic fruit in the tree ranged from 2 weeks to 2 months, depending on the cultivar. The inflection point of the disease progress curve was attained first by fruit on the ground, followed successively by fruit in the lower, middle, and upper thirds of the tree canopy. This may indicate that dropped fruit that became infected early provided a source of inoculum for subsequent epidemics by serving as a bridge between sporulation from overwintered fruit mummies in the spring and the first fruit with sporulating lesions in the tree in midsummer. Removal of dropped fruit from the orchard floor resulted in a significantly lower disease incidence on fruit in the tree on all cultivars; thus, drop-removal may be useful as a brown rot management practice in apple orchards.
Collapse
|
19
|
Chen H, Kaufmann C, Scherm H. Laboratory evaluation of flight performance of the plum curculio (Coleoptera: Curculionidae). J Econ Entomol 2006; 99:2065-71. [PMID: 17195674 DOI: 10.1603/0022-0493-99.6.2065] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Subscribe] [Scholar Register] [Indexed: 05/13/2023]
Abstract
Flight performance of laboratory-reared adults of the plum curculio, Conotrachelus nenuphar (Herbst) (Coleoptera: Curculionidae), was investigated under controlled conditions by using a flight mill system. Across all insects tested (n=198), median values of total distance traveled, total flight time, and maximum uninterrupted flight time were 122.7 m day(-1), 23.5 min day(-1), and 2.0 min, respectively. The latter result indicates that flight occurred primarily in short bursts. Although females had a significantly higher body mass than males, there were no significant differences in flight performance between the two sexes. Flight during the first 24-h test period (especially the first 6 h) was dominated by escape behavior, i.e., elevated levels of activity presumably associated with attempts by the insects to regain freedom of movement; during the second 24 h, flight activity was very limited throughout the late morning and afternoon, increased around sunset, and remained high during the night. All flight performance variables decreased linearly and significantly with insect age over the age range tested (2-16 d after emergence). Nutritional status also had a significant effect, whereby insects that had been provided with apples as a food source for 2 d after emergence showed considerably improved flight performance compared with those that had been given no food or only water during the same period. There was no significant effect of mating status on flight performance of male or female insects.
Collapse
Affiliation(s)
- H Chen
- Department of Plant Pathology, University of Georgia, Athens, GA 30602, USA
| | | | | |
Collapse
|
20
|
Ojiambo PS, Scherm H. Biological and application-oriented factors influencing plant disease suppression by biological control: a meta-analytical review. Phytopathology 2006; 96:1168-74. [PMID: 18943952 DOI: 10.1094/phyto-96-1168] [Citation(s) in RCA: 13] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/09/2023]
Abstract
ABSTRACT Studies to evaluate the effectiveness of biological control in suppressing plant disease often report inconsistent results, highlighting the need to identify general factors that influence the success or failure of biological control in plant pathology. We conducted a quantitative synthesis of previously published research by applying meta-analysis to determine the overall effectiveness of biocontrol in relation to biological and application-oriented factors. For each of 149 entries (antagonist-disease combinations) from 53 reports published in Biological & Cultural Tests between 2000 and 2005, an effect size was calculated as the difference in disease intensity expressed in standard deviation units between the biocontrol treatment and its corresponding untreated control. Effect sizes ranged from -1.15 (i.e., disease strongly enhanced by application of the biocontrol agent) to 4.83 (strong disease suppression by the antagonist) with an overall weighted mean of 0.62, indicating moderate effectiveness on average. There were no significant (P >0.05) differences in effect sizes between entries from studies carried out in the greenhouse versus the field, between those involving soilborne versus aerial diseases, or among those carried out in conditions of low, medium, or high disease pressure (expressed relative to the disease intensity in the untreated control). However, effect sizes were greater on annual than on perennial crops, regardless of whether the analysis was carried out for all entries (P = 0.0268) or for those involving only soilborne diseases (P = 0.0343). Effect sizes were not significantly different for entries utilizing fungal versus bacterial biocontrol agents or for those targeting fungal versus bacterial pathogens. However, entries that used r-selected biological control agents (i.e., those having short generation times and producing large numbers of short-lived offspring) were more effective than those that applied antagonists that were not r-selected (P = 0.0312). Interestingly, effect sizes for entries that used Bacillus spp. as biological control agents were lower than for those that applied other antagonists (P = 0.0046 for all entries and P = 0.0114 for soilborne diseases). When only aerial diseases were considered, mean effect size was greater for entries that received one or two sprays than for those that received more than eight sprays of the biocontrol agent (P = 0.0002). This counterintuitive result may indicate that investigators often attempt unsuccessfully to compensate for anticipated poor performance in antagonist-disease combinations by making more applications.
Collapse
|
21
|
Ojiambo PS, Scherm H. Optimum Sample Size for Determining Disease Severity and Defoliation Associated with Septoria Leaf Spot of Blueberry. Plant Dis 2006; 90:1209-1213. [PMID: 30781103 DOI: 10.1094/pd-90-1209] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
In a 3-year field study, Premier rabbiteye blueberry plants were sampled at three hierarchical levels (leaf, shoot, and bush) to assess severity of Septoria leaf spot (caused by Septoria al-bopunctata) and incidence of defoliation. A positive linear relationship (R 2 = 0.977, P < 0.0001, n = 2127) was observed between the number of spots per leaf and percent necrotic leaf area, both assessed on individual leaves in mid- to late October. For data summarized at the shoot level, percent defoliation increased nonlinearly (R 2 = 0.729, P < 0.0001, n = 224) as disease severity increased, with a rapid rise to an upper limit showing little change in defoliation above 60 spots per leaf. Variance components were calculated for disease severity to partition total variation into variation among leaves per shoot, shoots per bush, and bushes within the field. In all cases, leaves per shoot and shoots per bush accounted for >90% of the total variation. Based on the variance components and linear cost functions (which considered the time required to assess each leaf and select new shoots and bushes for assessment), the optimum sample size for assessing disease severity as number of spots per leaf (with an allowable variation of 20% around the mean) was 75 leaves, one each selected from three shoots per bush on 25 bushes (total time required for assessment: 36.1 min). For disease severity expressed as percent necrotic leaf area, the corresponding values were 144 leaves, two each sampled from three shoots per bush on 24 bushes (total time required: 21.7 min). Thus, given the strong correlation between the two disease variables demonstrated in this study, visual assessment of percent necrotic area was the more efficient method. With an allowable variation of 10% around the mean, a sample of 27 shoots from nine bushes was the optimum sample size for assessing defoliation across the 3 years.
Collapse
Affiliation(s)
- P S Ojiambo
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| |
Collapse
|
22
|
Fletcher J, Bender C, Budowle B, Cobb WT, Gold SE, Ishimaru CA, Luster D, Melcher U, Murch R, Scherm H, Seem RC, Sherwood JL, Sobral BW, Tolin SA. Plant pathogen forensics: capabilities, needs, and recommendations. Microbiol Mol Biol Rev 2006; 70:450-71. [PMID: 16760310 PMCID: PMC1489535 DOI: 10.1128/mmbr.00022-05] [Citation(s) in RCA: 121] [Impact Index Per Article: 6.7] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/20/2022] Open
Abstract
A biological attack on U.S. crops, rangelands, or forests could reduce yield and quality, erode consumer confidence, affect economic health and the environment, and possibly impact human nutrition and international relations. Preparedness for a crop bioterror event requires a strong national security plan that includes steps for microbial forensics and criminal attribution. However, U.S. crop producers, consultants, and agricultural scientists have traditionally focused primarily on strategies for prevention and management of diseases introduced naturally or unintentionally rather than on responding appropriately to an intentional pathogen introduction. We assess currently available information, technologies, and resources that were developed originally to ensure plant health but also could be utilized for postintroduction plant pathogen forensics. Recommendations for prioritization of efforts and resource expenditures needed to enhance our plant pathogen forensics capabilities are presented.
Collapse
Affiliation(s)
- J Fletcher
- Department of Entomology and Plant Pathology, Oklahoma State University, Stillwater, OK 74078, USA.
| | | | | | | | | | | | | | | | | | | | | | | | | | | |
Collapse
|
23
|
Cox KD, Scherm H, Riley MB. Characterization of Armillaria spp. from peach orchards in the southeastern United States using fatty acid methyl ester profiling. ACTA ACUST UNITED AC 2006; 110:414-22. [PMID: 16546364 DOI: 10.1016/j.mycres.2005.12.004] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/23/2005] [Revised: 12/12/2005] [Accepted: 12/28/2005] [Indexed: 10/24/2022]
Abstract
Limited information is available regarding the composition of cellular fatty acids in Armillaria and the extent to which fatty acid profiles can be used to characterize species in this genus. Fatty acid methyl ester (FAME) profiles generated from cultures of A. tabescens, A. mellea, and A. gallica consisted of 16-18 fatty acids ranging from 12-24 carbons in length, although some of these were present only in trace amounts. Across the three species, 9-cis,12-cis-octadecadienoic acid (9,12-C18:2), hexadecanoic acid (16:0), heneicosanoic acid (21:0), 9-cis-octadecenoic acid (9-C18:1), and 2-hydroxy-docosanoic acid (OH-22:0) were the most abundant fatty acids. FAME profiles from different thallus morphologies (mycelium, sclerotial crust, or rhizomorphs) displayed by cultures of A. gallica showed that thallus type had no significant effect on cellular fatty acid composition (P > 0.05), suggesting that FAME profiling is sufficiently robust for species differentiation despite potential differences in thallus morphology within and among species. The three Armillaria species included in this study could be distinguished from other lignicolous basidiomycete species commonly occurring on peach (Schizophyllum commune, Ganoderma lucidum, Stereum hirsutum, and Trametes versicolor) on the basis of FAME profiles using stepwise discriminant analysis (average squared canonical correlation = 0.953), whereby 9-C18:1, 9,12-C18:2, and 10-cis-hexadecenoic acid (10-C16:1) were the three strongest contributors. In a separate stepwise discriminant analysis, A. tabescens, A. mellea, and A. gallica were separated from one another based on their fatty acid profiles (average squared canonical correlation = 0.924), with 11-cis-octadecenoic acid (11-C18:1), 9-C18:1, and 2-hydroxy-hexadecanoic acid (OH-16:0) being most important for species separation. When fatty acids were extracted directly from mycelium dissected from naturally infected host tissue, the FAME-based discriminant functions developed in the preceding experiments classified all samples (n = 16) as A. tabescens; when applied to cultures derived from the same naturally infected samples, all unknowns were similarly classified as A. tabescens. Thus, FAME species classification of Armillaria unknowns directly from infected tissues may be feasible. Species designation of unknown Armillaria cultures by FAME analysis was identical to that indicated by IGS-RFLP classification with AluI.
Collapse
Affiliation(s)
- K D Cox
- Department of Plant Pathology, University of Georgia, Athens, GA 30602, USA
| | | | | |
Collapse
|
24
|
Ojiambo PS, Scherm H, Brannen PM. Septoria Leaf Spot Reduces Flower Bud Set and Yield Potential of Rabbiteye and Southern Highbush Blueberries. Plant Dis 2006; 90:51-57. [PMID: 30786474 DOI: 10.1094/pd-90-0051] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
In field trials on Premier rabbiteye blueberry, individual shoots were selected and tagged in the fall of 2001, 2002, and 2003 to quantify the effects of Septoria leaf spot severity and disease-induced premature defoliation on flower bud set and return yield. Experiments were carried outsimilarly on Bluecrisp southern highbush blueberry using shoots tagged after fruit harvest in the summer of 2002 and 2003. Leaves on the distal 20-cm segments of these shoots were monitored for disease severity (number of spots per leaf) through the remainder of the growing season; at the same time, defoliation (expressed as the proportion of nodes with missing leaves) was recorded for each of the shoot segments. Flower bud set was assessed subsequently in winter or early spring, and berries were harvested as they matured the following summer to determine return yield. For both cultivars, higher flower bud numbers were more likely to occur on shoots with lower disease levels the previous fall (P ≤ 0.0462 based on a Kolmogorov-Smirnov test). The data further showed that flower bud set potential (i.e., the maximum number of buds on shoots within a given disease severity range) decreased linearly as disease severity increased (r2 ≥ 0.926, P ≤ 0.0005). Based on the slope of this relationship, flower bud set potential decreased by one bud per shoot as disease severity the previous fall increased by 18 and 12 spots per leaf for Premier and Bluecrisp, respectively. Relationships between yield and disease variables were similar to those of flower bud numbers and disease, except that the decrease in yield potential (i.e., the maximum fruit weight per shoot within a given disease severity range) was less gradual than for flower bud set potential. On Premier, yield potential dropped markedly and significantly as disease severity the previous fall exceeded about 50 to 60 spots per leaf on average (P < 0.0001 based on a Kruskal-Wallis test). Evidence for such a threshold effect was weaker on Bluecrisp, presumably because of the lower number of data points for this cultivar combined with lower yields due to poor pollination.
Collapse
Affiliation(s)
- P S Ojiambo
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - P M Brannen
- Department of Plant Pathology, University of Georgia, Athens 30602
| |
Collapse
|
25
|
Abstract
Septoria leaf spot, caused by Septoria albopunctata, is an important disease on blueberry in the southeastern United States, yet its epidemiology is largely unknown. Disease severity and dissemination of pycnidiospores were monitored from 2002 to 2004 in a planting of susceptible Premier rabbiteye blueberry to characterize the temporal progress of the disease and determine the effect of inoculum dynamics and selected leaf attributes on disease development. Disease onset was observed between late April and mid-June, followed by a rapid increase in disease severity until mid- to late September; thereafter, disease severity decreased until the end of the season due to abscission of severely infected leaves. A logistic model was fitted to disease severity data using nonlinear regression, and parameter estimates were used to compare the effects of leaf position on the shoot and shoot location in the canopy on disease progress. Based on this model, the highest absolute rate of disease increase and the highest upper asymptote of disease severity were predicted for leaves in intermediate positions on the shoot and for shoots in the lower canopy. Data collected with funnel spore samplers showed that splash-dispersed pycnidiospores of S. albopunctata were available throughout most of the period from April through late October. Final disease severity on individual leaves was more strongly correlated with cumulative spore numbers throughout the entire season (from leaf emergence to the end of the assessment period in November) than with cumulative spore numbers during shorter periods around the time of leaf emergence; this suggests that infection is not limited to young, expanding leaves, but rather that leaves at all developmental stages can become infected by S. albopunctata seasonlong. Disease incidence on leaves of potted trap plants exposed to natural inoculum in the field during rain events in 2003 and 2004 was >70.0% irrespective of leaf developmental stage at the time of exposure. Taken together, the results of this study indicate that inoculum of S. albopunctata is present throughout most of the growing season and that infection can occur season-long on leaves of any age, giving rise to a polycyclic epidemic.
Collapse
Affiliation(s)
- P S Ojiambo
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| |
Collapse
|
26
|
Ojiambo PS, Scherm H. Survival analysis of time to abscission of blueberry leaves affected by septoria leaf spot. Phytopathology 2005; 95:108-113. [PMID: 18943843 DOI: 10.1094/phyto-95-0108] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
ABSTRACT In the southeastern United States, Septoria leaf spot, caused by Septoria albopunctata, can result in premature defoliation of blueberry plants during summer and fall, thereby reducing yield potential for the following year. The effects of disease severity and leaf attributes (leaf age and leaf location in the canopy) on the dynamics (timing and extent) of defoliation were quantified in field plots of Premier rabbiteye blueberry (Vaccinium ashei) in 2002 and 2003. In each year, 50 shoots were selected for assessment in early spring, and all leaves on these shoots (n = 410 and 542 in 2002 and 2003, respectively) were monitored individually for disease progress and time of abscission at 3- to 10-day intervals throughout the season. In both years, disease progress was characterized by an exponential increase in disease severity up to late September, followed by a decline toward the end of the assessment period in late November. Defoliation was sporadic up to late August, followed by more rapid and sustained levels of leaf loss. Abscission of severely infected leaves could explain the decline in disease severity toward the end of the season. Final disease severity (i.e., disease severity on the last assessment date before leaf drop) was highest for leaves that abscised early and lowest for leaves that had not abscised by the end of the assessment period. Survival analysis revealed that older leaves (located on the lower halves of shoots) and leaves with high levels of disease (>/=5 spots/leaf at the time of fruit harvest in mid-June) abscised significantly (P < 0.0001) earlier than younger leaves and leaves with lower disease severity. Relative to their respective reference groups, mean times to abscission were approximately 2 weeks shorter for the older leaf group and approximately 3 weeks shorter in the leaf group afflicted by high disease severity. When an accelerated failure time model was fitted to the data, the resulting parameter estimates indicated that each additional leaf spot present at harvest accelerated time to leaf abscission (expressed using late August as a starting point) by 1.9 and 4.5% in 2002 and 2003, respectively. Leaf location in upper or lower portions of the canopy had no significant effect on time to abscission (P > 0.05).
Collapse
|
27
|
Abstract
ABSTRACT Data on the occurrence and timing of discrete events such as spore germination, disease onset, or propagule death are recorded commonly in epidemiological studies. When analyzing such "time-to-event" data, survival analysis is superior to conventional statistical techniques because it can accommodate censored observations, i.e., cases in which the event has not occurred by the end of the study. Central to survival analysis are two mathematical functions, the survivor function, which describes the probability that an individual will "survive" (i.e., that the event will not occur) until a given point in time, and the hazard function, which gives the instantaneous risk that the event will occur at that time, given that it has not occurred previously. These functions can be compared among two or more groups using chi-square-based test statistics. The effects of discrete or continuous covariates on survival times can be quantified with two types of models, the accelerated failure time model and the proportional hazards model. When applied to longitudinal data on the timing of defoliation of individual blueberry leaves in the field, analysis with the accelerated failure time model revealed a significantly (P < 0.0001) increased defoliation risk due to Septoria leaf spot, caused by Septoria albopunctata. Defoliation occurred earlier for lower leaves than for upper leaves, but this effect was confounded in part with increased disease severity on lower leaves.
Collapse
|
28
|
Abstract
Leaf spots caused by fungal pathogens or abiotic factors can be prevalent on southern blueberries after harvest during the summer and fall, yet little is known about how they affect physiological processes that determine yield potential for the following year. In this study, we measured CO2 assimilation and leaf conductance on field-grown blueberry plants affected by Septoria leaf spot (caused by Septoria albopunctata) or by edema-like abiotic leaf blotching. Net assimilation rate (NAR) on healthy leaves varied between 6.9 and 12.4 μmol m-2 s-1 across cultivars and measurement dates. Infection by S. albopunctata had a significant negative effect on photosynthesis, with NAR decreasing exponentially as disease severity increased (R2 ≥0.726, P < 0.0001). NAR was reduced by approximately one-half at 20% disease severity, and values approached zero for leaves with >50% necrotic leaf area. There was a positive, linear correlation between NAR and leaf conductance (R2 ≥ 0.622, P < 0.0001), suggesting that the disease may have reduced photosynthesis via decreased CO2 diffusion into affected leaves. Estimates of virtual lesion size associated with infection by S. albopunctata ranged from 2.8 to 3.1, indicating that the leaf area in which photosynthesis was impaired was about three times as large as the area covered by necrosis. For leaves afflicted by edema-like damage, there also was a significant negative relationship between NAR and affected leaf area, but the scatter about the regression was more pronounced than in the NAR-disease severity relationships for S. albopunctata (R2 = 0.548, P < 0.0001). No significant correlation was observed between leaf conductance and affected area on these leaves (P = 0.145), and the virtual lesion size associated with abiotic damage was significantly smaller than that caused by S. albopunctata. Adequate carbohydrate supply during the fall is critical for optimal flower bud set in blueberry; therefore, these results document the potential for marked yield losses due to biotic and abiotic leaf spots.
Collapse
Affiliation(s)
| | | | - M W van Iersel
- Department of Horticulture, University of Georgia, Athens 30602
| |
Collapse
|
29
|
Su H, van Bruggen AHC, Subbarao KV, Scherm H. Sporulation of Bremia lactucae Affected by Temperature, Relative Humidity, and Wind in Controlled Conditions. Phytopathology 2004; 94:396-401. [PMID: 18944116 DOI: 10.1094/phyto.2004.94.4.396] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
ABSTRACT The effects of temperature (5 to 25 degrees C), relative humidity (81 to 100%), wind speed (0 to 1.0 m s(-1)), and their interactions on sporulation of Bremia lactucae on lettuce cotyledons were investigated in controlled conditions. Sporulation was affected significantly (P < 0.0001) by temperature, with an optimum at 15 degrees C, and by relative humidity (RH), with sporulation increasing markedly at RH >/= 90%. There was a significant effect of exposure time in relation to temperature (P = 0.0007) but not to RH. In separate experiments, both RH and wind speed significantly (P < 0.0001) affected the number of cotyledons with sporulation and the number of sporangia produced per cotyledon. No sporulation was observed at wind speeds of >0.5 m s(-1), regardless of RH. In still air, the number of sporangiophores produced per cotyledon increased linearly with RH from 81 to 100% (P = 0.0001, r = 0.98). Histological observations indicated that sporulation may be affected by stomatal aperture in response to RH, as more closed stomata and correspondingly fewer sporangiophores were present at lower RH. These results are important for understanding the mechanism of RH effects on sporulation and for predicting conditions conducive to downy mildew development.
Collapse
|
30
|
Lan Z, Scherm H. Moisture Sources in Relation to Conidial Dissemination and Infection by Cladosporium carpophilum Within Peach Canopies. Phytopathology 2003; 93:1581-1586. [PMID: 18943623 DOI: 10.1094/phyto.2003.93.12.1581] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
ABSTRACT Cladosporium carpophilum, the causal agent of peach scab, overwinters in lesions on 1-year-old twigs, from which conidia infect the developing fruit during spring and early summer. Twig lesions constitute the sole source of initial inoculum; therefore, the mode of dissemination of conidia from such lesions to the fruit is of considerable interest. In a 4-year study, we determined the relative importance of air- versus water-borne conidia and their interaction with different fruit wetness sources (splash, twig runoff, and dew) in a peach orchard with areas that had been treated or not treated with fungicide the previous year. The rareness of scab twig lesions in the previously sprayed trees implied that fruit infection in these trees would occur primarily by airborne conidia from unsprayed trees nearby (located within the same tree row or the adjacent row). In the unsprayed areas, additional infections could occur by short-distance waterborne dissemination of conidia from locally abundant twig lesions via splashing or runoff. Beginning at calyx fall, individual fruit were protected from splash by rain shields, protected from runoff by cotton wicks placed proximal to the peduncle, or left untreated. Rain shields were adjustable, allowing rain or dew to be excluded selectively. Various combinations of the shield and wick treatments were implemented in the previously sprayed and unsprayed areas, and statistical comparison of fruit scab severity between individual treatments by linear contrasts allowed us to untangle the relative contributions of the various sources of inoculum and fruit wetness. Results showed that aerial dissemination of conidia contributed little to fruit scab development, even in the presence of fruit surface wetness caused by splashing, runoff, or dew. In contrast, waterborne conidia contributed considerably and significantly (P < 0.0001) to disease development. This was due primarily to the importance of splash in disseminating conidia from twig lesions to the fruit, given that exclusion of splashing via rain shields decreased disease severity by >90%. Runoff water from the twig to the fruit via the peduncle also contributed to scab development, as evidenced by the fact that exclusion of runoff by cotton wicks reduced disease severity by 31.6 to 44.9%; however, this effect was not always statistically significant. The exclusion of dew did not reduce scab severity (P > 0.4), suggesting that it played a limited role in infection in the presence of other fruit wetness sources.
Collapse
|
31
|
Abstract
Risks to peach production from scab (caused by Cladosporium carpophilum) and plum curculio (Conotrachelus nenuphar), two key pests in the southeastern United States, are high until 2 months past petal fall and then decreases during midseason. This suggests that reduced-input pesticide strategies may effectively control both pests during the latter period. In this study, we evaluated midseason pesticide applications according to an alternate-row middle (ARM) spray program in which sprays were applied only to every other tree row while reducing tractor speed and keeping application intervals unchanged relative to conventional spraying of both sides of the trees. In a 2-year trial in a research orchard, conventional sprays of fungicide (primarily sulfur) and insecticide (primarily phosmet) were applied at 10- to 14-day intervals until first cover, followed by continued conventional sprays of fungicide and insecticide (standard), conventional sprays of one pesticide together with ARM sprays of the other pesticide, or combined ARM sprays of both pesticides. Schedules with midseason ARM sprays of both pesticides also were evaluated in two commercial orchards in 2 years. In all experiments, plots receiving combined ARM sprays were equivalent to the standard in fruit quality and control of scab and plum curculio. Combined ARM spraying resulted in lower environmental nontarget effects (as estimated by the Environmental Impact Quotient) and reduced application time by 25 to 33% for each midseason spray and 12.5 to 18.5% for the entire period from petal fall to the preharvest interval.
Collapse
Affiliation(s)
- Z Lan
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - D L Horton
- Department of Entomology, University of Georgia, Athens 30602
| |
Collapse
|
32
|
Scherm H, Savelle AT. Epidemic Development of Hawthorn Leaf Blight (Monilinia johnsonii) on Mayhaw (Crataegus aestivalis and C. opaca) in Georgia. Plant Dis 2003; 87:539-543. [PMID: 30812955 DOI: 10.1094/pdis.2003.87.5.539] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Mayhaws are small trees and shrubs in the hawthorn genus, Crataegus. They are native to the southern United States, where their fruit is highly valued for use in jellies and preserves. Since 1997, symptoms of hawthorn leaf blight, caused by Monilinia johnsonii, have been observed in mayhaw orchards in southwestern Georgia. We studied epidemic development of the disease in a mixed planting of Crataegus aestivalis (eastern mayhaw) and C. opaca (western mayhaw) between 2000 and 2002. Apothecia of M. johnsonii were first observed in early to mid-February on overwintered, mummified fruit of C. aestivalis; no apothecia were detected in plots underneath C. opaca trees. Both mayhaw species exhibited moderate to severe leaf blighting beginning in early March, although some genotypes within each species apparently escaped primary infection via delayed leaf bud break or a slower rate of leaf expansion. On a per-tree basis, leaf blight incidence was positively correlated with mean leaf length during the period when apothecia were most numerous (r = 0.7225, P = 0.0003, n = 20). Fruit mummification, which results from secondary infection of open flowers by conidia, was widespread by late March to early April and was significantly (P < 0.05) more severe on C. aestivalis, most likely because trees of this species were at an earlier bloom stage when conidia-bearing blighted leaves were first observed. By contrast, C. opaca advanced through bloom earlier, thereby partly escaping secondary infection. On a per-tree basis, there was no relationship between incidence levels of leaf blight and fruit infection for either species; indeed, some trees with the lowest incidence of leaf blight had the greatest incidence of fruit mummification and vice versa. Thus, in a mixed planting of different mayhaw genotypes, conidia appear to be dispersed readily from heavily blighted trees, leading to high levels of fruit infection even in trees with negligible incidence of leaf blight. There are genotypes within both mayhaw species that almost completely avoid either primary infection or secondary infection; therefore, planting such genotypes in pure stands may aid in minimizing losses due to the disease.
Collapse
Affiliation(s)
- H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - A T Savelle
- Department of Plant Pathology, University of Georgia, Athens 30602
| |
Collapse
|
33
|
Ngugi HK, Scherm H, Lehman JS. Relationships Between Blueberry Flower Age, Pollination, and Conidial Infection by Monilinia vaccinii-corymbosi. Phytopathology 2002; 92:1104-1109. [PMID: 18944221 DOI: 10.1094/phyto.2002.92.10.1104] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
ABSTRACT Monilinia vaccinii-corymbosi infects open blueberry flowers via the gynoecial pathway, leading to mummification of the developing fruit. To determine the effect of flower age on infection, stigmata were inoculated with conidia of M. vaccinii-corymbosi between 0 and 5 days after anthesis, fungal growth rates through the stylar canal were measured in detached flowers in the laboratory, and fruit disease incidence was determined in plants grown in the greenhouse. Hyphal growth rates were greatest in flowers inoculated on the day of anthesis, declined linearly with increasing flower age at inoculation (r = 0.921; P < 0.0001; n = 12), and were unaffected by the presence or absence of pollen applied at the time of inoculation. In greenhouse-grown plants, the percentage of infected fruit decreased exponentially with increasing flower age at inoculation (R = 0.878; P = 0.0057; n = 10), with disease incidence ranging from 76.4% for flowers inoculated on the day of anthesis to 15.5% for those inoculated 4 days later. Fruit disease incidence in the greenhouse was linearly correlated with hyphal growth rates in detached flowers (r = 0.985; P < 0.0001; n = 9), justifying the use of detached flowers when investigating gynoecial infection by M. vaccinii-corymbosi. In separate experiments, the effects of timing and sequence of pollination and inoculation on hyphal growth rates through the stylar canal and on disease incidence were investigated. Application of pollen to detached flowers 1 or 2 days before inoculation reduced hyphal growth rates by between 14.0 and 42.9% compared with flowers that received pollen and conidia simultaneously. Similarly, reductions in fruit disease incidence by between 9.5 and 18.3% were observed on greenhouse-grown plants for pollination-to-inoculation intervals ranging from 1 to 4 days. These results document that newly opened flowers are most susceptible to infection by M. vaccinii-corymbosi and that fruit disease incidence is reduced if pollination occurs at least 1 day before inoculation. Strategies that lead to early pollination of newly opened flowers may be useful for managing mummy berry disease in the field.
Collapse
|
34
|
Ngugi HK, Scherm H, Nesmith DS. Distribution of Pseudosclerotia of Monilinia vaccinii-corymbosi and Risk of Apothecial Emergence Following Mechanical Cultivation. Phytopathology 2002; 92:877-883. [PMID: 18942967 DOI: 10.1094/phyto.2002.92.8.877] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
ABSTRACT Pseudosclerotia (infected, mummified fruit) of Monilinia vacciniicorymbosi overwinter on the orchard floor and germinate to produce apothecia in early spring, providing the only source of primary inoculum for mummy berry disease of blueberry. Three experiments were carried out to develop a model for the relative efficacy of mechanical cultivation in reducing the risk associated with primary inoculum. In the first experiment, apothecial emergence from pseudosclerotia buried 0, 1.5, 3, 6, and 10 cm below the soil surface was monitored to determine the critical depth necessary to inhibit emergence. No apothecia emerged from pseudosclerotia buried at depths of >/=3 cm, and the critical depth of burial was determined at 2.6 cm by regression analysis. In the second experiment, pseudosclerotia or plastic beads (used as surrogates for pseudosclerotia) were placed on the soil surface of experimental plots before cultivation with an in-row rotary cultivator, a disc harrow, or a rotary cultivator with each implement operated in a single pass. Vertical distribution profiles of pseudosclerotia or beads in the topsoil were characterized after excavation with a custom-built sampling device. The proportion of pseudosclerotia placed below the critical depth of 2.6 cm was 20.9, 52.6, and 78.6% for the in-row rotary cultivator, the disc harrow, and the rotary cultivator, respectively. For all three implements, vertical distribution profiles of pseudosclerotia and plastic beads were very similar, allowing the latter to be used in subsequent experiments in commercial fields. In the third experiment, two blueberry plantings were surveyed to determine the horizontal distribution of pseudosclerotia on the orchard floor with distance from the crowns of the plants. The greatest frequency of pseudosclerotia occurred between 30 and 40 cm from the plants. Based on measurements of the distance from plants within which different implements can operate, the proportion of pseudosclerotia accessible by cultivation ranged from 58.7% for the disc harrow to 87.2% for the in-row rotary cultivator. Taken together, results from the three experiments indicated that cultivation with a single implement can reduce risk of apothecial emergence by about 50%. More effective risk reductions may be obtained by combining implements that result in deep burial of pseudosclerotia with those that have access to pseudosclerotia near the plants. This was demonstrated by a commercial cultivation method that utilized three passes of different implements and resulted in extensive reshaping of plant beds, placing 88.2% of beads below the critical depth of 2.6 cm.
Collapse
|
35
|
Wu BM, van Bruggen AHC, Subbarao KV, Scherm H. Incorporation of temperature and solar radiation thresholds to modify a lettuce downy mildew warning system. Phytopathology 2002; 92:631-636. [PMID: 18944260 DOI: 10.1094/phyto.2002.92.6.631] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
ABSTRACT The effect of temperature on infection of lettuce by Bremia lactucae was investigated in controlled environment studies and in the field. In controlled conditions, lettuce seedlings inoculated with B. lactucae were incubated at 15, 20, 25, or 30 degrees C during a 4-h wet period immediately after inoculation or at the same temperatures during an 8-h dry period after the 4-h postinoculation wet period at 15 degrees C. High temperatures during wet and dry periods reduced subsequent disease incidence. Historical data from field studies in 1991 and 1992, in which days with or without infection had been identified, were analyzed by comparing average air temperatures during 0600 to 1000 and 1000 to 1400 Pacific standard time (PST) between the two groups of days. Days without infection had significantly higher temperatures (mean 21.4 degrees C) than days with infection (20.3 degrees C) during 1000 to 1400 PST (P < 0.01) but not during 0600 to 1000 PST. Therefore, temperature thresholds of 20 and 22 degrees C for the 3-h wet period after sunrise and the subsequent 4-h postpenetration period, respectively, were added to a previously developed disease warning system that predicts infection when morning leaf wetness lasts >/=4 h from 0600 PST. No infection was assumed to occur if average temperature during these periods exceeded the thresholds. Based on nonlinear regression and receiver operating characteristic curve analysis, the leaf wetness threshold of the previous warning system was also modified to >/=3-h leaf wetness (>/=0900 PST). Furthermore, by comparing solar radiation on days with infection and without infection, we determined that high solar radiation during 0500 to 0600 PST in conjunction with leaf wetness ending between 0900 and 1000 PST was associated with downy mildew infection. Therefore, instead of starting at 0600 PST, the calculation of the 3-h morning leaf wetness period was modified to start after sunrise, defined as the hour when measured solar radiation exceeded 8 W m(-2) (or 41 mumol m(-2) s(-1) for photon flux density). The modified warning system was compared with the previously developed system using historical weather and downy mildew data collected in coastal California. The modified system was more conservative when disease potential was high and recommended fewer fungicide applications when conditions were not conducive to downy mildew development.
Collapse
|
36
|
Cox KD, Scherm H. Gradients of Primary and Secondary Infection by Monilinia vaccinii-corymbosi from Point Sources of Ascospores and Conidia. Plant Dis 2001; 85:955-959. [PMID: 30823109 DOI: 10.1094/pdis.2001.85.9.955] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Spread of mummy berry disease of blueberry, caused by Monilinia vaccinii-corymbosi, occurs in two discrete monocycles; primary infection by ascospores results in shoot blight, while secondary infection of open flowers by conidia leads to fruit mummification. Gradients of primary and secondary infection from point sources of ascospores and conidia placed in separate plant rows were recorded in each of 2 years at two sites with no history of the disease. Primary infection gradients were longer downwind than upwind, with 95% of blighted shoots occurring within 30 m of the ascospore point source. This observation, along with a positive correlation (r = 0.852, P = 0.0072, n = 8) between the distance over which shoot blight occurred and wind speed parallel to the row, supports the role of wind as a key factor in ascospore dispersal. By contrast, secondary infection gradients were shorter downwind and longer upwind, with 95% of infected fruit occurring within 20 m of the conidial point source. The shorter downwind spread of secondary infection, along with a nonsignificant correlation (r = -0.649, P = 0.0812, n = 8) between the distance over which infected fruit occurred and wind speed, suggests that factors other than wind are important in the transfer of conidia to open flowers; this could include conidial dispersal by bee pollinators, which have been shown previously to forage primarily upwind. Exponential and Pareto cumulative distribution functions were fitted to cumulative counts of blighted shoots and infected fruit to model spread of primary and secondary infection. The Pareto model, which is characterized by a longer tail and predicts more infection farther from the inoculum source, better fits the observed disease gradients in most cases.
Collapse
Affiliation(s)
- K D Cox
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| |
Collapse
|
37
|
Cox KD, Scherm H. Oversummer Survival of Monilinia vaccinii-corymbosi in Relation to Pseudosclerotial Maturity and Soil Surface Environment. Plant Dis 2001; 85:723-730. [PMID: 30823196 DOI: 10.1094/pdis.2001.85.7.723] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Pseudosclerotia (infected, mummified fruit) on the orchard floor act as oversummering and overwintering structures and the sole source of primary inoculum of Monilinia vaccinii-corymbosi, the causal agent of mummy berry disease of blueberry. Survival of pseudosclerotia may be affected by their maturity (degree of stromatization), which can vary considerably at the time of fruit abscission in early summer, and by variations in the soil surface environment. From July through October in 2 years, survival of pseudosclerotia of varying initial maturity (expressed as the proportion of fruit containing mature, melanized entostromata; immature, nonmelanized entostromata; or undifferentiated mycelia) was investigated in the laboratory relative to soil surface temperature and soil moisture content and in the field in relation to shading (full sun versus 50% shade) and ground cover (bare soil versus grass). In the laboratory, oversummer survival, expressed as the percentage of intact pseudosclerotia at the end of the experiment, was higher for cool soil temperatures (approximately 15°C), soils drier than field capacity, and pseudosclerotia containing mature entostromata. In the field, survival was related solely to initial maturity of pseudosclerotia and was highest for pseudosclerotia containing mature entostromata. Shading or grass ground cover did not significantly (P > 0.05) affect oversummer survival, presumably because they did not greatly modify soil temperature or soil moisture. When individual, intact pseudosclerotia were tested for viability using fluorescein diacetate staining, a linear relationship (r = 0.982, P < 0.0001, n = 90) between viable and intact pseudosclerotia was observed, supporting the use of the percentage of intact pseudosclerotia as a measure of oversummer survival.
Collapse
Affiliation(s)
- K D Cox
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| |
Collapse
|
38
|
Abstract
The effectiveness of reduced fungicide programs, using either extended application intervals or alternate-row middle (ARM) spraying of wettable sulfur or captan, on infection of peach fruit by Cladosporium carpophilum was investigated in a 2-year study in Georgia. Fungicide reductions focused on the midseason cover spray period when scab pressure is typically reduced and when growers would be most likely to adopt reduced spray programs because of the potential for fewer insecticide applications at the same time. In an experimental orchard, sulfur was applied at calyx split and calyx fall, followed by another application of sulfur or captan at first cover. Subsequent midseason applications consisted of sulfur at 12- to 14-day intervals (standard); sulfur at extended 24- to 28-day intervals; or either sulfur or captan applied via ARM spraying at 12- to 14-day intervals at reduced sprayer speed. Plots without midseason sprays after first cover also were included. Fruit scab severity was reduced by all fungicide programs compared with the untreated control. Disease severity with sulfur applied at extended intervals and with ARM spraying of sulfur or captan was not significantly different from that of the standard (P > 0.05) in both years, suggesting that application intensity during midseason can be reduced without compromising scab control. By contrast, plots that did not receive any mid-season sprays after first cover had significantly more disease. Reduced midseason applications of sulfur were further evaluated in two commercial orchards. In one orchard, fruit scab control achieved with extended-interval or ARM spraying during midseason was not significantly different from that of the grower standard. In the second orchard, higher disease severity resulted from midseason ARM applications compared with the standard, presumably because of the longer (14- to 18-day) spray interval used by the cooperating grower for ARM spraying. Reduced midseason fungicide programs did not lead to an increased carryover of inoculum as determined by conidial counts on overwintered twigs at petal fall in the following year.
Collapse
Affiliation(s)
- H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - A T Savelle
- Department of Plant Pathology, University of Georgia, Athens 30602
| |
Collapse
|
39
|
Copes WE, Scherm H, Ware GO. Sequential Sampling to Assess the Incidence of Infection by Monilinia vaccinii-corymbosi in Mechanically Harvested Rabbiteye Blueberry Fruit. Phytopathology 2001; 91:348-353. [PMID: 18943846 DOI: 10.1094/phyto.2001.91.4.348] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
ABSTRACT Blueberry fruit infected by Monilinia vaccinii-corymbosi, the causal agent of mummy berry disease, are unsuitable for use in processed food products. Fruit shipments that exceed a disease incidence threshold of 0.5% are redirected to alternative markets with substantial reductions in economic return to the producer. Because of this low tolerance, a sampling procedure with defined statistical properties is needed to determine disease incidence in the packinghouse. In this study, a sequential sampling plan was developed based on counts and dispersion patterns of infected fruit in 23 loads of mechanically harvested rabbiteye blueberries. Each load was sampled 20 to 100 times, with each sample containing 550 cm(3) of fruit. Various dispersion statistics (k of the negative binomial distribution, Lloyd' index of patchiness, and Iwao' b) were computed, all of which suggested aggregation of infected fruit. Because k was variable across loads, Iwao' regression procedure, which does not assume a single frequency distribution with fixed parameters describing the counts of infected fruit, was used to develop upper and lower stop lines for sequential sampling. For alpha = 0.05 and assuming a total of 250 fruit per 550-cm(3) sample, the resulting sampling plan would require only one sample to conclude that a load exceeds the threshold if the number of infected fruit in that sample is greater than four. A minimum of six samples would be needed to conclude that disease incidence in a load is below the threshold if the cumulative total of infected fruit in these samples is zero. Resampling analysis showed that most fruit loads could be classified reliably with <10 samples per load; for loads with a disease incidence very close to the 0.5% threshold, <50 samples were needed on average. Stop lines for sequential sampling for different fruit size classes are presented.
Collapse
|
40
|
Cox KD, Scherm H. Effect of Desiccants and Herbicides on Germination of Pseudosclerotia and Development of Apothecia of Monilinia vaccinii-corymbosi. Plant Dis 2001; 85:436-441. [PMID: 30831978 DOI: 10.1094/pdis.2001.85.4.436] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Pseudosclerotia (infected, mummified fruit) are the only source of primary inoculum of Monilinia vaccinii-corymbosi, the causal agent of mummy berry disease of blueberry. Laboratory applications of potential inhibitors of carpogenic germination were made to pseudosclerotia at three distinct developmental stages, i.e., ungerminated pseudosclerotia, pseudosclerotia with emerging stipes, and those with mature apothecia. Potential inhibitors evaluated included soybean oil and ammonium thiosulfate (two desiccants used experimentally as bloom thinners in fruit crops) and diuron and simazine (two commonly used herbicides), each applied in an aqueous suspension with 3% Latron B-1956 surfactant. Various aspects of carpogenic germination including the percentage of pseudosclerotia that produced stipes or apothecia, the number of stipes or apothecia per pseudosclerotium, the percentage of stipes that developed into apothecia, longevity of stipes and apothecia, and ascospore numbers were assessed. Compared with water, application of ammonium thiosulfate (2%) and diuron (2%) reduced stipe and apothecium production when sprayed on ungerminated pseudosclerotia, but these reductions were generally not significantly different from those achieved with Latron B applied alone (P > 0.05). The two compounds, however, completely inhibited the development of stipes into apothecia when applied to pseudosclerotia with stipes and caused a >3-fold reduction in apothecium longevity when applied to pseudosclerotia with mature apothecia. Application of simazine (2%) before germination or at stipe emergence resulted in the development of malformed apothecia from which no ascospores were recovered; stipe and apothecium longevity were also reduced. Soybean oil (15%) and Latron B applied alone had weak or inconsistent effects on most aspects of carpogenic germination of pseudosclerotia, although both compounds, when applied at stipe emergence, significantly reduced ascospore numbers in subsequently formed apothecia. The results suggest that diuron and simazine applied for weed control in commercial blueberry plantings may have beneficial side effects in reducing carpogenic germination of pseudosclerotia. The strong inhibitory effect of ammonium thiosulfate on all aspects of carpogenic germination, along with its value as a nitrogen fertilizer and ancillary herbicide, warrants further evaluation of this compound's performance and economics in the field.
Collapse
Affiliation(s)
- K D Cox
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| |
Collapse
|
41
|
Scherm H, Savelle AT, Pusey PL. Interactions Between Chill-Hours and Degree-Days Affect Carpogenic Germination in Monilinia vaccinii-corymbosi. Phytopathology 2001; 91:77-83. [PMID: 18944281 DOI: 10.1094/phyto.2001.91.1.77] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
ABSTRACT The relationship of cumulative chill-hours (hours with a mean temperature <7.2 degrees C) and heating degree-days (base 7.2 degrees C) to carpogenic germination of pseudosclerotia of Monilinia vaccinii-corymbosi, which causes mummy berry disease of blueberry, was investigated. In two laboratory experiments, pseudosclerotia collected from rabbiteye blueberry in Georgia were conditioned at 5 to 6 degrees C for 26 to 1,378 h prior to placement in conditions favorable for germination and apothecium development. The number of chill-hours accumulated during the conditioning period affected the subsequent proportion of pseudosclerotia that germinated and produced apothecia, with the greatest incidence of carpogenic germination occurring after intermediate levels of chilling ( approximately 700 chill-hours). The minimum chilling requirement for germination and apothecium production was considerably lower than that reported previously for pseudo-sclerotia from highbush blueberry in northern production regions. The rate of carpogenic germination was strongly affected by interactions between the accumulation of chill-hours and degree-days during the conditioning and germination periods; pseudosclerotia exposed to prolonged chilling periods, once transferred to suitable conditions, germinated and produced apothecia more rapidly (after fewer degree-days had accumulated) than those exposed to shorter chilling periods. Thus, pseudosclerotia of M. vaccinii-corymbosi are adapted to germinate carpogenically following cold winters (high chill-hours, low degree-days) as well as warm winters (low chill-hours, high degree-days). Results were validated in a combined field-laboratory experiment in which pseudosclerotia that had received various levels of natural chilling were allowed to germinate in controlled conditions in the laboratory, and in two field experiments in which pseudosclerotia were exposed to natural chilling and germination conditions. A simple model describing the timing of apothecium emergence in relation to cumulative chill-hours and degree-days was developed based on the experiments. The model should be useful for better timing of field scouting programs for apothecia to aid in management of primary infection by M. vaccinii-corymbosi.
Collapse
|
42
|
Emery KM, Michailides TJ, Scherm H. Incidence of Latent Infection of Immature Peach Fruit by Monilinia fructicola and Relationship to Brown Rot in Georgia. Plant Dis 2000; 84:853-857. [PMID: 30832138 DOI: 10.1094/pdis.2000.84.8.853] [Citation(s) in RCA: 30] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Peach fruit are most susceptible to infection by Monilinia fructicola during the preharvest ripening stage. Although various sources of inoculum for preharvest infection have been characterized, the role of latent infection of immature fruit in the carryover of M. fructicola from the spring (blossom blight phase) to the preharvest period (fruit rot phase) is unknown for the southeastern United States. From 1997 to 1999, immature peach fruit were collected at 14-day intervals from orchards in middle and northern Georgia. Fruit were surface disinfested and treated with paraquat (1997) or frozen overnight (1998 and 1999) to induce tissue senescence and activate latent infections. Across sites and years, the incidence of latent infection remained low until the final sampling date 7 to 12 days before harvest. The incidence of latent infection on the final sampling date ranged from 0 to 22.0% and correlated significantly with both the incidence of blossom blight earlier in the season (r = 0.9077, P = 0.0332) and the incidence of fruit rot at harvest (r = 0.9966, P = 0.0034). There also was a significant association between the incidence of latent infection at the onset of pit hardening (between 7 and 10 weeks before harvest) and subsequent fruit rot incidence (r = 0.9763, P = 0.0237). Weather variables (cumulative rainfall or rainfall frequency) alone did not correlate with fruit rot incidence (P > 0.05), whereas combined latent infection-rainfall variables did. The results suggest that latent infections can serve as a source of inoculum for subsequent fruit rot in peach orchards in Georgia. Despite its significant association with fruit rot incidence, the potential for using latent infection incidence as a biological indicator of disease risk at harvest may be limited; the assessment of latent infection during the fruit ripening stage (similar to the timing of the final sampling date in this study) would not provide sufficient lead time for preharvest disease management decisions, whereas an earlier assessment (e.g., at the onset of pit hardening) would require large sample sizes due to the low incidence of latent infection present during that period.
Collapse
Affiliation(s)
- K M Emery
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - T J Michailides
- Department of Plant Pathology, University of California-Davis, Kearney Agricultural Center, Parlier 93648
| | - H Scherm
- Department of Plant Pathology, University of Georgia
| |
Collapse
|
43
|
Scherm H, Sutherst RW, Harrington R, Ingram JS. Global networking for assessment of impacts of global change on plant pests. Environ Pollut 2000; 108:333-341. [PMID: 15092928 DOI: 10.1016/s0269-7491(99)00212-2] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/15/1998] [Accepted: 07/09/1999] [Indexed: 05/24/2023]
Abstract
Global change encompasses changes in atmospheric composition, climate and climate variability, and land cover and land use. The occurrence of these changes and their interactive effects on biological systems are worldwide; thus, an effective global change research and impact assessment program must be based on international and interdisciplinary research and communication. With this in mind, several collaborative research networks with a focus on global change have been established in the biological sciences. They include the Global Change and Terrestrial Ecosystems (GCTE) Core Project of the International Geosphere-Biosphere Programme (IGBP) which aims to predict the effects of global change on terrestrial ecosystems, including agriculture and production forestry. Because of the importance of plant pests (arthropods, microbial pathogens, weeds) as yield-reducing factors in agriculture and as early indicators of global change, GCTE initiated a network Activity on "Global Change Impacts on Pests, Diseases and Weeds" with the overall goal of developing a predictive capability for impact assessment and adaptation. The network's specific objectives, contributing research projects, initial results and future challenges are discussed.
Collapse
Affiliation(s)
- H Scherm
- Department of Plant Pathology, University of Georgia, Athens, GA 30602, USA.
| | | | | | | |
Collapse
|
44
|
Abstract
Inputs in climate-pest models are commonly expressed as point estimates ('crisp' numbers), which implies perfect knowledge of the system in study. In reality, however, all model inputs harbor some level of uncertainty. This is particularly true for climate change impact assessments where the inputs (i.e., climate projections) are highly uncertain. In this study, uncertainties in climate projections were expressed as 'fuzzy' numbers; these are uncertain numbers for which one knows that there is a range of possible values and that some values are 'more possible' than others. A generic pest risk model incorporating the combined effects of temperature, soil moisture, and cold stress was implemented in a fuzzy spreadsheet environment and run with three climate scenarios: (1) present climate (control run); (2) crisp climate change; and (3) fuzzy climate change. Under the crisp climate change scenario, winter and summer temperatures and precipitation were altered using best estimates (averaged predictions from the 1995 assessment report of the Intergovernmental Panel on Climate Change [IPCC]). Under the fuzzy scenario, climate changes were expressed as triangular fuzzy numbers, utilizing the extremes (lowest and highest predictions from the IPCC report) in addition to the best estimates. Under each scenario, environmental favorability was calculated for six locations in two geographical regions (Central North America and Southern Europe) with two hypothetical pest species having temperate or mediterranean climate requirements. Simulations with the crisp climate change scenario suggested only minor changes in overall environmental favorability compared with the control run. When simulations were conducted with the fuzzy climate change scenario, however, important changes in environmental favorability emerged, particularly in Southern Europe. In that region, the possibility of considerably increased winter precipitation led to increased values of environmental favorability. However, the simulations also showed that this result harbored a very broad range of possible outcomes. The results support the notion that uncertainty in climate change projections must be reduced before reliable impact assessments can be achieved.
Collapse
Affiliation(s)
- H Scherm
- Department of Plant Pathology, University of Georgia, Athens, GA 30602, USA.
| |
Collapse
|
45
|
Sanogo S, Yang XB, Scherm H. Effects of Herbicides on Fusarium solani f. sp. glycines and Development of Sudden Death Syndrome in Glyphosate-Tolerant Soybean. Phytopathology 2000; 90:57-66. [PMID: 18944572 DOI: 10.1094/phyto.2000.90.1.57] [Citation(s) in RCA: 35] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [What about the content of this article? (0)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/27/2023]
Abstract
ABSTRACT Sudden death syndrome of soybean, caused by Fusarium solani f. sp. glycines, is a disease of increasing economic importance in the United States. Although the ecology of sudden death syndrome has been extensively studied in relation to crop management practices such as tillage, irrigation, and cultivar selection, there is no information on the effects of herbicides on this disease. Three herbicides (lactofen, glyphosate, and imazethapyr) commonly used in soybean were evaluated for their effects on the phenology of F. solani f. sp. glycines and the development of sudden death syndrome in four soybean cultivars varying in resistance to the disease and in tolerance to glyphosate. Conidial germination, mycelial growth, and sporulation in vitro were reduced by glyphosate and lactofen. In growth-chamber and greenhouse experiments, there was a significant increase in disease severity and frequency of isolation of F. solani f. sp. glycines from roots of all cultivars after application of imazethapyr or glyphosate compared with the control treatment (no herbicide applied). Conversely, disease severity and isolation frequency of F. solani f. sp. glycines decreased after application of lactofen. Across all herbicide treatments, severity of sudden death syndrome and isolation frequency were lower in disease-resistant than in susceptible cultivars. Results suggest that glyphosate-tolerant and -nontolerant cultivars respond similarly to infection by F. solani f. sp. glycines after herbicide application.
Collapse
|
46
|
Scherm H, Copes WE. Evaluation of Methods to Detect Fruit Infected by Monilinia vaccinii-corymbosi in Mechanically Harvested Rabbiteye Blueberry. Plant Dis 1999; 83:799-805. [PMID: 30841034 DOI: 10.1094/pdis.1999.83.9.799] [Citation(s) in RCA: 14] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [What about the content of this article? (0)] [Affiliation(s)] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Blueberry fruit infected by Monilinia vaccinii-corymbosi (causal agent of mummy berry disease) are unfit for processing because of the formation of hardened structures (pseudosclerotia) within them. In commercial packinghouses in Georgia, fruit loads exceeding the tolerance level for mummy berry are appraised at lower quality grades, resulting in severe economic penalties to producers. Two methods to detect and enumerate mummy berry in blueberry loads were evaluated in the laboratory using fruit samples with known numbers of infected fruit. The first method involved destructive processing of the samples in a blender. The resulting blueberry puree was passed through a series of screens and the number of pseudosclerotia of M. vaccinii-corymbosi retained on the screens assessed tactilely. The second method consisted of visual symptom assessment of intact fruit. Bias and coefficients of variation of the blender method in five experiments ranged from -63.0 to 152.4% and 6.9 to 44.1%, respectively, indicating that the method was inaccurate and imprecise. Several factors probably contributed to its poor performance, including the formation of multiple fragments from single pseudosclerotia during blending and subjectivity in the tactile assessment of pseudosclerotia. Bias and coefficients of variation of the visual assessment method in four experiments ranged from -3.41 to 1.97% and 1.16 to 5.17%, respectively. Thus, the visual method was considerably more accurate and more precise than the blender method. Visual assessment was further evaluated under commercial packinghouse conditions, with >70,000 fruit assessed individually for symptoms of mummy berry and other abnormalities. Bias ranged from -11.1 to 33.3%, indicating that visual assessment was less accurate under packinghouse conditions than under laboratory conditions. This was due to the low number of infected fruit encountered in most of the loads, which resulted in large relative errors if only a single fruit was misidentified. In a two-year packinghouse survey, a high incidence of partial infection, together with successional variations in discoloration of infected portions of the fruit as the harvest season progressed, resulted in a greater variation of mummy berry symptoms than previously described.
Collapse
Affiliation(s)
- H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - W E Copes
- Department of Plant Pathology, University of Georgia, Athens 30602
| |
Collapse
|
47
|
Abstract
▪ Abstract Research on impacts of climate change on plant diseases has been limited, with most work concentrating on the effects of a single atmospheric constituent or meteorological variable on the host, pathogen, or the interaction of the two under controlled conditions. Results indicate that climate change could alter stages and rates of development of the pathogen, modify host resistance, and result in changes in the physiology of host-pathogen interactions. The most likely consequences are shifts in the geographical distribution of host and pathogen and altered crop losses, caused in part by changes in the efficacy of control strategies. Recent developments in experimental and modeling techniques offer considerable promise for developing an improved capability for climate change impact assessment and mitigation. Compared with major technological, environmental, and socioeconomic changes affecting agricultural production during the next century, climate change may be less important; it will, however, add another layer of complexity and uncertainty onto a system that is already exceedingly difficult to manage on a sustainable basis. Intensified research on climate change-related issues could result in improved understanding and management of plant diseases in the face of current and future climate extremes.
Collapse
Affiliation(s)
- S M Coakley
- Department of Botany and Plant Pathology, Oregon State University, Corvallis, Oregon 97331; e-mail:
| | | | | |
Collapse
|
48
|
Abstract
Sudden death syndrome, caused by Fusarium solani f. sp. glycines, has increased in prevalence in soybean production regions in the North-Central United States. Little is known about soil factors and environmental conditions that influence disease severity in this pathosystem. We studied associations between biological, chemical, and physical soil variables and severity of foliar symptoms of sudden death syndrome in nine commercial soybean fields in Iowa during 1995 and 1996. Disease was patchy in all fields, and soil samples were collected in each field along a transect that ran from a symptomless area through a diseased area. There were 25 sampling stops along each transect, separated by distances of 1.5 to 2.5 m. At each stop, soil samples were collected and soil strength, soil moisture, and foliar disease severity (at plant growth stage R6) were measured. Soil samples were assayed for population densities of F. solani f. sp. glycines, cysts of the soybean cyst nematode (Heterodera glycines), and for chemical variables (soluble salts, pH, organic matter, cation exchange capacity, and concentrations of P, K, Ca, Mg, Mn, and Fe). Cross-correlation analyses were carried out to test for associations between soil variables and disease severity in individual fields, while discriminant analysis was used to assess the effects of soil variables across all fields. Disease severity showed consistent associations with F. solani f. sp. glycines populations (strong effect) and H. glycines cyst counts (minor effect). Available K was identified as a possible disease-enhancing factor, but the magnitude of its effect was dependent on the overall K-concentrations in the fields. For example, as the median K-concentration increased, the correlation between K and disease decreased. None of the other soil variables showed consistent associations with disease. The results suggest that localized presence or absence of F. solani f. sp. glycines is the chief reason for the patchiness of sudden death syndrome in affected fields. Thus, manipulation of soil nutrient status or fertility level appears to have limited potential for reducing disease in the high-yield soybean production environment of Iowa. Instead, producers should focus on preventing the establishment or reducing populations of F. solani f. sp. glycines and H. glycines in their fields.
Collapse
Affiliation(s)
- H Scherm
- Department of Plant Pathology, University of Georgia, Athens 30602
| | - X B Yang
- Department of Plant Pathology, Iowa State University, Ames 50011
| | - P Lundeen
- Department of Plant Pathology, Iowa State University, Ames 50011
| |
Collapse
|
49
|
Abstract
The idea that climatic cycles such as the El Niño/Southern Oscillation significantly affect infectious disease, raised by Rita R. Colwell (Association Affairs, 20 Dec., p
2025
) in the context of cholera outbreaks, is supported by recent findings from veterinary, entomological, and botanical epidemiology (
1
,
2
).
Botanical epidemiologists have found that plant systems have unique advantages for macro-scale, long-term epidemiological studies. Weather and climate are important driving forces affecting plant disease development. For example, the U.S. Department of Agriculture's annual cereal rust survey, a program started in 1917 to monitor rust outbreaks over North America, accumulates time series of disease intensity, yield loss, and races of rust fungi in cereal crops (
3
). With these data, consistent and significant coherence patterns between El Niño and wheat rust intensity have been found in both the Eastern and Western hemispheres (
2
).
Studies of El Niño-disease associations and their underlying mechanisms could lead to the development of early warning systems. Colwell advocates the use of satellite surveillance for predicting cholera outbreaks, while others propose using El Niño forecasts for malaria alerts (4). Outbreaks of wheat scab in eastern China can be predicted successfully 4 months in advance by measuring sea surface temperatures in the central Pacific (R2 = 0.86; P < 0.001) (5); the mechanism for this association is thought to be the El Niño-dependent advance of the summer monsoon through East Asia, whereby increased precipitation causes increased infection by the scab pathogen. El Niño-disease studies are important also in the context of climate change research: infectious diseases driven by multiyear climatic cycles are likely to respond to slow, decadal changes in climate as well.
Collapse
|
50
|
|