1
|
Determining the Economically Optimum Metaphylactic Strategy for Cattle Cohorts of Varied Demographic Characteristics. Animals (Basel) 2024; 14:1423. [PMID: 38791641 DOI: 10.3390/ani14101423] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/28/2024] [Revised: 04/25/2024] [Accepted: 05/02/2024] [Indexed: 05/26/2024] Open
Abstract
Metaphylactic antibiotic use in feeder cattle is a common practice to control respiratory disease. Antimicrobial stewardship is important to ensure continued efficacy and to protect animal welfare. The objective of this study is to identify characteristics of cohorts of cattle that had not received metaphylaxis that would have benefited economically from the use of metaphylaxis. Cohorts (n = 12,785; 2,206,338 head) from 13 feedlots that did not receive metaphylaxis were modeled using an economic model to estimate net returns for three metaphylactic options. Logistic regression models with covariates for entry weight, sex, average daily weight gain, number of animals per cohort, and days on feed, with feedlot as a random effect, were used to determine the model-adjusted probability of cohorts benefiting economically from metaphylaxis. Most (72%) cohorts in this data set that had not received metaphylaxis at arrival would not economically benefit from metaphylaxis. Sex, entry weight category, number of cattle in the cohort, and average daily weight gain were associated with the likelihood of benefitting economically from metaphylaxis. The results illustrated that cattle cohort demographics influenced the probability that cohorts would benefit economically from metaphylaxis and the type of metaphylaxis utilized, and integrating this information has the potential to influence the metaphylaxis decision.
Collapse
|
2
|
National Beef Quality Audit-2022 Phase 1: face-to-face and digital interviews. Transl Anim Sci 2024; 8:txae034. [PMID: 38562215 PMCID: PMC10983070 DOI: 10.1093/tas/txae034] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/17/2023] [Accepted: 03/11/2024] [Indexed: 04/04/2024] Open
Abstract
The National Beef Quality Audit (NBQA) has been conducted regularly since 1991 to assess and benchmark quality in the U.S. beef industry, with the most recent iteration conducted in 2022. The goal of NBQA Phase I is to evaluate what needs to be managed to improve beef quality and demand. Interviews (n = 130) of industry personnel were conducted with the aid of routing software. In total, packers (n = 24), retailers (n = 20), further processors (n = 26), foodservice (n = 18), and allied government agencies and trade organizations (n = 42) were interviewed. Interviews were routed in software based on interviewee involvement in either the fed steer and heifer market cow and bull sectors, or both. Interviews were structured to elicit random responses in the order of determining "must-have" criteria (quality factors that are required to make a purchase), best/worst ranking (of quality factors based on importance), how interviewees defined quality terms, a strength, weakness, opportunities, threats (SWOT) analysis, general beef industry questions, and sustainability goals (the latter four being open-ended). Quality factors were 1) visual characteristics, 2) cattle genetics, 3) food safety, 4) eating satisfaction, 5) animal well-being, 6) weight and size, and 7) lean, fat, and bone. Best/worst analysis revealed that "food safety" was the most (P < 0.05) important factor in beef purchasing decisions for all market sectors and frequently was described as "everything" and "a way of business." Culture surrounding food safety changed compared to previous NBQAs with interviewees no longer considering food safety as a purchasing criterion, but rather as a market expectation. The SWOT analysis indicated that "eating quality of U.S. beef" was the greatest strength, and cited that educating both consumers and producers on beef production would benefit the industry. Irrespective of whether companies' products were fed or market cow/bull beef, respondents said that they believed "environmental concerns" were among the major threats to the industry. Perceived image of the beef industry in the market sectors has improved since NBQA-2016 for both fed cattle and market cow/bull beef.
Collapse
|
3
|
Impacts of economic factors influencing net returns of beef feedlot heifers administered two implant programs and fed for differing days-on-feed from pooled randomized controlled trials. Transl Anim Sci 2024; 8:txae021. [PMID: 38585170 PMCID: PMC10999156 DOI: 10.1093/tas/txae021] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/23/2023] [Accepted: 02/16/2024] [Indexed: 04/09/2024] Open
Abstract
The objective of this research was to evaluate the effects of two implant programs and differing days-on-feed (DOF) on net returns of beef feedlot heifers using sensitivity analyses of key economic factors. Crossbred beef heifers [n = 10,583; initial weight 315 kg (± 20.1 SD)] were enrolled across three trials (one Kansas, two Texas feedlot trials). Heifers were blocked by arrival and randomly allocated to one of six pens, resulting in a total of 144 pens and 24 blocks. Pen was randomly assigned to treatment as a 2 × 3 factorial. Implant programs were: IH + 200-Revalor-IH at initial processing, and a terminal implant after approximately 90 DOF (Revalor-200), or, XH-a single implant at initial processing (Revalor-XH). The DOF treatments were: heifers fed to a standard baseline endpoint (BASE) or heifers fed for an additional + 21 or + 42 d beyond BASE. Pen-level partial budgets were used for economic sensitivity analyses, which varied price points of single pricing components with all other components fixed. Variable components were live-fed cattle prices, base carcass prices (i.e., dressed), Choice-Select spread (CS-spread), and feed and yardage prices (FYP). For each, a Low, Mid-Low, Middle, Mid-High, and High price was chosen. Linear mixed models were fit for statistical analyses (α = 0.05). There were no significant two-way interactions (P-values ≥ 0.14). Regardless of the variable component evaluated, XH heifers had poorer net returns than IH + 200 at all prices (P ≤ 0.04). Selling live, the + 21 and (or) + 42 heifers had lower net returns than BASE at every fed cattle price point (P < 0.01). Selling dressed, the + 21 and (or) + 42 heifers had lower returns than BASE at Low, Mid-Low, and Middle fed cattle base prices (P < 0.01); there were no significant DOF differences at Mid-High, or High prices (P ≥ 0.24). Net returns were lower for + 42 than BASE at all CS-spreads (P ≤ 0.03), while BASE and + 21 did not differ significantly. Longer DOF had lower net returns than BASE when selling live at every FYP (P < 0.01) except at the Low price (P = 0.14). Selling dressed, there was no significant effect of DOF at Low or Mid-Low FYP (P ≥ 0.11); conversely, extended DOF had lower net returns than BASE at Middle, Mid-High, and High FYP (P < 0.01). Overall, there was minimal economic evidence to support extending feedlot heifer DOF beyond the BASE endpoint, and when feeding longer, larger reductions in return were observed when marketing live as opposed to dressed.
Collapse
|
4
|
Livestock health and disease economics: a scoping review of selected literature. Front Vet Sci 2023; 10:1168649. [PMID: 37795016 PMCID: PMC10546065 DOI: 10.3389/fvets.2023.1168649] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2023] [Accepted: 08/22/2023] [Indexed: 10/06/2023] Open
Abstract
Animal diseases in production and subsistence environments have the potential to negatively affect consumers, producers, and economies as a whole. A growing global demand for animal sourced food requires safe and efficient production systems. Understanding the burden of animal disease and the distribution of burden throughout a value chain informs policy that promotes safe consumption and efficient markets, as well as providing more effective pathways for investment. This paper surveys existing knowledge on the burden of animal disease across economic categories of production, prevention and treatment, animal welfare, and trade and regulation. Our scoping review covers 192 papers across peer-reviewed journals and reports published by organizations. We find there exists a gap in knowledge in evaluating what the global burdens of animal diseases are and how these burdens are distributed in value chains. We also point to a need for creating an analytical framework based on established methods that guides future evaluation of animal disease burden, which will provide improved access to information on animal health impacts.
Collapse
|
5
|
Editorial: Proceedings of the 5th ISESSAH conference 2021: economics and social sciences applied to livestock and aquaculture health. Front Vet Sci 2023; 10:1167589. [PMID: 37228839 PMCID: PMC10204144 DOI: 10.3389/fvets.2023.1167589] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/16/2023] [Accepted: 03/30/2023] [Indexed: 05/27/2023] Open
|
6
|
Stochastic, individual animal systems simulation model of beef cow-calf production: development and validation. Transl Anim Sci 2022; 7:txac155. [PMID: 36816825 PMCID: PMC9930734 DOI: 10.1093/tas/txac155] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/25/2022] [Accepted: 11/30/2022] [Indexed: 12/07/2022] Open
Abstract
A stochastic, individual animal systems simulation model describing U.S. beef cow-calf production was developed and parameterized to match typical U.S. Angus genetics under cow-calf production conditions in the Kansas Flint Hills. Model simulation results were compared to available actual, multivariate U.S. cow-calf production data reported according to beef cow-calf standardized performance analysis (SPA) methodology through North Dakota State University's CHAPS program to assess model validity. Individual animal nutrition, reproduction, growth, and health characteristics, as well as production state are determined on a daily time step. Any number of days can be simulated. These capabilities allow for decision analysis and assessment of long-run outcomes of various genetic, management, and economic scenarios regarding multiple metrics simultaneously. Parameterizing the model to match Kansas Flint Hills production conditions for the years 1995 through 2018, 32 different genetic combinations for mature cow weight and peak lactation potential were simulated with 100 iterations each. Sire mature cow weight genetics ranged from 454 to 771 kg in 45 to 46 kg increments. Sire peak lactation genetics were considered at 6.8, 9, 11.3, and 13.6 kg/d for all eight mature cow weights. Utilizing model results for the years 2000 to 2018, raw model results were assessed against actual historical cow-calf production data. Exploratory factor analysis was applied to interpret the underlying factor scores of model output relative to actual cow-calf production data. Comparing modeled herd output with CHAPS herd data, median average calf weaning age, average cow age, percent pregnant per cow exposed, and percent calf mortality per calf born of model output was 3.4 d greater, 0.2 yr greater, 1 percentage point less, and 1.7 percentage points greater, respectively. Subtracting the median CHAPS pre-weaning average daily gain from the median modeled pre-weaning average daily gain for each of the eight respective mature cow weight genetics categories, and then calculating the median of the eight values, the median difference was -0.21 kg/d. Performing the same calculation for birth weight and adjusted 205 d weaning weight, the modeled data was 4.9 and 48.6 kg lighter than the CHAPS data, respectively. Management and genetic details underlying the CHAPS data were unknown.
Collapse
|
7
|
Economic assessments from experimental research trials of feedlot cattle health and performance: a scoping review. Transl Anim Sci 2022; 6:txac077. [PMID: 35854971 PMCID: PMC9280984 DOI: 10.1093/tas/txac077] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/03/2022] [Accepted: 06/02/2022] [Indexed: 11/12/2022] Open
Abstract
Animal husbandry decisions for feedlot cattle may be based on economic or financial impacts reported from livestock research trials comparing interventions such as health practices or performance technologies. Despite the importance of economic assessments to production management decisions, there are no consensus guidelines for their methods or reporting. Thus, we hypothesized that methods and reporting of economic assessments in the scientific literature are inconsistent. This scoping review describes the types of economic assessments used to evaluate the costs and benefits of interventions in feedlot trials, how measured health and performance outcomes are utilized in economic evaluations, and the completeness of reporting. A structured search was used to retrieve peer-reviewed articles (published in English) on experimental trials performed in Australia, North America, or South Africa, which reported feedlot cattle health, performance, or carcass characteristics and included an economic outcome. A total of 7,086 articles were screened for eligibility; 91 articles (comprising 113 trials) met the inclusion criteria. Trial characteristics, methods, and reporting data were extracted. A primary outcome was stated in only 36% (41/113) of the trials. Of these 41 trials, an economic outcome was reported as a primary outcome in 18 (44%). Methodology for the economic assessment was reported for 54 trials (48%), the type of economic assessment was explicitly stated for 21 trials (19%), and both the type of economic assessment and methodology used were reported for 29 trials (26%); neither were reported for nine trials (8%). Eight types of economic assessments were explicitly reported: cost-effectiveness, cost–benefit analysis, enterprise analysis, partial budget, break-even analysis, profitability, decision analysis, and economic advantage. From the trials that did not report an assessment type, three were identified: partial budget, enterprise analysis, and gross margin analysis. Overall, only 32 trials (28%) reported economics as an outcome of interest, the methodology used or the type of assessment, and values, sources, and dates for at least some of the price data used in the analysis. Given the variability in methods and inconsistent reporting for feedlot trials identified by this scoping review, a guideline to facilitate consistency on appropriate methods and reporting is warranted.
Collapse
|
8
|
Editorial: Proceedings of the 3rd ISESSAH Conference 2019. Front Vet Sci 2021; 8:807796. [PMID: 34901257 PMCID: PMC8655875 DOI: 10.3389/fvets.2021.807796] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/02/2021] [Accepted: 11/08/2021] [Indexed: 11/24/2022] Open
|
9
|
|
10
|
Economic Cost of Traceability in U.S. Beef Production. FRONTIERS IN ANIMAL SCIENCE 2020. [DOI: 10.3389/fanim.2020.552386] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
Livestock traceability has increasingly become a focus for the USDA, the National Cattlemen's Beef Association, high-volume beef-exporting states, and other beef industry stakeholders. The focus on traceability within the United States (U.S.) began after several international animal disease outbreaks and continues to be of importance with highly infectious diseases spreading across the globe. Mitigating adverse future disease outbreaks and food safety events, as well as maintaining export markets through a positive international perception of U.S. beef has become a top priority. Implementing a national animal identification (ID) and traceability program would enable the industry to track and reduce the potential losses due to an outbreak or event. However, such a system comes at a cost, mainly to cow-calf producers. This study utilizes a partial equilibrium model to determine the impacts of a beef cattle animal ID and traceability system in the United States. Utilizing an economic model allows us to provide a comparison of how the various beef sectors would need to respond to offset the costs of a national animal ID and traceability program. Assuming no changes in domestic and international demand for U.S. beef, producers at the wholesale, slaughter, and feeder levels lose $475 million, $1,143 million, and $1,291 million, respectively, in a 10-year discounted cumulative producer surplus. A 17.7 and 1.9% increase in international and domestic beef demand would be required to completely offset the producer costs of CattleTrace, respectively.
Collapse
|
11
|
United States feedlot operator willingness to pay for disposal capacity to address foreign animal disease risk. Transbound Emerg Dis 2018; 65:1951-1958. [PMID: 30094971 DOI: 10.1111/tbed.12976] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/12/2018] [Revised: 07/12/2018] [Accepted: 07/13/2018] [Indexed: 11/30/2022]
Abstract
Foreign animal diseases can cause severe and lasting economic impacts to producers, directly and indirectly. Understanding producer investment cost structures can provide industry and policy makers better tools to encourage biosecurity adoption. Consistent with the literature, many factors can contribute to an individual operator's decision to invest in biosecurity based on individual characteristics, perception of disease likelihood, or expected losses associated with a disease event. We used a producer survey and a one-and-one-half bound econometric model to estimate feedlot operator willingness to pay to invest in disposal capacity within the next 3 years. Results indicate an average willingness to pay of $14,310 for a one-time investment in on-farm disposal capacity to address carcass movement restrictions during a disease outbreak. We found several factors that contribute to and explain the heterogeneity between feedlots and their adoption decisions. Primarily, size of the feedlot and death loss rate significantly impact adoption, which both potentially speak to the financial liquidity and investment potential of a feedlot enterprise. While there is no failsafe in disease prevention, these results provide a better understanding for how to study and structure policy and cost structures to incentivize adoption of biosecurity.
Collapse
|
12
|
An estimation of US horse-owner/caregiver willingness-to-pay for daily use and infectious upper respiratory disease treatment options. Equine Vet J 2017; 50:498-503. [PMID: 29171908 DOI: 10.1111/evj.12786] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2017] [Accepted: 10/29/2017] [Indexed: 12/01/2022]
Abstract
BACKGROUND Equine injury and disease cause two types of costs for those financially responsible for treating and caring for the infected horse(s); direct costs of treating the horse and indirect cost of lost use of the horse for a period of time to the user of the horse (daily horse use). Indirect costs are more difficult to estimate but pose significant financial implications for equine-owners/caregivers. Additionally, there exists a gap in existing research regarding the valuation of infectious treatment options in horses. OBJECTIVE To estimate the value a US horse-owner/caregiver places on daily horse use and describe respondents' willingness-to-pay for various attributes of equine treatment options. STUDY DESIGN Online questionnaire survey. METHODS An online questionnaire was provided to equine-owners and caretakers, and owner demographic, horse care and horse use information from respondents were requested. Additionally, respondents were presented with hypothetical disease treatment options with the following attributes: daily dosage, number of days of rest required, route of administration and out-of-pocket cost to the owner/caretaker through a choice experiment. Data were analysed using a rank-ordered logit analysis and willingness-to-pay estimates for daily use and treatment options were calculated. RESULTS Results suggest that the average horse-owner with an uninsured and insured horse is willing to pay $12.07 (95% confidence interval: -$15.01, -$9.69) and $17.95 (95% confidence interval: -$25.30, -$11.20) per day to reduce lost use days required (due to need for rest) respectively. Respondents showed preferences for oral administration over treatments requiring i.m. injections. MAIN LIMITATIONS As this study employed an online survey it was subjected to self-selection bias and a sample size calculation was not performed. CONCLUSIONS Veterinarians and pharmaceutical companies may use these results when promoting various treatment options to horse-owners/caregivers and in product development. Additionally, promotion efforts may be targeted towards equine-owners with higher daily use values (owners with insured horses).
Collapse
|
13
|
Market Impacts of Reducing the Prevalence of Bovine Respiratory Disease in United States Beef Cattle Feedlots. Front Vet Sci 2017; 4:189. [PMID: 29170739 PMCID: PMC5684707 DOI: 10.3389/fvets.2017.00189] [Citation(s) in RCA: 56] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/27/2017] [Accepted: 10/20/2017] [Indexed: 12/02/2022] Open
Abstract
Bovine respiratory disease (BRD) is a common endemic disease among North American feedlot cattle. BRD can lead to significant economic losses for individual beef cattle feedlot producers through mortality and morbidity. With promising new management and technology research that could reduce BRD prevalence, this study evaluates the potential impacts of a reduction of BRD in the US beef cattle feedlot sector. Using a multi-market, multi-commodity partial equilibrium economic model of the US agricultural industry, we evaluate the market impacts of reduced BRD to producers from various livestock, meat, and feedstuffs industries. We find that as morbidity and mortality is reduced, beef cattle producers experience losses due to increased supplies (lower beef cattle prices) and increased demand for feedstuff (higher feedstuff prices). Beef cattle processors see gains as the price of beef cattle is lower, whereas feedstuff producers gain from higher feedstuff prices. Producers in the allied industries (pork, lamb, poultry, and eggs) see a small reduction in returns as consumers substitute with less expensive beef products. Consumers see gains in welfare as the increase in beef cattle supply results in lower beef prices. These lower beef prices more than offset the small increases in pork, lamb, poultry, and egg prices. Overall, the potential economic welfare change due to management and technologies that reduce BRD is a net gain for the US society as a whole.
Collapse
|
14
|
National Beef Quality Audit-2016: Phase 1, Face-to-face interviews. Transl Anim Sci 2017; 1:320-332. [PMID: 32704657 PMCID: PMC7205349 DOI: 10.2527/tas2017.0039] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/23/2017] [Accepted: 07/15/2017] [Indexed: 11/16/2022] Open
Abstract
The National Beef Quality Audit (NBQA) is conducted every 5 yr and was most recently again conducted in 2016. Face-to-face interviews gauged progress in quality associated with live cattle production using procedures first utilized in NBQA 2011. The 2016 NBQA was the first in which interviews concerning fed steers and heifers were combined with an audit of market cow and bull beef. Face-to-face interviews were designed to illicit definitions for beef quality, estimate willingness to pay (WTP) for quality attributes, establish relative importance rankings for important quality factors, and assess images, strengths, weaknesses, potential threats, and shifting trends in the beef industry since the 2011 audit. Individuals making purchasing decisions in 5 market sectors of the steer/heifer and cow/bull beef supply chain were interviewed, including packers (n = 36), retailers (including large and small supermarket companies and warehouse food sales companies; n = 35), food service operators (including quick-serve, full-service, and institutional establishments; n = 29), further processors (n = 64), and peripherally-related government and trade organizations (GTO; n = 30). Face-to-face interviews were conducted between January and November of 2016 using a designed dynamic routing system. Definitions (as described by interviewees) for 7 pre-determined quality factors, including: (1) How and where the cattle were raised, (2) Lean, fat, and bone, (3) Weight and size, (4) Visual characteristics, (5) Food safety, (6) Eating satisfaction, and (7) Cattle genetics were recorded verbatim and categorized into similar responses for analysis. Compared to NBQA-2011, a higher percentage of companies were willing to pay premiums for guaranteed quality attributes, but overall were willing to pay lower average premiums than the companies interviewed in 2011. Food safety had the highest share of preference among all interviewees, generating a double-digit advantage over any other quality factor. The 2 beef industries have an overall positive image among interviewees, and despite lingering weaknesses, product quality continued to be at the forefront of the strengths category for both steer and heifer beef and market cow and bull beef.
Collapse
|
15
|
Economic Assessment of FMDv Releases from the National Bio and Agro Defense Facility. PLoS One 2015; 10:e0129134. [PMID: 26114546 PMCID: PMC4482686 DOI: 10.1371/journal.pone.0129134] [Citation(s) in RCA: 26] [Impact Index Per Article: 2.9] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/16/2013] [Accepted: 05/05/2015] [Indexed: 11/18/2022] Open
Abstract
This study evaluates the economic consequences of hypothetical foot-and-mouth disease releases from the future National Bio and Agro Defense Facility in Manhattan, Kansas. Using an economic framework that estimates the impacts to agricultural firms and consumers, quantifies costs to non-agricultural activities in the epidemiologically impacted region, and assesses costs of response to the government, we find the distribution of economic impacts to be very significant. Furthermore, agricultural firms and consumers bear most of the impacts followed by the government and the regional non-agricultural firms.
Collapse
|
16
|
Economic Assessment of Zoonotic Diseases: An Illustrative Study of Rift Valley Fever in the United States. Transbound Emerg Dis 2014; 63:203-14. [PMID: 25052324 PMCID: PMC7169821 DOI: 10.1111/tbed.12246] [Citation(s) in RCA: 9] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2013] [Indexed: 11/30/2022]
Abstract
This study evaluates the economic consequences of a Rift Valley Fever outbreak, a virus that spreads from livestock to humans, often through mosquitoes. Developing a ‘one health’ economic framework, economic impacts on agricultural producers and consumers, government costs of response, costs and disruptions to non‐agricultural activities in the epidemiologically impacted region, and human health costs (morbidity and mortality) are estimated. We find the agricultural firms bear most of the negative economic impacts, followed by regional non‐agricultural firms, human health and government. Further, consumers of agricultural products benefit from small outbreaks due to bans on agricultural exports.
Collapse
|
17
|
Phase I of The National Beef Quality Audit-2011: quantifying willingness-to-pay, best-worst scaling, and current status of quality characteristics in different beef industry marketing sectors. J Anim Sci 2013; 91:1907-19. [PMID: 23408805 DOI: 10.2527/jas.2012-5815] [Citation(s) in RCA: 23] [Impact Index Per Article: 2.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/13/2022] Open
Abstract
The National Beef Quality Audit (NBQA)-2011 benchmarked the current status of and assessed progress being made toward quality and consistency of U.S. cattle, carcasses, and beef products after the completion of the first NBQA in 1991. Unlike previous NBQA, objectives of the 2011 Phase I study were to determine how each beef market sector defined 7 quality categories, estimate willingness-to-pay (WTP) for the same quality categories by market sector, and establish a best-worst (B/W) scaling for the quality categories. Structured face-to-face interviews were conducted and responses were recorded using dynamic routing software over an 11-mo period (February to December 2011) with decision makers in each of the following beef market sectors: Feeders (n = 59), Packers (n = 26), Food Service, Distribution, and Further Processors (n = 48), Retailers (n = 30), and Government and Allied Industries (n = 47). All respondents participated in a structured interview consisting of WTP and B/W questions that were tied to 7 quality categories and then were asked to "define" each of the 7 categories in terms of what the category meant to them, resulting in completely unbiased results. The 7 quality categories were a) how and where the cattle were raised, b) lean, fat, and bone, c) weight and size, d) cattle genetics, e) visual characteristics, f) food safety, and g) eating satisfaction. Overall, "food safety" and "eating satisfaction" were the categories of greatest and second most importance, respectively, to all beef market sectors except for Feeders. Feeders ranked "how and where the cattle were raised" and "weight and size" as the most important and second most important, respectively. Overall, "how and where the cattle were raised" had the greatest odds of being considered a nonnegotiable requirement before the raw material for each sector would be considered for purchase and was statistically more important (P < 0.05) as a requirement for purchase than all other categories except "food safety." When all market sectors were considered, "eating satisfaction" was shown to generate the greatest average WTP percentage premium (11.1%), but that WTP premium value only differed statistically (P < 0.05) from "weight and size" (8.8%). Most notably, when a sector said that "food safety" was a nonnegotiable requirement, no sector was willing to purchase the product at a discounted price if the "food safety" of the product could not be assured.
Collapse
|
18
|
Abstract
Traceability programs can cover the whole of life, or parts of it, for individual animals or groups/lots of animals. Of 13 country or community traceability programs for cattle/beef, 11 are mandatory (4 encompass, or are scheduled to encompass, birth to retail; 7 cover birth to slaughter) while 2 are voluntary and encompass birth to slaughter. Of 10 country or community traceability programs for swine/pork, 2 are mandatory (1 covers birth to retail; 1 covers birth to slaughter) while 8 are voluntary. Of 6 country or community traceability programs for sheep/sheep-meat, 3 are mandatory (1 encompasses birth to retail; 2 encompass birth to slaughter) while 3 are voluntary. Mandatory birth to retail programs that include "post-slaughter individual animal identification (IAID) traceability" have been implemented for cattle/beef, swine/pork and sheep/sheep-meat by the European Union and for cattle/beef by Japan. Many of the voluntary as well as mandatory, birth to slaughter traceability programs for all three species are presumed (though that is not specified) to include "post-slaughter group/lot identification (GLID) traceability" - e.g., those qualifying products for shipment to the European Union. "Post-slaughter IAID traceability" can be accomplished in very-small, small, medium, large and very-large packing plants using single-carcass processing units, tagging and separation/segregation, and/or deoxyribonucleic acid (DNA) fingerprinting technology but all of these approaches are time-consuming and costly; and, to-date, in most countries, there has been no reason compelling enough to cause industry to adopt such protocols or technology.
Collapse
|
19
|
Soil Carbon Sequestration Strategies with Alternative Tillage and Nitrogen Sources under Risk. ACTA ACUST UNITED AC 2007. [DOI: 10.1111/j.1467-9353.2007.00341.x] [Citation(s) in RCA: 25] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/28/2022]
|
20
|
Economic feasibility of no-tillage and manure for soil carbon sequestration in corn production in northeastern Kansas. JOURNAL OF ENVIRONMENTAL QUALITY 2006; 35:1364-73. [PMID: 16825456 DOI: 10.2134/jeq2005.0149] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
This study examined the economic potential of no-tillage versus conventional tillage to sequester soil carbon by using two rates of commercial N fertilizer or beef cattle manure for continuous corn (Zea mays L.) production. Yields, input rates, field operations, and prices from an experiment were used to simulate a distribution of net returns for eight production systems. Carbon release values from direct, embodied, and feedstock energies were estimated for each system, and were used with soil carbon sequestration rates from soil tests to determine the amount of net carbon sequestered by each system. The values of carbon credits that provide an incentive for managers to adopt production systems that sequester carbon at greater rates were derived. No-till systems had greater annual soil carbon gains, net carbon gains, and net returns than conventional tillage systems. Systems that used beef cattle manure had greater soil carbon gains and net carbon gains, but lower net returns, than systems that used commercial N fertilizer. Carbon credits would be needed to encourage the use of manure-fertilized cropping systems.
Collapse
|