Skip to main content

Risk Analysis Forum

The Office of Risk Assessment and Cost Benefit Analysis sponsors a Science, Policy and Risk Forum series featuring prominent speakers on risk assessment and the intersection of risk assessment and economic analysis to build capacity in risk analysis. Four or more seminars are scheduled each year. The seminars address emerging issues in risk assessment or economic analysis, new methodological techniques or other special topics of interest. Past seminars included examination of risks in the fields of microbial and chemical risks to food safety, invasive species risks, animal disease risks and resource risks. ORACBA occasionally hosts full or half-day workshops and recently convened a workshop on benefit cost analysis.

Below are links to presentations made at the Forums from 2014-present.

To receive notifications about future forums and other news from ORACBA, join the ORACBA mailing list.

Dealing with Messy Data?

Jeff Bailey, Chief, Summary, Estimation and Disclosure Methodology Branch, National Agricultural Statistics Service, USDA
October 22, 2019

  • Quality data are critical for policy development, program administration, business decisions and improving processes. Data sources can vary greatly and we must know information about the quality of the data we are using. The data must be “fit for use.” When the data come from a survey, we should ask about the sample sizes, data collections methods, response rates, estimation methodology and measures of error. How the nonresponse is handled is key. Do they reweight or impute for missing units. For item nonresponse, what methods are used? What is the quality of the data, and how have the data been analyzed? From my NASS experience, I provide information about these questions and methodologies dealing with messy data.

U.S. Perspective on the Codex Committee on Contaminants in Food: Development of Draft Guidelines for Risk Analysis of Instances of Contaminants in Food Where There Is No Regulatory Level or Risk Management Framework

Lauren Posnick Robin, Sc.D., FDA/CFSAN. U.S. Delegate Codex Committee on Contaminants in Food
July 18, 2019

  • The Codex Committee on Contaminants in Foods (CCCF) is one of the general subject committees of the Codex Alimentarius, an international food safety standards setting organization. The U.S. participates in the yearly meetings of CCCF through a delegation headed by U.S. Food and Drug Administration, but drawing on expertise of other government agencies and public stakeholders, including the United States Department of Agriculture. In 2017, CCCF agreed to endorse new work, led by New Zealand, on the development of risk analysis guidelines to address chemicals inadvertently present in food at low levels. In 2019, CCCF forwarded the final work, Draft Guidelines for Risk Analysis of Instances of Contaminants in Food Where There Is No Regulatory Level or Risk Management Framework, to the Codex Alimentarius Commission for final approval. The guidelines document outlines a rapid risk analysis approach using a decision tree, the threshold of toxicological concern (TTC) construct, and rapid exposure assessment.  Among the first steps in the decision tree is application of a cut-off value of 1 µg/kg. The rapid risk analysis approach is intended to allow for prioritization of only those instances of contamination where further in-depth investigations are warranted. This presentation will review the history of development of the guidelines document and possible practical applications, in the context of an explanation of the activities of Codex Alimentarius and the CCCF, but the presentation will not cover the science of the TTC concept.

Emerging Foods and Food Technologies: Challenges and Benefits

Dr. Richard Williams, Senior Affiliated Scholar, Mercatus Center at George Mason University and Affiliated Scholar at the Center for Growth and Opportunity at Utah State University
March 21, 2019

  • There are two competing movements today in the food world.  The first is a consumer desire to simplify food choices, much like Theroux’s desire to simplify life.  Within this movement is an anti-GMO, anti-synthetic chemicals and artificial ingredients attitude; in fact, anything that a consumer cannot easily pronounce.  The second is emergence of a category of novel foods, particularly foods that are produced by entirely new processes.   In a sense, nature’s competition for survival will be replaced by market competition to supply better foods for humans.  These new technologies will allow farmers, producers, retailers and consumers to exercise more control of the creation of foods which can enhance their safety and nutritional content.  New technologies include 3D food printers, alternative proteins (including plant and cell-based meat and fish), genetic engineering and AI nutrition devices.  Benefits from these devices include food safety benefits (microbial, pandemics, antibiotics); nutrition benefits, including personalized nutrition; environmental and ethical benefits.   The primary challenges will come from the initial high costs associated with some of the new technologies (including unequal distribution), classical fears of new technologies and regulatory challenges.  However, it is likely that the technology improvements will ultimately outlast and outcompete the “simplify” movement.

Emerging Issues in Risk Assessment: Chemicals in Drinking Water and Food Packaging

Jonathan Gledhill and James Rollins, Policy Navigation Group
September 26, 2018  

  • In the first portion of the forum, Jonathan Gledhill, President of the Policy Navigation Group, led a discussion on emerging risk assessment issues associated with the use of perfluoroalkyl and polyfluoroalkyl substances (PFAS)s.  This discussion looked at issues related to PFAS contamination in water as well as the use of these substances in food packaging. In the second portion of the forum, James Rollins, also of the Policy Navigation Group provided an update on decisions regarding regulation of perchlorate as a contaminant in drinking water.  The treatment of perchlorate has highlighted a number of issues with application to risk assessment well beyond the regulatory status of a single chemical.  The regulatory status of perchlorate has been in a state of flux for over 10 years.  Mr. Rollins will provide an update on where things stand related to the resolution of both risk issues and regulatory status.

Application of an Updated Cramer et al. Decision Tree to Safety Assessment

Dr. Szabina Stice
U.S. Food and Drug Administration, Center for Food Safety and Applied Nutrition
March 28, 2018 

  • The Concept of the Threshold of Toxicological Concern (TTC) uses the principles of chemical grouping and read-across to screen chemicals at low levels of exposure for prioritization of follow-up testing. The TTC approach relies on grouping chemicals using the Cramer et al. 1978 Decision Tree (DT), which was developed 40 years ago to provide a tool to prioritize orally ingested substances based on their chemical structure. Combining knowledge of structure, metabolism and toxicity, a sequence of yes/no questions were devised that leads to the assignment of a substance to one of three classes of toxic concern. Given the scientific knowledge accumulated since 1978, the DT is long overdue for an update. This presentation will discuss current work to develop more refined DT questions, which leads to an increased number of six classes of toxic concern, aimed at more accurate allocation of a broad range of structures to toxicity classes with appropriate thresholds.

The Changing Regulatory Environment in Europe

Dr. Ragnar Löfstedt, Director, King’s Centre for Risk Management, King’s College, London
December 8, 2017

  • The presentation discussed evidence based policy making post-Brexit and the changing nature of regulation in Europe, the role of science, transparency and risk communication.

WHO/IPCS Guidance on Probabilistic Dose-Response Assessment: Basic Principles and General Approach

Dr. Weihseuh Chiu, Texas A&M University, College Park, Texas
July 20, 2017

  • In 2014, WHO/IPCS published a guidance document on evaluating uncertainties in human health dose-response assessment. Rather than single values for the point of departure (POD) and for any adjustment/uncertainty factors, the WHO/IPCS approach uses uncertainty distributions that reflect the assumed or estimated uncertainties in each of those aspects.  Additionally, it quantitatively defines the protection goals in terms of incidence (I) and magnitude (M) of the critical effect in the human population.  By contrast, traditional approaches for developing dose-response toxicity values result in a single value (e.g., RfD, ADI) whose uncertainty is not known and for which the associated values for I and M are not quantified.  By quantifying the overall uncertainties in the target human dose at explicitly specified values of I and M, the probabilistic approach developed by the WHO/IPCS expert group allows risk managers to better weigh the benefits from reduced human health effects associated with different risk management options against other considerations, including economic costs.  Further, the probabilistic analyses can inform the value of information associated with different options for developing a higher tier assessment. Finally, WHO/IPCS reviewed the literature to derive parameter distributions based on analyses of historical data, enabling probabilistic dose-response assessments to be performed rapidly with minimal additional effort as compared to traditional deterministic analyses.   This presentation will provide an overview of the principles underlying the WHO/IPCS approach.

Considering the Paradox of Salmonella Control

Dr. Guy Loneragan, Texas Tech University
February 15, 2017

  • Salmonella serves as a paradox for control.  Some approaches, such as vaccination against Salmonella Enteritidis in laying hens, appear to have efficacy in specific situations.  However, approaches to control in other situations appear to be ineffective.  It is possible that part of the paradox of Salmonella control is that its ecology in many situations may be far more complex than previously believed.  Indeed, non-mammal (or non-reptile) vectors ought to be considered in some situations, for example.  If so, the animal-to-animal transmission and observed epidemiology might be more fully explained.  Further, the catalog(s) of virulence-mechanism acquisition and – sometimes simultaneous – loss of gene function in Salmonella have resulted in a spectrum of agent-host interaction that is yet to be fully understood.  The implications for animal health, and particularly human health, are profound in that some efforts – based on a limited understanding – may be too simplistic to be effective.  Further, with a more diverse cadre of foods implicated in human exposure, a broader and more complete systems awareness of Salmonella’s behavior in the wider environment might identify novel aspects to control and the reasons that some approaches to control appear to be ineffective.

Benefit Cost Analysis: Advancing Analysis Workshop, Selected Presentations from the Society for Benefit Cost Analysis 2017 Annual Meeting

    • A Consumer’s Guide to Regulatory Impact Analysis, Brian Mannix, George Washington University
    • Behavioral Responses to Health Information and Warnings, Rosemarie Lavaty and Carolyn Wolff, U.S. Food and Drug Administration
    • Fukushima: U.S. Response and the Short-Term Impact on U.S.-Japan Trade in Seafood, Aliya Sassi and Peter Vardon, U.S. Food and Drug Administration
    • Hurdle Rates, Declining Discount Rates, and Uncertain Opportunity Costs, Daniel Wilmoth, U.S. Small Business Administrations
    • Applying VSL When Persons Choose Risk or Care about Risks to Others: the Case of Commercial Space Exploration, Timothy Brennan, Resources for the Future
    • Willingness to Pay for Mortality Risk Reduction in Chinese Cities, Sandra Hoffmann and Alan Krupnick, USDA Economic Research Service and Resources for the Future (respectively)
    • Revising Analysis and Methods in Response to New Data or Information (Retrospective BCA of Federal Rules), Randall Lutter, University of Virginia and Resources for the Future
    • Retrospective Benefit Cost Analysis of Cooperative Interstate Shipment Program, Flora Tsui, Food Safety and Inspection Service
    • Retrospective Review of DOE’s Energy Efficiency Standards, Arthur Fraas and Sofie Millier,  Resources for the Future and George Washington University (respectively)

Salmonella risks pre-harvest and their importance for food safety

Karin Hoelzer
Pew Charitable Trusts
March 14, 2016

  • Salmonella can infect livestock and is known to be present on many farms and feedlots in the U.S. This presence in ‘pre-harvest’ environments is a recognized food safety risk. Efforts to eliminate Salmonella from livestock production have been successful in some cases, including, for example, large parts of the Scandinavian pig and poultry sectors. However, minimizing Salmonella in pre-harvest settings poses formidable challenges. Salmonella can be introduced onto premises in numerous ways and the relative importance of different pathways is largely unclear: can persist in the environment for long periods of time; can be challenging to detect due to intermittent shedding; can be particularly difficult to control because serotypes may differ considerably in epidemiology. Various pre-harvest interventions have been designed that may directly target Salmonella, that may favor competition with non-pathogenic bacteria (e.g., probiotics), or that may reduce exposure. While some of these interventions have shown promising results, at least in experimental settings, technological, logistical, economical and regulatory challenges complicate their implementation. Moreover, the value of pre-harvest interventions has remained the subject of intense scientific debate. This presentation will discuss the challenges and promise of pre-harvest interventions for Salmonella, and highlight the importance of risk assessment and risk-based interventions in this context.

When is Prevalence (based on presence-absence testing) Sufficient to Predict a Reduction in Illnesses in a Microbiological Food Safety Risk Assessment? 

Dr. Mike Williams
Food Safety Inspection Service
December 10, 2015

  • Process models that include the myriad pathways that pathogen-contaminated food may traverse before consumption and the dose-response function to relate exposure to likelihood of illness may represent a ‘‘gold standard’’ for quantitative microbial risk assessment. Nevertheless, simplifications that rely on measuring the change in contamination occurrence of a raw food at the end of production may provide reasonable approximations of the effects measured by a process model. In this study, we parameterized three process models representing different product-pathogen pairs (i.e., chicken-Salmonella, chicken- Campylobacter, and beef–E. coli O157:H7) to compare with predictions based on qualitative testing of the raw product before consideration of mixing, partitioning, growth, attenuation, or dose-response processes. The results reveal that reductions in prevalence generated from qualitative testing of raw finished product usually underestimate the reduction in likelihood of illness for a population of consumers. Qualitative microbial testing results depend on the test’s limit of detection. The negative bias is greater for limits of detection that are closer to the center of the contamination distribution and becomes less as the limit of detection is moved further into the right tail of the distribution. Nevertheless, a positive bias can result when the limit of detection refers to very high contamination levels. Changes in these high levels translate to larger consumed doses for which the slope of the dose-response function is smaller compared with the larger slope associated with smaller doses. Consequently, in these cases, a proportional reduction in prevalence of contamination results in a less than proportional reduction in probability of illness. The magnitudes of the biases are generally less for nonscalar (versus scalar) adjustments to the distribution.

Trends in Reported Foodborne Illness in the United States; 1996-2013

Mark Powell, USDA/ORACBA
September 17, 2015  

  • Retrospective review is a key to designing effective food safety measures. The analysis examines trends in the reported incidence of U.S. foodborne illness using both a conventional generalized linear model and penalized B-spline regression. B-spline regression is a semi-parametric, locally-controlled method that makes no assumptions about the form of the trend. To address the sensitivity of B-spline regression to choices about the number and location of join-points called knots, penalized B-spline regression imposes a “roughness” penalty on differences among neighboring B-spline regression coefficients. The optimal degree of smoothing is determined based on statistical model selection criteria (e.g., generalized cross-validation). The result is a flexible, smooth curve that avoids over-fitting the data, while providing a statistical test for trend. The findings indicate a lack of evidence for continuous reduction in foodborne illnesses in the U.S. during 1996-2013.

On Objective Risk

Dima Yazji Shamoun, co-author
September 15, 2015  

  • Objectivity in the science of risk plays a monumental role in the projection of the benefits from health and safety regulations, which constitute the majority of total reported benefits of all federal regulations. Claims concerning the accuracy of regulatory risk assessments have been un-testable so far in that they focus on whether a risk assessment over- or underestimates the risk of exposure to certain hazards; yet such claims rely on an implication that the true level of risk can be known. This presentation proposes moving the debate from the realm of the un-testable to the realm of the testable "process objectivity" of the science of risk. Consistent adherence to a process should yield objective results. There is a sizable body of guidelines and recommendations on sound risk assessment practices produced by the federal government and by various scientific bodies. The proposed process incorporates these guidelines and recommendations and is testable, objective, and—if adhered to consistently—has the potential to shed light on the accuracy of the benefits calculus of major federal health and safety regulations.

Retrospective Analysis of Significant Rules: APHIS Avocado Import Regulations

USDA Animal and Plant Inspection Service, Policy and Program Development
July 23, 2015

  • Agencies can benefit from reviewing past regulatory actions to better understand if anticipated outcomes differ from observed outcomes and if there were any unintended consequences of the regulation. APHIS is developing a framework to conduct retrospective analyses of significant rules. Using regulations of avocado imports as a case study, we assess the accuracy of our economic projections concerning domestic production and prices, consumption, and trade. We revisit issues raised in public comments to the proposed rule to assess how we would respond to the issues today, given the additional data that is now available. We anticipate using this framework to look back at additional significant rules to incorporate lessons learned from prior actions for future analyses.

FDA Risk Modeling Tools for Enhancing Fresh Produce Safety: Modeling the interface between the environment and fresh produce

David Oryang, Sherri Dennis and Yuhuan Chen, Center for Food Safety and Applied Nutrition, U.S. Food and Drug Administration
July 23, 2014

  • Presents FDA’s multi-scale efforts to develop risk modeling tools for enhancing fresh produce safety. Two case studies that model the complex interface between the environment and fresh produce will be examined: an agent based produce risk assessment model; and a geospatial risk assessment model.

New Risk Assessment Tools on Foodrisk.org

Clare Narrod, Research Scientist and Risk Analysis  Program Manager and Kyle McKillop, IT Program Manager, Joint Institute for Food Safety and Applied Nutrition (JIFSAN)
July 15, 2014

  • Foodrisk.org is a comprehensive on-line resource for the food safety risk analysis community.  The web site is home to a variety of risk assessment models, food safety risk tools and a library of completed risk assessments as well as some unique data sets.  It is operated by the Joint Institute for Food Safety and Applied Nutrition in collaboration with FDA’s Center for Food Safety and Applied Nutrition and USDA’s Food Safety Inspection Service. JIFSAN has been working with a number of colleagues over the last couple of years to develop tools to assist the risk analysis community.  Dr. Narrod and Kyle McKillop will discuss new risk assessment tools recently added to foodrisk.org as well as several that are in progress.

Optimal Food Safety Sampling Under a Budget Constraint

Mark Powell, USDA/ORACBA
September 18, 2013

Risk Analysis: Advancing Analysis Workshop

June 18th, 2013

Procedures for the Pesticide evaluation-assessment in the EU, role of EFSA (PPTX, 3.9 MB)

Dr. Jordi Serratosa, European Food Safety Authority (EFSA)
April 4, 2013

Estimation of cancer risks and benefits associated with a potential increased consumption of fruits and vegetables (PPT, 706 KB)

Rick Reiss, Principal Scientist, Exponent
September 25, 2012

Risk versus Hazard – Lessons from Europe

Ragnar Löfstedt, of King’s College of London
April 5, 2012

12/03/2010

Application of a multicriteria decision making model based on probabilistic inversion to assess nanotechnology-enabled food products, Villie Flari, Food and Environment Research Agency, United Kingdom, Rabin NelsonUniversity of Delft, the Netherlands

11/16/2010

Moving beyond nanogeneralities - , Providing focus to nanopolicy progress, Richard Canady, International Life Sciences Institute Research Foundation, Steve Froggett, Expert Consultant, ICF International, Inc, Guillaume Gruere, International Food Policy, Research Institute

03/16/2010

The EPA's Stochastic Human Exposure and Dose Simulation 
(SHEDS) - Dietary Model
, Valerie Zartarian, Ph.D. and Jianping Xue, M.D., M.S., Environmental Protection Agency, Office of Research and Development, National Exposure Research Laboratory, Human Exposure and Atmospheric Sciences Division, Exposure Modeling Research Branch

10/07/2009

Risk Analysis for Nanotechnology: State of the Science and Implications, Jo Anne Shatkin, Ph.D., Managing Director, CLF Ventures, Inc., Boston, Massachusetts

11/12/2008

State-of-the-Science Workshop Report: Issues and Approaches in Low Dose-Response Extrapolation for Environmental Health Risk Assessment, Ronald H. White, Ila Cote, Lauren Zeise, Mary Fox, Francesca, Dominici, Thomas A. Burke, Paul D. White, Dale B. Hattis and Jonathan M. Samet, Presenter: Mary Fox

05/13/2008

Agrifood Nanotechnology: Upstream Assessment of Risk and Oversight (PPT, 6.4 MB), Prof. Jennifer Kuzma, Center for Science, Technology, and Public Policy, Humphrey Institute, University of Minnesota

11/19/2007

Dietary Exposure Assessment at the Food and Drug Administration: A comparison of exposure assessment methods used in the Total Diet Study and analyses of individual food products, Dr. Michael DiNovi and Katie Egan, U.S. Food and Drug Administration, Center for Food Safety and Applied Nutrition

11/13/2007

Are interdisciplinary approaches of any use to economists and risk assessors?, Dr. Robert Oconnor National Science Foundation, Social, Behavioral and Economic Sciences

10/16/2007

Spring Valley Public Health Scoping Study,Legacy of US WWI Chemical Weapons Research: Case-Study of Environmental Epidemiology, Risk and Community Health Assessment in Washington, DC, Dr. Mary Fox, Johns Hopkins Bloomberg School of Public Health