Application of Quantitative Risk Assessment Methods for Food Quality

INTRODUCTION

Risk analysis is a logical, structured, and consistent process that aims to provide information about the risk of introduction, expression, and dissemination of diseases, assessing their economic impact and the consequences for public and animal health (Chapman, Otten, Fazil, Ernst, & Smith, 2016). It is a tool that encompasses three elements with different objectives; however, they are strongly linked: risk assessment, risk management, and risk communication (Buchanan, 2004; Lammerding, 1997; OMS & FAO, 2014).

Quantitative microbiological risk assessment (QMRA) is the scientific framework that characterizes the microbiological risks involved in the food production chain (Chapman et al., 2016; WHO/FAO, 2008). Currently, several models, methodologies, and procedures assist in risk analysis for decision making (Aven & Zio, 2018). The concepts involved in information collection and risk management are based on the current state of knowledge and information available in the literature (Aven & Zio, 2018).

QMRA aims to assess scientifically the risks of occurrence of food spoilage and the likely severity, known or potential, resulting from human exposure to food hazards. It includes four steps: hazard identification; hazard characterization; exposure assessment; and risk characterization (Lammerding, 1997; OMS & FAO, 2014; Vose, 2008).

The QMRA model should not only be used to estimate the probability of microbiological risk exposure but should also provide information that helps to elaborate strategies, interventions, and hazard controls that can be found along the production chain (Barron, Redmond, & Butler, 2002). These models incorporate techniques for examining the variability associated with raw materials, microorganisms, and production processes. Additionally, simulations and sensitivity analyses may predict and project the critical points that can be found along the production chain. Furthermore, mitigating effect measures can be integrated into the model for quantifying the process effectiveness and the responses of the production system (Barron et al., 2002).

Risk analysis allows control points to be identified along the food processing chain, which are essential for making preventive and corrective decisions. Thus, it is possible to evaluate the benefits and costs of each action, thereby enhancing the efficiency of the risk management process (Membre & Boue, 2018).

Although there are several QMRA models developed for pathogenic microorganisms, only a few quantify food spoilage (Snyder & Worobo, 2018). The control of microorganisms associated with food spoilage impacts directly on the food quality. Thus, the development of QMRA models for spoilage microorganisms may be a valuable tool to give support for control and intervention measures to prevent food deterioration along the chain. This chapter explores the concepts used to develop spoilage QMRA models.

CONCEPTS OF RISK ASSESSMENT FOR MICROBIAL SPOILAGE

Spoilage can be defined as a process or a change that makes a product undesirable or unacceptable for the consumer. The spoilage is the result of the biochemical activity of microbial metabolism that produces alcohols, sulfur compounds, hydrocarbons, fluorescent pigments, organic acids, esters, carbonates, and diamines, which modify food taste, odor, and texture (Abdel-Aziz, Asker, Keera, & Mahmoud, 2016; Nychas & Panagou, 2011; Petruzzi, Corbo, Sinigaglia, & Bevilacqua, 2017). The spoilage can occur at any stage of the chain. Thus, careful monitoring from production to distribution, as well as product storage in retail and household refrigerators, is essential to ensure its safety and quality (Abdel-Aziz et al., 2016; Nychas & Panagou, 2011).

QMRA models developed to evaluate spoilage use the exposure assessment in a broader sense. They consider the contribution of each step during food processing or distribution, which is quantified with the ultimate goal of identifying and evaluating management options to control or reduce that risk (Gougouli & Koutsoumanis, 2017). Furthermore, the variability and uncertainty affecting the microbial response are preponderant factors in estimating the risk of spoilage (Koutsoumanis, 2009). These concepts will be discussed later in this chapter.

The QMRA model structure considers as fundamental parameters hazard definition, product or food matrix, the exposure scenario, and the target population. The definition of risk is based on the likelihood of an adverse effect on an organism, system, or population caused under the circumstances specified by exposure to an agent (Barlow et al., 2015; Vose, 2008). For food QMRA, the term "risk" has been used to describe the likelihood that food is unfit for consumption, whether for reasons of contamination or spoilage (WHO/FAO, 2008).

The first step in developing a food spoilage QMRA model is to determine the sources that may cause variation along the food chain. These sources are related to the production environment, storage, and microbiological and consumer variability (Koutsoumanis, 2009). The next step is hazard identification, which indicates the effects considered to be adverse regardless of the dose required or the specific mechanisms involved in the process.

Thus, hazard characterization provides a quantitative and qualitative estimate of adverse effects, so that the dose-response relationship and mode of action for these effects can be established (Vose, 2008). Exposure assessment involves the assessment of the modes, magnitudes, duration and time of actual or anticipated exposure, and the number and nature of those likely to be exposed (WHO/FAO, 2008). Risk characterization, the final stage of risk assessment, is the estimation of the likelihood of food spoilage as an exposure consequence, taking into account the results of hazard identification, hazard characterization, and exposure assessment (Membre & Boue, 2018).

The risk assessment implementation may be biased (vulnerable) by many uncertainties resulting from information deficiencies or critical gaps (Kouame-Sina et al., 2012). When this occurs, plausible assumptions are made according to the current state of scientific knowledge, taking into account these uncertainties, so that the assessment can be achieved (Vose, 2008). Therefore, risk assessment is considered a complex mix of currently available data and assumptions based on prevailing scientific data.

ASSESSMENT OF MICROBIOLOGICAL SPOILAGE OF FOOD AND BEVERAGES: CHEMICAL AND MICROBIOLOGICAL CHANGES

The microbiology of food spoilage has been well characterized, but the challenge is to determine the relationship between microbial composition and the presence of microbial metabolites, resulting in a proper evaluation of microbiologic spoilage (in't Veld, 1996; Remenant, Jaffres, Dousset, Pilet, & Zagorec, 2015).

In order to manage food quality risks, it is essential to identify which foods, spoilage microorganisms, or situations contribute to chemical or microbiological changes and also to find out the magnitude of the impact caused by these changes. Such information is necessary to make rational decisions about the most effective interventions for reduction of food spoilage. With the development of microbial risk assessment concepts, it was recognized that the focal point for controlling spoilage microorganisms and determining food quality should consider the chemical and microbiological changes together, not only the concentration of a microorganism in the food. Additionally, Remenant et al. (2015) point out that spoilage may be a result of the presence of different microorganism species and not always due to the dominant one, and that there may be a synergistic effect, which can increase the magnitude of spoilage compared with the presence of only one spoilage microorganism.

The establishment of microbiological criteria defines the acceptability of a product, a batch, or a process based on the absence, presence, or a defined number of microorganisms and the number of their toxins or metabolites per unit of mass, volume, area, or batch. Such criteria can be used as tools to assess the safety and quality of foods, and they are generally determined by quantitative risk assessment and feasibility (Barlow et al., 2015). As an example, Rukchon, Nopwinyuwong, Trevanich, Jinkarn, & Suppakul (2014) explain that the quantification of chemical changes in poultry meat can provide information about the level of spoilage. Indicators such as biogenic amines, volatile bases, nucleotide breakdown products, and volatile acidity can be used to assess meat quality and freshness during storage. However, the ideal microbial metabolite to be used as the indicator should follow some principles such as the absence (or at least, presence in a low concentration) in fresh product, should increase over the storage period, and should be produced by the predominant microorganisms and have a good correlation with the product's organoleptic assessment (Lianou, Panagou, & Nychas, 2016).

As crucial as detecting spoilage during storage by the food industry and retailers is the search for a cheap, simple, and accurate measurement device for detecting food spoilage at a consumer level. However, the existing spoilage detectors/indicators are sensitive to specific metabolites present in a specific food product or group of food products and are not validated for general application in every type of food. Also, spoilage indicators are still in a developmental stage at universities or innovative companies (de Jong et al., 2005). Some examples of published work on food spoilage indicators include (1) a colorimetric mixed pH dye-based indicator for real-time monitoring of intermediate-moisture dessert spoilage by measuring the response to C02 (Nopwinyuwong, Trevanich, & Suppakul, 2010); (2) a colorimetric sensor array for fish spoilage monitoring in which chemo-sensitive compounds were incorporated in an array for colorimetric detection of typical fish spoilage compounds (trimethylamine, dimeth- ylamine, cadaverine, and putrescine) at room temperature and 4°C (Morsy et al., 2016); (3) an on-package dual-sensor label based on pH indicators

(methyl red and bromocresol purple) for real-time monitoring of beef freshness at room and chiller temperatures (Kuswandi & Nurfawaidi, 2017); and (4) the use of an electronic nose, based on a chemical sensor array of six metal oxide semiconductors, to diagnose fungal contamination, to detect high fumonisin content, and to predict fumonisin concentration in maize cultures (Gobbi, Falasconi, Torelli, & Sberveglieri, 2011).

There are different reasons to explain why there has not been a single attempt to quantify spoilage of specific food products: (i) available methodologies are too slow, as they provide retrospective information about spoilage and cannot be applied for online monitoring; and (ii) there are different technologies for food preservation (i.e., vacuum, modified atmospheres, etc.), which influences the choice of a specific methodology and can affect its application. Furthermore, some considerations that make it challenging to choose the ideal metabolite for spoilage assessment should be taken into account: specificity to a given microorganism - if this microorganism is not present or is inhibited by other microorganisms (naturally present or intentionally added to the product), incorrect results may be achieved; specificity to a given substrate - if this substrate is absent or is present in low concentration, the metabolite cannot be produced, and spoilage does not occur; influence of environmental conditions (i.e., pFI, temperature, water activity, oxygen tension, etc.) - the microorganism will not produce the metabolite - if environmental conditions are not in favor of such production. Besides, ideal metabolites should not demand complex procedures for their measurement and should not demand too much time and equipment. If the detection of the metabolite is difficult, rapid analytical methods or tools for quantifying their indicators should be chosen. Finally, validation of such methodologies and inspection of authorities for control purposes are also necessary (Lianou et al., 2016).

Classical microbial determination of perishable foods is of limited value for predictive prognoses because foods are sold or consumed earlier than the results of microbiological assays are accessible (in't Veld, 1996). There is interest among the food industry, which includes retailers, consumers' rights organizations, and food safety controlling bodies, in the development of accurate, cost-effective, rapid, reliable, non-invasive, and non-destructive methods or devices to evaluate the real-time freshness of food products (Rukchon et al., 2014). New methodologies and instruments for early quantitative detection of spoilage microorganisms or their metabolites have been developed and include novel analytical approaches based on biosensors, sensor arrays, and spectroscopy techniques in tandem with chemometrics. Examples of such methodologies consist of metagenomics, enzymatic reactor systems, electronic noses (arrays of sensors), potentiometric measurements by electronic tongues, Fourier transform infrared spectroscopy, near-infrared spectroscopy, hyperspec- tral imaging techniques, and multispectral imaging technologies (Lianou et al„ 2016).

 
Source
< Prev   CONTENTS   Source   Next >