- QUANTITATIVE EXPOSURE ASSESSMENT OF MYCOTOXINS BASED ON FOOD CHAIN DATA
- Case Study 1: Simulation of Consumer Exposure to Deoxynivalenol According to Wheat Crop Management and Grain Segregation (Le Bail et al. 2005)
- Case Study 2: Evaluation of Strategies for Reducing Patulin Contamination of Apple Juice Using a Farm- to-Fork Risk Assessment Model (Baert et al., 2012)
- Case Study 3: A Stochastic Simulation Model for the Quantitative Assessment of the Concentration of Mycotoxins in Milk and the Related Human Exposure (Signorini et al. 2012)
QUANTITATIVE EXPOSURE ASSESSMENT OF MYCOTOXINS BASED ON FOOD CHAIN DATA
Growing evidence has shown that different parameters may be good predictors of fungal growth and mycotoxin production in raw food materials, either in the field, postharvest or later in the production chain, and by extension, predictors of mycotoxin occurrence in final products and eventual exposure of consumers. Hence, risk management strategies can be applied at different stages along the food chain, from primary production, processing and manufacturing, transport and distribution, storage and retail to preparation and consumption of the food, which may result in a decreased exposure to mycotoxins. An integrative food chain risk assessment strategy (also known as a farm-to-fork strategy) is required to assess the impact of the risk management options (e.g., mycotoxin reduction strategies) on the final concentration in food and related exposure in the population. Methodologically, such risk assessment approaches may be either deterministic (based on mycotoxin point estimations for the different steps in the food chain) or probabilistic, if probability distributions for mycotoxin contamination are established (in this way, the probability distribution of exposure, in particular the percentiles, is more accurately estimated). In any case, most frameworks are developed on the basis of simulated scenarios where the parameters of interest (e.g., storage temperature) are modelled on the basis of realistic applications. To assess the efficacy and usefulness of management strategies focused on reducing mycotoxin burden in food, a complete exposure assessment might not be required, as the mycotoxin concentration in the finished product can be compared with the reference processing system to check whether a lower concentration results. There are many different types of strategies or control measures, instigated by regulation or chosen by the operators (e.g., good agricultural and animal production practices, good hygiene practices [GHPs] during manufacture and processing, and good consumer handling practices).
Control measures in the food industry regarding mycotoxins may comprise some of the following activities:
- • Ensuring control of initial levels of hazards (e.g., avoiding nuts and spices from certain origins; avoiding raw materials from primary producers who do not adhere to good agricultural practices; establishing requirement specifications with suppliers and requiring verifiable documentation, e.g., letters of guarantee or certificates of analysis attesting the safe level of mycotoxins; using sampling and analyses as necessary; and using appropriate methods based on established criteria to reject unacceptable ingredients or products).
- • Preventing an unacceptable increase of hazards.
- a) Preventing contamination, for example adopting GHPs that minimize mycotoxin contamination from transport, drying and storage facilities, establishments or equipment, and from aqueous solutions in fruit and nut processing due to excessive reuse. For the particular case of mycotoxins, GHPs are also important to prevent contamination by mycotoxigenic fungi, which may further develop and produce mycotoxin in subsequent process stages.
- b) Preventing fungal growth during transportation, storage and processing, for example cold storage of apples; adjusting aw in stored cereals, nuts, coffee or spices; adding preservatives to stored fruits and cereals; controlling temperature and mois- ture/aw in dehydrating fruits; adjusting storage times; using packaging techniques and materials to protect food from contamination; or implementing effective controls within the food processing environment (e.g., pest control).
- • Reducing or eliminating mycotoxins
- a) Selecting ingredients (e.g., applying electronic sorters to reject nuts that are likely to contain AFs; culling fruits for fruit juice production that are likely to contain patulin; rejecting rotten grape bunches that are likely to contain OTA; cleaning of cereals, resulting in the separation of mouldy grains, which account for most of the Fusarium toxins and AFs in a lot).
- b) Additionally some measures that are not implemented to control mycotoxins, or that are intrinsic to the food process, may exert a certain control on mycotoxins, inactivating mycotoxins to some extent, because mycotoxins are quite heat stable (e.g., heat treatments, like roasting, frying and baking; commercial sterilization; fermentation processes), physical segregation of the most contaminated fractions of raw materials (e.g., milling of cereals, must extraction from grapes or malt, pressure washing of apples, centrifugation, filtration) (Garcia-Cela et al. 2012).
The application of simulation-based models using data collected in the preliminary phases of the food-production chain appears to be a promising approach to evaluate risk management strategies focused on reducing the burden of mycotoxins in the food-web. These types of models, mainly borrowed from the microbiological risk assessment field, may efficiently integrate probabilistically a vast list of parameters gathered from the different steps that may affect the final consumer exposure. For QRA to become a useful decision support system, the impact of all these control measures on mycotoxin concentration should be modelled in such a way that simulations should be run easily, providing robust evaluations for risk assessment. Moreover, it is important to address variability and uncertainty. Variability refers to quantities that are distributed within a defined population, such as food consumption rates, raw material contamination, mycotoxin production, etc.). These are inherently variable and cannot be represented by a single value, so that we can only determine their moments (e.g., mean, variance, skewness, etc.) with precision. In contrast, true uncertainty or model-specification error (e.g., statistical estimation error) refers to a parameter that has a single value, which cannot be known with precision due to measurement or estimation error (WHO, 1995). Two- dimensional Monte-Carlo simulations allow an efficient, separate evaluation of both variability (Dimension 1) and uncertainty (Dimension 2), which may be easily performed by multiple statistical packages.
Some examples of QRA have been taken from the literature, and the main points are summarized here:
Case Study 1: Simulation of Consumer Exposure to Deoxynivalenol According to Wheat Crop Management and Grain Segregation (Le Bail et al. 2005)
DON is mainly produced in wheat in the field and rarely during postharvest. According to Le Bail et al. (2005), assuming a consumption of 175 g of wheat flour per day (WHO, 2003) and a PMTDI of 60 pg/day for an adult weighing 60 kg, the maximum permissible concentration in wheat is 340 pg/kg. However, it is difficult to respect this limit every year at field scale. Thus, two risk management strategies were tested to reduce DON exposure through the consumption of wheat derivatives: the use of tillage methods as a cropping system was tested against no-tillage, and on the other hand, once the wheat was harvested, different crop segregation strategies were tested during postharvest. Point estimations were carried out for the different scenarios tested, while a distribution of consumption was applied for modelling the exposure.
It is known that no-tillage practices lead to an increase in DON in cropping wheat (Champeil, 2004). Thus, three scenarios were considered and applied to experimental plots:
Scenario 1: all the plots of the area were ploughed (0% no-tillage) Scenario 2: half the area was ploughed (50% no-tillage)
Scenario 3: all the plots were direct-drilled (100% no-tillage)
Simulations were carried out assuming an initial DON concentration of 100-800 pg/kg for Scenario 1, and different ratios for Scenario 3/Scenario 1 from 1 to 8, and their impact on exposure was assessed. It was shown that PMTDI was not exceeded even for the highest ratio if the initial contamination level was 100 pg/kg for Scenario 1. In contrast, a ratio of 2 was sufficient to attain the PMTDI of mean exposure for contamination values of 400 pg/kg and over.
Segregation of cereal crops from different field plots in different batches according to predicted mycotoxin contamination may be a suitable risk management alternative. The authors considered the following scenarios:
Scenario (a): all wheat crops in the area are blended as a single batch. Scenario (b): the plots are divided into two classes according to the preceding crop: Batch M (maize) for the plots with maize as preceding crop and Batch О (others) for the other plots.
Scenario (c): the plots of the area are divided in two classes according to the real grain contamination value to segregate two batches: Batch L (L) for the plots with contamination <1250 pg/kg and Batch H for the plots exceeding this value.
Two experimental datasets were used in this case, the first one (n = 17 plots) with median 15 pg/kg (15-2250 pg/kg) and the second one (n = 21 plots) with median 135 pg/kg (10-16,685 pg/kg). For the first dataset, levels of exposure below PMTDI for Scenario (a) were obtained; thus, the two alternative scenarios, (b) and (c), gave no advantage over the simple blending of grain from all the fields, and the segregation of some batches even resulted in exposure levels above the PMTDI. For Dataset 2, Scenario
- (a) always resulted in exposure above the PMTDI. In this case, Scenarios
- (b) and (c) made it possible to create batches - Batch О for Scenario (b) and Batch L for Scenario (c) - for which the estimated exposure was below 1 pg/day kg. Scenario (c) was the most useful, as it not only reduced exposure but also corresponded to 90% of the fields, so that only a small proportion had to be rejected; however, it is difficult to implement, as it requires analysis of grain for DON content in each field (which is too costly) or a prediction tool to be applied per field. Alternatively, Scenario (b) can be applied; however, in this case, wheat from 53% of the fields should be rejected for human consumption, which is also economically costly.
No scenarios were tested at this point. DON content in flour was determined using a corrective coefficient of 0.44, applied to grain content, to take into account the effects of processing. Regarding consumption data, the percentage of flour in the recipes of the different finished products was taken into account.
In recent years, a number of publications have provided useful data for a better modelling of the effect of processing wheat to wheat products on DON content, including operations such as sorting, cleaning, milling, fermentation, baking and extrusion cooking (Generotti et al. 2015; Schwake-Anduschus et al. 2015; Vidal et al. 2014, 2016).
Transport and distribution, storage, retail, and preparation and consumption of the finished wheat products are not expected to have an effect on the presence of DON.
Case Study 2: Evaluation of Strategies for Reducing Patulin Contamination of Apple Juice Using a Farm- to-Fork Risk Assessment Model (Baert et al., 2012)
Patulin is a mycotoxin usually found in apple derivatives, mainly apple juice, due to the processing of low-quality, Penicillium expansum infected fruits. Almost 100% of P. expansum isolates are able to produce patulin (Morales et al 2010). In 2003, the EC issued a recommendation including a Code of Practice for the prevention and reduction of patulin contamination in apple juice and apple juice ingredients in other beverages, which points out a number of strategies for patulin risk management.
Baert et al. (2012) developed a QRA model for patulin in apple juice as a function of apple storage and processing conditions. The model is developed on the basis of a comprehensive list of parameters identified all along the food processing chain, from the collection of apples until storage of the produced apple juice. The model was used to test the influence of different risk management measures: duration of the deck storage for fresh apples, either using or not using a controlled atmosphere (CA) during storage, duration of refrigerated storage before and after CA storage, duration of deck storage at the juice plant, segregation of damaged or mouldy apples prior to processing, etc. The model was validated against empirically generated data (n = 177) of apple storage and juice production using the different possible scenarios. The simulations (n = 10,000) were performed with the ©RISK software using Latin Hypercube sampling, setting the random generator seed at 1 to guarantee the stability of the estimations. An exposure assessment was not carried out, and the suitability of the different scenarios was assessed in terms of final contamination in the apple juice.
Although present in the field, P. expansum usually develops postharvest, during deck or cool storage, and produces patulin. Consequently, preharvest strategies aiming to reduce P. expansum inoculum in the field could be applied, but they are not as effective as postharvest strategies.
Baert et al (2012) built a probabilistic QRA model for apple postharvest, from picking to deck or cold storage prior to processing to juice, including all possible scenarios of duration, temperature and %02 through their probability distributions. Probability of infection, lag phase duration and colony diameters were estimated for P. expansum, as well as probability of patulin production and concentration, and were included in the risk assessment model from previously published models for the different storage conditions. Apples stored under CA conditions contained more patulin than fresh or cold stored ones; for example, a reduction from 40% to 20% in CA stored apples in a batch reduced the mean patulin concentration by almost 50%. Reduction of the wound frequency had a limited impact on final patulin concentration, as did short storage at 1 °C before or after CA storage. By contrast, the time between delivery at the juice plant and processing led to a large increase in patulin concentration, mostly when apples had been CA stored.
The fate of patulin throughout the production of cloudy and clear apple juice was modelled in a deterministic way by taking point values for the different percentages of reduction during washing, milling, pasteurization, filtration, refrigeration, etc. from the literature. Sorting of apples with lesion surfaces larger than 10 cm2 led to fewer than 0.5% of juice samples containing over 25 pg/kg.
From all strategies tested, a joint strategy was refined by including a maximum of 20% of CA stored apples and removing apples with lesions bigger than 10 cm2; the model predicted in this case a 88% reduction of the patulin content and only 0.3% of apple juices over 25 pg/kg.
Transport and distribution, storage, retail, and preparation and consumption of the finished apple juices are not expected to have an effect on the presence of patulin.
Overall, the model showed realistic concentration estimates in apple juice; however, an overestimation at the higher concentrations was recognized. The authors pointed out the need for further raw data and models predicting the production of patulin in order to refine the predictions.
Case Study 3: A Stochastic Simulation Model for the Quantitative Assessment of the Concentration of Mycotoxins in Milk and the Related Human Exposure (Signorini et al. 2012)
Aflatoxin Ml (AFM1) is an increasing concern for the dairy industry in European countries. It is thought that the increase may be linked to the increase in AFB1 in maize and maize by-products used for feed production in certain areas due to climate change as well as to the inclusion of cottonseed and rapeseed in the feed.
Signorini et al. (2012) built a QRA model by combining probability distributions for the AFB1 concentration in the different raw materials included in cattle feed in Argentina and the amount of each ingredient in the diet, which was affected by the season and by the time from lactation initiation for each animal. In this way, four different simulations were created (2 seasons x 2 lactation periods), for which the model estimated AFB1 probability distributions for each diet. Monte-Carlo simulations (n = 5000 iterations) were also performed with the @RISK package, and the simulated statistics showed adequate convergence. Carry-over equations from the literature were applied to these distributions, and the distributions for AFM1 in milk was obtained as a single distribution per year. The concentration of AFB1 in feed was estimated as 4.7 pg/kg (95%CI 0.832-23.14), and mean AFM1 in milk was 0.059 pg/kg (95%CI 0.032-0.323). The AFM1 level in milk was sensitive to AFB1 in concentrate feed, carry-over, AFB1 in corn silage, season and AFB1 in cotton seed, in this order, which indicates the priorities in risk management, e.g., the need for better segregation of maize used for concentrate feed production and improved management of silage production. The total daily intake estimated for AFM1 was 0.00122 ng/kg bw (95%CI 0.007-0.633).
The last summarized study exemplifies that modelling may be applied to a whole food chain as, in this case, preharvest studies could have been included, as well as processing of milk to final products. In particular, the processing of milk may affect exposure, although in the study, the authors considered that effect negligible and estimated exposure from data estimated for raw milk. A recent review by Campagnollo et al. (2016) summarized the effects of the main unit operations for dairy product processing in AFM1. An overview of the whole process and its impacts on exposure is presented in Figure 17.3; scattered data exist on most of the steps, but model building is still a challenge in some of them.
Figure 17.3 Overview of the farm-to-fork stages that determine the exposure of population to AFM1. [AFM1]C: concentration of AFM1 in cereals; [AFMl]cbp: concentration of AFM1 in cereal by-products; [AFMl]j: concentration of AFM1 in imported raw materials; [AFM1]S: concentration of AFM1 in silage; [AFMl]fc: concentration of AFM1 in feed concentrate; [AFM1]U: concentration of AFM1 in unifeed; [AFMl]m: concentration of AFM1 in milk; [AFMl]hm: concentration of AFM1 in heated milk; [AFMl]ch: concentration of AFM1 in cheese; [AFMl]fm: concentration of AFM1 in fermented milk.