However, Southern California, a region that suffers from a similar degree of water shortage, currently uses less than ~3% of municipal wastewater in agriculture, while discharging ~1.5 million acre-feet effluent per year into the Pacific Ocean . Secondary municipal wastewater effluent for ocean discharge is often sufficient to support both the nutrient and water needs for food production. Water reuse in agriculture can bring municipal water reclamation effluent to nearby farms within the city limit, thus promoting local agriculture and also reducing the rate of farmland loss to urban development. While the use of reclaimed water in agriculture offers a multitude of societal and agronomical benefits, broader adoption faces great challenges. One of the important challenges is ensuring the safety of food products in light of a plethora of human pathogens that may be present in recycled wastewater. Past studies have identified risks associated with irrigating food with recycled wastewater through the retention of the irrigation water on edible plant surfaces during overhead irrigation . With the emphasis on water conservation and reduction of evapotranspiration, subsurface drip irrigation is gaining popularity . Since there is lesser contact between water and the plant surface, the chance of surface contamination of pathogens is reduced. However, this new practice presents risk of uptake of microbial pathogens into plants. Such internalized pathogens are of greater concerns as washing, even with disinfectants, may not affect pathogens sheltered in the vasculature. Although pathogen transport through root uptake and subsequent internalization into the plant has been a growing research area, results vary due to differences in experimental design, systems tested, and pathogens and crops examined . Among the array of pathogens causing food borne illness that may be carried by treated wastewater, viruses are of the greatest concern but least studied. According to the CDC, 60% of U.S. food borne outbreaks associated with eating leafy greens were caused by noroviruses ,stacking pots while Salmonella and E. coli only accounted for 10% of the outbreaks . Estimates of global food borne illness prevalence associated with NoV surpass all other pathogens considered.
Viruses are also of concern because they persist in secondary wastewater effluents in high concentrations . They do not settle well in sedimentation basins and are also more resistant to degradation than bacteria . Therefore, in the absence of solid scientific understanding of the risks involved, the public are likely less receptive to adopting treated wastewater for agricultural irrigation. NoV internalization in hydroponic systems has been quantified by DiCaprio et al. . Internalization in crops grown in soil is considered lesser but nevertheless occurs. However, the only risk assessment that considered the possibility of NoV internalization in plants assumed a simple ratio of viruses in the feed water over viruses in produce at harvest to account for internalization. The time dependence of viral loads in lettuce was not explored and such an approach did not permit insights into the key factors influencing viral uptake in plants. In this study, we introduce a viral transport model to predict the viral load in crisp head lettuce at harvest given the viral load in the feed water. It is parameterized for both hydroponic and soil systems. We demonstrate its utility by performing a quantitative microbial risk assessment . Strategies to reduce risk enabled by such a model are explored, and a sensitivity analysis highlights possible factors affecting risk.The plant transpiration rate was adopted as the viral transport rate ) based on: 1) previous reports of passive bacterial transport in plants , 2) the significantly smaller size of viruses compared to bacteria, and 3) the lack of known specific interactions between human viruses and plant hosts . Accordingly, viral transport rate in hydroponically grown lettuce was determined from the previously reported transpiration model , in which the transpiration rate is proportional to the lettuce growth rate and is influenced by cultivar specific factors . These cultivar specific factors used in our model were predicted using the hydroponic crisp head lettuce growth experiment carried out by DiCaprio et al. described in Section 2.3 . Since the transpiration rate in soil grown lettuce is significantly higher than that in the hydroponic system, viral transport rate in soil grown lettuce was obtained directly from the graphs published by Gallardo et al. using WebPlotDigitizer . The shoot growth rate for soil grown lettuce was determined using Eq. 9 . In the absence of a published root growth model for lettuce in soil, a fixed root volume of 100 cm3 was used. In the viral transport model, viral transfer efficiency was used to account for the potential “barrier” between each compartment .
The existence of such a “barrier” is evident from field experiments where some microbial pathogens were internalized in the root but not in the shoot of plants . In addition, viral transfer efficiencies also account for differing observations in pathogen internalization due to the type of pathogen or lettuce. For example, DiCaprio et al. reported the internalization of NoV into lettuce, while Urbanucci et al. did not detect any NoV in another type of lettuce grown in feed water seeded with viruses. The values of ηgr and ηrs were determined by fitting the model to experimental data reported by DiCaprio et al. and is detailed in Section 2.3. The viral removal in the growth medium includes both die-off and AD, while only natural die-off was considered in the lettuce root and shoot. AD kinetic constants as well as the growth medium viral decay constant in the hydroponic case were obtained by fitting the model to the data from DiCaprio et al. . Viral AD in soil has been investigated in both lab scale soil columns and field studies . In our model, viral AD constants in soil were obtained from the experiments of Schijven et al. , who investigated MS2 phage kinetics in sandy soil in field experiments. As the MS2 phage was transported with the water in soil, the AD rates changed with the distance from the source of viruses. To capture the range of AD rates, two scenarios of viral behavior in soils were investigated. Scenario 1 used the AD rates estimated at the site closest to the viral source , while scenario 2 used data from the farthest site . In contrast to lab scale soil column studies, field studies provided more realistic viral removal rates . Using surrogate MS2 phage for NoV provided conservative risk estimates since MS2 attached to a lesser extent than NoV in several soil types . The viral decay rate in the soil determined by Roberts et al. was adopted because the experimental temperature and soil type are more relevant to lettuce growing conditions compared to the other decay study . Decay rates in the root and shoot were used from the hydroponic system predictions.The transport model was fitted to log10 viral concentration data from DiCaprio et al. , extracted from graphs therein using WebPlotDigitizer . In these experiments, NoV of a known concentration was spiked in the feed water of hydroponic lettuce and was monitored in the feed water, the root and shoot over time.
While fitting the model, an initial feed volume of 800 mL was adopted and parameters producing final volumes of b200 mL were rejected. To fit the model while accounting for uncertainty in the data, a Bayesian approach was used to maximize the likelihood of the data given the parameters. A posterior distribution of the parameters was obtained by the differential evolution Markov chain algorithm,strawberry gutter system which can be parallelized and can handle multi-modality of the posteriors distribution without fine tuning the jumping distribution. Computation was carried out on MATLAB R2016a and its ParCompTool running on the High Performance Computing facility at UC Irvine.Table 3 lists the parameters estimated by model fitting and their search bounds. Fitting data from DiCaprio et al. without including viral AD to the tank walls was attempted but the results were not used in the risk estimates due to the poor fit of model to the data. The rationale behind the model fitting procedure and diagnostics are discussed in Supplementary section S1H.A summary of the model fitting exercise for viral transport in hydroponic grown lettuce is presented in Fig. 2. Under the assumption of first order viral decay, NoV loads in water at two time points did not fall in the credible region of model predictions, indicating that mere first order decay was unsuitable to capture the observed viral concentration data. The addition of the AD factor into the model addressed this inadequacy and importantly supported the curvature observed in the experimental data. This result indicates the AD of viruses to hydroponic tank wall is an important factor to include in predicting viral concentration in all three compartments .The adequacy of model fit was also revealed by the credible intervals of the predicted parameters for the model with AD . Four of the predicted parameters: at, bt, kdec, s and kp, were restricted to a smaller subset of the search bounds, indicating that they were identifiable. In contrast, the viral transfer efficiency η and the kinetic parameters spanned the entirety of their search space and were poorly identifiable. However, this does not suggest that each parameter can independently take any value in its range because the joint distributions of the parameters indicate how fixing one parameter influences the likelihood of another parameter . Hence, despite the large range of an individual parameter, the coordination between the parameters constrained the model predictions to produce reliable outcomes . Therefore, the performance of the model with AD was considered adequate for estimating parameters used for risk prediction.Risk estimates for lettuce grown in the hydroponic tank or soil are presented in Fig. 4. Across these systems, the FP model predicted the highest risk while the 1F1 model predicted the lowest risk. For a given risk model, higher risk was predicted in the hydroponic system than in the soil. This is a consequence of the very low detachment rates in soil compared to the attachment rates. Comparison of results from Sc1 and Sc2 of soil grown lettuce indicated lower risks and disease burdens under Sc1 . Comparing with the safety guidelines, the lowest risk predicted in the hydroponic system is higher than the U.S. EPA defined acceptable annual drinking water risk of 10−4 for each risk model. The annual burdens are also above the 10−6 benchmark recommended by the WHO . In the case of soil grown lettuce, neither Sc1 nor Sc2 met the U.S. EPA safety benchmark. Two risk models predicted borderline disease burden according to the WHO benchmark, for soil grown lettuce in Sc1, but under Sc2 the risk still did not meet the safety guideline. Neither increasing holding time of the lettuce to two days after harvesting nor using bigger tanks significantly altered the predicted risk . In comparison, the risk estimates of Sales-Ortells et al. are higher than range of soil grown lettuce outcomes presented here for 2 of 3 models. The SCSA sensitivity indices are presented in Fig. 5. For hydroponically grown lettuce, the top 3 factors influencing daily risk are amount of lettuce consumed, time since last irrigation and the term involving consumption and ρshoot. Also, the risk estimates are robust to the fitted parameters despite low identifiability of some model parameters . For soil grown lettuce, kp appears to be the major influential parameter, followed by the input viral concentration in irrigation water and the lettuce harvest time. Scorr is near zero, suggesting lesser influence of correlation in the input parameters.In this study, we modeled the internalization and transport of NoV from irrigation water to the lettuce using ordinary differential equations to capture the dynamic processes of viral transport in lettuce. This first attempt is aimed at underscoring the importance of the effect of time in determining the final risk outcome. The modeling approach from this study may be customized for other scenarios for the management of water reuse practices and for developing new guidelines for food safety. Moreover, this study identifies critical gaps in the current knowledge of pathogen transport in plants and calls for further lab and field studies to better understand risk of water reuse.