ABSTRACT
Identification of management practices associated with preharvest pathogen contamination of produce fields is crucial to the development of effective Good Agricultural Practices (GAPs). A cross-sectional study was conducted to (i) determine management practices associated with a Salmonella- or Listeria monocytogenes-positive field and (ii) quantify the frequency of these pathogens in irrigation and nonirrigation water sources. Over 5 weeks, 21 produce farms in New York State were visited. Field-level management practices were recorded for 263 fields, and 600 environmental samples (soil, drag swab, and water) were collected and analyzed for Salmonella and L. monocytogenes. Management practices were evaluated for their association with the presence of a pathogen-positive field. Salmonella and L. monocytogenes were detected in 6.1% and 17.5% of fields (n = 263) and 11% and 30% of water samples (n = 74), respectively. The majority of pathogen-positive water samples were from nonirrigation surface water sources. Multivariate analysis showed that manure application within a year increased the odds of a Salmonella-positive field (odds ratio [OR], 16.7), while the presence of a buffer zone had a protective effect (OR, 0.1). Irrigation (within 3 days of sample collection) (OR, 6.0), reported wildlife observation (within 3 days of sample collection) (OR, 6.1), and soil cultivation (within 7 days of sample collection) (OR, 2.9) all increased the likelihood of an L. monocytogenes-positive field. Our findings provide new data that will assist growers with science-based evaluation of their current GAPs and implementation of preventive controls that reduce the risk of preharvest contamination.
INTRODUCTION
Produce commodities have been estimated to account for an estimated 46%, 38%, and 23% of food-borne illnesses, hospitalizations, and deaths in the United States, respectively (1). The fact that produce commodities are often consumed raw or with minimal processing likely contributes to the risk of food-borne disease associated with produce. Salmonella and Listeria monocytogenes are two bacterial food-borne pathogens that represent a substantial burden to the produce industry. Produce-borne Salmonella outbreaks have been responsible for a considerable number of food-borne illness cases (2–6). For example, a Salmonella outbreak in 2005, associated with tomatoes, resulted in 459 illnesses across 21 U.S. states (7). In 2008, an outbreak of Salmonella, linked to jalapeno peppers, sickened approximately 1,500 individuals from 43 states, the District of Columbia, and Canada; this became the largest known outbreak of food-borne illness in the United States within the past decade (8). L. monocytogenes was responsible for a 2011 produce-borne outbreak in the United States, with 147 illnesses, 33 deaths, and 1 miscarriage, due to consumption of cantaloupe (9). In addition, a considerable number of produce recalls (e.g., of spinach and lettuce) have occurred in the past 3 years as a result of L. monocytogenes contamination (10). Both Salmonella and L. monocytogenes can contaminate, persist, and amplify at any point along the farm-to-fork continuum from production to consumption; therefore, minimizing the risk of contamination by these pathogens throughout the supply chain is essential to reducing food-borne illness risks (11–13).
The risk of produce contamination can be reduced by controlling for conditions that favor pathogen introduction and growth in the preharvest environment. Preharvest produce safety is complicated by the fact that each farm has a distinct combination of environmental risk factors (e.g., topography, land-use interactions, and climate). Combinations of these environmental factors influence the frequency and transmission of food-borne pathogens and subsequently impact the risk of produce contamination (14). Mitigating contamination risks from environmental factors may be complex and challenging (e.g., as it is difficult to modify farm landscapes); however, modifying management practices to minimize contamination risks may be a more achievable approach. Eighty-nine percent of growers in the United States have reported already implementing at least one on-farm food safety measure due to pressure from auditors, inspectors, buyers, and other food safety professionals (15). Examples of food safety measures that were implemented include removing riparian areas, treating irrigation water, installing fences, and using poison bait to control rodents. While these practices were initially used to limit food safety risks in high-risk crops (e.g., leafy greens and tomatoes), a follow-up study determined that these practices were also being applied to low-risk crops (e.g., potatoes and squash), thus increasing the cost of production (16). In addition, some of these practices may also have negative effects on landscape health (17). The average per acre cost to growers to implement food safety modifications to meet the “Leafy Green Marketing Agreement” (LGMA) was $13.60 based on a survey conducted in 2008 and 2009 (18).
Preharvest contamination with food-borne pathogens can occur from a variety of sources (e.g., irrigation and runoff water, soil amendments such as manure, and fecal deposition from intruding domesticated and wild animals). In addition, management practices (e.g., worker hygiene and buffer zones) and geospatial factors (e.g., soil characteristics) can significantly modulate the risk of contamination from different sources (2, 3, 12, 19–21). A number of studies have shown that water can act as both a source of pathogens and a vehicle of pathogen introduction to preharvest environments and produce (20, 22–25). For example, surface water has been reported to have a wide range of Salmonella (6% to 80%) and L. monocytogenes (6.4% to 62%) prevalence (24, 26–29). In particular, Salmonella prevalence of 6 to 9% has been reported for water samples obtained from produce-growing regions in California and New York State (14, 23). Manas et al. (30) determined that lettuce plants irrigated with nonpotable water had significantly higher rates of total coliforms and Salmonella contamination than lettuce irrigated with drinking water. A number of studies also have linked sporadic or repeated contamination events in produce fields to wildlife fecal deposits (21), with a variety of bacterial food-borne pathogens, including Salmonella and L. monocytogenes, regularly isolated from fecal samples collected from wildlife and domesticated animals (3, 31–36). Salmonella can also survive in the soil for long periods of time (e.g., up to 230 days in one study [37]) when introduced by contaminated poultry or cow manure. A study of farm management practices in Minnesota and Wisconsin found that the use of manure significantly increased the risk of Escherichia coli contamination in organic (odds ratio [OR], 13.2) and semiorganic (OR, 12.9) produce (38). Another study demonstrated that worker hygiene (e.g., portable toilets and hand-washing stations) and trainings were important in reducing the likelihood of generic E. coli contamination at the preharvest level (39).
While a number of studies (3, 12, 23, 38–41) have suggested that specific farm management practices may impact pathogen contamination in the preharvest environment, we are not aware of any studies that used statistical methods to quantitatively assess the risk of pathogen contamination associated with specific field-level management practices. These types of data are essential to allow for identification of practices that can significantly increase or decrease the likelihood of field-level contamination in order to facilitate implementation of science-based preventive controls. Thus, the purposes of this study were to (i) evaluate the prevalence of Salmonella and L. monocytogenes isolated from environmental samples (soil, drag swab, and water) and (ii) identify field-level management practices associated with the presence of Salmonella or L. monocytogenes.
MATERIALS AND METHODS
Study design.Twenty-one produce farms in New York State were enrolled in a cross-sectional study. Enrollment was based on the willingness of the grower to participate in the study. Participation entailed giving permission to collect environmental samples from produce fields on the farm and agreeing to fill out a questionnaire regarding field-level management practices associated with each field that was sampled. Farms were located in three regions of New York State, with five in western New York, 12 in central New York, and four in eastern New York. Farm visits were performed over a 5-week period in June and July 2012. At least 10 fields were selected per farm. A single composite soil sample (consisting of five subsamples of soil from five locations in the field) and an area drag swab sample were collected for each field (using a sampling area of approximately 0.2 ha). Additionally, samples were collected from water sources that were (i) used for field irrigation (n = 23) or (ii) not used for field irrigation but within 50 m from a sampled field (n = 51). Six hundred samples were collected for the study (263 composite soil samples, 263 area drag swab samples, and 74 water samples).
Questionnaire design.A questionnaire was developed to obtain data on field-level practices identified in the literature as possible factors (e.g., manure application and irrigation water) that influence the risk of preharvest contamination (see the supplemental material). The interview form included questions to obtain (i) general farm characteristics (15 questions) and (ii) information on sampled fields (11 field questions). Seven of the 11 field-specific questions were time dependent. For instance, growers were asked the last time a sampled field was irrigated, with answer options of within 3 days, 4 to 7 days, 8 to 14 days, and over 14 days/never. One of the time-dependent questions (frequency of irrigation) had two follow-up questions. The two follow-up questions were (i) source of irrigation water (e.g., pond) and (ii) type of irrigation system used (e.g., drip). The remaining four field-specific questions were not time dependent. For example, growers were asked if the field had a buffer zone (i.e., defined as at least a 5-m strip where no produce was grown). Questionnaires were administered by a single interviewer (L.K.S.) and completed at the time of sample collection in a face-to-face interview, which lasted approximately 1 h. Data were coded from the questionnaires, entered into Excel (Microsoft, Redmond, WA), and imported into SAS 9.3 (SAS Institute Inc., Cary, NC).
Sample collection.Samples were collected as previously detailed by Strawn et al. (14). Briefly, latex gloves and disposable plastic boot covers (Nasco, Fort Atkinson, WI) disinfected with 70% ethanol were worn and changed for sample collection at each field. Five soil samples per field were taken using sterile scoops (Fisher Scientific, Hampton, NH) at least six inches (15.2 cm) below the surface (subsurface soil) and deposited in separate sterile Whirl-Pak bags (Nasco). A premoistened drag swab (30 ml of buffered peptone water [BPW] [Becton, Dickinson, Franklin Lakes, NJ] in a sterile Whirl-Pak bag), as previously described by Uesugi et al. (42), was dragged through the field (side to side in 10-m increments, around perimeter of field) for 10 min. Water samples (n = 74) were collected directly into sterile 250-ml jars; a sampling pole (Nasco) was used if necessary (i.e., for creeks and ponds). Surface water samples were taken a minimum of 2 m from the water's edge and 0.3 m below the surface. All samples were transported on ice, stored at 4 ± 2°C, and processed within 24 h of collection.
Sample preparation.Samples were prepared for two enrichment schemes to allow for separate isolation and detection of Salmonella and L. monocytogenes. Composite soil samples were prepared by combining 5-g portions of each of the five subsamples of soil collected in a field in duplicate. Both 25-g composite soil samples were deposited into sterile filter Whirl-Pak bags. Individual drag swab samples were combined with BPW, hand massaged for 2 min, and squeezed, and 10 ml of the liquid contents was aseptically transferred to each of two sterile filter Whirl-Pak bags. Water samples were tested according to Environmental Protection Agency (EPA) standard methods (43, 44). Briefly, each water sample collected (250 ml) was passed through a 0.45-μm filter unit (Nalgene, Rochester, NY). The filter was then aseptically removed and cut in half, and each portion was transferred to a separate sterile filter Whirl-Pak bag.
Salmonella and L. monocytogenes detection and isolation.Salmonella (45) and L. monocytogenes (46) detection and isolation were performed using modified versions of the procedures outlined in the U.S. Food and Drug Administration's Bacteriological analytical manual (FDA BAM). No quantification of Salmonella or L. monocytogenes was performed. Briefly, for Salmonella detection and isolation, samples were diluted 1:10 with tryptic soy broth (TSB) (Becton, Dickinson) and allowed to stand for 2 h at room temperature (23 ± 2°C). After incubation at 35 ± 2°C for an additional 24 h, two aliquots (1.0 and 0.1 ml) were transferred to 9 and 9.9 ml of tetrathionate (TT) (Oxoid, Cambridge, United Kingdom) and Rappaport-Vassiliadis (RV) (Oxoid) broths, respectively. Both selective enrichment broths were incubated at 42°C in a shaking water bath for 24 h. A 50-μl aliquot of TT and RV broths was plated onto xylose lysine deoxycholate agar (XLD) (Neogen, Lansing, MI) and CHROMagar Salmonella (CHROMagar Company, Paris, France), and incubated at 35 and 37 ± 2°C for 24 and 48 h, respectively. Up to four presumptive Salmonella colonies per selective enrichment and plating medium combination (e.g., TT-XLD and RV-XLD) were substreaked to brain heart infusion agar (BHI) (Becton, Dickinson) and incubated at 37 ± 2°C for 24 h. Presumptive Salmonella colonies were confirmed by a PCR assay that detects the gene invA (47). For L. monocytogenes, all samples were diluted 1:10 with buffered Listeria enrichment broth (BLEB) (Becton, Dickinson) and incubated at 30 ± 2°C for 24 h. Listeria selective enrichment supplement (Oxoid) was added to enrichments at 4 h. At 24 and 48 h, 50 μl of enrichment was streaked onto modified Oxford agar (MOX) (Becton, Dickinson) and L. monocytogenes plating medium (LMPM) (Biosynth International, Itasca, IL). MOX and LMPM plates were incubated for 48 h at 30 and 35 ± 2°C, respectively. Up to four L. monocytogenes presumptive colonies per plating medium and time combination (e.g., MOX at 24 h or LMPM at 48 h) were substreaked onto BHI. BHI plates were incubated for 37 ± 2°C for 24 h. Presumptive L. monocytogenes colonies were confirmed by PCR amplification and sequencing of the partial sigB gene (48–50). Controls were processed in parallel with each pathogen detection and isolation scheme. Salmonella strain ATCC 700408 (FSL F6-826) (51) and L. monocytogenes strain FSL R3-001 (52) were used as positive controls. Sterile enrichment media were used as negative controls.
Classification of isolates.There were four isolation schemes each for Salmonella (TT-XLD, RV-XLD, TT-Chrome, and RV-Chrome) and L. monocytogenes (LMPM 24 h, MOX 24 h, LMPM 48 h, and MOX 48 h); one isolate from each isolation scheme was used for subtyping (as detailed below), yielding up to four representative isolates per pathogen-positive sample. Representative isolates were streaked from frozen culture onto BHI and incubated at 37°C for 18 h, and a well-isolated colony was selected. Salmonella serotyping using the White-Kauffman-Le Minor scheme (53) was conducted by the Wadsworth Center, New York State Department of Health (Albany, NY). Nucleotide sequences of sigB from L. monocytogenes isolates were obtained by Sanger sequencing performed by the Cornell University Life Sciences Core Laboratories Center. Allelic types (ATs), as defined by a unique combination of polymorphisms (54, 55), were assigned by comparison of sigB sequences to an internal reference database.
Statistical analysis.Separate statistical analyses for Salmonella and L. monocytogenes were performed in SAS 9.3. An initial descriptive analysis was performed to calculate Salmonella and L. monocytogenes prevalence for all samples (n = 600) and each sample type collected: soil (n = 263), drag swab (n = 263), and water (n = 74). Univariate associations between pathogen-positive terrestrial samples and region and week sampled were determined using a chi-square test or Fisher's exact test (if the expected frequency in any cell was less than 5). Confidence intervals (95%) were calculated assuming a binominal distribution. Individual P values are reported for each test.
A field was used as the unit of analysis for model development to identify field-level risk factors associated with Salmonella and L. monocytogenes contamination in produce fields. A field was considered positive if either a soil or drag swab sample collected from that field was confirmed culture positive for the respective pathogen. Chi-square or Fisher's exact tests were computed for each of the 11 specific field questions (i.e., factors). Factors determined to be significant (P ≤ 0.05) were retained as candidates for subsequent multivariate analysis. The general linear mixed model (GLIMMIX) procedure was used to model the association between each candidate factor (univariate analysis) or factors (multivariate analysis) and the outcome (Salmonella or L. monocytogenes-positive/negative field). Fields within a farm were not independent; therefore, farm was included in the model (as a random effect). Effect estimates (β), standard errors (SEs), odds ratios (ORs), 95% confidence intervals (CIs), and P values were determined for each candidate factor. Potential colinearity among the candidate factors was evaluated by Spearman's rank correlation coefficient test. Multivariable models were built using a stepwise selection method and assessed by fit statistics, such as Akaike's information criteria and Schwarz's Bayesian criterion. The final model retained only variables that significantly improved the fit of the model (P ≤ 0.05). Interaction terms were also tested, but none were significant.
Isolate storage and data access.All isolates were preserved at −80°C in 15% glycerol. Isolate information and subtyping data from this study are archived and available through the Food Microbe Tracker database (http://www.foodmicrobetracker.com).
RESULTS
Salmonella and L. monocytogenes prevalence in terrestrial samples.The prevalence of Salmonella in terrestrial samples (n = 263 soil and n = 263 drag swab samples) was 3.4% (18/526). Salmonella prevalence was higher among soil samples (13/263) than among drag swab samples (5/263). Salmonella was detected in 6.1% of fields sampled (16/263). For two fields, both soil and drag swab samples were positive for Salmonella. No significant difference was observed in the Salmonella prevalence in soil and drag swab samples by region (P = 0.4 and 0.9, respectively) and week sampled (P = 0.9 and 0.6, respectively). Furthermore, no significant difference was observed for the field-level prevalence of Salmonella by region (P = 0.8) and week sampled (P = 0.9).
The prevalence of L. monocytogenes in terrestrial samples (n = 263 soil and n = 263 drag swab samples) was 9.7% (51/526). L. monocytogenes prevalence in soil and drag swab samples was 11% (30/263) and 8% (21/263), respectively. L. monocytogenes was detected in 46 of the 263 fields sampled (17.5%). Five fields had both soil and drag swab samples that were positive for L. monocytogenes. No significant difference was found in the L. monocytogenes prevalence in soil and drag swab samples by region (P = 0.2 and 0.3, respectively) and week sampled (P = 0.7 and 0.2, respectively). In addition, no significant difference was found for the field-level prevalence of L. monocytogenes by region (P = 0.9) and week sampled (P = 0.1).
Salmonella and L. monocytogenes prevalence in water samples.The prevalence of Salmonella and L. monocytogenes in water samples was 11% (8/74) and 30% (22/74), respectively. Samples were collected from irrigation (n = 23) and nonirrigation (within 50 m of a sampled field; n = 51) water sources (Table 1).
Salmonella and L. monocytogenes prevalence in water samples collected from irrigation and nonirrigation water sources
The prevalence of Salmonella and L. monocytogenes in water samples used for irrigation was 4% (1/23) and 9% (2/23), respectively. Fourteen of the samples collected from irrigation sources were obtained from engineered water sources (e.g., well or municipal), which were of potable water quality; all of these samples were negative for Salmonella and L. monocytogenes. The remaining nine water samples were from surface water sources (1 creek and 8 pond samples); three samples from ponds used for field irrigation tested positive for Salmonella (1 sample) and L. monocytogenes (two samples). All fields using these irrigation water sources were negative for the presence of Salmonella and L. monocytogenes (Table 1).
Salmonella and L. monocytogenes were detected in 14% (7/51) and 39% (20/51), respectively, of water samples obtained from nonirrigation sources and within 50 m of a sampled field. Water samples were collected from three source types: ponds (n = 17), roadside or field buffer ditches (n = 13), and flowing surface water (e.g., rivers, creeks, or streams) (n = 21). The prevalence of Salmonella was higher in roadside or field buffer ditch samples (23%; 3/13), than in pond (12%; 2/17) and flowing surface water (10%; 2/21) samples. The prevalence of L. monocytogenes was highest in pond samples (59%; 10/17), compared to roadside or field buffer ditch (39%, 5/13) and flowing surface water (24%, 5/21) samples.
Characterization of Salmonella and L. monocytogenes isolated from terrestrial and water samples.Serotyping was performed on one representative Salmonella isolate per isolation scheme, which yielded 35 Salmonella isolates from the 26 positive samples. Three of the 26 samples yielded isolates with more than one serotype. Salmonella enterica serotypes Give and Typhimurium were isolated from a single water sample (isolation schemes TT-Chrome and TT-XLD, respectively), S. enterica serotypes Agona and Tennessee were isolated from a drag swab sample (isolation schemes RV-Chrome and RV-XLD), and S. enterica serotypes Senftenburg and Newport were isolated from a soil sample (isolation schemes RV-Chrome and RV-XLD, respectively). The remaining 23 Salmonella-positive samples represented one serotype. These isolates were identified as S. enterica serotypes Newport (8 samples), Cerro (5 samples), Thompson (5 samples), Agona (2 samples), IV 40:z4,z32:− (2 samples), and Give (1 sample). For the two fields where Salmonella was isolated in both soil and drag swab samples, the same serotype (S. Cerro) was isolated in both sample types from one field, whereas different serotypes (S. Thompson and S. Cerro) were isolated in the soil and drag swab samples from the other field.
Two-hundred sixteen L. monocytogenes isolates (one isolate per isolation scheme) were subtyped based on alignment of sigB nucleotide sequences. None of the four isolation schemes yielded different subtypes for any sample. The 73 representative L. monocytogenes isolates (from the 73 L. monocytogenes-positive samples) yielded nine different allelic types that represented L. monocytogenes lineage I (29 isolates, 5 ATs), II (41 isolates, 3 ATs), and IIIa (3 isolates, 1 AT). L. monocytogenes was detected in both soil and drag swab samples for five fields. The same subtype was identified in soil and drag swab samples in two fields (AT 57 and AT 59), whereas different subtypes (ATs 57 and 61, ATs 78 and 137, and ATs 57 and 58) were isolated in the soil and drag swab samples from three fields.
Risk factors associated with Salmonella contamination of produce fields.Three of the 11 field management practices evaluated (manure application, soil cultivation, and buffer zone) were significantly associated with a Salmonella-positive field by univariate analysis (Table 2). Fields where manure was applied within a year prior to sample collection had higher odds of Salmonella isolation (OR = 19.0; 95% CI = 4.9, 77.0) than fields where manure had not been applied. Fields where soil was cultivated within 7 days prior to sample collection were approximately 6 times more likely (OR = 6.3; 95% CI = 1.6, 23.0) to be Salmonella positive than fields where soil was not cultivated for at least 30 days. The presence of a buffer zone was shown to have a protective effect and reduced the likelihood of a Salmonella-positive field by 5 times (OR = 0.2; 95% CI = 0.1, 0.5) (Table 2).
Univariate analyses of management practices that influence the likelihood of Salmonella being detected in a produce field (based on testing of soil and drag swab samples)
Examination of Spearman's rank correlation coefficients for the three retained candidate factors from the univariate analysis showed a correlation between application of manure and soil cultivation of a field. Therefore, three multivariable models were evaluated: model 1, manure application, soil cultivation, and buffer zone; model 2, manure application and buffer zone; and model 3, soil cultivation and buffer zone. In the multivariate model with the best fit (i.e., model 2) (Table 3), application of manure to a field within a year prior to sample collection was associated with a higher likelihood of Salmonella being detected in a field (OR = 16.7; 95% CI = 3.0, 94.4) than for fields where manure had not been applied. The presence of a buffer zone was associated with a lower likelihood of Salmonella being detected in a field (OR = 0.1; 95% CI = 0.03, 0.6) than absence of a buffer zone (Table 3).
Multivariate final modela of risk factors that influence the likelihood of Salmonella being detected in a produce field (based on testing of soil and drag swab samples)
Risk factors associated with L. monocytogenes contamination in produce fields.Six of the 11 field management practices (manure application, reporting of wildlife, worker activity, irrigation, soil cultivation, and reporting of a buffer zone) were significantly associated with an L. monocytogenes-positive field by univariate analysis (Table 4); five of these six factors were time dependent. Fields where manure was applied within a year prior to sample collection had 7 times higher odds of L. monocytogenes isolation (OR = 7.0; 95% CI = 3.1, 15.4) than fields where manure had not been applied. Fields where growers reported observation of wildlife within 3 days prior to sample collection had higher odds of L. monocytogenes isolation (OR = 4.4; 95% CI = 1.2, 15.6) than fields where growers did not report observation of wildlife for at least 7 days. Fields where soil was cultivated within 7 days prior to sample collection were approximately 8 times more likely (OR = 8.1; 95% CI = 3.3, 19.6) to be L. monocytogenes positive than fields where soil was not cultivated for at least 30 days. Fields with recent worker activity (within 3 days prior to sample collection) had 10.5 times higher odds of L. monocytogenes isolation (OR = 10.5; 95% CI = 2.3, 47.5) than fields where workers had been absent for longer than 30 days. A number of other worker-related factors did not show significant associations with L. monocytogenes contamination, including delivery of food safety training (in the native language), presence of portable toilets and handwashing stations (within a quarter-mile of fields), frequency of cleaning toilets, and posting of signs advocating food safety and or sanitation best practices in changing areas; for most of these factors, a high level of compliance with “best practices” was reported (e.g., all farms reported cleaning toilets at least once a week). Fields irrigated within 3 days prior to sample collection had nearly 5.5 times higher odds of L. monocytogenes isolation (OR = 5.3; 95% CI = 2.4, 12.0) than fields irrigated at least 14 days before. Furthermore, no significant difference was observed in L. monocytogenes-positive fields for irrigation type (overhead versus drip). Lastly, the presence of a buffer zone was shown to have a protective effect and reduced the likelihood of an L. monocytogenes-positive field (OR = 0.5; 95% CI = 0.2, 0.9) (Table 4).
Univariate analyses of management practices that influence the likelihood of L. monocytogenes being detected in a produce field (based on testing of soil and drag swab samples)
Correlation between the six factors retained by univariate analysis was evaluated using Spearman's rank correlation coefficients. Similar to the findings for Salmonella, a correlation was observed between manure application and soil cultivation of a field. The three multivariable models evaluated were the following: model 1, manure application, reported observation of wildlife, worker activity, irrigation, soil cultivation, and buffer zone; model 2, manure application, reported observation of wildlife, worker activity, irrigation, and buffer zone; and model 3, reported observation of wildlife, worker activity, irrigation, soil cultivation, and buffer zone. The multivariate model with the best fit was model 3 (Table 5). In this model, reported observation of wildlife in a field (OR = 6.1; 95% CI = 1.3, 28.4) and irrigation of a field (OR = 6.0; 95% CI = 2.0, 18.1) within 3 days prior to sample collection were associated with higher odds of L. monocytogenes isolation. Fields where soil was cultivated within 7 days prior to sample collection were nearly 3 times more likely to be L. monocytogenes positive than fields where soil was cultivated at least 30 days before (OR = 2.9; 95% CI = 1.1, 8.6) (Table 5).
Multivariate final modela of risk factors that influence the likelihood of L. monocytogenes being detected in a produce field (based on testing of soil and drag swab samples)
DISCUSSION
Our study reported here is one of the first to quantitatively identify management practices that are associated with an increased or decreased likelihood of Salmonella and L. monocytogenes isolation in produce fields. In a univariate analysis, six factors (manure application, reported observation of wildlife, worker activity, irrigation, soil cultivation, and buffer zone presence) were identified as significant risk factors for Salmonella or L. monocytogenes contamination. Five of the six risk factors were influenced by time of application to fields, suggesting that adjustments to current practices may reduce the potential for produce contamination with minimal costs to growers.
Some risk factors influence the likelihood of isolation of both Salmonella and L. monocytogenes in fields.Based on the separate univariate analysis of the Salmonella and L. monocytogenes data, we identified three risk factors that significantly affected the likelihood of both Salmonella and L. monocytogenes detection. As adjustments of management factors related to these risk factors have the potential to reduce contamination with both of these key pathogens, these three factors are discussed below.
Our data specifically showed that recent cultivation of fields (i.e., within 7 days of sample collection) was significantly associated with an increased likelihood of both Salmonella and L. monocytogenes isolation from fields. Soil cultivation was also found to be a significant risk factor in the final multivariate model for L. monocytogenes isolation. A likely explanation for these findings is that pathogens present in the subsurface soil are exposed to the surface when soil is cultivated, making them more likely to be detected and possibly also more likely to contaminate produce. Furthermore, the likelihood of pathogen isolation will decrease over time after cultivation, due to exposure to environmental conditions (e.g., UV light) that reduce pathogen loads. This model is supported by previous studies (14, 23, 42) that have shown the presence and persistence of Salmonella and L. monocytogenes in subsurface soil. For example, Salmonella was detected in 2.6% and 2.0% of soil samples collected from produce-growing regions in California and New York State, respectively, while L. monocytogenes prevalence in soil was 9% in New York State preharvest environments (14, 23). Interestingly, Park et al. (39) observed that spinach contamination with generic E. coli was less likely when a field was cultivated prior to the growing season; this may reflect that cultivation at time points considerably before sampling (e.g., >7 days before) will reduce overall pathogen loads by exposing pathogens present in the subsurface soil to UV light and other inactivating conditions (e.g., desiccation). This hypothesis is supported by the observation that Salmonella and Listeria numbers in inoculated livestock waste declined more rapidly when this material was spread on the surface of soil than when it was incorporated into the soil, where it would be protected from exposure to environmental conditions (e.g., UV light or harsh temperatures) (56).
Application of manure was also identified as a significant factor that increased the odds of both Salmonella and L. monocytogenes isolation in fields. Numerous studies (37, 39, 41, 57–64) have demonstrated that the application of manure to soils can introduce pathogens and may facilitate long-term persistence of pathogens in soil. One study observed Salmonella to persist in manure, manure-amended nonsterilized soil, and manure-amended sterilized soil for 184, 332, and 405 days, respectively (60). However, the association between pathogen contamination of fields and manure application has not been previously described for commercial produce farms. Some studies (38, 39) have investigated the association between generic E. coli contamination of preharvest produce samples and application of manure to fields. One study observed that generic E. coli contamination was lower in spinach samples collected over a 2-year period if the application of manure occurred greater than 200 days prior to sample collection (39), while another study observed that E. coli prevalence in produce samples collected preharvest was not affected by the application of manure between 90 and 120 days prior to sampling (38). Our results suggest that application of manure to fields can significantly influence the risk of both Salmonella and L. monocytogenes contamination; therefore, management of manure before application is essential. Manure management practices, such as aging, treating, and handling of manure before application have been shown to affect the survival of food-borne pathogens in manure (38, 56, 65). For example, one study (65) showed that composting cow manure before application was effective at killing Salmonella, supporting that management of manure before application to fields may limit or reduce the risks associated with manure use in produce preharvest environments.
In addition, the likelihood of Salmonella and L. monocytogenes isolation in fields was significantly decreased if growers reported the presence of a buffer zone, defined as a zone of at least 5 m separating the edge of produce fields from potential environmental pathogen reservoirs (e.g., forests, roads, waterways, livestock operations). These data suggest that even buffer zones narrower than the 10 m (30 ft) recommended in the 2012 version of the LGMA (Table 6 in reference 66) are associated with reduced pathogen prevalence. Surprisingly, there is little science-based research to support the hypothesis that the presence of a buffer zone is associated with decreased pathogen prevalence in preharvest environments. Therefore, in our study we formally tested the hypothesis that the presence of a buffer zone is associated with decreased pathogen (i.e., Salmonella and L. monocytogenes) prevalence in produce fields. Some previous studies (67–69) suggest that vegetative buffer zones may be effective in reducing bacterial pathogen loads in sewage runoff and wastewater from animal facilities. Vegetative buffer zones and nonagricultural lands adjacent to produce fields (e.g., riparian, wetlands, or grasslands) also offer a variety of ecological benefits (16, 17, 69, 70). Combined, these data suggest that the effects of buffer zones and nonagricultural lands adjacent to produce fields on pathogen prevalence may be driven by complex ecological interactions that will require further field studies that include mathematical modeling efforts. These research efforts will also need to define the effects of different types of buffer zones (i.e., bare strips or specific vegetation) and the quantitative relationship between buffer zone width and type and pathogen reduction.
Some risk factors specifically increase the likelihood of isolation of L. monocytogenes in fields.While some risk factors increased or reduced the likelihood of both Salmonella and L. monocytogenes, others (worker activity, reported wildlife observation, and irrigation) were identified to increase solely the likelihood of L. monocytogenes detection in fields. Worker activity was significantly associated with an increased likelihood of L. monocytogenes isolation in fields by univariate analysis but was not significant in the multivariate analysis. However, reported observation of wildlife and irrigation of fields were significantly associated with higher odds of L. monocytogenes isolation by multivariate analysis and are discussed below.
Reported observation of wildlife was based on visual confirmation (i.e., sighting of wildlife in a field) by the grower or his/her staff (e.g., field supervisor). We acknowledge that growers who have their farms and food safety programs (e.g., GAPs) frequently audited may be less inclined to report presence of wildlife because they are aware of the risks associated with wildlife in fields, while growers who have their farms and food safety programs infrequently audited may be more forthcoming to report presence of wildlife. Future studies may choose to measure the impact of wildlife and potential pathogen contamination by objective measures (e.g., the use of infrared cameras to detect wildlife in fields). Our study does provide quantitative data to support previous studies (2, 12, 19, 25, 32) that suggested that wildlife may be a source of pathogen contamination in fields. Furthermore, wildlife has also been suspected as the source of pathogen contamination in a number of produce-associated outbreaks (13, 70, 71). While reported observation of wildlife was shown to be a risk factor increasing the likelihood of L. monocytogenes isolation in fields, this finding may be site specific to New York State or parts of New York State; Langholz and Jay-Russell have discussed that pathogen prevalence in wildlife may be dependent on geographic location and local landscape characteristics (70).
Recent irrigation was also shown to significantly increase the odds of L. monocytogenes isolation in fields. Water has been identified as a major reservoir for pathogens and irrigation as a vehicle for transmission of pathogens to fields and produce (12, 30, 41, 72–75). L. monocytogenes is often found in various water sources, with reported prevalence from <1% to 29% (14, 76, 77). We also observed here a high prevalence of L. monocytogenes in water, particularly surface water sources (e.g., ponds). Steele and Odumeru (72) that observed surface water had the most variable microbial quality and, if contaminated, could lead to widespread contamination of crops. Our findings suggest that detection of L. monocytogenes in fields was more likely only if irrigation occurred within a few days prior to sample collection. Two studies have also shown an association between pathogen detection and time of irrigation or water application. One study observed that Salmonella sprayed on tomatoes was not able to be recovered from the tomatoes after 2 days (78). The second study observed that the risk of E. coli contamination in spinach samples decreased when irrigation in a field occurred >5 days prior to sample collection. In addition to L. monocytogenes introduction with irrigation water, the association of irrigation with an increased frequency of L. monocytogenes detection may also reflect the fact that moist soils may facilitate L. monocytogenes growth or detection, consistent with previous studies that reported a higher L. monocytogenes prevalence in moist soils (14, 79). Overall, our data suggest that avoiding irrigation at least 3 days before harvest (if possible and feasible) may reduce potential L. monocytogenes contamination to produce, and possibly the transfer into packinghouses, from soil in the fields.
Conclusions.This study provides quantitative data on management practices that represent potential risk factors for produce field contamination. A majority of research previously conducted to investigate these risk factors has been pathogen inoculation based or targeted the presence of indicator organisms (i.e., generic E. coli). Such studies are commonly employed because the prevalence of food-borne pathogens (Salmonella and Shiga toxin-producing E. coli) in produce production environments is low. Statistically robust analyses are difficult to conduct unless a sufficient number of pathogen-positive samples are obtained, and this generally requires an extremely large sample size. Large sample sizes in environmental field studies are often difficult to achieve due to considerable labor and financial costs and difficulties gaining access to commercial operations. We focused on only 11 key management practices previously discussed as risk factors for preharvest contamination, limited the number of levels within each factor, and opted for a statistical procedure to deal with farm as a confounder in order to prevent bias and misinterpretation of results (e.g., spurious relationships). This study was conducted in New York State, and thus the risk factors identified may not always be appropriate in other produce-growing regions in the United States or elsewhere. Additionally, fields were sampled over a 5-week period in June and July, and as a result, risk factors identified may not be applicable to other time periods (e.g., late in the growing season). Despite some limitations, this study is one of the first to use field-collected data to provide quantitative data on management practices associated with detection of Salmonella and L. monocytogenes (two food-borne pathogens of concern to the produce industry). These findings will assist growers in (i) evaluating their current on-farm food safety plans (e.g., GAPs), (ii) implementing preventive controls that reduce the risk of preharvest contamination, and (iii) making more informed decisions related to field practices prior to harvest.
ACKNOWLEDGMENTS
This project was supported by the Agriculture and Food Research Initiative Competitive Grant no. 2012-67011-19875 from the USDA National Institute of Food and Agriculture.
We are grateful for the technical assistance of Maureen Gunderson and the editing assistance of Travis Chapin and Gina Ryan.
FOOTNOTES
- Received 21 August 2013.
- Accepted 24 September 2013.
- Accepted manuscript posted online 27 September 2013.
Supplemental material for this article may be found at http://dx.doi.org/10.1128/AEM.02831-13.
- Copyright © 2013, American Society for Microbiology. All Rights Reserved.
REFERENCES
- 1.↵
- 2.↵
- 3.↵
- 4.↵
- 5.↵
- 6.↵
- 7.↵
- 8.↵
- 9.↵
- 10.↵
- 11.↵
- 12.↵
- 13.↵
- 14.↵
- 15.↵
- 16.↵
- 17.↵
- 18.↵
- 19.↵
- 20.↵
- 21.↵
- 22.↵
- 23.↵
- 24.↵
- 25.↵
- 26.↵
- 27.↵
- 28.↵
- 29.↵
- 30.↵
- 31.↵
- 32.↵
- 33.↵
- 34.↵
- 35.↵
- 36.↵
- 37.↵
- 38.↵
- 39.↵
- 40.↵
- 41.↵
- 42.↵
- 43.↵
- 44.↵
- 45.↵
- 46.↵
- 47.↵
- 48.↵
- 49.↵
- 50.↵
- 51.↵
- 52.↵
- 53.↵
- 54.↵
- 55.↵
- 56.↵
- 57.↵
- 58.↵
- 59.↵
- 60.↵
- 61.↵
- 62.↵
- 63.↵
- 64.↵
- 65.↵
- 66.↵
- 67.↵
- 68.↵
- 69.↵
- 70.↵
- 71.↵
- 72.↵
- 73.↵
- 74.↵
- 75.↵
- 76.↵
- 77.↵
- 78.↵
- 79.↵