Study Shows How Reclaimed Irrigation Water Treatments Influence AMR Bacteria Transfer to Crops

A new controlled-environment study has demonstrated how the treatment and quality of wastewater reused for irrigation can greatly affect the transfer of antibiotic-resistant bacteria and antimicrobial resistance (AMR) genes to crops, highlighting both the risks of applying insufficiently treated wastewater to produce and the effectiveness of advanced water treatment technologies.
The study compared potable water (as the control), secondary-treated wastewater, and tertiary-treated reclaimed water to evaluate their potential to introduce Escherichia coli, extended-spectrum β-lactamase (ESBL)-producing E. coli, and key resistance genes into baby lettuce throughout an entire growth cycle.
Secondary-treated reclaimed wastewater was collected from a treatment plant located in Murica, Spain which conducted the following treatment steps:
- Aeration, solids and suspended solids separation, grit removal, and degreasing
- A double stage activated sludge process with coagulation/flocculation and lamella clarification
- Sand filtration and UV-C disinfection.
Tertiary-treated water underwent further chlorine treatment.
Overall, the researchers found that tertiary treatment significantly reduces AMR-related food safety risks, while secondary-treated water remains a potential source of resistant bacteria contamination.
Secondary-treated wastewater consistently exhibited detectable E. coli and ESBL-producing E. coli, as well as the highest absolute and relative abundances of four priority resistance genes. These findings reinforce earlier studies showing that biological (secondary) treatment alone cannot fully remove AMR bacteria or resistance genes and that such effluents may function as reservoirs of AMR factors, even after substantial bacterial reduction.
In contrast, tertiary-treated reclaimed water kept E. coli and ESBL-producing E. coli levels below detection limits, matching potable water performance. Low levels of resistance genes were still detectable in tertiary-treated water, but their concentrations were greatly reduced compared to secondary effluent. This finding aligns with growing evidence that advanced treatment technologies such as disinfection, filtration, and multi-stage processes significantly suppress resistome profiles in reclaimed water.
Importantly, lettuce irrigated with tertiary-treated water exhibited no significant differences in the prevalence of AMR genes when compared with lettuce irrigated with potable water.
Looking for quick answers on food safety topics?
Try Ask FSM, our new smart AI search tool.
Ask FSM →
Interestingly, despite substantial resistance gene loads in treated wastewater, concentrations in lettuce were only 4–6 percent of the levels in the corresponding irrigation water. This limited transfer supports several recent studies showing that plant surface characteristics, microbial competition, and UV exposure may constrain bacterial colonization; and that AMR gene entry into edible tissues is highly variable and often low, even when the irrigation water contains resistance determinants.
Still, while transfer was limited, secondary effluent can still introduce resistance determinants into leafy greens. The study supports reclaimed irrigation water with clear microbial criteria and the use of tertiary or multi-hurdle treatments to mitigate AMR risks.
The study, published in Frontiers in Microbiology, was conducted by researchers with the University of Porto, Portugal and the Spanish National Research Council.









