We investigated if salt tolerance can be inferred from observable cues based on a woody species’ native habitat and leaf traits. Such inferences could improve species selection for urban landscapes constrained by soils irrigated with reclaimed water. We studied the C3 tree species Acer grandidentatum Nutt. (canyon maple; xeric-non-saline habitat) that was hypothesized to have some degree of salt tolerance based on its semiarid but non-saline native habitat. We compared it with A. macrophyllum Pursh. (bigleaf maple) from mesic/riparian-non-saline habitats with much larger leaves and Eucalyptus camaldulensis Dehnh. (eucalyptus/red gum) from mesic-saline habitats with schlerophyllous evergreen leaves. Five levels of increasing salt concentrations (non-saline control to 12 dS·m−1) were applied over 5 weeks to container-grown seedling trees in two separate studies, one in summer and the other in fall. We monitored leaf damage, gas exchange, and hydric behavior as measures of tree performance for 3 weeks after target salinity levels were reached. Eucalyptus was the most salt-tolerant among the species. At all elevated salinity levels, eucalyptus excluded salt from its root zone, unlike either maple species. Eucalyptus maintained intact, undamaged leaves with no effect on photosynthesis but with minor reductions in stomatal conductance (g S). Conversely, bigleaf maple suffered increasing leaf damage, nearly defoliated at the highest levels, with decreasing gas exchange as salt concentration increased. Canyon maple leaves were not damaged and gas exchange was minimally affected at 3 dS·m−1 but showed increasing damage at higher salt concentration. Salt-tolerant eucalyptus and riparian bigleaf maple framed canyon maple’s moderate salt tolerance up to 3 dS·m−1 that appears related to seasonal soil drying in its semiarid native habitat. These results highlight the potential to infer a degree of salt tolerance from either native habitat or known drought tolerance in selecting plant species for urban landscapes limited by soil salinity or brackish irrigation water. Observable cues such as xeri-morphic leaf traits may also provide visual evidence of salt tolerance.
Nisa Leksungnoen, Roger K. Kjelgren, Richard C. Beeson Jr., Paul G. Johnson, Grant E. Cardon, and Austin Hawks
Shane R. Evans, Kelly Kopp, Paul G. Johnson, Bryan G. Hopkins, Xin Dai, and Candace Schaible
Recent advances in irrigation technologies have led many states to incentivize homeowners to purchase United States Environmental Protection Agency WaterSense-labeled, smart irrigation controllers. However, previous research of smart controllers has shown that their use may still result in excess water application when compared with controllers manually programmed to replace actual water loss. This study compared kentucky bluegrass (Poa pratensis) irrigation applications using three smart irrigation controllers, a conventional irrigation controller programmed according to Cooperative Extension recommendations, and the average irrigation rate of area homeowners in Utah during 2018 and 2019. Of all the controllers tested, the manually programmed controller applied water at amounts closest to the actual evapotranspiration rates; however, smart controllers applied from 30% to 63% less water than area homeowners, depending on the controller and year of the study. Kentucky bluegrass health and quality indicators—percent green cover and normalized difference vegetation indices—varied between years of the study and were lower than acceptable levels on several occasions in 2019 for three of the four controllers tested. Compared with the results of similar studies, these findings suggest that the effects of smart irrigation controllers on turfgrass health and quality may vary by location and over time.