Search Results
Abstract
Preharvest field temperatures can influence composition and quality of vegetables at harvest as well as their postharvest responses. Temperatures which injure or weaken the tissues prior to harvest will reduce storage life and increase susceptibility to decay.
Freezing temperatures in the field can greatly reduce storage life. In addition, many vegetables suffer injury when exposed for certain periods of time to temperatures above freezing but below about 10°C. This injury, termed chilling injury, is most often associated with vegetables of tropical and subtropical origin; however, some temperate-zone vegetables can be injured by low, but non-freezing, temperatures.
High field temperatures can result in physiological disorders and increased deterioration. High field temperatures can induce injury visible at harvest, such as sunburn or sunscald; however, serious problems can also occur during storage and handling following injury which was not visible at harvest.
Abstract
Succinic acid-2,2-dimethylhydrazide (daminozide) applied at 1800 ppm to 10-year-old ‘Babygold-5’ and ‘Babygold-8’ peach trees (Prunus persica (L.) Batsch) at pit hardening advanced fruit maturity and increased tolerance of low soluble solids fruit by improving flesh color. Fruit softening, the disappearance of flesh chlorophyll, and reductions in puree viscosity were the primary benefits obtained from the application of daminozide. Daminozide advanced maturation generally, rather than concentrating the fruit into a more uniform maturity class. Flesh color and acidity were highly correlated with loss of fruit firmness of both control and daminozide-treated fruit, but daminozide treated fruit improved in flesh color more rapidly than control fruit as firmness decreased. Acidity remained consistently higher for treated fruit at all firmness levels. The mechanism by which daminozide advanced maturity did not appear to be active during postharvest holding of these clingstone cultivars. Further ripening as a result of postharvest storage of both treated and control fruit occurred at about the same rate during a 2, 4 or 6 day period at 18°C.
Abstract
Anatomical studies of 3 physiological disorders of lettuce (Lactuca sativa L., cv. Climax) i.e., russet spotting (RS), brown stain (BNS), and rusty brown discoloration (RBD) showed that these disorders differ in their location and sequence of symptom development. RS produces well-defined, localized, spot-like lesions that may start either in the epidermis or in the mesophyll. In advanced stages of RS, the vascular tissue may also show discoloration and the mesophyll cells collapse, resulting in pit-like depressions. BNS lesions are restricted to the epidermis and may cover a large area. All the cells in a given lesion become discolored, with greater discoloration intensity at the periphery rather than the center of the lesion. RBD symptoms start in the epidermis and in advanced stages expand to mesophyll cells. RBD affected areas exhibit a mosaic-like pattern of discoloration.
Research was conducted to assess the response of tall fescue (Festuca arundinacea Schreb.) to water deficit conditions. Different leaching fractions (LF = drainage volume/irrigation volume) and irrigation frequencies (IF) were imposed over a 119-day summer period in Las Vegas, Nevada, followed by a 71-day recovery period. Plots of tall fescue contained 120 cm deep × 51 cm diameter draining lysimeters. Irrigations were based on an evapotranspiration (ET) feedback system to establish LFs of +0.15, 0.00, -0.15, -0.25, and -0.40. Plots were irrigated on a daily or twice per week schedule. N was applied to subplots at a rate of 0, 12.2, or 24.4 kg·ha-1 per month. As LF decreased, relative soil water in storage declined in a linear fashion (r 2 = 0.97, P = 0.001). Storage depletions for the four lowest LFs at the end of 119 days of imposed water deficits were about 15%, 40%, 60%, and 70% compared to the +0.15 LF treatment. Canopy temperature, soil matric potential (Ψm), leaf xylem water potential (ΨLX), leaf stomatal conductance (gs), clipping yield, color and cover ratings all statistically separated (P < 0.05) based on LF but not on IF. However, irrigation amount (I), ET, tissue moisture content and total Kjeldahl N (TKN) separated based on LF and IF with a significant LF by IF interaction for I (P < 0.05) and TKN (P < 0.001). An irrigation savings of 60.4 cm was realized during the 119-day water deficit period at the -0.40 LF. However, at the lower LFs, plant stress increased (all parameters) with color ratings declining below an acceptable value of 8.0. An Irrigation/Potential ET (I/ETo) threshold of 0.80 was determined for both color and cover. After a 71-day recovery period both color and cover returned to pre experimental values at the two higher N rates. Results of this experiment indicate that implementing a twice weekly irrigation strategy at a -0.15 LF on tall fescue during summer months in an arid environment would lead to savings of 37.5 cm of water while still maintaining acceptable color and cover ratings.
A study was conducted to evaluate the effect of banding or broadcasting fertilizer on yield and quality of turnip (Brassica rapa L. Rapifera group), sweetcorn (Zea mays var. rugosa Bonaf), and cabbage (Brassica oleracea L. Capitata group). Preplant fertilizer was applied broadcast prior to bedding, broadcast after bedding, or banded after bedding. Sidedress applications were broadcast or banded on the beds. Strong visual differences were noticed early in the season in the spring turnip crop with the growth in the broadcast-then-bed treatment appearing superior. The yield at first harvest and total yield were lower for turnip growth with the bed-and-broadcast treatments. No differences in yield of cabbage and sweetcorn resulted from the treatments. Few differences in turnip stem to leaf ratio were noted due to fertilizer treatment. Few differences in yield due to sidedress method were noted with any of the crops. Since broadcasting can be done with a faster, wider applicator, growers could reduce costs by broadcasting fertilizer and obtain yields that are at least equivalent to the yields from banding.
A 2-year study was conducted to quantify the actual evapotranspiration (ETa) of three woody ornamental trees placed under three different leaching fractions (LFs). Argentine mesquite (Prosopis alba Grisebach), desert willow [Chilopsis linearis (Cav.) Sweet var. linearis], and southern live oak (Quercus virginiana Mill.) (nursery seedling selection) were planted as 3.8-, 18.9-, or 56.8-liter container nursery stock outdoors in 190-liter plastic lysimeters in which weekly hydrologic balances were maintained. Weekly storage changes were measured with a portable hoist-load cell apparatus. Irrigations were applied to maintain LFs of +0.25, 0.00, or -0.25 (theoretical) based on the equation irrigation (I) = ETa/(1 - LF). Tree height, trunk diameter, canopy volume, leaf area index, total leaf area (oak only) and dry weight were monitored during the experiment or measured at final harvest. Average yearly ETa was significantly influenced by planting size (oak and willow, P ≤ 0.001) and leaching fraction imposed (P ≤ 0.001). Multiple regressions accounting for the variability in average yearly ETa were comprised of different growth and water management variables depending on the species. LF, trunk diameter, and canopy volume accounted for 92% (P ≤ 0.001) of the variability in the average yearly ETa of oak. Monthly ETa data were also evaluated, with multiple regressions based on data from nonwater-deficit trees, such that LF could be ignored. In the case of desert willow, monthly potential ET and trunk diameter accounted for 88% (P ≤ 0.001) of the variability in the monthly ETa. Results suggest that irrigators could apply water to arid urban landscapes more efficiently if irrigations were scheduled based on such information.
When comparing states with population percentages residing in major cities, Nevada is considered the third most urban state in the nation. It also has the distinction of being the driest, with less than 4 inches of precipitation annually in the Las Vegas Valley. Nevada is using 280,000 acre-feet of water from its 300,000 acre-feet allotment from the Colorado River annually. Approximately 60% of this is used for urban landscaping. With average water use at >300 gallons per person per day in the past, Las Vegans have been criticized as “water-wasters.” Rising water prices and an active research and extension education program begun in 1985 and supported by the local water utility has helped to contribute to changing water use patterns and a reduction in water use. Research, educational programs for commercial landscapers, and home horticulture programs conducted through Master Gardeners have helped to reduce water use in the Las Vegas Valley while providing information on sound horticultural practices.
A study was conducted to evaluate the effect of banding or broadcasting fertilizer on yield and quality of turnip (Brassica rapa L. Rapifera group), sweetcorn (Zea mays var. rugosa Bonaf.), and cabbage (Brassica oleracea L. Capitata group). Preplant fertilizer was applied broadcast before bedding, broadcast after bedding, or banded after bedding. Sidedress applications were broadcast or banded on the beds. Differences in plant size and vigor were noticed early in the season in the spring turnip crop, with the growth in the broadcast-and-bed treatment appearing superior. The yield at first harvest and total yield were lower for turnip grown with the bed-and-broadcast treatment. No differences in yield of cabbage or sweetcorn resulted from the treatments. Few differences in turnip stem-to-leaf ratio were noted due to fertilizer treatment. Few differences in yield due to sidedress method were noted with any of the crops. Analysis of soil samples in a grid pattern across the beds showed that the location of the fertilizer after the broadcast-and-bed treatment was similar to the placement of the banded fertilizer. Since broadcasting can be done with a faster, wider applicator, growers could reduce costs by broadcasting fertilizer and obtain yields that are at least equivalent to the yields obtained by banding the fertilizer.
Golf course superintendents in the southwestern United States (Tucson, Ariz.; Phoenix, Ariz.; Las Vegas, Nev.; Orange County, Calif.) were surveyed to assess attitudes toward using reuse water for irrigation. Eighty-nine golf course personnel returned the survey, with 28% indicating that they irrigate with municipal water, 36% with well water, and 27% with reuse water. The reason for switching to reuse water varied by state, with 40% of respondents switching in Arizona because of mandates, 47% switching in Nevada because of cost incentives, and 47% switching in California because it was considered a more reliable source of water. Less than 20% of the respondents rated the use of reuse water on golf courses and parks to have a negative impact on cost, the environment and health. However, respondents indicated that using reuse water does have a negative impact on the operations of the golf course, with pond maintenance and irrigation maintenance having the highest negative impact (∼80%). Multiple regression analysis revealed that among those who indicated that using reuse water would have a negative impact on golf course management, a higher percentage were individuals who had a greater number of years of experience irrigating with reuse water (P = 0.01) and individuals who have taken classes on how to use reuse water (P = 0.05). Respondents who currently irrigate with reuse water indicated they had changed a wide range of landscape and turfgrass management practices as a result of using reuse water. Based on the results of this survey, it was concluded that golf course personnel in the southwestern U.S. do not oppose the transition to reuse water for irrigation. However, it was also clear they recognize using such water negatively impacts their golf courses' operations.
Population growth and water limitations in the southwestern United States have led to golf courses in many communities to be encouraged or mandated to transition to reuse water for irrigation purposes. A monitoring program was conducted on nine golf courses in the Las Vegas valley, NV, for 4.5 years to assess the impact of reuse water on soil–turfgrass systems {bermudagrass [Cynodon dactylon (L.) Pers.], perennial ryegrass (Lolium perenne L.), bentgrass (Agrostis palustris Huds.)}. The nine courses selected included three long-term reuse courses, three fresh water courses, and three courses expected to transition to reuse water during the monitoring period. Near-surface soil salinity varied from 1.5 to 40.0 dS·m−1 during the study period with the highest peaks occurring during summer months and on long-term reuse irrigated fairways. Although soil salinity at several depths on fairways and greens increased after transition to reuse water, this did not lead to a systematic decline in leaf xylem water potential (ΨL) or color. When the data were grouped as fresh, transition, or reuse irrigated, soil salinity on reuse courses were statistically higher (P < 0.05) than fresh and transitional courses, yet plant response on reuse courses was not statistically different (P > 0.05) than that observed on fresh courses. The fact that summertime plant parameter values often declined under lower salinity levels and the electrical conductivity of the irrigation water was rejected as a significant variable in all backward regression analysis to describe plant response indicated that management differed significantly from course to course. We conclude that proper irrigation management, based on a multitiered feedback system (soil–plant–atmospheric monitoring), should be able to maintain favorable salt balances and plant response as long as irrigation volumes are not restricted to where deficit irrigation occurs.