Trials in nine commercial celery (Apium graveolens L.) fields were conducted between 1997-99 to evaluate grower drip irrigation management practices and their effects on yield and quality. Surface drip irrigation tapes with flow rates higher and lower than the grower-installed tapes were spliced into the field system; as the cooperating growers irrigated and applied N fertigation according to their routine practices these drip tapes delivered either more or less water and N than the field drip system. Total grower water application during the drip-irrigated portion of the season ranged from 85% to 414% of seasonal reference evapotranspiration (ETo). Water volume per irrigation varied among fields from 1.8 to 3.8 cm, with irrigation frequency varying from an average of every other day to once a week. Grower management of drip irrigation was not consistently successful in maintaining soil water tension (SWT) in a desirable range. SWT was often below -30 kPa, and in some cases below -70 kPa. These transient stresses were more often a result of inappropriate irrigation frequency than applied water volume. In four of the fields plots receiving less water than that delivered by the field system produced equivalent marketable yield and quality, indicating a significant potential for water savings. An economically important incidence of petiole pithiness (collapse of parenchyma tissue) was observed in four fields. Infrequent irrigation under high ETo summer conditions, rather than irrigation volume applied, appeared to be the major factor in pith development. N fertigation amount and crop N status appeared to be unrelated to pithiness severity. We conclude that celery drip irrigation management could be substantially improved by maintaining a closer proportionality between irrigation and crop evapotranspiration (ETc), increasing irrigation frequency, and reducing volume per irrigation.
S.J. Breschini and T.K. Hartz
S.J. Breschini and T.K. Hartz
Trials were conducted in 15 commercial fields in the central coast region of California in 1999 and 2000 to evaluate the use of presidedress soil nitrate testing (PSNT) to determine sidedress N requirements for production of iceberg and romaine lettuce (Lactuca sativa L.). In each field a large plot (0.2-1.2 ha) was established in which sidedress N application was based on presidedress soil NO3-N concentration. Prior to each sidedress N application scheduled by the cooperating growers, a composite soil sample (top 30 cm) was collected and analyzed for NO3-N. No fertilizer was applied in the PSNT plot at that sidedressing if NO3-N was >20 mg·kg-1; if NO3-N was lower than that threshold, only enough N was applied to increase soil available N to ≈20 mg·kg-1. The productivity and N status of PSNT plots were compared to adjacent plots receiving the growers' standard N fertilization. Cooperating growers applied a seasonal average of 257 kg·ha-1 N, including one to three sidedressings containing 194 kg·ha-1 N. Sidedressing based on PSNT decreased total seasonal and sidedress N application by an average of 43% and 57%, respectively. The majority of the N savings achieved with PSNT occurred at the first sidedressing. There was no significant difference between PSNT and grower N management across fields in lettuce yield or postharvest quality, and only small differences in crop N uptake. At harvest, PSNT plots had on average 8 mg·kg-1 lower residual NO3-N in the top 90 cm of soil than the grower fertilization rate plots, indicating a substantial reduction in subsequent NO3-N leaching hazard. We conclude that PSNT is a reliable management tool that can substantially reduce unnecessary N fertilization in lettuce production.