Scheduling irrigation from wetting front depth.
Date
2016-07-07Author
Stirzaker , Richard
Maeko, Tshepo C.
Annandale, John G.
Steyn, Martin
Adhanom, Goitom T.
Mpuisang, Thembeka
Metadata
Show full item recordAbstract
Irrigation scheduling is often based around the analogy of a ‘tipping bucket’, and the measurement or prediction of the amount of water stored within the bucket. We compare this conventional approach of scheduling with stopping irrigation when the bucket tips i.e., when infiltrating water moves from an upper to a lower soil layer. Electronic wetting front detectors were used to close a solenoid valve at the time infiltrating water reached a depth of 300 mm, when irrigating a lucerne crop in a rain-out shelter. Four different ways of using information from the position of the wetting front were compared with scheduling irrigation from soil water measurements made by a neutron probe or calculated by a soil-crop model. Automatically closing a solenoid valve at the time the upper bucket tipped was a successful approach, but only when the correct irrigation interval was selected. If the irrigation interval was too short, water draining from the soil layer above the detector resulted in drainage. Scheduling from wetting front detectors placed at 600 mm depth was unsuccessful because of the difficulty in detecting weak wetting fronts at this depth. The commonly accepted method of measuring a soil water deficit and refilling the bucket to field capacity was not without limitation. Since the soil drained for many days after irrigation, and well beyond the 48 h period typically selected to represent the upper drained limit, drainage and evapotranspiration occurred concurrently.
Collections
- Reseach articles [135]