The ability to track moisture content using soil moisture sensors in green stormwater infrastructure (GSI) systems allows us to understand the system’s water management capacity and recovery. Soil moisture sensors have been used to quantify infiltration and evapotranspiration in GSI practices both preceding,
[...] Read more.
The ability to track moisture content using soil moisture sensors in green stormwater infrastructure (GSI) systems allows us to understand the system’s water management capacity and recovery. Soil moisture sensors have been used to quantify infiltration and evapotranspiration in GSI practices both preceding, during, and following storm events. Although useful, soil-specific calibration is often needed for soil moisture sensors, as small measurement variations can result in misinterpretation of the water budget and associated GSI performance. The purpose of this research is to quantify the uncertainties that cause discrepancies between default (factory general) sensor soil moisture measurements versus calibrated sensor soil moisture measurements within a subsurface layer of GSI systems. The study uses time domain reflectometry soil moisture sensors based on the ambient soil’s dielectric properties under different soil setups in the laboratory and field. The default ‘loam’ calibration was compared to soil-specific (loamy sand) calibrations developed based on laboratory and GSI field data. The soil-specific calibration equations used a correlation between dielectric properties (real dielectric: ε
r, and apparent dielectric: K
a) and the volumetric water content from gravimetric samples. A paired
t-test was conducted to understand any statistical significance within the datasets. Between laboratory and field calibrations, it was found that field calibration was preferred, as there was less variation in the factory general soil moisture reading compared to gravimetric soil moisture tests. Real dielectric permittivity (ε
r) and apparent permittivity (K
a) were explored as calibration options and were found to have very similar calibrations, with the largest differences at saturation. The ε
r produced a 6% difference while the K
a calibration produced a 3% difference in soil moisture measurement at saturation. K
a was chosen over ε
r as it provided an adequate representation of the soil and is more widely used in soil sensor technology. With the implemented field calibration, the average desaturation time of the GSI was faster by an hour, and the recovery time was quicker by a day. GSI recovery typically takes place within 1–4 days, such that an extension of a day in recovery could result in the conclusion that the system is underperforming, rather than it being the result of a limitation of the soil moisture sensors’ default calibrations.
Full article