Research Summaries

Share
Owens, M. K. and G. W. Owens.
2007. Saltcedar water use: realistic and unrealistic expectations. Rangeland Ecology and Management 60:553-557.

Saltcedar (Tamarix spp.) is a widespread invasive plant found in riparian corridors and floodplains in 16 western states. In addition to being associated with such problems as increased soil salinity and decreased plant diversity, saltcedar has been reported to be a prolific water user. Popular press articles widely report that each individual saltcedar tree can use as much as 757 L (200 gallons) per day. Consequently massive control and removal efforts are underway to reduce transpirational water loss and increase water salvage for arid and semiarid environments. Although the potential economic benefits of these control efforts are touted, it has not been proven whether such water savings are possible on a stream level. The original citation for the 757-L estimate does not list the experimental design or techniques used to arrive at this value. In this current study, three lines of evidence — peer-reviewed scientific literature, sap flux rates and sap wood area, and potential evaporation rates — indicate that it is highly improbable saltcedar, or any other woody species, can use this much water per tree on a daily basis. A more realistic estimate of maximum tree-level daily water use derived from sap flux measurements would be <122 L·d-1 (32.2 gallons). Estimates of water salvage would be grossly overestimated using the popular water use value (757 L·d-1), and therefore, economic benefit estimates of saltcedar control based solely on water salvage are questionable.

 

Simon, A., M. Doyle, M. Kondolf, F.D. Shields Jr., B. Rhoads, and M. McPhillips.
2007. Critical evaluation of how the Rosgen classification and associated “natural channel design” methods fail to integrate and quantify fluvial processes and channel response. Journal of the American Water Resources Association 43(5):1117-1131.

Over the past 10 years the Rosgen classification system and its associated methods of “natural channel design” have become synonymous to some with the term “stream restoration” and the science of fluvial geomorphology. Since the mid 1990s, this classification approach has become widely adopted by governmental agencies, particularly those funding restoration projects. The purposes of this article are to present a critical review, highlight inconsistencies and identify technical problems of Rosgen’s “natural channel design” approach to stream restoration. Simon et al’s primary thesis is that alluvial streams are open systems that adjust to altered inputs of energy and materials, and that a form-based system largely ignores this critical component. Problems with classification use are encountered when evaluating bankfull dimensions, particularly in reaches with incising channels and with bed and bank sediment that intergrades together, leaving no evidence of bankfull flows. Its use for engineering design and restoration may be flawed by ignoring some processes governed by force and resistance, and the imbalance between sediment supply and transporting power in unstable systems. An example of how C5 channels composed of different bank sediments adjust differently and to different equilibrium morphologies in response to an identical disturbance is shown. This contradicts the fundamental underpinning of “natural channel design” and the “reference-reach approach.” The Rosgen classification is probably best applied as a communication tool to describe channel form but, in combination with “natural channel design” techniques, is not diagnostic of how to mitigate channel instability or predict equilibrium morphologies. For this, physically based, mechanistic approaches that rely on quantifying the driving and resisting forces that control active processes and ultimate channel morphology are better suited as the physics of erosion, transport, and deposition are the same regardless of the hydro-physiographic province or stream type because of the uniformity of physical laws.

 

Wilcox, J. C., M. T. Healy, and J. B. Zedler.
2007. Restoring native vegetation to an urban wet meadow dominated by reed canarygrass (Phalaris arundinacea L.) in Wisconsin. Natural Areas Journal 27:354-365.

The perennial grass Phalaris arundinacea (reed canarygrass) is a widespread invader of North American wetlands. Wilcox et al. examined if the use of herbicide with burning, clipping, or seeding could reduce the cover of P. arundinacea and increase cover of native species in a wet prairie that receives stormwater runoff. Two glyphosate treatments decreased P. arundinacea cover, and spring seeding of 33 native species doubled species richness and floristic quality compared to no seeding. Despite an initial decrease in abundance, P. arundinacea cover was no different than control plots two years after seeding and overall native species richness and cover decreased from the previous year. Application of the grass-specific herbicide sethoxydim in the third year of the study reduced P. arundinacea cover and height while allowing native forbs and graminoids to persist. Most of the 75 species identified during the study were perennial, native forbs, and most of the sown species that commonly established in the field had high germination rates in the laboratory. Continued management of P. arundinacea is needed to maintain desirable native wetland flora.

 

Bailey, D. W., H. C. Van Wagoner, R. Weinmeister, and D. Jensen.
2008. Evaluation of Low-Stress Herding and Supplement Placement for Managing Cattle Grazing in Riparian and Upland Areas. Rangeland Ecology & Management 61:26-37.

Management practices are often needed to ensure that riparian areas are not heavily grazed by livestock. A study was conducted in Montana during midsummer to evaluate the efficacy of low-stress herding and supplement placement to manage cattle grazing in riparian areas. Three treatments were evaluated in three pastures over a 3-yr period in a Latin-square design (n=9). Each year, naïve 2-yr-old cows with calves were randomly assigned to the three treatments: 1) free-roaming control, 2) herding from perennial streams to upland target areas, and 3) herding to upland sites with low-moisture block supplements. Stubble heights along the focal stream were higher (P=0.07) in pastures when cattle were herded (mean±SE, 23±2 cm) than in controls (15±3 cm). Global positioning system telemetry data showed that herding reduced the time cows spent near (<100 m) perennial streams (P=0.01) and increased the use of higher elevations (P=0.07) compared with controls. Evening visual observations provided some evidence that free-roaming cows (44%±19%) were in riparian areas more frequently (P=0.11) than herded cows (23%±6%). Fecal abundance along the focal stream was less (P=0.07) with herding (61.9±11.4 kg·ha-1) than in controls (113.2±11.4 kg·ha-1). Forage utilization within 600 m of supplement sites was greater (P=0.06) when cows were herded to low-moisture blocks (18%±6%) compared with controls and herding alone (8%±2%). Moving cattle to uplands at midday using low-stress herding is an effective tool to reduce use of riparian areas. Herding cattle to low-moisture blocks can increase grazing of nearby upland forage but may not provide additional reduction in cattle use of riparian areas compared with herding alone.