Journal of the NACAA
ISSN 2158-9429
Volume 7, Issue 1 - May, 2014

Effects of On-Farm Management Practices on Round Bale Silage Fermentation

Shockey, W.L., Extension Agent and Extension Associate Professor, Preston County, WVU Extension Service
Straight, A.M., Extension Agent and Extension Instructor, Ritchie County, WVU Extension Service
Rayburn, E.B., Forage Specialist and Professor, WVU Extension Service
Loyd, B.M., Extension Agent and Extension Associate Professor, Lewis County, WVU Extension Service


Round bale silage (baleage) is often produced on West Virginia farms to overcome the difficulty of making high quality dry hay under wet, humid conditions. Incidences of toxicity and death of cattle consuming baleage that has spoiled because of improper bale management have been reported. On-farm management and fermentation data was collected to identify the most critical risk factors that contribute to poorly preserved baleage. Our data show that success in baleage production can be consistently achieved by following best management practices that emphasize optimum forage moisture (50 to 60%) and bale density (> 20 pounds per cubic foot).


Climatic conditions in the Appalachian region of the US are generally conducive to forage production for use as ruminant livestock feed. Unfortunately, the wet, humid conditions that stimulate forage growth make it difficult to cut, dry, and bale high quality hay crops. Many farmers have adopted the practice of making hay-crop silage using plastic wrapped large, round hay bales. The product is commonly called baleage.

Fermentations that result in a low pH and well-preserved forage requires 1) the presence of lactic acid producing bacteria, 2) a sufficient supply of fermentable carbohydrate, and 3) exclusion of oxygen to create the anaerobic conditions necessary to produce fermentation acids that preserve the forage and prevent secondary fermentations.

When wilted hay is baled with a moisture content of 50 to 60%, then wrapped with plastic to exclude air, an anaerobic environment is formed within the plastic wrapped bale that allows lactic acid producing bacteria to ferment soluble forage carbohydrates. The amount of fermentable forage carbohydrate tends to decrease as plant maturity increases. The primary end products of the fermentation are lactic and acetic acids that lower the pH of the forage mass to approximately 5.0 and prevent it from spoiling.

Best management practices (BMP’s) that minimize the risk of spoilage include 1) cutting at early stages of maturity, 2) wilting to 50 to 60% moisture, 3) make tight, dense bales, 4) wrap bales within 2 hours after baling, 5) store in a clean, well-drained area to minimize tears and damage, and 6) inspect regularly and repair tears and holes with tape designed for that purpose (Tietz, 2007, Sullivan and McKinlay, 1998; and McCormick, 2006).

Field observations reported by West Virginia University Extension Service (WVU-ES) personnel indicate that conditions often exist that causes the hay to not ferment completely. When this happens, Clostridia bacteria may reproduce in sufficient numbers to produce butyric acid and ammonia which prevents the pH from getting below 5.5, thus allowing continued microbial activity that can cause a buildup of toxins (Kung, 2002; Sylvester, 2007; Bagley, 1998) that result in poor livestock health and can cause death.

Every year cattle death losses have been reported due to baleage that did not ferment properly. On beef farms, such incidents have resulted in the death of five or more animals with economic losses up to $10,000 or more per incident.

Based on forage analysis of baleage that was associated with cattle death, WVU-ES faculty found that baleage had not fermented adequately to protect the forage from bacterial or fungal contamination. Such conditions can cause livestock death due to botulism, listeria, or mycotoxin poisoning (Bagley, 1998).

A research project was conducted to determine the effects of local management practices on fermentation characteristics of forage harvested as baleage in West Virginia.

Materials and Methods

County extension agents recruited 1 or more farmers to allow collection of management data and forage samples. The data collection tool is shown in figure 1. Management data collected included date of cutting, width of forage swath after cutting (full swath or windrow), time between cutting and baling, whether raked or tedded, windrow moisture at baling, time between baling and wrapping, characteristics of wrapper, characteristics of wrap, layers of plastic, percent stretch of plastic, inoculant or chemical treatment if any, weight of bale, diameter of bale, and length of bale.

Six bales from each farm or, when appropriate, each field per farm, were sampled after at least 30 days (most were more than 90 days) of fermentation using a Penn State core sampler. For each farm or treatment, 4 cores of forage were removed from each of 6 bales and all 24 core samples were thoroughly mixed. A sample of this thoroughly mixed forage was packed and sealed into a quart-sized plastic freezer bag and kept in an ice-filled cooler for transport to a freezer. Samples were stored in the freezer (0 degrees Fahrenheit) for at least 24 hours. Frozen samples were submitted to Dairy One Forage Analysis Laboratory for chemical analysis. Sealed, frozen samples were sent by overnight UPS service to the laboratory to reduce the risk of secondary fermentations.

Moisture, crude protein, neutral detergent fiber, and acid detergent fiber were analyzed by NIRS. Industry standard wet chemistry analysis was used to measure pH, lactic acid, propionic acid, and butyric acid.

Results were summarized and analyzed using appropriate models for analysis of variance or least squares regression techniques (Hintze, 1998).

Results and Discussion

A total of 79 forage samples were collected in 2011 and 2012. The effects of management and weather factors on balage fermentation quality, as estimated by forage pH, are shown in table 1. Across these 79 fields there was no significant effect of cutting forage into a wide swath or a windrow on baleage pH.

When averaged over all samples, there was only a slight tendency for rain damage to inhibit final pH (P=.21); however, the pH of wet forage (50 to 70% moisture) that was rained on between cutting and baling was significantly (P=.08) higher than baleage made from forage not rained on. Forage that is rain damaged during the wilting process loses fermentable carbohydrate through leaching. Our observation implies that for forage baled at optimum moisture levels, and with its full complement of nonstructural carbohydrate, final baleage pH is lower compared to rain damaged forage.

Tedding is used to increase drying rates of forage that has been cut for hay. When used to increase drying rates of forage that will be stored as baleage, tedding was detrimental to optimal preservation as measured by pH when averaged over all samples (P = .17); the effect was highly significant (P = .01) for forage baled at optimal moisture levels (50 to 70% moisture). This effect has not been previously reported in the scientific literature and the authors can only speculate that tedding increases the atmosphere’s surface contact with wilted forage, thus inhibiting the establishment of lactic acid producing bacteria on the forage surface. Also, orientation of hay stems becomes random after tedding, reducing packing and exclusion of air in the bale.

The biggest single factor in determining the fermentation quality of baleage, as estimated by final forage pH, was forage moisture content. Data (Figure 2) clearly shows that for the majority of bales containing more than 50% moisture a final pH of at least 5.5 was reached (lower right quadrant); while the opposite was true for bales containing less than 50% moisture (upper left quadrant). Bagley (1993) warns that bales with pH greater than 5.5 should be monitored closely for molds and evidence of Clostridia, and bales with pH greater than 6.0 should be discarded.

Of the 11 bales in the lower left quadrant in Figure 2, 10 of them had a density greater than 20 pounds per cubic foot. This indicates that even though bale moisture was less than 50%, bale density can sometimes overcome the negative effects of baling below moisture levels of 50%. Conversely, even when forage moisture was greater than 50%, data points in the upper right quadrant indicate that final pH below 5.5 sometimes is not attained, such as when rain damage occurs or bale density is improperly monitored. Figure 3 supports that observation and shows that a minimum bale density of 20 pounds per square foot can help ensure that the final pH will be lower than 5.5.

Figure 4 shows a positive association between forage moisture and bale density, indicating that bale density can be improved by baling at higher moisture levels. The combined relationship of bale density and forage moisture on final pH is illustrated in the pH contour graphic shown in figure 5. The small legend defines the line color associated with pH contours from pH 5.0 to 7.0 in 0.5 increments. The large legend defines the color associated with the final pH of the baleage at each combination of bale density and forage moisture %. The pH contours visually illustrate that bale density greater than 25 pounds per cubic foot results in well-preserved forage, as measured by final pH, even when forage moisture levels are lower than 50%. It also illustrates that when baled at optimum forage moisture levels (50 to 65% moisture) final forage pH is always low enough to properly preserve the bales, unless density is less than 18 pounds per cubic foot.


Analysis of baleage samples, taken under on-farm conditions, confirmed that forage moisture content and bale density were the two most important parameters that affected bale quality as measured by final pH. Good success in baleage production can be consistently achieved by following best management practices, with emphasis on baling at optimum forage moisture (50 to 60%) and making dense bales (> 20 pounds per cubic foot).


The following faculty and staff provided essential contributions to this study which could not have been completed without their support. J.J. Barrett, Wood County; Brandy Brabham, Roane County; Debbie Friend, Braxton County; Greg Hammons, Pocahontas County; Ronnie Helmondollar, Randolph County; Sheryl Jarvis, Monongalia County; John David Johnson, Jackson County; Jennifer Poling, Tucker County; David Richmond, Raleigh and Summers County; H.R. Scott, Monongalia County; David Seymour, Pendleton County; Brian Sparks, Nicholas County; Les Vough, University of MD, Emeritus; Rodney Wallbrown, Mason County; Brian Wickline, Monroe County; and David Workman, Hardy County.

Literature Cited

Bagley, C.V., 1998. Beware! – Botulism.  Utah State University, Dairy Veterinary Newsletters, Vol. 21, No. 5. Accessed March 12, 2014.

Hintze, J.L. 1998. NCSS 2000 Statistical System. Number Cruncher Statistical Systems, Kaysville, UT 84037

Kung, Jr., L. 2002. Botulism in Cattle. Delware Cooperative Extension Service on-line reference., Accessed March 12, 2014.

Sullivan, P. and J. McKinlay. 1998. Maintaining Quality in Large Bale Silage. Fact Sheet. Ontario Ministry of Agriculture and Food. Accessed March 12, 2014.

Sylvester, P. 2007. Botulism, Forages, and Livestock. Delaware Cooperative Extension Service on-line reference., Accessed March 12, 2014.

Tietz, N. 2007. Best and Worst Silage. Hay and Forage Grower., Accessed March 12, 2014.