A statistical analysis of fertilizer application rate differences between 1989 and 1990
Harold H. Taylor
A Statistical Analysis of Fertilizer Application Rate Differences Between 1989 and 1990
Yearly variation in fertilizer application rates derived from a probability-based survey can be explained by a variety of factors. Sample size, design, and data estimates, as well as interviewer error are contributing factors. In addition, farmers may adjust the amount per application and the number of applications per crop year. Farmers make modifications when future market conditions are expected to change, when Government programs or program participation levels change, when excessive wet or dry weather conditions exist, when purchase costs of fertilizer change, or when nutrient carryover in the soil from the previous year varies significantly from expectations. The goal of this report is to analyze the difference in mean application rates in 1989 and 1990, if any, under the assumption of no statistical difference.
Fertilizer application rates for corn, cotton, soybeans, rice, and wheat were estimated from a survey of fields in the major producing States for 1989 and 1990. The quantity of nitrogen, phosphate, and potash per fertilizer application for each selected field, and the number of applications per field were used to estimate per-acre application rates of nitrogen, phosphate, and potash. The survey design determined how field- level data (average application rates for each field in the survey) would be weighted to derive per-acre average application rates representative of each State and for the total sample area.
A General Linear Model was used to evaluate State and area differences in nitrogen, phosphate, and potash application rates between 1989 and 1990 for corn, cotton, soybeans, rice, and wheat, and to determine whether these differences were statistically different from zero at a special level of significance. The analysis used weighted least squares regression procedures and dummy variables in model formulations (1,2,3,4,5). For each observation sampled, the computed application rate became the dependent variable with year (a set of dummy variables) as the independent variable. When the regression equation was solved, the constant term contained the mean value for 1990, and the value for the dummy variable (year) contained the difference between the means of the two years. The F statistic was then used to test the null hypothesis of no difference in average application rates between 1989 and 1990 at the 10-, 5-, and 1-percent levels of confidence.
The level of confidence is a measurement that indicates the degree of reliability placed on the hypothesis of no difference between means. For example, at the 10-percent level of confidence there is a chance that in 10 out of 100 times a wrong conclusion can be made (i.e., differences in means are significantly different from zero but we conclude that there is no difference). At the 1-percent level of confidence there is a chance that in 1 out of 100 times a wrong conclusion can be made.
Table B-1 contains the average application rates for 1989 and 1990 and their differences by crop, while tables B-2 to B-6 contain the same information by crop and State. Differences in average application rates that were found to be statistically different from zero at a specified level of significance are indicated. [Tabular Data Omitted]
Results of the Analysis
Differences in the average nitrogen application rates on soybeans, rice, and winter wheat were statistically different from zero at the 1-percent level of confidence (table B-1). Differences in mean nitrogen application rates for the other crops were not found to be significant. Winter wheat was the only crop which had a statistical difference in the mean phosphate application rate. This difference was significant at the 1-percent level. Only cotton, soybeans, and rice showed statistical differences in the mean potash application rates. Differences for cotton and soybeans were significant at the 1- percent level, and for rice at the 10-percent level. For all other crops and nutrients, the mean application rates were not statistically different between 1989 and 1990.
A greater variation in mean application rates for nitrogen, phosphate, and potash existed at the State level for each crop. For example, mean differences in application rates on the U.S. corn crop for each nutrient were not significant; however, some State-level comparisons of differences for corn were significant (table B-2). Significant mean differences at the 10-percent level for nitrogen and phosphate application rates on corn existed for South Dakota, and potash application rates for Iowa. Phosphate application rates on corn were statistically different from zero at the 5-percent level in Illinois and Nebraska. [Tabular Data Omitted]
Mean differences in application rates on the U.S. cotton crop were not found to be statistically different from zero for nitrogen and phosphate, but the potash application rate difference was significant at the 1-percent level (table B-3). However, at the State level, significant differences in application rates for each of the nutrients were detected. Significant differences were obtained for mean nitrogen application rates for Arizona, California, and Texas at the 10-, 5-, and 1-percent levels. The difference in the mean phosphate application rate for California cotton was significant at the 10-percent level, while potash differences were significant for California, Mississippi, and Texas cotton at the 1-, 5-, and 5-percent levels. [Tabular Data Omitted]
State-level differences in soybean application rates for nitrogen, phosphate, and potash were also significant among some States (table B-4). Differences in nitrogen application rates were significantly different from zero for Arkansas soybeans at the 5-percent level, Iowa soybeans at the 10-percent level, and Kentucky soybeans at the 5-percent level. Differences in phosphate application rates for Arkansas and Iowa soybeans were significant at the 1- and 5-percent levels. The difference in the mean potash application rate for Arkansas soybeans was significant at the 5-percent level. No other States had a significant difference in soybean potash application rates, indicating that potash application rates likely were unchanged between 1989 and 1990. [Tabular Data Omitted]
The State of Arkansas had statistically significant differences in mean application rates for rice for all three nutrients (table B-5). Differences in nitrogen and potash application rates were significant at the 5-percent level, and phosphate at the 1-percent level. No other rice States had statistically significant differences in average application rates. [Tabular Data Omitted]
State-level spring wheat showed statistical differences in mean application rates for nitrogen and potash at the 10-percent level (table B-6). Minnesota and South Dakota showed differences for nitrogen, while differences were found in North and South Dakota for potash. No other spring wheat States had significant differences in average application rates. [Tabular Data Omitted]
Winter wheat State-level data showed significant differences for some States and nutrients (table B-6). Differences in mean nitrogen application rates for Kansas were significant at the 10-percent level, while for Montana, Nebraska, and Oklahoma, differences were significant at the 5-percent level. Kansas and Washington were the only winter wheat States that showed a significant difference in phosphate application rates. Differences in phosphate application rates in Kansas were significant at the 10-percent level, while in Washington differences were significant at the 5-percent level. The States of Illinois and Texas had significant differences in potash application rates at the 10- and 5-percent levels.
At the aggregate area level of analysis, differences in mean application rates for a specified level of significance are less than at the State level since State-level data are obtained from a smaller sample size and have greater variation than the area-level data. For example, 1989-90 area-level differences in nitrogen application rates for soybeans and winter wheat of over 6 pounds per acre are needed to be considered significantly different from zero at the 1-percent level of confidence.
No significant differences in nitrogen application rates at the 1-percent level of confidence were detected for any State-level data. However, nitrogen differences of 15.3 and 22.7 pounds per acre for Kentucky and Arkansas soybeans are necessary for significance at the 5-percent level of confidence, while winter wheat States of Montana, Nebraska, and Oklahoma require nitrogen differences of 12.7, 9.6, and 9.8 pounds per acre. Kansas requires a difference of 4.3 pounds for significance at the 10-percent level.
References[1.] Freund, Rudolf J., and Ramon C. Littell. SAS for Linear
Models. Raleigh, North Carolina: SAS Institute Inc.,
1981.[2.] Hallberg, M. C. Statistical Analysis of Single Equation
Stochastic Models Using the Digital Computer. The Pennsylvania
State University, A.E. and R.S. 78, February 1969.[3.] Ferber, Robert, and P.J. Verdoorn. Research Methods in
Economics and Business. New York: Macmillan, 1962, pp.
369-380. Similar comparisons could be made for the other crops and States. What is clear is that sampling variation must be considered before concluding that fertilizer application rates have increased or decreased from one year to the next.[4.] Johnston, J. Econometric Methods. New York: McGraw-Hill,
1972, pp. 176-186.[5.] Cleary, James P., and Hans Levenbach. The Professional
Forecaster. Belmont, California: Wadsworth, Inc., 1986,
COPYRIGHT 1991 U.S. Department of Agriculture
COPYRIGHT 2008 Gale, Cengage Learning