Report

Waste Not, Want Not: Transactional Politics, Research and Development Funding, and the US Farm Bill

By Philip G. Pardey | Vincent H. Smith

American Enterprise Institute

December 11, 2017

Key Points

  • Most of the farm bill programs dedicated to agriculture simply slice up and, in some cases, reduce the size of the agricultural pie, redistributing dollars from taxpayers to farmers, insurance intermediaries, and various agribusinesses while, in some cases, creating incentives for farmers to waste resources.
  • In stark contrast, farm bill dollars dedicated to food and agricultural research and development (R&D) expand the overall size of the agricultural pie to benefit not only innovative farmers and agribusinesses but also taxpaying consumers who foot the bill. Productivity growth induced by publicly funded R&D investments lowers costs of production and the price of food.
  • The economically sensible strategy is to cut back on wasteful farm bill spending and instead significantly increase funding for public investments in agricultural R&D. Shifting farm bill policy to an “investment” strategy is far more than mere political rhetoric.

Read the full PDF. 

Executive Summary 

In 1862 President Abraham Lincoln established the United States Department of Agriculture (USDA), which was conceived primarily as a federal government agency to promote innovation in US agriculture. As the 20th century dawned, more than half the department’s total expenditures were directed to research and development (R&D) activities. As we approach the second decade of the 21st century, the department’s spending priorities are now very different.

The share of USDA spending directed to food and agricultural R&D has fallen precipitously to just 1.6 percent of the agency’s total budget in fiscal year 2017. As a consequence of these shifts in USDA spending priorities, the US has lost significant global R&D ground with large agricultural economies such as China, India, and Brazil, which are now collectively outspending the US by a large margin. Ostensibly temporary, emergency measures to shore up farm prices and US agricultural incomes introduced in the initial farm bills of the Great Depression and Dust Bowl era of the 1930s have grown inexorably over the subsequent decades, while government spending on R&D has stalled and is now declining.

The hard-nosed economic evidence is compelling. Failing to realign farm bill spending priorities and revive spending on (publicly performed) food and agricultural R&D will continue to compromise the productivity performance of US agriculture and undermine the sector’s competitive position in growing but highly contested international markets. In contrast, realigning public funding for agricultural programs toward agricultural R&D, along with creative programs that increase incentives for private support of public interest focused on agricultural research, would benefit US agriculture, the US economy, and US consumers.

Introduction 

Doling out taxpayer dollars via the farm bill is transactional politics in its finest form, pitting the self-interests of agricultural lobbies against society’s communitywide well-being. Many of the farm programs we know today have their roots in Franklin D. Roosevelt’s 1930s New Deal legislation, which were emergency measures put in place to address the farm income implications of severely depressed farm prices during the Great Depression and as Dust Bowl droughts were ravaging parts of the United States. The Agriculture Adjustment Act of 1933 established the precedent for using federal resources to prop up farm prices and farm incomes, and the subsequent 1938 Farm Bill committed substantial federal funds for farm subsidy payments.1

The 1933 Farm Bill represented an explicit, radical expansion and shift in the US Department of Agriculture’s (USDA) spending priorities, arguably the most dramatic change since President Abraham Lincoln signed an act to establish the Department of Agriculture in 1862 as the Civil War unfolded. The charge for the new fledgling federal department was “to acquire and to diffuse among the people of the United States useful information on subjects connected with agriculture in the most general and comprehensive sense of that word, and to procure, propagate, and distribute among the people new and valuable seeds and plants [Section 1].”2

The original research and innovation-centric vision of the USDA has been heavily diluted over the past 150 years. Figure 1 shows the USDA annual budget (in 2009 dollars) over the period 1889–2015 and the share of that budget allocated to research and development (R&D). In the early 1890s, expenditures on R&D accounted for more than half of total USDA spending. By 1929, at the onset of the Great Depression, that share had declined to about 11 percent as the USDA’s extension, education, food safety, and other regulatory functions expanded.3 However, subsequent to the passage of the 1933 and 1938 Agricultural Adjustment Acts, the share of the USDA budget allocated to research spending dropped sharply, averaging 4.6 percent in the 1930s. In the 1940s, R&D’s share declined further to an average of around 2 percent, peaking at about 4 percent in 1952. Nevertheless, following the 1948 Agricultural Act, which introduced price supports at relatively high levels for some major commodities (e.g., wheat and corn), R&D’s share dropped back to as little as 2 percent.

The USDA’s budget rose sharply after the mid-1970s as the department’s mission further expanded. In 1977, the Food Stamp Program, wrehich is now known as the Supplemental Nutrition Assistance Program (SNAP), underwent major reforms, and participation in the progam jumped substantially. Then, through the 1985 Food Security Act, which in effect reintroduced a soil bank (now called the Conservation Reserve Program) and subsequent farm bills, spending on conservation programs was increased. The result was a further diminution of the share of USDA resources allocated  to R&D, the department’s original raison d’etre, to an average of less than 1.5 percent of its budget over the past decade.

The 19th and 20th centuries each had pivotal moments in terms of how federal funds would be spent on US agriculture. At some point, it will be time to revisit and realign USDA spending priorities to deal with 21st-century realities. With increasing concerns about trade deficits, the rate of growth of agricultural exports, the impact of US direct subsidies to farmers on US trade relations with other countries, access to overseas markets, and a decline in US agricultural productivity, is this the time to consider reallocating resources toward publicly funded R&D? Or does the near “rounding error” that, at about $20 billion a year, farm subsidies represent in an overall federal government budget of $3.65 trillion (about 0.5 percent) once again spare that part of farm bill spending from any serious scrutiny?

Read the full report.

Notes

  1. Douglas E. Bowers, Wayne D. Rasmussen, and Gladys L. Baker, History of Agricultural Price Support and Adjustment Programs, 1933-84, US Department of Agriculture, Economic Research Service, 1984, https://www.ers.usda.gov/publications/pub-details/?pubid=41994; and Bruce L. Gardner, American Agriculture in the Twentieth Century: How It Flourished and What It Cost (Cambridge, MA: Harvard University Press, 2002).
  2. Wayne D. Rasmussen and Gladys L. Baker, The Department of Agriculture (New York: Praeger, 1972), 6.
  3. Ibid. For example, food safety activities increased with the 1906 passage of the Food and Drugs Act and the Meat Inspection Act—the latter attributable in part to the outcry stemming from Upton Sinclair’s 1906 novel, The Jungle—while extension activities were given a boost with the passage of the Smith-Lever Act in 1914.