3. Politics and Propaganda of Economic Inequality (See also 10) – Inequality Book Reviews–scroll to index for reviews https://inequalitybookreviews.com Overview of many aspects of inequality Wed, 25 Aug 2021 00:08:20 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.2 13. Dark Money (multibillion dollar, 50 year propaganda and influence campaign led by Kochs and others) https://inequalitybookreviews.com/2021/08/21/13-dark-money-multibillion-dollar-50-year-propaganda-and-influence-led-by-kochs-and-others/ https://inequalitybookreviews.com/2021/08/21/13-dark-money-multibillion-dollar-50-year-propaganda-and-influence-led-by-kochs-and-others/#respond Sat, 21 Aug 2021 22:16:38 +0000 https://inequalitybookreviews.com/?p=175 Dark Money.  Jane Mayer.

Market societies unrestrained by government inevitably experience relentlessly increasing inequality. The wealthy beneficiaries of inequality then have the resources to perpetuate and enhance this unfortunate cycle by capturing government and public opinion. In Dark Money, Jane Mayer shows how this cycle is made worse by a system that allows unlimited funding of campaigns, lobbying, and propaganda by dark money that maintains secrecy and avoids taxes. Thus the superrich are able to maintain public postures of high-minded innocence while secretly providing massive financing for programs that further increase inequality by favoring their interests at the expense of everyone else.

The dark money that is a key component of this process is money funneled through nonprofit organizations that can receive unlimited donations from corporations and individuals and spend funds to influence elections but are not required to disclose their donors. The main vehicles for this process are tax-free, nonprofit private 501 (c) foundations which were originally said to be for charity, social welfare, and education but not for politics. Super PACs like Karl Rove’s American Crossroads and business organizations like the Chamber of Commerce are also major vehicles for the use of dark money.

The history of private foundations began in 1909 with John D. Rockefeller’s request to set up a tax-free foundation. This request was first denied by congress because of its undemocratic nature. However, it was later approved by the New York state legislature with limitation to education, science, and religion. Over time, the numbers of private foundations and the issues they served multiplied rapidly so that by 2013, there were over a hundred thousand foundations with assets over $800 billion. The supposedly nonpolitical nature of these foundations was progressively undermined so that by the time of the Citizens United decision of 2010, they could be the source of enormous unlimited tax-free secret special interest political funding.

The transition of non-profit foundations from charitable organizations to political tools of the superrich accelerated in the 1970s. In 1971, Lewis Powell, a corporate lawyer, tobacco defender, and future Nixon-appointed Supreme Court justice, wrote a special memorandum for the business league. He called for “guerilla warfare” against what he saw as the anti-business threat posed by “perfectly respectable elements of society,” including “the college campus, the pulpit, the media, the intellectual and literary journals, the arts and sciences, and politicians.” Public opinion was to be captured by exerting influence over these institutions and the courts, by demanding balance in textbooks, television, and news, and by donors demanding a say in university hiring and curriculum.

This was followed in 1976 by Charles Koch of the extremist libertarian Koch brothers laying out a road map for future takeover of American politics. His intent was to overturn the post WWII view of government as a force for good (including regulation of business, progressive taxation, and worker’s rights) and instead argue for limited government, drastically lower personal and corporate taxes, minimal social services for the needy, and much less oversight of industry. Campaign contributions and lobbying were to be supplemented by a secretive long-term plan to capture public opinion by 1) investing in intellectuals, 2) investing in think tanks, and 3) subsidizing “citizens” groups that gave the appearance of public support.

In 2003, the Koch brothers began their “donor summits,” which were secretive meetings of large numbers of superrich donors for archconservative causes. By 2014, the impressive list of 300 or so secretive donors in this group included 18 billionaires with combined assets of $222 billion as well as numerous sub-billionaires, top Republican politicians, conservative media stars, and even two Supreme Court justices (all listed in the book). Goals now included winning the presidency, capturing the House and Senate, cementing control of congress by gerrymandering, capturing state legislative bodies, governorships, and supreme courts, and controlling the Republican Party. For the 2016 elections, the donor summit group alone pledged $889 million.

When Powell and the Kochs formulated their strategies, America’s greatest corporate fortunes were already poised to enlist their private foundations for the cause. Early participation by the Scaife, Olin, Coors, Koch, Bradley and other Family Foundations, which controlled hundreds of millions of dollars, and by scores of Fortune 500 corporations was only the tip of the iceberg. The powerful leaders of these families and corporations eventually funded hundreds of additional foundations in what was cynically called the “philanthropy plan” to change academia, the media, the courts, regulation, taxation, politics, government, and public opinion.

The effectiveness of this secret dark money was greatly facilitated by organization of these foundations into multiple additional layers. Donors at the top could contribute family, foundation, and corporate money to the next layer of foundations, some of which were mere conduits, or to organizations like the Chamber of Commerce and super PACs. These foundations and organizations could then disguise the self-serving nature of these donations by redirecting them to their actual targets without revealing the donor’s identities. The scope of this process became unlimited after the 2010 Supreme Court Citizens United decision that removed all restrictions on the size of the contributions to this system.

The enormous resources of this system are distributed to innumerable activities and organizations thoroughly interspersed in American life to maximize influence on politics and public opinion. Many examples of the numerous additional non-family foundations funded by the Kochs and others are listed and characterized in the book. Business Associations and PACs that hide donor’s identities are also listed and characterized in the book. Many examples of the numerous ideological think tanks like the American Enterprise Institute, the Heritage Foundation, and the Cato Institute funded by the Kochs, Scaife, and others are listed and characterized in the book. The establishment of right wing media outlets and organizations like the Tea Party and the sponsorship of media stars like Rush Limbaugh, Sean Hannity, Laura Ingraham, and Glenn Beck are discussed in the book.

Numerous partisan university institutes and activities were funded. A network of 5,000 scholars was established in 400 colleges and universities. Koch foundations alone funded pro-corporate programs in 283 colleges and universities. Twenty-four right wing academic centers were privately funded, such as George Mason University’s Koch-funded Mercatus Center and Institute for Humane Studies. The conservative Olin Foundation funded Harvard Law School’s influential Center for Law, Economics, and Business. The Olin foundation and later Scaife and the Kochs funded the Federalist Society, which grew to 150 law school chapters and 42,000 right-leaning lawyers. The Olin Foundation backed the Collegiate Network which funded a string of right wing newspapers on college campuses.

The book’s depiction of how extensively these many organizations and activities penetrate American life cannot be recounted in a brief review. A few examples may suffice to illustrate the depth of resources and breadth of scope involved: 1) Richard Scaife, the billionaire heir to Mellon Banking, Gulf Oil, and Alcoa Aluminum, estimated that he spent $1 billion on philanthropy, of which $670 million was to influence public opinion by bankrolling 133 of conservatism’s most important movements. 2) A carefully staged ten year legal campaign using the “social welfare” corporations Citizens United and Speech Now succeeded in removing campaign financial restrictions to increase the influence of the superrich. 3) From 2003 to 2010, 140 conservative foundations contributed $558 million as 5,299 grants to 91 nonprofit organizations to promote denial of climate change. Three-fourths of these funds were untraceable due to use of conduits. In addition, efforts were made to discredit, defund, and fire leading climate scientists.

4) Finally, even complete right wing takeover of targeted state governments is not out of reach. This has actually happened for all three branches of government in Wisconsin. In 2010, Scott Walker was elected governor after promotion at the Koch’s Americans for Prosperity Tea Party rallies and with support from Koch Industries (second largest campaign contributor) and the Republican Governor’s Association (also supported by the Kochs) to work around state contribution limits. The out-of-state Kochs also contributed to sixteen legislative candidates who all won, helping conservatives control both houses of the legislature.

The state Supreme Court majority was captured by funneling $10 million (which exceeded campaign contributions for all candidates combined) through the Wisconsin Club for Growth and Wisconsin Manufacturers & Commerce to elect three conservative justices in 2007, 2008, and 2011 and to replace the liberal chief justice with a conservative. This provided the final step to victory when the right wing packed state Supreme Court upheld the right-wing program passed by Walker and the legislature by a partisan 4 to 3 vote. Subsequently, in the 2012 recall election the Walker campaign collected $36.1 million (more than 1/2 from out of state) v. $6.6 million for the opposition, and state redistricting resulted in Republicans gaining 62% of the legislature after winning only 45% of the vote in 2018.

]]>
https://inequalitybookreviews.com/2021/08/21/13-dark-money-multibillion-dollar-50-year-propaganda-and-influence-led-by-kochs-and-others/feed/ 0
14. The Nobel Factor (Economic Nobel Prize created in 1969 by bankers to boost right wing economists) https://inequalitybookreviews.com/2021/08/21/14-the-nobel-factor-economic-nobel-prize-created-in-1969-by-bankers-to-boost-right-wing-economists/ https://inequalitybookreviews.com/2021/08/21/14-the-nobel-factor-economic-nobel-prize-created-in-1969-by-bankers-to-boost-right-wing-economists/#respond Sat, 21 Aug 2021 22:14:58 +0000 https://inequalitybookreviews.com/?p=172 The Nobel Factor: The Prize in Economics, Social Democracy, and the Market Turn. Avner Offer and Gabriel Soderberg. 2016.

The authors argue that in 1969 the Swedish business elite managed to acquire the Nobel name for a prize in economics for the purpose of exaggerating the scientific authority of market-liberalism to overturn Social Democracy. This prize in economics was not part of the original group of Nobel prizes awarded every year since 1901. This is hardly surprising since Alfred Nobel had written that he hated business and considered himself a social democrat. Nevertheless, the prize was created by the central bank of Sweden by providing an endowment funded by taxpayers and by persuading the Nobel Foundation (dominated by businessmen), the Royal Swedish Academy of Science (which resisted), and the Nobel family to lend the prestige of its name. However, the family insisted in setting the prize apart by naming it the “Prize in Economic Science in Memory of Alfred Nobel.”

The authors contend that the Nobel-selection committee, which did not include a single left-leaning economist until the 1990s, was biased from its onset toward the right compared to economists generally. This is documented by multiple surveys of economists over several decades that consistently show 2/3 favored Social Democratic norms, while only 1/3 strongly opposed them in both Europe and the U.S (see fig. 1). A study of doctoral students in the top six American universities showed that 2/3 were left of center in 1985 and still left of center when followed up in the 2000s. A poll of senior American economists by the Economist in 2008 found 46% Democrats, 44% independents, 10% Republicans, and 80% supporting Obama’s policies.

The politics of the selection process are examined by a review of the history of the committee and by extensive graphic analysis of the lifetime patterns of citations of the candidates’ research. Prizes were split relatively evenly between left-leaning and right-leaning economists, but did not reach the 2/3 to 1/3 split in the discipline. The committee advanced its viewpoint by focusing on studies of markets and by presenting economics as more scientific than it is. Some highly regarded liberal economists appear to have been blackballed—permanently for J. K. Galbraith and Joan Robinson and temporarily for Stiglitz and Akerlof. On the other hand, the prize in 1974 rescued the status of the conservative Friedrich von Hayek (author of The Road to Serfdom), whose career had been at a dead end since the 1950s.

The authors set the actions of the Nobel Committee in the context of the struggle between social democrats and market-liberalism of business elites. Market-liberalism is characterized as considering market exchange as superior to all alternatives with no role for government and with no concern for advantages from inequality of endowments of wealth, connections, ability, education, and health or for unequal rewards. Nordic Social democracy is characterized as a vision of reciprocal solidarity, in which immediate self-interest is subordinated to collective advantage, with the addition of government programs to correct market failure so that health, education, welfare, and housing are pulled out of the market and predation of labor is prevented by central negotiation between employers and trade unions.

The authors note that due to social democracy with mixed economies, no societies on earth are farther from serfdom than the Nordic welfare states, which are among the richest and most equitable in the world. In these states earnings growth is shared by all rather than just a few at the top as in the U.S. (see fig. 2). In addition public sector social insurance is more than an order of magnitude cheaper to administer than market insurance for sickness, disability, and unemployment. Compared to the U.K., the U.S. health system costs twice as much, has inferior outcomes, and fails to cover everyone.

According to the authors, the market-liberalism favored by the Nobel Committee is not entitled to the authority of science because it lacks natural science’s widely shared core principles and requirement for empiric validation. Its main feature is that its doctrines are highly convenient for great wealth, polluting industry, risky finance, and those who don’t want to pay taxes or help the needy. Moreover, the mostly theoretical neoclassical economics at its base has been largely discredited in the last several decades by the rise of empiric and behavioral economics. These disciplines have disproven many of the theoretical conclusions, assumptions, and models about perfect markets, rational choice, and so on by actual rigorous measurement and observation.

Models of market efficiency (the invisible hand) fail because assumed extensive uniformity, perfect information, and perfect competition, not to mention an absence of bad faith, opportunism, and fraud do not exist in the real world. Also, efficiency is worth having, but so are other values, such as truth, justice, freedom, loyalty, and obligation. Models of Rational Choice (informed self-interest) fail because of the same assumptions and because actual human choice has been shown to differ greatly from that of models. Also, the self-interest model excludes other influences, such as friendship, love, loyalty, charity, and integrity. The Just World Theory, which states that everyone gets what he deserves regardless of prior endowments, justifies inequality and hardship as arising from individual desert. Thus its purpose is to dismantle constrains on the wealthy and dismantle protections for everybody else. It is essentially a license to inflict pain. The Optimal Taxation Model suggested a low linear tax of 20-30% with marginal rates declining as income increased until they reached zero for the top earner. The model is open to many criticisms and other top economists have suggested a top rate of 78%, not zero.

Despite these and many more shortcomings, market-liberalism, with authority boosted by several Nobel Prizes, managed to gain political ascendancy in the past several decades and helped bring about the negative consequences of the “market turn” referred to in the book’s title. The massive redistribution of wages and benefits away from workers and toward wealthy elites caused soaring inequality. Deregulation contributed to the financial crisis of 2007. The introduction of ownership equity incentives for managers led to systematic plundering of corporations. Privatizing of welfare functions led to inferior programs with higher risk, lower employer contributions, and high fees (25-40% for pension investment). Imposition of austerity by the IMF and World Bank according to the “Washington Consensus” led to slower economic growth, collapsed wages, lower standards of living, and increased corruption throughout Latin America, Southeast Asia, Russia, and Eastern Europe.

The authors ask, “What warrant does Nobel economics provide for the market turn? As science, not much….Economics, even Nobel economics does not hang together very well….The massive empirical turn in economics during the last two decades, the work of field experiments and historical ‘natural experiments’, is a silent repudiation of equilibrium economics. To recapture validity, economics has to come down to the ground of argument, evidence, and counterargument, supported by reason and an open mind.”

]]>
https://inequalitybookreviews.com/2021/08/21/14-the-nobel-factor-economic-nobel-prize-created-in-1969-by-bankers-to-boost-right-wing-economists/feed/ 0
15. The Politics of Resentment (right wing Wisconsin takeover by outside money and propaganda) https://inequalitybookreviews.com/2021/08/21/15-the-politics-of-resentment-right-wing-wisconsin-takeover-by-outside-money-and-propaganda/ https://inequalitybookreviews.com/2021/08/21/15-the-politics-of-resentment-right-wing-wisconsin-takeover-by-outside-money-and-propaganda/#respond Sat, 21 Aug 2021 22:13:29 +0000 https://inequalitybookreviews.com/?p=169 The Politics of Resentment.  Katherine J. Cramer.

Ms. Kramer, a University of Wisconsin—Madison Political Science Professor, explored a recent political paradox, “We live in a time of increasing economic inequality, and yet voters continue to elect politicians whose policies respond very disproportionately to the preferences of affluent people.” She examined the origins of this paradox in her home state of Wisconsin, for which rural voters recently tipped the balance from a blue to a red state, seemingly against their own interests. To better understand the opinions of these voters as reported by the usual technique of polling, she personally and repeatedly participated in multiple informal discussions of thirty-nine groups scattered throughout Wisconsin for six years {2007-2012}.

The study identified a very rural identity with “us versus them” characteristics leading to resentment of urban and political elites, public employees, and diverse urban populations. A “rural consciousness” was identified that included “three major components…a perception that rural areas do not receive their fair share of decision-making power, that they are distinct from urban (and suburban) areas in their culture and lifestyle (and these differences are not respected), and that rural areas do not receive their fair share of public resources.” In addition, they believed they worked much harder for lower wages than less deserving urbanites, public employees, and recipients of public assistance and that their culture and communities were dying as a result of these discrepancies.

Reports are reviewed for previous examinations of these perceived discrepancies by the usual political science statistical techniques. At a superficial level, those reports show that rural residents are right about receiving considerably lower wages but wrong about not getting their fair share of public funds. In 2011, per capita median income was in excess of $70,000 for the richest suburbs, about $55,000 for urban counties (without considering the urban poor), and about $40,000 for completely rural counties. Per capita combined state and federal tax revenues were greater than $10,000 for the richest suburbs, over $6,000 for urban counties, and about $4,000 for rural counties. Per capita percentage returned from taxes paid was about 65% state and 150% federal for urban counties and about 100% state and over 400% federal for rural counties (both state and federal graphs skewed by outliers).

However, Ms. Kramer found that the answers from this political science approach didn’t really match the concerns of rural citizens on several important points. The revenues returned to rural regions were often in the form of programs imposed upon then by urban and political elites and staffed by public employees who lived among them. Rural citizens perceived the politicians to be tone deaf to their real needs and the programs to be contrary to their real interests. They perceived the local public employees to be outsiders (them rather than us) with much easier work, better salaries, and enormously better benefits than they had. They perceived their hard-earned tax dollars to be wasted on these programs, public employees, and transfers to what they saw as undeserving urban minorities.

This perspective suggests that voters’ preference for limited government was not rooted in libertarian political principles or identification as Republicans but in a strong rural identity with the perception that services were not benefiting deserving, hard-working people like themselves. Politicians, such as Scott Walker, skillfully directed these rural resentments away from Republican policies that favor affluent people and redirected them toward government, the people who work for it, and urban areas that are home to liberals and people of color. This rural identity with these strong resentments was already firmly established as the result of long-standing difficult rural circumstances and generations of community members teaching these ideas to one another in the context of the national political debate. Scott Walker merely reaped the harvest of a field already prepared for him (how’s that for a rural metaphor?).

So what are the lessons from these findings? First, as on the national level, citizens tend to vote according to personal identities rather than specific policy preferences, with attitudes toward social groups doing the work of ideology. Ms. Kramer examined the rural identity and its resentments in her state. Nationally, numerous additional divisive identities have been experienced, including those involving race, gender, Northerners versus Southerners, and so on. Second, in Wisconsin, it is necessary to reassess what is going on in rural places and reconsider the policy responses. 1) It is possible that resources rural communities are receiving are not effectively addressing the needs of rural communities. 2) It is likely that some of the resources rural communities are receiving are invisible to the people who live there so they are unaware of the programs they use. 3) The manner in which policy is created and delivered is important. If rural residents feel they have been listened to and respected, they may feel different about the programs that result.

My comments about the book:

My main criticism of the book is that the “Where Does Rural Consciousness Come From?” section is inadequate. Radio was dismissed as a source with the comment that public radio transcripts were unavailable but that state and local newspapers were a reliable indicator of the local news environment. Has the author never heard of talk radio? Is she unaware of the enormous audience of Rush Limbaugh? As for local newspapers, her study of papers from 2007 to 2011 doesn’t begin to cover the period necessary for “generations of community members teaching these ideas to each other”. In my view, her approach likely missed a substantial contribution from several decades of the extensive Koch political network propaganda machine firmly embedding these ideas in rural and other identities.

]]>
https://inequalitybookreviews.com/2021/08/21/15-the-politics-of-resentment-right-wing-wisconsin-takeover-by-outside-money-and-propaganda/feed/ 0
16. Behind the Carbon Curtain (Wyoming academic and government takeover by corp. mining) https://inequalitybookreviews.com/2021/08/21/16-behind-the-carbon-curtain-wyoming-academic-and-government-takeover-by-corp-mining/ https://inequalitybookreviews.com/2021/08/21/16-behind-the-carbon-curtain-wyoming-academic-and-government-takeover-by-corp-mining/#respond Sat, 21 Aug 2021 22:12:01 +0000 https://inequalitybookreviews.com/?p=166 Behind the Carbon Curtain: The Energy Industry, Political Censorship, and Free Speech.

I liked this book enough to buy a second copy from Amazon so I could write an Amazon book review. I bought the first copy at the New Mexico School of Mining bookstore, of all places, after attending a graduation ceremony. I am afraid the book will never get the attention it deserves because the author is from a little known university of the nation’s least populous state, Wyoming, and because much of the activity described takes place in that region, which must seem remote to many potential readers.

However, as the author points out, Wyoming can be viewed as a lens into America for this topic. It produces 1/5 of the nation’s energy and is second only to Texas in total BTUs produced. In Wyoming, fossil fuel taxes and royalties account for 60% of state revenues or at least 2/3 to 3/4 when allied activities are considered. Hence, the fossil fuel industry, individuals it has made rich, and politicians it finances have the opportunity to exert enormous financial pressure on public institutions like its university, museums, and schools and, of course, on government itself. The author examines how this industry and its captains handle this opportunity with respect to censorship.

Usually, censorship is regarded as a tool of government or religion to maintain its power or authority. However, in this case, censorship originates as a tool of industry and its captains to serve its own purposes either directly or through the politicians it controls. In Wyoming, many individuals with backgrounds or fortunes from the fossil fuel industries hold elected offices, serve as government officials, and serve on the boards of public institutions like the University of Wyoming, museums, and public schools. In addition, the industry finances many of the state’s politicians at all levels of government, as well as think tanks, academic institutions, and programs placed in the state’s schools and university.

The author’s approach to showing how this power is used begins by reporting many seemingly small narratives that are well below the national radar but that cumulatively become pervasive. He begins with the story of the Carbon Sink, a piece of land art installed by the UW Art Museum that displeased the energy industry. The large outdoor artwork was destroyed as a result of pressure from industry and the legislators it supports. For good measure, follow-up included withdrawal of industry support for university museum fundraising, removal of $2 million from the museum’s budget, and passage of a law requiring submission of future similar art work to the UW energy resources council and the governor. Another sequence that involved suppression of art is described for the photographic exhibit, THE NEW GOLD RUSH: Images of Coal Bed Methane in the Nicolaysen Art Museum of Casper, Wyoming.

The author continues with the stories of several scientists who were fired, censured, had research defunded, or were directed away from conducting research or presenting information to the general public due to objections from the fossil fuel industry and its politicians. Forbidden activities included studies or opinions that reached the public regarding fracking and water contamination, fracking and markedly increased earthquakes, dumping enormous volumes of waste water from fracking, finding that natural gas may be worse than coal for global warming if leakage exceeds 3%, and finding that flared excess gas and fracking effluents from new wells produced dangerous toxic smog in downwind communities.

Although these events were painfully important for individual careers and communities, they were merely indicators of the widespread intimidation that created a far more important climate of pervasive self-censorship. The author documents this by providing many examples of industry and its top politicians publicly endorsing this process and privately implementing it by phone calls and meetings or by actions on boards of directors for universities and museums. He then provides multiple examples of enforcement of this censorship by subordinate university administrators, university department heads, and museum administrators afraid of losing funding or losing their jobs.

Of course, industry driven censorship does not stop here. It also engages in preemptive educational censorship to determine what can and cannot be taught to the state’s children and even in its university. In 2013, a bill was passed supporting a curriculum developed by energy companies to promote “energy literacy” in grade schools. Subsequently, politicians offended by the inclusion of anthropogenic climate change in the Next Generation Science Standards (NGSS) initiated a struggle for its implementation by passing a budget footnote denying it state funds. Other struggles over curriculum and teaching materials took place at the state and local levels.

For the University of Wyoming, the legislature imposed upon the trustees a decision to have the industry-dominated advisory Energy Resources Council approve the hiring for the School of Energy Resources. In 2013, Robert Sternberg was hired as the new university president with assurance to the legislature that he would be “building programs…in collaboration with the energy industry.” He replaced the provost with a candidate who had ties to the energy industry. A cascade of resignations and firings followed. He conducted a review of the UW College of Law by a task force stacked with energy industry supporters to assess its shortcomings “with regard to energy, natural resources, water, and environmental law.” The dean resigned in protest over the duplicitous politics that provided no voice for his faculty. Subsequently, an investigative journalist showed that this plan to reeducate law students was based on demands from a single trustee with connections to the energy industry.

In his final two chapters, the author discusses the fate of free speech in market society. He offers climate change as the most powerful example of how energy corporations have controlled public discourse. Well-funded think tanks and their scientists-for-hire have successfully manufactured the illusion of scientific controversy where none existed by recycling fatuous claims in a version of “proof by repetitive assertion.” This was merely repetition of techniques used by industry to delay action against the hazards of tobacco in the 1960s and acid rain in the 1980s.

In today’s era of market fundamentalism these techniques have been facilitated by the commodification of public and private life. This commodification may be the most potent tool of censorship. In the starkest sense, free speech now becomes the property of those who purchase it. When speech can be bought and sold, only the rich can speak in ways that are heard, particularly after Citizens United. The top 0.1% have as much wealth as the bottom 90%. With the concentration of wealth comes the consolidation of speech. In this setting, the author finds two structural defects that foster censorship in Wyoming, the hegemony of the energy industry and the connection between political elections and corporate money.

]]>
https://inequalitybookreviews.com/2021/08/21/16-behind-the-carbon-curtain-wyoming-academic-and-government-takeover-by-corp-mining/feed/ 0
17. Democracy for Realists (political choice from group identity, not from leaders’ policy positions) https://inequalitybookreviews.com/2021/08/21/17-democracy-for-realists-political-choice-from-group-identity-not-from-leaders-policy-positions/ https://inequalitybookreviews.com/2021/08/21/17-democracy-for-realists-political-choice-from-group-identity-not-from-leaders-policy-positions/#respond Sat, 21 Aug 2021 22:09:59 +0000 https://inequalitybookreviews.com/?p=163 Democracy for Realists: Why Elections Do Not Produce Responsive Government. Christopher H. Achen & Larry M. Bartels. Princeton University Press. 2016.

The authors challenge the cherished American notion that general citizens obtain the government policies they want by democratic elections. They argue that election outcomes are essentially random and do not validate voters’ policy preferences. They begin by examining political science theories of how elections transmit the preferences of ordinary people to be enacted by government. They divide these into the three older theories of populist voting, leadership selection voting, and retrospective voting and the newer theory of group identity voting.

In the populist model (folk theory), voters know their policy preferences and have them implemented either by direct democracy or by representative democracy. In direct democracy, voters rule by choosing policies themselves via initiative and referendum procedures. In representative democracy, voters elect candidates whose policy preferences are most similar to theirs to represent them in assemblies that enact their preferences.

The leadership selection model dispenses with the notion that the voters themselves decide issues by electing candidates to carry out their will. Instead, democracy means only that voters have the opportunity of accepting or refusing the individuals who are to rule them. Thus voters don’t need to know policy so long as the leaders they elect make the best political decisions for them in order to compete for their votes.

The retrospective model regards voters as merely appraisers of past events, performance, and actions. Thus election outcomes hinge not on ideas, but on public approval or disapproval of actual past performance of incumbent political leaders. The authors compare this form of voting to driving by looking in the rear view mirror. They state that it works about as well in government as it would on the highway.

The authors present voluminous information to show that none of these models satisfactorily explains election results. For the populist and leadership selection models, they show that voters have insufficient understanding of their own perceived political preferences and those of their parties and candidates to vote on this basis. In addition, voters are commonly mistaken in highly partisan directions about easily measureable facts, such as crime rates and changes in deficits. Consequently, numerous studies show very little correlation between voters’ preferences and those of the parties and candidates they elect and the actual political outcomes that result.

For the retrospective model, the authors show that voters have insufficient understanding of whether times have been good or bad and whether government is responsible for perceived changes to vote on that basis. For instance, votes for incumbents have been shown to fall significantly after acts of nature, such as shark attacks and hundreds of droughts and floods, for which government clearly is not responsible. This is compounded by the incentive for political and ideological entrepreneurs to construct self-serving explanations and solutions for people’s hardships, which are then amplified by the mass media.

For retrospective voting, only the state of the economy is correlated with election results, but even that correlation is limited. When a president’s term is divided into sixteen quarters, only the two before the election matter. A president will be punished for an economy that does well for four years overall, but tumbles for the last two quarters, and he or she will be rewarded for the reverse. Politicians know this, so income growth usually increases during the last years of four year presidential terms, likely as the result of political manipulation of the economy. Thus, even for the economy, overall performance matters much less than very short term performance just before elections.

The authors argue that these older models of voting do not explain even the great partisan realignments, such as with the New Deal and with the Civil Rights Acts, as for changing policy preferences. For the New Deal, incumbent Republicans were punished for economic collapse in 1932, and incumbent Democrats were rewarded for recovery in 1936. However, contemporaneous elections in state governments and many foreign countries showed that incumbents of opposing parties were similarly punished and rewarded whether they were liberal or conservative. Hence, these election results were not directed at specific policies, but rather at incumbents. In addition, the recession of 1938 led to Democratic losses in congress and state governments. If the presidential election had been held that year, the great realignment might not have occurred.

This brings us to the group identity theory of voting. The authors conclude that the primary sources of voting behavior are partisan loyalties, social identities, group attachments, and myopic retrospection, not policy preferences, ideologies, or realistic assessment of circumstances. Party is the strongest identity, but others include race, ethnicity, religion, social class, and region. Identities are emotional attachments that transcend thinking and may trump facts and policy reasoning. Voters first choose, or commonly inherit the choice of, a party validating their political and social identities, and only then adapt their policy choices to fit those of their candidates and parties. Hence, in thinking about politics, it makes no sense to start from issue positions.

Consequently, the authors find that election outcomes are essentially random choices among the available parties—musical chairs. When the party balance is close, which it usually is in two party systems according to the “Law of the Pendulum,” outcomes turn on the voting choices of “pure independents” who do not even lean toward one party or the other. These “swing voters,” who are the least informed and the least engaged, are often swept along by the familiarity of an incumbent, the charisma of a fresh challenger, or a sense that it is “time for a change.” Hence, elections do not produce policy mandates, even when they are landslides.

Even well-informed and highly educated citizens are not exempt from these findings. They are likely to have more elaborate and internally consistent worldviews that just reflect better rehearsed rationalizations. Indeed, they are often more subject to partisan and confirmatory bias than less attentive voters. The authors emphasize that “this is not a book about the political misjudgements of people with modest educations. It is a book about the conceptual limitations of human beings—including the authors of this book and its readers.”

Given these findings about elections, what is good about democracy? First, elected governments are accepted as legitimate, which facilitates peaceful and orderly transfers of power. Second, in well-functioning democracies, parties that win office are inevitably defeated in later elections, sometimes due to random events, such as droughts, floods, or untimely economic slumps. This inevitable turnover is key to preventing any one group or coalition from becoming too entrenched in power and leading to the abuses of dictatorships or one party states. Third, electoral competition provides incentives for rulers to tolerate loyal opposition. Fourth, in well-functioning democracies, reelection-seeking politicians will strive to avoid being caught violating consensual ethical norms.

Given the limitations of voting, what are the concerns about democracy? After scrupulous efforts to present data in a nonpartisan manner throughout the book, the authors reveal what some would argue is partisan bias in only the last several pages. In their view, “more effective democracy would require a greater degree of economic and social equality.” Power imbalances are very large in favor of the wealthy, the educated, corporations, major media, ethnic majorities, and racial majorities. Organized, powerful, often minority policy demanders routinely get what they want at the expense of less powerful, unorganized majorities. Hence, the authors believe the folk theory of democracy should be abandoned in favor of the group identity theory to better understand the contributions and limitations of citizens, groups, and political parties in the search for political and social progress.

]]>
https://inequalitybookreviews.com/2021/08/21/17-democracy-for-realists-political-choice-from-group-identity-not-from-leaders-policy-positions/feed/ 0
18. Technology, Growth, and Development (rebuttal to “government can’t do anything right”) https://inequalitybookreviews.com/2021/08/21/18-technology-growth-and-development-rebuttal-to-government-cant-do-anything-right/ https://inequalitybookreviews.com/2021/08/21/18-technology-growth-and-development-rebuttal-to-government-cant-do-anything-right/#respond Sat, 21 Aug 2021 22:06:33 +0000 https://inequalitybookreviews.com/?p=160 Technology, Growth, and Development: An Induced Innovation Perspective.  Vernon W. Ruttan.  2001. (Textbook)

Part One—Productivity and Economic Growth:

Throughout history and in today’s developing countries, sustained economic growth, which is achieved by increasing productivity, has been exceptional rather than typical.  During the last two centuries, this has changed markedly in European and English-speaking countries in association with markedly increased productivity from the Industrial Revolution

This growth continued due to a series of advances in general purpose technologies that have had a pervasive impact well beyond the industries in which they originated.  This sequence included advances in manufacturing (factory assembly of mass produced interchangeable parts), agriculture, sources of energy, particularly electricity, the chemical industry, the aeronautical industry, the digital and computer industry, and the biotechnology industry. In 1957, Solow found that 80% of US worker productivity growth from 1909 to 1949 was from changes in technology.  Thus, government may increase the rate of economic growth by pursuing an active technology policy.

High economic growth of the US and other developed countries has slowed during the last quarter of the 20th Century.  Six possible explanations include increased energy and raw material prices, changes in capital formation, decreased infrastructure growth, increased share of the services in labor, decreased R & D spending, and measurement and data problems due to the shift to harder to measure services.  Possible explanations that were dismissed include costs of pollution abatement and safety regulations, depletion of natural resources, and reduction of the work ethic.

Labor has been reallocated from sectors with increasing productivity to the service industry, which lacks gains in productivity.  In the US, employment has declined in agriculture from 50% to 2% from 1870 to 1990 and in manufacturing, mining, and construction from over 30% to less than 20% from 1950 to the 1990s, while it has increased in the services to over 75% by the late 1990s.  (Fig 1.2)

This “Service Sector Cost Disease,” describes failure of increased productivity in one sector that dampens growth for the entire economy.  This is shown by a two sector simulation with increasing productivity of 3% annually in manufacturing (automobiles) but 0% in services (education) for thirty years.  This “Service Sector Cost Disease,” describes rising prices without increased output due to failure of increased productivity in one sector that dampens growth for the entire economy.  Thus, productivity of services must increase for economic growth to continue.  (Table. 1.1)  

Part Two—Sources of Technical Change

The emergence of new general purpose technologies is the major basis for increased productivity from technology.  The fusion of science and technology for this transition began with the creation of industrial research laboratories (beginning with Edison in 1888) and publically funded agricultural research stations and research universities.  The Vannevar Bush report to FDR of 1945 endorsed public funding of basic research, which became the dominant pattern for several decades.  Changes in the relative prices of factors of production also spur innovation such as by substitution of capital for labor when labor expenses increase.

Institutional innovation to support growth is necessary during changing circumstances.  Political leaders may organize collective action for this socially desirable change, but only if they are rewarded by greater prestige and stronger political support.  This change may be resisted because of institutional drift to protect vested interests, industries, workers, or intellectuals fearful of technology.  In any event, long term resistance to technology has seldom been successful.  Also, continuing innovation may be discouraged by path-dependent “lock-in” for an established inferior technology, such as QWERTY or VHS.   

Outcomes of institutional change vary greatly according to the power structure of vested interest groups, ideologies, and cultural traditions.  English enclosure movements hurt farmers to benefit landowners.  Chinese agricultural decollectivization benefited farmers.  Argentina’s political dominance by the landed aristocracy impoverished smaller farmers.  And the 1862 US Homestead Act created opportunities for small farmers.   

Governmental and other nonprofit institutions have been established to advance basic scientific knowledge because benefits may not be profitable for individual firms, universities, or even nations.  This public support produced most of the technical advances that led to the computer revolution, major new pharmaceuticals, the biotechnology industry, and many improvements in agriculture.

Government action may be required to supply public goods because special interest “distributional coalitions” make political life divisive, limit the capacity to reallocate resources, and slow technical development and economic growth.  Models, such as the tragedy of the commons, the logic of collective action, prisoner’s dilemma game, and mechanism design, are profoundly pessimistic about the ability of individuals, acting alone or in cooperation, to achieve common action.    

Part Three—Technical Innovation and Industrial Change

For new ground breaking technologies, the private sector was the major contributor for the mechanization of the industrial revolution and agriculture and for the early discoveries of electricity.  However, public support was the major contributor for agricultural science, chemistry, computers, and biotechnology, particularly during the early non profitable stages.  

Agriculture:  Incentives for private development were better suited for mechanization than for science based techniques that had spill-over of benefits.  Hence, private sector mechanization increased productivity by increasing the area cultivated.  Largely publicly funded science-based biological and chemical technology increased productivity by increasing output per unit of land area.  Beginning in the 19th Century, sources of funding for agricultural science shifted from private to public in England, then in Germany for agricultural research stations, and then in the US for agricultural research stations and research universities.  Improved farming practices and the international wheat-breeding system provide powerful evidence for the productivity of publicly supported research

Light, Power, and Energy:  The Industrial Revolution saw greatly increased productivity with the transition of power for manufacturing from water power to steam engines in the mid-1800s and then to electric engines by the 1920s.  Sources of energy for this transition progressed from flowing water to coal to oil to natural gas.  Long distance transmission became possible after the change from AC to DC and the invention of the steam turbine engine.  Eventually, entire systems of manufacturing were redesigned, first by replacing steam engines in turning long line shafts, then by use of multiple smaller motors and shafts, and finally by replacing shafting altogether.  Expected progress toward alternate energy sources in the late 20th Century did not occur due to progress in natural gas production and electricity generation.  However, contributions from some of these sources, such as photovoltaic cells and pure hydrogen fuel, will likely increase greatly by the mid-21st Century because of technical progress, pollution from fossil fuels, and the possibility of global warming.  

Chemistry:  The chemical industry is one of the first modern industries in which technical change depended upon prior scientific researchGovernments have played an important role beginning in Germany, and then in the transfer of German technical knowledge to the US and UK after World Wars I & II.  By the early 20th Century, German organic chemistry and high pressure chemical reactions dominated world markets for dyes, drugs, and other products.  By the 1950s, US polymer-based synthetics and continuous process technology for large scale production positioned it well ahead of Germany.  The growth of the petrochemical industry has slowed substantially since the 1970s in association with technical maturity, environmental demands, and decreasing budgets for R&D.  It is doubtful that it can ever again play a similar dynamic role in economic growth.  

Computers:  Almost all technical advances in computers were publicly funded.  Since the 1930s, the need for improved electric switching led to progression from vacuum tubes to transistors, semiconductors, integrated circuits, and microprocessors.  Consequently, electronic computing evolved from huge vacuum tube computers to progressively much smaller, much faster, and much cheaper main frame computers, minicomputers, and microcomputers.  The transition to the microcomputer, made possible by Intel’s programmable chip of 1969, led to the development of the PC market by the Apple II and IBM PC in the 1970s and 1980s.  This was accompanied by the rapidly expanding independent software industry led Microsoft, which provided MS-DOS (Microsoft disk-operating system) for the Intel 8088 16 bit microprocessor of the new IBM PC in the 1980s. 

The computer, semiconductor, and software industries have been uniquely influenced by public policy.  The first computers and semiconductors were developed and supported by procurement from the US military.  The DOD and NSF supported fundamental research and graduate education for the development of softwareThe internet was developed by the Defense Advanced Project Agency (DARPA).  The rate of return on public investment in these industries has been high, in the 50-70%range.  It is simply not credible to assume that the market could have developed anywhere nearly as rapidly in the absence of the large public support that began in the 1940s.

Biotechnology:  Prior to the 1970s, almost all research was conducted by universities and the federal government.  This produced four major advances in molecular biology (below the cellular level) that led to the development of biotechnology: 1. Identification of DNA as the physical carrier of genetic information in 1938.  2.  Discovery of the helical structure of DNA in 1953.  3.  Invention of gene splicing to insert genes from a foreign organism into a host genome in 1973.  4.  Invention of hybridoma (fusion) technology to form a hybrid cell with nuclei and cytoplasm from different cells in 1975.  

These breakthroughs led to new biotechnology advances:  1. Cell and tissue culture technology that regenerates entire organisms from single cells or tissues.  2. Recombinant DNA (rDNA) technology that joins pieces of DNA from different organisms.  3. Cell fusion technology that combines a myeloma cell and a lymphocyte to produce monoclonal antibodies.  4. Protein engineering to create new proteins with specific catalytic or therapeutic properties.  

In the late 1970s entrepreneur-scientists created new university-industry relationships and genetic engineering start-up companies, such as Genentech.  By the mid-1990s, there were 1200 small- to medium-size research-intensive dedicated biotechnology firms in the US.  These start-ups lacked the capacity for large volume manufacturing, regulatory clinical testing (which averaged 100 months), and necessary extensive distribution networks.  Consequently, a highly complementary, rather than competitive three-way relationship has evolved between universities, biotechnology firms, and multinational pharmaceutical and agrochemical companies, such as Genentech’s contract with Eli Lilly to develop bacterially-produced insulin.  Marked growth is expected for the industry in the 21st Century, possibly similar to that of the computer industry in the late 20th Century. 

Part Four—Technology Policy

For three decades after World War II, an implicit social contract between the federal government, the scientific community, and universities assured a steady stream of scientific advances that translated into new weapons, new medicine, new materials, new products, and new jobs.  Beginning in the 1970s, this arrangement was challenged by the Environmental Movement, conservative anti-government ideology, and the end of the Cold War.  Today’s Policy choices are whether to continue widening income differences, environmental destruction, and decreasing world security or to acquire the vision for sustainable development and convergence of the rich and the poor. Three systems are compared.

The American system of technical innovation:  “The American System” of assembly of complex products from mass-produced interchangeable individual parts began with government funded manufacture of firearms in US Armories in the mid-1800s.  It then spread to the private sector along with the rise of the machine tool industry that supplied necessary precision for interchangeable parts.  High volume “mass production,” increased manufacturing’s share of commodity production from 10% to 50% from 1800 to 1900.  This system became most highly developed with the Ford Model T assembly line, which decreased assembly man-hours from 12.5 to 1.5 and decreased price by two-thirds.  From 1903 till the last Model T in 1927, the US GNP grew at 7% annually.  At the same time (1911), Principles of Scientific Management by Frederick Taylor introduced time and motion studies to increase efficiency. 

The Ford era of “classical mass production” ended in 1926, when General Motors introduced the era of “flexible mass production” characterized by multiple models with annual changes, as well as a system for purchase on credit.  However, after World War II, the US automobile industry declined due to markedly decreased innovation and decreased productivity growth to 5% annually for 20 years, then to 2.5% after 1965.  Ascendancy of a new business school trained managerial elite, with no experience and little appreciation of manufacturing and process technology, was an important source of this loss of US competitive leadership.

In the early 20th Century, science-based technology emerged due to critical institutional innovations, including agricultural experiment stations, industrial research laboratories, research universities, and support for public universities to develop necessary human capital.  The US post-World War II lead in agricultural, electrical, chemical, aircraft, defense, computer, and biotechnology industries was associated with a rapidly growing, largely publically funded R & D industry that included NSF, NIH, DOD, AEA, NASA, and the department of Agriculture.  Additional factors included small start-up firms, venture capital firms, military procurement, intellectual property rules that encouraged cross-licensing, and substantial spillover from military and space technology until the 1970s.

After declining in the 1970s, confidence in American leadership was revived by the 1990s by US dominance of the information revolution.  At this time, structural changes in technical innovation included 1) greater reliance by US firms on research in collaboration with federal laboratories, universities, and other firms, 2) increased location of R & D facilities in other countries and by other countries in the US, 3) greater reliance by US universities on industrial funding, and 4) changes in patenting and licensing commercially oriented research.

The Japanese system:  Japan was the first nonwestern country to successfully challenge the dominance of western technology.    By the 1880s Japan’s economy achieved a “take-off” to sustained economic growth due to the latecomer’s advantage of “catch-up” by technology transfer.  This began in the textile industry with low-cost Japanese labor, but progressed to the Japanese national technology system:  1) transfer of technology from abroad.  2) Strong public support.  3) Rapid adoption of imported technology.  4) Development of the capacity to innovate and manufacture.  Other factors included the Ministry of Trade and Industry (MITI) that targeted a succession of industries for technological catch-up, high rates of saving and investment, low consumption, the Deming “total quality control” system, and “lifetime employment guarantees” that encouraged investment in worker’s skills.

Japanese producers successfully challenged global leadership in a series of industries, including textiles between World Wars I & II, steel and ships in the 1960s, consumer electronics in the 1970s, and automobiles, machine tools, and several areas of computers and semiconductors in the 1980s.  Between 1970 and 1980, the market share of auto imports to the US increased from 4.2% to 22%.  The Japanese GDP achieved a “miracle” growth rate of 9% per year during the earlier catch-up phase, fell to a still healthy 4% per year during the transition to higher-value products in the 1970s and 1980s, but began a long recession in the 1990s.  Several factorsare believed to have contributed to this decline:  1) Being a borrower rather than a creator of science and technology.  2)  A financial sector that protected inefficient companies.  3)  Competition from other East and Southeast Asian economies that adopted the Japanese system.  4)  A strong reaction to its protectionist strategy by older industrialized countries.    

The German System:  In the early 19th Century, German institutional innovations included the modern research university beginning with Humboldt University in Berlin in 1809 and publicly supported agricultural experiment stations beginning in 1852.  Advances in chemistry, physics, and biology as well as technology transfer from the US and Briton led to industrial dominance for synthetic dyes, heavy chemicals, pharmaceuticals, and electrical machinery by the early 20th Century.  After World War II, Germany made a remarkable recovery, with growth only slightly lower than in Japan, mostly from reestablishment of the same institutions and industries that were dominant before World War I. 

The automobile industry is the major exception to the science-based development of German industry.  The Volkswagen project begun by Hitler in 1937 followed the Ford system with a single model through the 1950s, but with two important differences—an emphasis on technical improvements and a cooperative rather than an adversarial relationship with its trade unions.  When Toyota exceeded Volkswagen in exports to the US in 1975, Germany responded with a new class of luxury cars, including Daimler-Benz, BMW, and Audi, that were smaller, more powerful, and more reliable than US luxury cars.

Germany does have some disadvantages, including the highest paid Western workforce, conservative financing that inhibits venture capital, and a small domestic market.  It has dealt with these by a focus on high value products, one of the strongest science and technology bases in the West, and developing the larger market of the European Union.

Discussion of the three systems:  In the 1980s, science and technology development was characterized as mission oriented in the US to support defense, space, and computers, diffusion oriented in Germany to spread advances throughout most industries, and both mission and diffusion oriented in Japan to first develop a few target industries and then spread advances across multiple industries. Trajectories of new technologies have been divided into three phases.  In the post-World War II era, the US has led in the emergence phase that requires sophisticated R & D and flexible financial institutions to develop new trajectories.  Japan has excelled in the consolidation phase that exploits these new trajectories and transfers technologies from one trajectory to another.  Germany has established a superior maturity phase that requires highly skilled labor and production engineering.  Both Germany and Japan lack venture capital systems comparable to that of the US.

 In the half-century since World War II, the US technological leadership severely eroded as it became a world leader in inequality and national debt.  This probably reflects some combination of an inherently transient lead until competitors recovered from World War II and new factors like globalization and suboptimal exploitation of technology.  Proposed solutions for the US have included addressing low savings rates and high capital costs, formation of a strategic or managed trade policy, and more public and private support for commercial technology development. 

In any event, science-based industries represent leading sectors that tend to drive and shape technical change and economic growth.  The ability of a high-wage economy to compete in international markets is increasingly dependent on the science-based industries that require strength in scientific education and research.  Consequently, government direction and assistance are warranted

Writing at the end of the 20th Century, the author believes that each of the three systems discussed face major difficulties.  The US will be forced to confront its enormous income inequality and large deficiencies in education and health services.  Japan will be forced to modernize its largely traditional service economy, particularly its financial markets, and to change its economic policies beyond those of a developing state.  Germany (and Europe generally) will be faced with completing the economic and political integration of the European Union.

Technology, Resources, and the Environment:   Since World War II, intellectual and populist challenges to science and technology emerged regarding resource depletion, pollution, global warming, and other global environmental changes.

Resource depletion has become less urgent due to induced technical change in resource exploration, backstop technology, and raw material utilization.  From 1870 to 1960, the average private cost of extraction, in constant prices, fell for almost all extractive products.  Also, material use intensity decreased due to dematerialization (smaller automobiles), substitution, recycling, and waste mining.

Pollution remains problematic because open-access resources such as water, air, and natural environments continue to be undervalued in the price system.  Nonmarket solutions were introduced to manage this market failure.  Clean air and water laws greatly reduced air pollution from automobile and industrial emissions, acid rain from sulfur dioxide in power plant emissions, and water pollution from various waste products.  Nevertheless, these gains need to be protected and progress needs to be extended to other areas.  The driving force behind this growth of the environmental technology industry has been regulation or threat of regulation.

Global warming first became a concern in the 1960s when atmospheric carbon dioxide was found to be 25% higher than at the beginning of the Industrial Revolution.  By the end of the 20th Century, the balance of evidence supported a discernible human influence on global warming that could increase temperatures from 1.8-6.3 degrees F by 2100.  This is a significant threat to human health and agriculture.  While unsafe water, inadequate sanitation, and particulate pollution are found in poor populations, high carbon emissions come almost entirely from rich populations. (Fig. 12.6)

Preventionists argue that approaching catastrophe requires immediate action, while adaptationists argue that change will be slow enough to rely on market forces.  Limited studies in the 1990s estimate a cost of 2% of GDP for a 50% reduction of US carbon emissions by 2050 and an interval of several decades till benefits exceed costs.  Models show that even a modest carbon tax would be a powerful inducement to bias technology in an energy and carbon saving direction that could decrease costs.

UN conferences in Rio de Janeiro in 1992 and Kyoto in 1997 produced nonbinding fairly broad agreement of the requirement for progress:  1) Substantial reduction in global carbon emissions, with rich countries transferring resources to poorer countries.  2) Development of technologies, including increased nuclear power, solar power, fossil fuel conversion to hydrogen fuel, and storing carbon underground.  3) Creation of institutions for monitoring and enforcement (the most difficult task).  The costs of these changes may be mitigated by new technology and anticipated growth.

US Science and Technology Policy:  The US agricultural and industrial preeminence of the late 19th and early 20th Centuries was not a product of science-based technology, although several federal research bureaus were established at that time.  The Bush report to FDR of 1945 promoted government support for research, particularly basic research, for both national security and commercial applications.  Subsequently, US government support for science and technology markedly increased after World War II and during the Cold War

By the 1980s, a complex, four quadrant relationship between government, science, technology, and industry had developed. These quadrants included government-supported curiosity-inspired basic research, government-sponsored applied science and technology, government and privately-supported use-inspired basic research, and privately funded applied technology.  (Fig. 13.1)  During World War II and the early Cold war, large flows of government resources into Rickover’s Quadrant for weapons development, atomic energy, and exploration of space led to “Big Science.”

The underinvestment rationale argued thatpublic investment was necessary for a socially optimal result because private firms would underinvest in research that was unprofitable for them but produced beneficial spill-over social goods for other firms and consumers.  By the 1990s, multiple studies showed that social rates of return were significantly higher than private rates of return for investment in basic research and even for applied research.

Critics of the underinvestment rationale argue that necessary information is rarely available for assisting strategic trade industries or for preventing “lock-in” of less efficient technologies.  They argue that greater weight should be given to transfer of technology, since the US has lagged behind foreign competitors in commercial development.  Proponents argue that results of basic research cannot be predictedbut increase the number of options for technical development and commercial success.

For allocation of government resources for research, only specialized scientists truly understand the prospects for experimental success for projects in their own fields.  Hence, they strongly advocate allocation by peer reviews according to two important internal criteria: 1) Is the field ready for exploitation?  2) Are the scientists in the field really competent?  However, input by outsiders according to external criteriais also important.  Scientists from neighboring fields help determine relative importance and prospects for cross-over benefits.  Ultimately, allocation is determined by the political process.  Unfortunately, cost-benefit analysis is of limited use for the political process because complexity and necessary assumptions produce fragile and unrealistic results.       

Intellectual property policy to encourage development remains controversial.  Patents are thought to provide only weak encouragement to research and limited protection against imitation.  An evolving primary use for patents is in cross-licensing agreements that will likely become the dominant mode of settling intellectual property conflicts.  International cooperation is pursued through organizations and agreements such as the WTO, GATT, and TRIPS (trade-related intellectual property rights).

It is hard to overestimate the role of government policies for military procurement in technology developmentIn the US, the defense establishment came to dominate R & D expenditures between World War II and the end of the Cold War.  Commonly cited spin-offs include jet engines and airframes, insecticides, microwave ovens, satellites (for telecommunications, navigation, or weather forecasting), robotics, medical diagnostic equipment, lasers, digital displays, Kevlar, fire resistant clothing, integrated circuits, and nuclear power.

However, military R & D also carries the opportunity cost for whatever civilian technology development may have been foregone.  Also, substantial military innovations of systems management were ineffective in public and private sector environments not conducive to command-and-control.  By the 1980s, the role of the military was passing to the civilian economy with transfer to spin-ons of off-the-shelf technologies from civilian to military applications.

Politics of Science and Technology Policy:  The process of allocation of R & D funds in the US is quite decentralized.  The National Science Foundation (NSF), governed by scientists appointed by the president, was established In 1950 but with a role limited to support for basic research.  Prior to its establishment, multiple other federal programs were already allocating funds for research, including the Departments of Agriculture, Interior, Labor, and Commerce, as well as the National Institutes of Health, the Atomic Energy Commission, and the Office of Naval Research.  Within months of the Sputnik launch, NASA and ARPA (DoD Advanced Research Projects Agency) were added.  Consequently, the flow of federal resources to its own laboratories, to the private sector, and to universities is exceedingly complex.

During the first two postwar decades, federal R & D support expanded rapidly, initially primarily for the military and atomic energy, then with space exploration added in the 1960s.  During the Johnson and Nixon administrations of the 1960s and 1970s, federal R & D expenditures declined as resources were shifted to areas of social needs.  R & D expenditures for energy briefly rose then fell in the late 1970s and early 1980s.  During the Reagan administration of the 1980s, an increased share of federal R & D for the military from 48 to 64%resulted in an overall increase.  With the end of the Cold War,expenditures slowed during the Bush administration and declined during the Clinton administration.  (Fig.  13.4)

Institutional science advice to the president began with the effective collaboration of FDR and Vannevar Bush as head of the Office of Scientific Research and Development (OSRD).  Several reorganizations followed, including formation of the Office of Science and Technology (OST) by President Kennedy in 1962, which added civilian areas to its portfolio and took long term planning from the NSF.  In 1972, during a time of tension with academics, Nixon abolished both the OST and the President’s Science Advisory Council (PSAC) and demoted the advisory function to the NSF.  In 1976, the president coordinated R & D with the help of his science advisor and ad hoc panels of scientists and engineers.  In the 1980s, Reagan shifted this activity from civilian to military issues.

In response to erosion of US technical leadership, technology policy assumed an increasingly important role in the Bush and Clinton administrations.  The 1990 Bromley report by Bush’s OSTP director and 1992 Clinton-Gore report helped to defend the federal role in R & D and emphasized support for commercialization and transfer relative to basic science.  Hence, S & T policy continued the shift away from the Cold War focus on the defense, space, and nuclear fields toward areas that would enhance US competitiveness in world markets.  A 50-50 split was achieved between military and civilian technology before the end of the decade.

Congress has been more hesitant than the president in establishing institutions for scientific and technical advice.  Its only institution has been the Office of Technology Assessment (OTA), which was created in 1973 but assassinated by the fiscal revolutionaries of 1994.  Since the demise of the OTA, the National Research Council (NRC) has been the only objective source with the necessary range and depth for major science and technology issues. The NRC is the operating arm of the nonprofit National Academy Complex that includes the National Academy of Sciences (1863) and associated academies.  Other nonprofit organizations, such as the Carnegie Corporation, the Brookings Institution, and the Rand Corporation, have been helpful for selected issues.  The large population of foundations and think tanks often pursue ideological agendas that result in less useful hortatory and polemic advice.  

Issues of Science and Technology Policies:  Although a federal R & D budget can be “added up,” it has never been allocated or managed as a coherent whole.  With increasing budget constraints in the 1990s, congress sought better information for allocation of resources for research.  Although scientists happily accepted economic analysis showing higher social than private returns for R & D, they feared that cost-benefit analysis by the same methodology would lead to mindless application of incomplete results.  Nevertheless, the author maintains that when applied with skill and insight, rate of return analysis has been exceedingly useful.

Congress needed answers to two important questions when allocating resources for research:  1) What are the chances of advancing knowledge or technology?  2) What will be the value to society of these advances?  The first question can be answered only by scientists or technologists at the leading edge, usually by peer reviews.  The judgements of administrators (even former scientists and engineers), planners, and economists are rarely adequate.  The answer to the second question requires use of formal analytical methods employed by planners, economists, or other social scientists.

State government contributions to R & D have been much smaller than federal contributions.  Local critics have feared “spill over” of benefits to other political jurisdictions and use of funding to serve agendas of faculties other than their own.  Nevertheless, state technology development programs have had some successes, such as the North Carolina Research Triangle Park, Massachusetts Route 128 development related to Harvard and MIT, and Silicon Valley related to UC and Stanford.

Prior to World War II, most science was “little science,” sometimes with big engineering, such as for the TVA or Manhattan Project.  By the 1960s, “big science” had emerged, with monuments such as huge rockets, high-energy accelerators, and high-flux research reactors.  This led to three questions:  1) Is big science ruining science?  2) Is big science ruining us financially?  3) Should we devote a larger part of our scientific efforts to bear more directly on human well-being than big science projects do?  Weinberg answered that big science is here to stay but at the same time should not be allowed to trample little science.  He added that the US should settle on some figure less than 1% of GNP for federally supported nondefense science.

In the 1990s, the Department of Energy (DOE), which employs 30,000 scientists and engineers, is a source of concern about big science, particularly for fusion research.  In other areas, congress has cut off funding for the Superconducting Supercollider project and reduced funding for the Global Climate Change program.  Increasingly, international cooperation and partnerships will be required for major projects like the human genome, fusion power, space exploration, particle physics, or global ecologic problems.  The author projects that political support is unlikely for any new big science in the US.

The federal share of support for university research has been declining since the 1960s.  Public support has been eroded by publicity of ethical issues like environmental controversies, misuse of human and animal research subjects, and isolated scientific misconduct.  Critics have complained about inflated indirect cost recovery for items such as libraries and even a Stanford luxury Yacht and about bypassing federal peer and merit review systems by lobbying for congressional earmarks.  Critics have called for downsizing research universities, which have increased from fewer than 50 after World War II to over 400 by the late 1990s.  Of these, 50 accounted for 51% and 100 accounted for79% of federal research funding.

Three types of US government investment in technology programs are identified as relatively successful.  In procurement-related technology, the government has knowledge of its own needs and the ability to communicate them to suppliers.  Civilian spillover occurs but is not the primary source of legitimacy.  In generic technology, university-based research largely funded by the NIH is t-oriented research.  In Client-oriented technology, research has been most successful for specific agricultural missions, such as for higher yielding crops, animal feed conversion, management of soil and water, and economics of farm operation.

The fourth type of investment is the attempt to pick winners for development in commercial markets.  Many regard the results of this as unequivocally negative, such as in housing technology, supersonic transport, and synthetic fuel.  Others argue that government support is needed for projects that are unpromising today but may be promising tomorrow, although support is still more effective in generic research than in commercial markets.  Critics continue to argue that private property rights improve incentives, private participants make more efficient choices, business cooperation leads to more efficient research, university development of technology for private use should be direct rather than indirect, and large public projects make pork barrel problems difficult to avoid. 

A number of points about public funding of R & D have emerged:  1) Public sector support has played an important role in the emergence of every US industry that is competitive on a global scale.  2) The system of intellectual property rights is more efficient for diffusion than generation of technology.  3) As the cold war ended, the presumption that massive defense-related investment was a pervasive source of spin-off commercial technology returned to the more traditional view of a spin-on relationship between commercial and military technology.  4)  Skepticism is increased in both public and private sectors that investments in S & T lead directly to commercial development

The author concludes that significant constraints are likely for big science in the future.  He regards the issues of whether the US is investing too much or too little in R & D or producing too many or too few scientists as unresolved.  Perhaps the issue of spill-over of US scientific knowledge to the rest of the world should be dealt with by appropriating more of the knowledge generated abroad.  A sharp distinction is made between support of target basic research and generic technology and a more narrow policy of “picking winners” in technology development.  In any event, a rate of growth of R & D expenditures that exceeds that of productivity and income will not be sustainable over the long run.   

Summary of Findings by the end of the 20th Century:  A succession of general purpose technologies has served as important vehicles for technical change and economic growth throughout the economy.  In the 19th Century, the steam engine powered the industrial revolution.  In the early 20th Century, electricity enabled mass production, communication technology, and consumer electronics.  Throughout the 20th Century, chemistry led to agricultural and military advances, as well as new fibers, materials, and pharmaceuticals.  In the second half of the 20th Century, computers and semiconductors led to advances for manufacturing, services, and consumers.  In the late 20th Century, molecular biology led to the emerging biotechnology industry.  A consistent feature of these general purpose technologies has been a lengthy period between their emergence and their impact—a century for the steam engine, half a century for electric power and computers.

Government has played an important role in technology development in almost every US industry that has become competitive on a global scale.  Examples include research for agriculture, highway infrastructure for automobiles, military research and procurement for computers, and basic research for biotechnology.  Three types of public support have been successful:  1) Direct support for areas with strong government involvement, such as development of the internet by DoDs ARPA.  2) Support for generic technology, such as for molecular biology that led to the biotechnology industry.  3) Support for client-oriented technology, such as agricultural research that led to most increases in plant and animal productivity of the last century.  Also, the US decentralized research system gives it greater flexibility to adjust to global circumstances.

Even for relatively mature industries, advances in technology, particularly process technology, can be important sources of productivity growth and competitive advantage.  Continued growth of agricultural productivity after maturity resulted in decreased costs and decreased workforce share (to less than 2%) that maintained a dominant US position in world markets while allowing transfer of workers to other industries.  On the other hand, the US automobile industry was mature and dominant in the mid-20th Century but then lost both jobs and market share when it lagged behind Japan and Germany in productivity growth as management passed from engineers to business school elites.

Prices of factors of production, particularly for labor relative to capital, powerfully influence the rate and direction of technical change.  However, these factors are influenced by political as well as economic markets.  In particular, the values of environmental resources, formerly regarded as free goods, have begun to rise.  Consequently, technical change trajectories with high energy requirements and material consumption have come under attack as threats to human health and the environment.  These perceptions have already stimulated substantial innovation in environmental policy and law in developed countries.  Nevertheless, very substantial public sector investment in the generation of new knowledge and new technology will be required to achieve sustainability in the 21st Century.

Prospects for Transition to Sustainable Development:  Some resource economists maintain that sustainability merely requires the capacity to supply the expanding demand for substitution of resources and commodities on increasingly favorable terms.  Some ecologists argue that the present system is unsustainable for the natural balance and should be replaced.  A third group argues that sustainability should include social considerations since technical change assaults community values, rural people, and indigenous communities, as well as the environment.  In any event, a successful transition must include enhanced consumption for the vast majority of people now living as well as those to be added to the population in the future.

Wealthy societies that already resist redistribution to address present inequality can be expected to resist Intergenerational resource transfer, as well.  Economists have proposed that a sustainable path of development gives future generations equal treatment with the present generation.  The strong sustainability rule requires that the stock of natural capital be held constant or enhanced. The weak sustainability rule holds that the form of replacement for natural capital doesn’t matter if it is replaced by constructed capital to maintain the same aggregate capital stock.  Even sophisticated models rely on assumptions that may include biases of conventional opinion and the strong temptation to project catastrophe, such as for the1970s projections for higher petroleum prices.

Three basic scenarios are presented for global change scenarios between the 1990s and 2050.  The Conventional Worlds scenario continues present trends that lead to a richer but dirtier world with increased conflict, but has the option for improvement by vigorous government action and institutional reform.  The Great Transformation scenario substitutes cultural consumption for material consumption to promise sustainability, with the possibility of even more radical decentralization, small scale, and decreased growth (Ecocommunalism).  The Barbarism scenario leads to institutional disintegration, economic collapse, and intensified conflict, with the possibility of a Fortress World option from authoritarian response in more developed countries.  (Fig. 14.1). 

Successful transformation to sustainability will require major cultural and institutional changes from material and energy-intensive to service and cultural-intensive consumption.  The more optimistic scenarios posit continued technological change leading to decarbonization and dematerialization. Productivity gains will be required in the growing service sector if growth of consumption is to continue.  The author’s sense is that a substantial number of countries will fail to achieve a transitionHe doubts that the New Sustainability variant will be more than partly realized or that the Barbarization Scenario well be completely eliminated.

Substantial progress has already been made in some transitions, such as from rural to urban human settlement, from low to high agricultural productivity, and from high to low rates of birth and death.  Several other transitions appear to be well underway, such as from high to low levels of energy and materials consumption, from low to high levels of literacy and numeracy, and from early to late death.  However, transitions remain problematical for the poorest countries, particularly from failure of institutional development.  Food demand is expected to double in the next half-century.  Less than 10% of health research funding is directed to more than 90% of the world’s preventable deaths.  No coherent system is in yet in place for substantial institutional changes for environmental concerns.

The issues of substitutability, obligations toward the future, and institutional design are central to sustainability transition.  The sustainability community regards substitutability between natural factors and constructed factors as severely constrained.  If investment and technology can continuously increase opportunities for substitution, constraints on resources could still leave future generations worse off.  If opportunities are even more narrowly bounded and cannot exceed some upper limit, catastrophe is unavoidable. 

For obligations toward the future, some economists propose discounting costs and benefits by some “real” rate of interest, but critics insist this is dictatorship of present over the future that will make molehills out of mountains in fifty years.  In any event, efforts must involve some combination of high contemporary rates of saving to defer consumption to the future, high investment in human capital, and more rapid technical change, particularly for resource productivity and substitutability[FH1] .  In the short run, even the Conventional Worlds scenario seems sustainable.  Over the long run, almost no scenario involving continuing economic growth appears sustainable.

For institutional designs, the costs of actions that generate the negative externalities for the environment must be internalized for households, private firms, and public organizations.  Otherwise, technological development will be biased along inefficient pathways.  Unfortunately, these pathways are often selected more for their political acceptability or their consistency with ideological commitments than on the basis of objective knowledge.  If humankind fails to navigate this transition, it will be due to failure of institutional design rather than constraints of natural resources or technical innovation.  This is not an optimistic conclusion.


 [FH1]

]]>
https://inequalitybookreviews.com/2021/08/21/18-technology-growth-and-development-rebuttal-to-government-cant-do-anything-right/feed/ 0