America, part 2

Stars and stripes

(Photo credit: Stars and stripes, courtesy Theodore Lee, Flickr Creative Commons)

As I said in an earlier post, when I’m doing historical interpretation and people ask me “Why do you do this?” I reply by telling the truth: I’m a history buff, and I understand the world around me primarily through the prism of history. So when I became a citizen, I realized that the only way I could make sense of this country (not to mention my own decision to become an American) was through that same prism. However, this brings to the fore some real issues. I’m going to take a swing here at identifying at least a portion of the intellectual challenge. The eventual goal will be to marry the ideological stuff with America’s peculiar relationship with its history and its memorialization of that history. This is only a start.

A big part of appreciating this country’s public history—America’s core sense of itself—is to try to figure out its intellectual foundations. At first sight this seems like it should be a fairly straightforward exercise. After all, we’ve got a solid Constitution to go on. But of course, it’s not that simple. We’ve got the Declaration of Independence to consider as well. Most Americans tend to think of these founding charters as emerging from the same wellspring of Revolutionary fervor. But they don’t. More on that below.

So where to start? Well, we’ve got to go back to the European philosophers. First up, John Milton. Writing way back in the 1600s, Milton was an originator of a philosophical foundation that has supported and been supported by lawyers and journalists since before the founding of the American Republic: the search for truth and virtue. According to Milton, in any free exchange the truth would always emerge triumphant under what became known as “the self-righting principle.” If there is one single philosophical justification anywhere for free and open expression, this is it. In the late 18th century, Milton’s ideas became widely accepted and then endorsed and promoted by many of the Revolution’s leading lights, including Benjamin Franklin and Thomas Jefferson. Justice Oliver Wendell Holmes is perhaps best known for giving form to modern American notions of the marketplace of ideas; in his opinion in Abrams v. United States, he wrote that the “best test of truth is the power of the thought to get itself accepted in the competition of the market.”

In spite of Milton’s undoubted influence, John Locke can also lay claim to the title of “father of American democracy.” He is considered the father of empiricism, the doctrine that all knowledge (with the possible exception of logic and mathematics) is derived from personal experience rather than high theory—a core belief of the American creed.

Locke put forward his most influential political arguments and theories in his Two Treatises of Government, first published in 1690. His prescription for good government consisted of three essential elements: the concept of natural law, the social contract between the governors and the governed, and the right of the people to rebel if that contract is broken. Each leads naturally to the next, since the ruling government (or monarch) is a party to the contract and can justifiably be resisted and even overthrown should it fail to deliver its side of the bargain. Locke’s beliefs in natural laws and inalienable rights, including the right of popular rebellion, were later seized on by Jefferson, who put it in the preamble to the Declaration of Independence: “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.” It’s for this reason that one scholar characterizes the Declaration of Independence as a document that, “with its ringing phrases that have been written into the bloodstream of American journalists, was the handiwork of John Locke as much as of Thomas Jefferson” (Altschull, 1990, p. 57).

If Locke originated the concept of empiricism, a great Scottish philosopher and fellow libertarian, David Hume, took the concept a significant step further. Hume, who lived in the period leading up to the American Revolution, helped implant in American the doctrines of empiricism and skepticism. Indeed, if Hume, among all the great European philosophers, holds the position of “chief skeptic,” he is also remembered as the supreme relativist. Hume rejected absolutes of any kind, just as he rejected metaphysics and religion of any kind. His value system was totally inductive, determined not by universal principles but by individual cases growing out of general custom and popular practice. In this way his epistemology mirrored the inductive nature of the English common law that still lies at the heart of American and Canadian jurisprudence, and which underpins the broader structure of classical market capitalism. And of course we can’t talk about classical market capitalism without mentioning Adam Smith, whose central contention about the “invisible hand” of the market helped shatter the old system of mercantilism and usher in a modern capitalist revolution.

In a broad sense, it is easy to see the link between Hume and modern American society. His “show me” skepticism, his absolute belief in empiricism over abstract ideas and God-given knowledge, and his theory of cause-and-effect are all traits that remain predominant in American society, law, government, academia, and journalism. They are also beliefs singled out for challenge or rejection by many subsequent European theorists, from Kant to Marx and beyond, whose works had less impact in the United States than in Europe. (Marx also believed in the determinative power of economics and history, something that still holds more sway in Europe than in the U.S.)

The young United States adopted wholesale much of this Enlightenment thinking in 1776 (helped immensely by Tom Paine’s Common Sense). After all, any new political entity forced into creation by its rejection of another system ought to have fashioned for itself a strong philosophical justification for its actions. Jefferson’s Declaration of Independence certainly did that. It also set the frame for a loose union of states, whose rulers and citizens professed an abiding fear of a powerful, centralized government. Yet just twelve years later the United States, on the verge of falling apart under the weak Articles of Confederation, adopted a Constitution that provided for just such a powerful, centralized government. Much of the work of justifying and interpreting that Constitution was done by Madison, Hamilton, and Jay in The Federalist Papers. This, and the provision of a Bill of Rights, helped bring the new Constitution into being, albeit against stiff internal opposition. In the process, many of the founders, including Jefferson and later even Madison himself, took serious issue with its centralizing tendencies. This basic tension has remained right up to the present day.

As a result, the country developed, and has retained, a deep schism between two conflicting creeds: on the one hand, a Hamiltonian nationalist, centralizing, patriotic, communal sentiment and, on the other, a populist sentiment summed up by one Canadian commentator as “antistatism, individualism, populism, and egalitarianism” (Lipset, 1990, p. 19). I would suggest that the former creed, which tends to come to the fore during wars and other times of national emergency, provides a firmer ground for an appreciation of history and the memorialization of that history. The latter belief system, with its preference for personal satisfaction and self-fulfillment over communal well-being, is less caring about history and common memory. Lipset also notes: “The United States is unique among developed nations in defining its raison d’être ideologically” rather than, say, historically. It’s this ideological orientation that makes American conceptions of nationhood and nation-building so difficult to pin down, and sets the United States apart from those of most other countries, certainly in Europe. It also affects how we do monuments and memorials in this country, something I’ll have to get back to.

So how do we link any of this to an understanding of the importance of monuments and memorials in America today? After all, this deep-rooted tendency toward empiricism, skepticism, and individualism might not seem like ideal soil on which to plant an appreciation of history and common memory. And the powerful communitarian forces that partially offset these tendencies in the 20th century—propelled by the Great Depression, world wars, and the Cold War—seem to be on the wane. Meanwhile, America’s grip on its own history is shaky at best, and it’s getting shakier. I think these trends are linked. Now to be clear, I’m not saying that libertarian individualism is bad, just that the pendulum has swung too far in that direction. It needs to come back a tad.

So do America’s ideological origins and its current zeitgeist make an appreciation for history more problematic? Yes. Is this an insurmountable problem? I don’t know. I hope not.

Sources:

J. Herbert Altschull (1990). From Milton to McLuhan: The Ideas Behind American Journalism. New York; London: Longman.

Seymour Martin Lipset (1990). Continental Divide: The Values and Institutions of the United States and Canada. New York: Routledge.

Bertrand Russell (1972). A History of Western Philosophy. New York, NY: Simon and Schuster.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s