History of citizenship in the United States

An essay exploring the relationship of citizenship from the 1700s to the present

Citation
, XML
Authors

Abstract

This is a brief history not of a people or nation or place but of a relationship — the relation of citizenship — between a person and the state and how it changed from colonial times to the present. In the 1700s, citizenship meant active political participation in local government, showing up at meetings, debating, volunteering in local government. Over time, however, political participation declined to the point where Americans are essentially apolitical beings divorced from politics. Citizenship has narrowed to signify only a legal marker meaning membership in America. How did this change? And what does it mean? While students of history and politics as well as political activists may find this thinking particularly helpful, free people can benefit. This essay examines the transformation, and argues that declining political participation has drawbacks, but that impressive advances for humanity in general have overshadowed the citizenship decline.

Note: The first version of this text appeared in Wikipedia with the same title, but was whittled down by fellow contributors for various reasons, and will likely be deleted. Rather than battle on Wikipedia, I moved the article here. Since the first draft, I’ve learned more things which I’ve used to upgrade this article further. I feel this is the BEST explanation on the web or in print about what happened to U.S. citizenship. And it’s free for everybody. Further, I am striving to make it even better; constructive comments are highly welcome. If anybody can point me to a better explanation about what happened to citizenship, I’m interested. This article is public domain; feel free to copy it as you wish (pictures and quotations may have attribution requirements.) Like it? Add weblinks pointing to it. — tom sulcer February 2011

Colonial years

Town hall meetings and direct democracy

Early European settlers to the New World braved many dangers without much support from where they came. They were on their own. There were no established governments or armies or bureaucrats from the mother countries. Circumstances forced them to cooperate. Their situation was similar to that of ancient Greece, where cooperation was required for the Greek phalanx fighting method to be effective, and the military

Jamestown, Virginia, was one of
the first European settlements in
 the Americas. Settlers were
 largely on their own.

necessity favored the political growth of democracy.[1] A tradition began in New England of regular town meetings to coordinate activities.[2] If neighbors wanted a schoolhouse, for example, they had to build it themselves. Survival demanded participation from people thinking rationally how to tackle problems. It was founded on an understanding of equality.[3] There was a military component too; neighbors had to band together for protection in the event of attacks by Native Americans. People had to act as citizens. The meetings were examples of direct democracy which gave people a chance to bring up any issue of their choosing, and any attendee with a reasonable request had a chance to have that request heard.[4] [5] Citizens learned skills necessary for self-government: debating, thinking, compromising, listening. The group exerted a natural peer pressure to encourage civility and politeness, and obnoxious or obstreperous persons met with disapproval. Many volunteered to serve in elected one-year offices such as treasurer, postmaster, clerk, justice of the peace. [6] By dividing work, nobody became too powerful or overworked; by widening participation and shuffling positions,

Town crier in Province-
town, MA, one of many
volunteer posts.

office holders learned valuable political skills hands-on, up-close, every day as well as gaining valuable experience. [7] People developed respect for neighbors. They knew each other. People could see who the most able thinkers and legislators were and select them for higher office or more responsibility. Later thinkers such as Alexis de Tocqueville saw these meetings as important incubators of liberty.[8]

Senator William E. Borah from Idaho, writing later in 1922, wrote that there should be a granite monument to any town where it could be proved that the first New England town meeting was held.[9] Borah described these meetings:

“Once each year every man residing in the limits of the township came, gave full expression to his views and had his vote counted. All affairs of government were here discussed and passed upon, policies were outlined, accepted or rejected — publicity in all public affairs was a reality and not a pretense. They chose their Selectmen, town officers and finally came to vote for their State and Federal officers — and were not haunted or harassed by the doubts and fears of the modern statesman whose erudition leads him to question the judgment and stability of the masses.”[9]

There was real interaction between governors and governed.[10] And decision-makers were close to their decisions — close in terms of distance as well as close in terms of time — so people could see quickly in a hands-on way whether a particular law was effective or not, and modify it accordingly. Participation fostered civic responsibility.[9] Extensive citizen participation meant that government functioned properly with little expense and government could change course rapidly when decisions were imperfect.[9]

“Local self-government in all the term implies, active, vigorous, vigilant, jealously guarding and governing all matters of local or domestic concern, drawing the citizen for a season away from private affairs and enlisting his energies in public matters, identifying him with the actual needs and doings of the State and Government, are indispensable to a healthy, durable Federal system. Our fathers understood this well, and were wise and cautious in jealously guarding it when they came to frame the Federal system. If they were wise to preserve it, their children will be wise to continue to preserve it. — Senator William Borah from Idaho, 1922.”[9]

Writing at the end of the twentieth century, Jean Elshtain commented on Tocqueville’s sense of citizenship:

“Alexis de Tocqueville, in his classic work Democracy in America, argued that one reason the American democracy he surveyed was so sturdy was that citizens took an active part in public affairs. This is important because participating in public affairs means one must move from exclusive and narrowly private interests and occasionally take a look at matters that concern others. In Tocqueville’s words, As soon as common affairs are treated in common, each man notices that he is not as independent of his fellows as he used to suppose and that to get their help he must often offer his aid to them. In this way civic engagement helped to underscore what Tocqueville called self-interest properly understood, an interest that was never narrowly focused on the self.” — Jean Elshtain (circa 2000) quoting Tocqueville (1840s).”[11]

Educational aspects of town meetings

Town meetings had an educational aspect. Thomas Jefferson called the New England town meeting “the best school of political liberty the world ever saw.”[12] It taught people how to use reasoned arguments, how to

The Declaration of Independence is presented to Congress.

compromise, how to think and speak and persuade, and citizens skilled in the give-and-take later rose to become important statesmen in the original Thirteen Colonies. By example, older citizens could teach younger ones how to participate. Borah wrote “local rule is the great university in which is reared and trained and equipped the kind of statesmen who take care that no harms comes to the Republic.[9] These meetings helped teach people “how to use democracy” and “enjoy it” according to Tocqueville.[5]

It was by no means perfect. Professor Benjamin R. Barber admitted that democratic politics could be “fairly raucous but with certain limits” and sometimes manifested a “rhetorical incivility within the boundaries of bipartisan politics” but overall he concluded it was “a healthy manifestation of political conflict and

Slavery ended after the Civil War.

disagreement.”[13] Some Framers distrusted these meetings; James Madison wrote that at town meetings, “passion never fails to wrest the scepter from reason.”[5] And citizenship was limited to adult white men and further restricted by property qualifications which narrowed the pool of eligible voters; the process of attending meetings tended to favor persons from the upper classes. And the unfortunate glaring exception to citizenship was, of course, slavery, in which slaves obviously were neither citizens nor enjoyed basic rights, and this issue was to cause serious national mayhem throughout the nation’s history and continues to reverberate in the present. It was a basic system. One writer (2003) described democracy this way: “For the founding fathers, the ideal citizen was a white, property-owning male whose vote was a ratification of a fellow prominent citizen’s trustworthiness to lead … Contemporary democratic staples like freedom of the press, party politics, open deliberation, political campaigns, and even widespread public education were not considered vital elements for citizenship in the colonial period … Our first version of democratic citizenship was, in Schudson’s analysis, a politics of assent.”[14] Nevertheless, Newsweek reporter Jonathan Alter wrote that “the New England town-hall meeting was the earliest form of American democracy and it remains the best place to watch, listen, ask questions and then go home and think.”[15]

Writers such as Solzhenitsyn believed “democracy works well in small units where the voters know the candidates personally and exercise self-restraint” such as in Switzerland or in New England town meetings.[16]

The public sphere

Piazza della Signoria in Florence, Italy. Town squares in
Renaissance Europe embodied the concept of the “public sphere”
architecturally. Citizens met and discussed public matters there.
photo by Samuli Lintula (2006) cca 2.5.

As American towns grew, a phenomenon brought from Europe which evolved during the Renaissance happened in America too. The public sphere was a place between private individuals and government authorities in which people could meet and have rational-critical debates about public matters. Discussions served as a counterweight to political authority and happened physically in face-to-face meetings in coffee houses and cafes and public squares as well as in the media in letters, books, drama, and art.[17] It was brought on partially by merchants’ need for accurate information about distant markets as well as by the growth of democracy and individual liberty and popular sovereignty, according to democratic theorists such as Jürgen Habermas. He saw a vibrant public sphere as a positive force keeping authorities within bounds lest their rulings be ridiculed. In effect, public opinion became a check on government power.

The American Revolution

The United States Constitution

By 1776, many citizens had such skill and experience in self-government that their generation produced an outstanding cadre of first-class thinkers educated not only in the political wisdom of antiquity, but in the nuts

Common Sense, 1776.

and bolts of governing, and their collective skill surpassed that of the British Parliament and monarchy. Americans could see, despite their relative lack of financial resources, and despite their general lack of education compared with their counterparts in Great Britain, that they were not considered as true citizens in the British world since they were denied basic representation in Parliament. To be fair, it took a dissatisfied Briton named Thomas Paine writing the persuasive pamphlet Common Sense to help Americans appreciate their discontent in the context of history and world politics, and crystallize their anger into action. To their credit, Americans read Common Sense avidly, quickly, and grasped the logic. They paid attention. They rightly protested taxation without representation. It led to a successful revolution.

Adam Smith.

During this time, huge changes were underway worldwide. In Europe, feudalism was breaking down, but monarchs were struggling for power with well-entrenched nobles who had a power base in owning serf-worked land. The industrial revolution was in its infancy with key discoveries by Newcomen on steam-powered pumps which freed industries from having to locate by fast-flowing water sources. Humans were learning how to exploit fossil fuels such as coal and oil instead of exploiting other humans. An intellectual enlightenment was beginning, pushed by advances in printing, paper, and book technology and the free-circulation of scientific pamphlets. Adam Smith published The Wealth of Nations in 1776 which suggested, in part, that nations could compete with each other by advancing their own economies. As professor David Christian pointed out in his Big History lecture course, much of so-called agrarian civilizations were built on the exploitation of a peasant class, such that nine of every ten humans were peasants with little incentive to produce more or think or invent; peasants didn’t benefit by working harder. Ancient civilizations in Egypt and Mesopotamia and India and China and Rome generally discouraged merchant activity and economic competition. But by 1700, this anti-mercantile mindset was breaking down, and humanity was poised to leap to new levels of power and dominance.

In Europe, nation-states competed for wealth and power and dominance, but ambitious rulers hungry for land and seeking funds for ever-expensive armies, and seeking brains to invent new weapons, ran into the obstacle

Conquering Europe is difficult
because of its geography, although
Napoleon held this large swath of
territory for a short while.

of the peculiarly divisive geography of Europe. Thinkers such as Jared Diamond is his excellent Guns, Germs and Steel pointed out how tall mountains separated France and Spain, France and Italy, protected Switzerland and much of Austria, and how large ocean channels and straits separated island nations such as Britain and Ireland as well as much of Scandinavia. As a result, it was difficult for one imperial land power to impose its will on all of Europe. During those times when a ruler such as Napoleon became too powerful, Britain would form a coalition with the weaker continental powers to balance the opposition, leading to a stalemate which bolstered the status quo of a Europe divided into different kingdoms or states. Whenever a monarch tried to squelch merchant activity, perhaps with forbidding taxes or extensive regulations, aggrieved merchants could flee to neighboring nations, set up shop, and win protection from a different sovereign. Commerce could be squeezed, stifled occasionally, but not killed; commerce was like wet clay oozing freely from a tightening fist. Since there was competition to get the finest thinkers to come to one’s country to invent new weapons and processes, rulers were pressed to entice them with promises of free speech, rights, and free travel as well as protection from crime and onerous taxes. There were numerous wars, often over religion, which taught Europeans the futility of trying to impose religious ideas on others; thousands of people died in useless religious wars. Despite the petty warring, the essential national boundary lines, usually based on language, stayed fairly firm.

The Alps is one of many geographic
barriers in Europe making it hard
to conquer all of the continent.

Europe was, in a sense, a geographic lesson in capitalism, a continuing experiment in how not to govern by exploiting people. Rather, the key lesson for people and governments alike was that success lay in empowering merchants, in empowering people, in fostering invention, in protecting individual rights, in building environments in which individual humans could flourish. It was a style marked by protecting rather than impoverishing people, of fostering free trade rather than squelching commerce. The lesson was: empower individuals. Free them. Protect them. Set them loose. Respect them. Let them learn. Tolerate dissent. Nations which did these things, prospered. Nations which didn’t, didn’t, and the United States was born at a lucky time when the evolutionary thinking of several thousand years of human civilizations as well as the peculiar decentralizing geography of Europe culminated in a brilliant new approach to governance.

A patent cover.

Many principles underlying the American legal code had survived from ancient Greece and particularly Rome, such as jury trials, argument, separation of judicial functions, stare decisis, and respect for precedent, and were taken from Britain’s common law approach. And a working legal code was needed to protect commercial transactions and to uphold contracts. The Constitution’s Framers were serious enlightenment intellectuals who benefited from the collective learning of generations, and for the most part, they made wise choices which bolstered commerce and encouraged inventiveness by legal devices such as patents.

They built representative government into an intelligent federal system, established checks and balances to thwart ambitious politicians, and devised a government which could practically run on autopilot even with substandard and lackluster leaders. America’s federal system mimicked Europe’s decentralized nature since particular states, such as New York and New Jersey and Pennsylvania, were like separate national governments charged with regulating commerce within their borders, but which competed with each other to govern effectively. A business, taxed badly by Massachusetts, could move to Vermont, for example; this kept state governments scrambling to regulate wisely to keep from losing citizens and businesses. Dividing power among many state governments was one way to help people stay free, since people could vote with their feet when a particular state government became tyrannical or tax-happy or clumsy. Since state governments were forbidden to raise national armies or conduct foreign diplomacy (a temporary exception happened during the Civil War), this was a highly sensible arrangement.
 

It’s year 1700. Tell someone:
they can use this hand-held
device to talk with anybody,
anywhere in the world
with a similar device. They’d
think that you were crazy.

The Framers could not have foreseen the terrific upcoming revolutions in power, science, learning, and population growth. For example, as professor David Christian points out in Big History, if a person in the 1700s had predicted that man, within three hundred years, would fly in planes or explore the ocean in submarines, fly rockets to the moon, play music from a credit-card sized gadget storing hundreds of pre-recorded songs, take pictures with cameras and send these images in milliseconds around the world as image files — and not just kings but most people — then others would have thought such predictions were preposterous, ridiculous, crazy. If somebody had said “you’ll be able to talk with a person instantaneously on the other side of the world”, then they’d have thought you were nuts. People thought more along the lines of Thomas Malthus, the gloom-and-doom demographer who predicted that population explosions would trigger famines, wars, and huge disasters. Malthus was wrong. Human population was set to boom incredibly and boom it did, growing from perhaps five hundred million people worldwide in the 1700s to SIX BILLION people today (NINE billion forecast by 2050). Further, humans figured out incredibly clever ways to feed all of these people while reducing hunger and starvation. While the Framers could not have foreseen these incredible changes on the horizon, they crafted a shrewd document with much flexibility built in. They expected population to grow but not explode exponentially.

The Constitution was great, but not perfect. It solved problems such as population growth with a nifty representational scheme; philosophers such as Aristotle had predicted in his work The Politics that democracy would fail as population expanded beyond the “small compass of Greece’s mass or town meeting” but having representatives chosen by election to represent others proved to have been a wise choice.[18]

There were some perhaps unintended consequences of the constitutional structure which the Framers might not have foreseen or understood. For example, the Constitution’s winner-takes-all approach to elections strongly favored the growth of two and only two political parties — left and right (now Democratic and Republican) — and marginalized what we call third party candidates, and led to a two-party system. Third party or independent candidates rarely win elections; this is true for almost all congressional, state, and federal elections (a third party candidate coming close to winning the presidency was Ross Perot with about 20% of the vote in 1992). The lesson for voters is repeated subtly again and again: voting for a third party candidate is wasting one’s vote. Some analysts have suggested that having only two main parties brings a kind of stability helpful for business growth, since it encourages voters and candidates to think along mainstream party lines. In contrast, other analysts criticized the two-party system as muffling the voices of dissenters, independent thinkers, and other outsiders since they have no practical chance of winning a seat in any legislature. Divergent opinions are excluded; people at the fringes politically (who may sometimes be right) are effectively blocked from the forum of public opinion. Mainstream voices do not get the benefit of having their thinking challenged. In contrast to the two-party system, many democracies around the world have a so-called parliamentary system which often has party-proportional representation — so if a party wins 23% of the popular vote, it will win 23% of the seats in the legislature. This enables an upstart party to quickly get a voice in national politics, and pushes the major parties to heed their views and adjust their own programs. My personal sense is that debate is healthier and more robust within legislatures where there is a parliamentary system, since debate includes fringe and extreme views as well as the mainstream.

Signing of the U.S. Constitution. The word “citizenship” did not appear
in the original 1787 document. Painting by Howard C. Christy.

But a key flaw, in my view, was that the Constitution failed to define citizenship. Who was a citizen? What was citizenship? Was it a duty, a privilege, a right? What exactly did citizenship mean? These issues were not addressed. Some historians see this as an oversight, while others suggest it was a planned omission guided by Federalists who distrusted the public. One writer suggested the American system was built on preventing a “badly educated populace” from making poor choices and accordingly devised a system which dispersed power and “filtered the whims of the masses through an elected body and dispersed power by dividing the government into three branches.”[19] Anti-federalists such as Patrick Henry argued for a Bill of Rights, and these were included, with the support of James Madison who came to see their wisdom. But not defining citizenship was, in my view, and not clarifying exactly what it meant in writing was a grievous error which has had repercussions to the present day.

The new American government faced conflicting pressures. The Americas were relatively uninhabited compared with other continents, and the need for skilled labor led to pressures to encourage immigration to help build the country. In the southern regions, the institution of slavery solved this problem for the time being in an old-school way of exploiting people rather than resources and technology. Slaves couldn’t vote, but immigrants could be granted suffrage rights, and what kind of effect would these new immigrants have if they voted? When John Adams was president, Federalists in Congress assumed that many immigrants would vote for the Democratic-Republican party and not for Federalists, so the Alien and Sedition Act was passed in 1798 which extended the period of time required for naturalization. Thus, immigrants had to reside in the United States for fourteen years, not merely five, before they were eligible to apply for citizenship.[20]  A recurring pattern throughout American political life is that the party in power favors immigration if it believes that immigrants will vote their way, or make immigration rules difficult if they foresee future opponents. As a result, rules regarding immigration, built up over centuries, have become complex.

Alexis de Tocqueville

Alexis de Tocqueville.

In the 1830s, visiting America, Alexis de Tocqueville thought that a powerful influence guiding the destiny of American democracy was the principle of equality.[21] Unlike Europe, in America nobody saluted clergy or professors, for example. People treated each other equally (with the painful exception of slavery.) And, as Tocqueville saw it, the natural human yearning for distinction and respect could not be satisfied through feudal inherited structures, but rather by one’s commerce and industry, and he saw a feverish hunt for wealth everywhere in America. A national focus on economic betterment brought many advantages, but one casualty was declining civic participation. Helping out in town affairs didn’t pay much, generally; as frontier dangers receded and the population expanded, many citizens stayed away from meetings and instead pursued jobs and careers and money, or simply stayed home. Participation in government, after all, wasn’t required; it was not a duty specified by the Constitution; people showing up for a community meeting couldn’t force no-shows to show. Sometimes declining attendance was welcomed by attendees, since it helped some decisions happen faster and with less conflict, and it gave attendees relatively more power to decide matters. The idea of freedom as freedom to be left alone was given philosophical credence in the mid-nineteenth century by the philosopher John Stuart Mill who wrote On Liberty. Mill explained: “The only part of the conduct of anyone for which he is amenable to society is that which concerns others. In the part that merely concerns himself, his independence is, of right, absolute. Over himself, over his own body and mind, the individual is sovereign.”[22] But it wasn’t clear whether people-as-citizens had a duty to others to participate in government, or show up at meetings, or follow politics. And social pressure to cooperate politically began to erode. Fixing one’s house or raising one’s salary or expanding one’s business brings a direct benefit, while debating in a town council about where to build a new firehouse, for example, brings an indirect benefit, and direct benefits usually trump indirect ones; this is another phrasing of the famous problem of the Commons.

Declining citizen participation in town governments was balanced, to some extent, by participation in associations. Tocqueville observed that “Americans of all ages, all stations of life, and all types of disposition are

The Knoxville garden club planted
shrubberies.  Photo: B. Stansberry

forever forming associations.[23] There are not only commercial and industrial associations in which all take part, but others of a thousand different types — religious, moral, serious, futile, very general and very limited, immensely large, and very minute.”[23] The associations formed bonds between people and helped people solve local problems locally. A volunteer garden club, for example, could plant flowers in public parks which helped beautify towns without costing the town money; it was a form of civic participation, in a sense, since it was related to the task of governing in a tangential way, according to Professor Cook in his lectures on Tocqueville. Tocqueville thought town meetings were a “marvel of municipal freedom” and he was impressed how people could settle their affairs “with no distinction of rank.”[24]

Declining civic engagement

Still, fewer people showing up in local government councils to volunteer as officers or workers or knowledgeable citizens in town meetings meant that when municipal problems required action, such as road repairs or leaks in school roofs, there weren’t enough local volunteers to tackle the problem. If a new schoolhouse was needed, there weren’t enough volunteers to build it.

One room schoolhouse in Prudence
Island, Rhode Island

Accordingly, town governments faced a choice of (1) raising taxes to get funds to hire a local contractor to do the work of the no-shows, or (2) asking a higher level of government to solve the problem, such as a county or state government. As a result, taxes inched up, and control over decisions moved to higher and higher levels of government. As decades passed, decision-making power left the town governments as county and state governments were being asked to cope with local problems; even the federal government became a primary problem-solver, but this didn’t happen primarily until the twentieth century.

On a bigger level, one could compare the growing United States to a growing organism moving towards increased complexity, differentiation, and centralization. For example, cells

Stars formed like capital cities;
large ones expanded from smaller
ones, becoming more complex.
Pictured: Orion Nebula

in a fetus are undifferentiated initially, and can develop into many different cells — they’re like jack of all trades cells — but as the fetus develops, these cells multiply and begin differentiating into specific types such as skin cells or bone cells or into complex organs such as the heart or liver or brain. Similarly, the cells of the United States, or individual persons, such as local people meeting in New England town halls, were undergoing a similar process towards greater expansion and differentiation and complexity. Some became farmers; others became doctors or lawyers or newspaper reporters; and some became politicians. Seen this way, it was a natural process that decision-making left small towns and moved towards state governments and Washington. Professor Christian explained how growing civilizations have a tendency to pull educated people into large central cities where most important decisions get made, and compared this process to star formation in the galaxy, in that large stars had greater gravity and, as a result, attracted even more hydrogen and helium atoms towards them, making the star even larger. A capital city, like a large star, has greater gravity than a provincial town, and pulls people to it like a magnet. Today, a civilization such as the United States is a highly complex system with several hundred millions of people and billions of different highly integrated parts and technologies and capabilities, with people playing often very specific roles focused on very narrow specialties, so much of this specialization is understandable, including political specialization. But, while undergoing this specialization, did people have to abandon their role as political participants?

In nineteenth century America, there were a slew of cause-and-effect relations pushing people away from political participation.

  • Feedback loop moving power away from local governments. Once a type of decision was hiked up the chain of government, authority didn’t revert back since officials at higher levels had more money and

    power to exercise, and clung to these powers as best they could. Town agendas shrank to mundane tasks like zoning decisions or garbage pickups, and since there were fewer matters to decide locally, people had one more reason not to attend town meetings — why attend local meetings if there was little to decide? How important was it to argue over garbage pickups, for example? So there was a kind of feedback loop working against local government, eroding local citizens’ participation, which meant, in turn, that fewer people learned the vital skills of self-governance, and didn’t get a chance to debate, think, and use reason to solve problems, and didn’t have to learn how to forge compromises by listening to differing viewpoints with a respectful patience. People began to lose touch with their neighbors, and a vital training ground for politicians to learn hands-on democracy was dying at the source. However, people had more time for their jobs and businesses and inventions and artistic creations.

  • Declining political participation compensated by paying taxes. If people participated in politics less, they could compensate for their non-participation by paying taxes. They had to. Local governments needed money to provide services such as schoolhouses and bridges and lacked volunteers to do these tasks. As the economy expanded, people had more money to pay taxes. It was a tacit compromise. If citizens didn’t show up at a town meeting to argue against a proposed tax increase, then how could they complain when taxes were, indeed, raised? Increasingly, as the nation expanded, citizenship was marked by the payment of taxes as a quasi-substitute for participation, although people still had the power to not pay taxes, or threaten to not pay taxes, as a check on growing government power.

  • Self-selecting candidates and career politicians. Fewer citizens participating in government meant that the ones showing up had relatively more power, particularly at higher levels of government. Politics became a full time game for professionals tempted to rig rules to engineer re-election, and with fewer

    eyes watching, and more money changing hands to solve problems, corrupt officials could hide mischief or jigger election rules with gerrymandering to make it easier for incumbents to win re-election. Statesmen were replaced by career politicians. When townsfolk all participated in local councils and paid attention to discussions, people could see who were the best and brightest and they could choose these few to be their representatives, and coax them into representing their town in higher posts in county or state government or even in national assemblies such as the Continental Congress. But as time went by, and fewer townsfolk paid attention to local meetings, townsfolk were less able to discern who were the best people in local government and identify which potential leaders were smart, articulate, honest. In this vacuum, ambitious people could push themselves forward as candidates instead of being chosen by others, and since there were fewer eyes minding the store, it was harder to choose the best candidates. And there was a shrinking pool of people from which to choose a candidate as well. So, gradually, the practice of knowledgeable citizens selecting leaders became replaced with candidates selecting themselves. And these self-selecting candidates were less likely to bring a benevolent impartial fairness to politics, but often were motivated by their own purposes, including needs for power, respect, and money. Political parties became organized into vast public relations machines to get candidates elected. There was concern that professional party organizers and behind-the-scenes power-brokers had too much influence.[18] But politicians, in office, could find ways to steer money to their constituents and themselves. It was a vast money trough. A ruling buried inside a complex legislative decision could divert dollars to hidden pockets. Politics became progressively less well understood, complex, corrupt, dirty, and this further turned citizens away from politics.

  • Government employees. During the nineteenth century, government grew slowly as an employer, but this process accelerated during the twentieth century, particularly during the New Deal. But when

    government employs a citizen, there’s a conflict of interest, since the citizen’s boss is, in essence, the government. The government-employee has trouble being an impartial player or referee regarding government decisions since in many situations he or she has a vested interest in gaining more power and pay. For example, suppose a town with three schoolhouses needs a fourth, but it can’t build one with volunteer citizen labor because there weren’t enough people showing up at town meetings to volunteer to hammer the structure together. So, the town hires an official to build schools. This official has a vested interest in building even more schools, not merely the fourth schoolhouse, but perhaps a fifth or sixth schoolhouse as well, even though the town may not need these extra buildings, because this building activity helps justify their job. When whole classes of people became government employees, such as postal workers or road repair workers, the conflict-of-interest issue became more vexing. On a macroeconomic level, it was perhaps unavoidable that centralized government would grow in size, in the same way that a fetus gets a bigger and bigger brain.

  • Growing distance between decision-makers and decisions. In colonial days, when townsfolk made a

    law, they could see up-close and fast whether the law was working, and amend it accordingly if it wasn’t. But as more distant governments were making decisions about local matters, the rules were less likely to be effective and took longer to amend. To be fair, there were some pluses when state governments took over some types of rule-making: rules were broader, more uniform, and beneficial in many instances. For example, statewide rules about street signs were more likely to be uniform in a way to aid travelers and this was a positive development. But there were instances in which the broad-brush approach to local problems didn’t work well.

  • Receding military threat. Dangers posed by militant native Americans receded quickly and there was little danger from invasion by British forces from Canada, although there was a brief war with Britain from 1812-1814. As a result, it was not generally necessary for citizens to band together for mutual protection and defense like in the days before and during the American Revolution. Citizens didn’t need to meet and agree about defense. Accordingly, citizenship became increasingly seen as a right or entitlement and not as a necessity for survival against attack, since external authorities such as the federal government could make such decisions for them.

Positive factors redefining citizenship

While these developments constricted the political aspects of citizenship, some highly positive forces expanded citizenship, but in a direction away from self-governance. Since people were freed from local civic commitments, Americans could focus full-time on money and jobs and self-betterment and careers and

businesses. They could invent new labor-saving gadgets. New opportunities sprouted forth for shoppers and consumers to select from a wide assortment of ever-expanding products promising to make life comfortable and fun and entertaining. An American with a steady income from a job or business didn’t need local government or other citizens. Goods and services needed could be bought with cash. A booming economy meant freedom. This, as well, increased opportunities for education, and scientists, by sharing knowledge, could advance new cures for diseases and invent new medicines and therapies. In addition, a rule-bound legal system based on precedent and a commitment to stare decisis (from Latin, roughly meaning let the decision stand) enforced with a hierarchy of courts ranging from municipal courts to the Supreme Court meant that there were protections for businesses and consumers with a legal establishment which could enforce contracts. In short, this was an effective foundation for economic growth. The expanding legal code protected people and build on the concept of rights. Further, newspapers and the media acted as a check on government corruption, often exposing serious flaws and leading to efforts at political reform.With the advent of a two-party system, the nation could alternate periodically between a left-leaning pro-labor pro-farmer orientation and a right-leaning pro-business pro-merchant pro-trade orientation, and this regular peaceful change between conflicting ideological orientations every few decades or so had a positive benefit in preventing corruption and pleasing different groups of people within society, and keeping a balance of power.

A legal system protected business.

Economic activity needed a strong legal system to enforce contracts as well as manage financial instruments such as mortgages, deeds, and disparate aspects of business law. The introduction of complex financial vehicles such as stocks and bonds as well as increasingly complex risk management products such as insurance further developed the world of business law. Consumer protection became important. A strong legal culture with respects for rights and property made it possible to emphasize the legal aspects of the relation of citizenship.

It is difficult to downplay the importance of technological advances. Humans harnessed power using amazing new machines. The cotton gin revolutionized agriculture and the spinning jenny mechanized textile production. Machine tools, gas lighting, combustion engines, and electric

Newcomen’s steam engine.

power empowered people and businesses and societies unlike anything in the past. Steamboats and rail travel meant people could travel across the country in days, efficiently, safely, with a minimum of fuss. The power of an individual human to travel, to make things, to create new metals or chemicals, was increased a hundred-fold. The industrial revolution empowered all people, particularly the middle classes. Americans migrated westwards, and the new mobility had an unsettling effect on a town’s political rituals. New communities sprouted overnight. The inflow and outflow of people made it difficult to maintain community traditions such as regular town meetings. Factory jobs on a rigorous round-the-clock timetable sometimes kept workers from attending town meetings.

Peoples lives were being transformed in positive ways such that citizenship duties, particularly political participation, became relatively less important. Civic duties receded into the background. Was it important to attend a town meeting when a factory whistle beckoned? Was reading the newspaper’s local political pages important when one might be moving west next year?

Stowe’s “Uncle
Tom’s Cabin”
helped people think.

Empowered middle class people, mobile, educated, literate, had an increasing awareness about people around them. Their lives were good, but what about the lives of others? Increasing mobility meant people could meet others from different backgrounds and classes and compare notes. Were their lives as good? Were they prospering? Was life fair? Centuries-old traditions of male-female gender roles came into question; was it right for women to be stuck at home while men got to have exciting careers? And, when women were needed to work, why couldn’t they vote? And why was there a whole caste of black people held in slavery?

Gradually, over time, citizenship became less defined by civic participation in local government, and more defined as a legal matter. Citizenship wasn’t defined by how one contributed in government; rather, it became more of a legal status, a kind of membership in America, a right to vote, a right to work, a right to make money. Adult men were thought of as citizens of the United States as well as citizens of their respective state, such as New York or Connecticut or New Jersey, but they probably didn’t think about it much. The growing nation didn’t face serious military threats and, as a result, it wasn’t necessary for all citizens to cooperate for the successful prosecution of wars; rather, a volunteer or paid army was sufficient to meet early threats, generally, that is, until the American Civil War.

The Civil War

Expansion of the citizenship franchise

The aftermath of the Civil War brought a slew of changes which affected citizenship dramatically. African-Americans were freed from slavery and defined by a new amendment in 1868 to be “citizens”; in fact, this was the first mention of citizenship in the Constitution, and the Fourteenth Amendment declared that all citizens

Naval battle on the Mississippi River, April 24th, 1862.
Photo: Colored lithograph, published by Currier & Ives, 1862.

had equal rights under the law. The amendment read: “All persons born or naturalized in the United States, and subject to the jurisdiction thereof, are citizens of the United States and of the State wherein they reside.” It defined birthright citizenship: if a person was born in America, he or she was an American. While free and technically citizens, African-Americans were excluded from important parts of the burgeoning economy and were, in effect, second-class citizens because of treatment such as segregation which the Supreme Court legalized under the doctrine of separate but equal in the Plessy v. Ferguson case in 1896. Since the meaning of citizenship was shifting from political participation to legal status, it was possible to admit a whole new group of persons — African-American men — as citizens without shaking up the political order. Citizenship was a token, a legal marker, a label, and while it conferred certain legal rights as well as a power to vote, these could be suppressed and distorted. What was important was ability to prosper in America’s burgeoning market economy, and this depended on being included as full members in this economic whirlwind, and it was easy enough for whites to merely ignore African-Americans. Being largely excluded from economic participation, most African-Americans couldn’t enjoy their newfound freedom.

States’ rights

Since the South tried to revolt under the banner of states’ rights, and lost, the concept of states’ rights was discredited to an extent, according to a subsequent analysis by Senator Borah writing in 1922.[9] Borah thought

Diagram of the federal structure, 1862, showing how the national
government was becoming preponderant, even during the Civil War.

that the Constitution divided powers so that all matters of domestic concern and local interests were given to state governments, while matters for general government were given to Washington, and wrote “upon the integrity of the States after all rests the integrity and permanency of the Union — that upon the principle of local self-government rests the perpetuity of republican institutions.”[9] Thinkers such as Tocqueville as well as Abraham Lincoln and Supreme Court Justice Harlan believed a federal system would work best in which individual states had  autonomy to govern themselves, since it allowed people dissatisfied with a particular state government to vote with their feet by changing states.[25] Given choices between states, citizens, by being free to move and change states, had more freedom. Supreme Court Chief Justice John Marshall wrote “No political dreamer was ever wild enough to think of breaking down the lines which separate the States and of compounding the American people into one common mass.”[9]

But winning the Civil War allowed the federal government to impose its will on the former rebel states, and a precedent working against states’ autonomy had been set, and the natural feedback loop pushing decision-making away from local town councils now could push control away from state governments to the national

Power flowed away from
local governments.

government in Washington. In addition, state governments lost more control when in 1913 the seventeenth amendment took away their power to appoint national senators, which removed an important voice they had had in national politics. The Supreme Court could use vaguely worded parts of the Constitution such as the Commerce Clause to validate federal power over state government decision-making in cases such as Lochner v. New York (1911) or in later cases such as Brown v. Board of Education (1954) or Roe v. Wade (1974). It might be argued that the decrease in the power of state governments was one more manifestation of the centralization of the American republic — power was increasingly concentrated in Washington D.C. in the same way that economic power was moving to large cities like New York. In any event, diminished state power affected citizenship indirectly, since a power base for citizens was eroding.

Twentieth century

Population increase and citizenship

Philosophers such as Aristotle and Machiavelli believed that it was difficult for democracy to exist in heavily-populated city-states or nations, and generally, they were being proved right. America’s rapid population

Town hall in Biddeford,
Maine, in 1855.

increase meant that each individual’s relative power to exert political influence became less. Population growth made it harder for town meetings to regulate well. One writer described town meetings in Biddeford, Maine as becoming inadequate by about 1855, dysfunctional, with decisions being made by “anonymous” participants; in that year, residents elected Biddeford to become a city.[26]

Decisions became increasingly complex. In frontier times, a town might have to build a simple bridge or schoolhouse; but as new technologies emerged, it often required contractors and experts to build and manage interlinked systems involving sewers, traffic, lighting, communications, and fire protection services. How would intelligent

A city manager of Galveston, Texas.

zoning decisions be made with long time frames in mind? Bad decisions could be costly. Cities, particularly, were complex animals often with two sets of people with differing needs: office workers during the day; residents during the night; how would the needs of different groups be balanced? Would office workers have the same say as residents? It became harder for individual participants at meetings to understand the trade-offs, financial challenges, and budget choices. Cities incorporated. Managers were hired. Citizens weren’t needed at meetings.

In 1900, James T. Clark wrote: “It is simple enough to yet gather the opinion and will of a hundred men who can meet together, but multiplication of numbers and wide distribution, as in our present conditions, so change and complicated the case as to make the construction of a democracy almost a wholly different matter.” [18] There was concern that as America’s population expanded, that the political machinery had not been adjusted accordingly. Clark wrote in The New York Times:

“For one hundred years and more we have worked along with the simple machinery our founders set up, while the conditions of democracy have changed from the simple to the complex.

Meanwhile, we have made no conscious, premeditated adjustment … This was the professional politician’s or party organizer’s opportunity. He has volunteered and assumed to keep the machine running on the tacit condition that he should control the result. The power which he has gained by this service is only the measure of how essential a prop it was to the outgrown structure of our fathers, and, also, how perfectly designed the politicians’ own party system of delegation is to organize the great body of voters for his own ends. … No one questions the effectiveness, even the natural perfection, for its function, of the old town meeting plan of government, where the best opinion of the community could be gathered and the fittest men be selected to administer the public affairs, because all men were known of all men … Now representation or principle of delegation is the only means by which populations so large as not to be collectible in town meeting can be given a true voice in public affairs. This principle the politician has adopted in his party system, for his own ends; so that what now is put forward by him as the people’s voice in elections is a ventriloquism that comes from the politician’s own belly.”[18]

Party politics picked up the slack for declining citizenship. Parties could energize masses of people using advertising slogans, entertainment, and fanfare into a quasi-machine to get their slate of candidates elected into office. Citizenship became identified with membership in one of the two main rival parties which continually jousted for the political center. One writer suggested that “the ideal citizen was a party loyalist, aware of his party’s passions and convictions and active in the carnivalesque atmosphere of conventions and election days.”[14]

When the nation grew, town meetings were no longer capable of dealing with larger issues, particularly when they involved entire counties, regions, states or all states.[27] One writer in 1911 wrote that “common interests could be attended to only by delegates or representatives” and identified the two principles helping keep democracy alive in the United States and helping it grow “so far and so fast” were (1) representation and (2) federation.[27]

Doubling of the citizenship franchise

The growing economy and the growing sense of citizenship as a legal status unlinked with civic participation meant that differing political parties, jostling for power at all levels of government, could consider admitting new persons or groups into citizenship as a way to swell the ranks of voters hopefully in their favor.

African-Americans voting
in 1945.

Politicians could widen citizenship with little impact on their grip on political power since the civic-duty aspects of citizenship were declining and since Americans were paying less and less attention to politics. It didn’t matter much who was or wasn’t a citizen, particularly since government didn’t have to support most people in the nineteenth century. And there was public pressure to widen the citizenship franchise. Welcoming new groups of citizens was not a serious challenge to the authority of either political party, but if handled properly, could give one political party a slight edge in upcoming elections depending on how the newly admitted groups voted. Since, according to political scientist Benjamin Ginsberg, Americans were losing interest in politics, and were less willing to embrace civic responsibility, the net of citizenship could be widened with little impact.[28] [29] Philosopher Jürgen Habermas noticed the contradiction in his book The Structural Transformation of the Public Sphere and noted that while the public widened, the public sphere shrunk, so that more people were counted technically as citizens but the actual task of citizenship-as-civic-action was shrinking.[17] [30] [31]

Voting remained an important check by the people on the political process, but it was increasingly focused on larger, national elections, and less focused on local or state elections. People often didn’t know the names of their congresspersons or state legislators; many walked into voting booths and voted for people with ethnic-sounding names similar to their own, regardless of the issues; and people voted for their party. So many party candidates were chosen not by voters, but in smoky back-room deals characterized by horse-trading between party bosses. Ginsberg suggested in 1998 in a controversial analysis that government could extend rights of modern citizenship to diverse new groups such as minorities and women, as well as encourage voting as an alternative to more dangerous unwanted protests, such as striking or rioting, as a way to tame a wary public.[32] [33] He wrote: “To vote meant not to strike or riot,” and the state preferred citizens to vote rather than have more serious challenges to its power such as lawsuits, protests, union organizing, parliamentary procedure, or lobbying.[32]

Women’s suffragists demonstrate
 in February 1913.

Accordingly, the citizenship franchise expanded. Women were admitted into citizenship after long advocacy by prominent activists such as Elizabeth Cady Stanton and Susan B. Anthony. The Nineteenth Amendment was ratified in 1920. Women could vote and run for office. The electorate doubled in size. And this was an important marker in the status and power of women. But the influx of women-as-citizens did not reverse the general trend towards declining civic participation in local government. While African-American men could technically vote, in many southern jurisdictions whites found ways to exclude or negate the minority vote, often by intimidation or deliberate obfuscation of voting laws.

Depression

The Great Depression had huge ramifications for citizenship. It brought Democrats to power under the leadership of Franklin D. Roosevelt who took drastic steps at the federal level in 1933 to respond to massive unemployment and bank failures. The New Deal featured an array of federal programs such as the Works Progress Administration as well as Social Security which redistributed money from some groups of citizens to other groups of citizens. As before, the issue of how citizenship was affected by these large transfers of money was largely ignored, but aid recipients joined government employees as persons getting substantial income from government largesse. Could persons receiving paychecks from the federal government behave as responsible and impartial citizens? Aid recipients were officially and legally “citizens” despite their inability to support themselves. And federal government, being considerably removed by time and space from the aid recipients themselves, had a difficult job ferreting out persons truly needy of aid from the borderline cases, and resulting fraud made it more expensive to run aid programs, leading to higher taxes and waste as well as possibly corrupted aid-recipient citizens. Washington swelled. The federal budget expanded exponentially.

The handling of the Depression was one indication of sea-change in governance. Government was no longer the exploiter of people; rather, government became the steward of people, their benefactor, their protector, their employer, their bailer-outers of sinking economic enterprises. While later economists would criticize government decision-making during the Depression, particularly targeting their decisions to raise tariffs on foreign imports for exacerbating the downward economic spiral, Uncle Sam was a patron. And, over time, government learned that it could resort to such tactics as deficit spending to ease the national economy through troubling times.

Expansion of the federal government

Note: Largest deficit was for World War II.
 1998-2002 had surpluses. For brevity,
annual numbers were combined into
 ten-year averages.
Source: US Government statistics.

Critics of the federal expansion had little effect. In 1922, Senator Borah warned: “Under no circumstances should the national Government undertake to deal with those things which are essentially local.”[9] He suggested that “when a people cease to be active in the affairs of government” that oligarchy follows shortly thereafter.[9] Tocqueville had made a similar warning back in 1835 and suggested that when local authorities had the power to administer laws made by higher governments such as counties or states, then there was a healthy measure of control; but what Tocqueville found alarming was when state or national government not only made the laws, but administered them; he described this as potentially “dangerous.”[34] But, political discourse was falling apart in the same way as town meetings.

Internment

Japanese-Americans, jailed for
three years during WW2, prepare
to leave Poston, AZ in 1945.
photo by Iwasaki, Hikaru.

World War II exploded away the Depression and saw massive numbers of Americans in uniform fighting on two shores, including African-American males, who could use the experience in subsequent generations to demand an end to segregation and equal treatment under law. One tragedy, however, was the detention of over a hundred twenty thousand west coast Japanese-Americans beginning soon after the attack on Pearl Harbor. Many were second generation Japanese, born in America, and therefore citizens by birthright; estimates of citizens among the detainees were 62%[35] or 58%.[36] But since military authorities worried that saboteurs and spies for Japan might be lurking within this group, and were unable to identify which persons might possibly have been dangerous, authorities (based on presidential directive #9066) detained them during the war, and a later Supreme Court decision upheld the detentions. German-Americans and Italian-Americans were not treated similarly, leading to accusations that the United States government committed a bias crime because the Japanese-Americans were more easily identified by racial characteristics. In a real sense, the internment illustrated the fragility of citizenship. For those imprisoned for years, citizenship was a meaningless badge lacking enough clout to keep innocent people out of jail.

Tax withholding

During World War II, expensive war munitions required government to raise taxes further, but it was difficult prompting Americans to save tax money for an annual payment. The possibility that citizens might refuse to pay taxes was, in one respect, a check on the power of government.

But this citizenship power was lost during the war. In 1943, when the federal government raised taxes further, a major collection issue loomed.[37] While there was strong support for the war, resources were tight, and Americans hadn’t been saving to meet the needs of an increased annual tax bill. A former Macy’s executive named Beardsley Ruml conceived a plan to bypass citizens by forcing employers to pay taxes directly to government regularly on their behalf. Here’s how it worked, according to New York Times writer Amith Shlaes:

“The government would get business to do its work, collecting taxes for it. Employers would retain a percentage of taxes from workers every week–say, 20 percent–and forward it directly to Washington’s war chest. This would hide the size of the new taxes from the worker. No longer would the worker ever have to look his tax bill square in the eye. Workers need never even see
Activist industrialist Vivien Kellems
protested the withholding taxes with
a lawsuit, but lost.
Photo date 1941.[38] [39] [37]

the money they were forgoing. Withholding as we know it today was born … This was more than change, it was transformation. Government would put its hand into the taxpayer’s pocket and grab its share of tax–without asking … Ruml had several reasons for wagering that his project would work. One was that Americans, smarting from the Japanese assault, were now willing to sacrifice more than at any other point in memory. The second was that the federal government would be able to administer withholding–six successful years of Social Security showed that the government, for the first time ever, was able to handle such a mass program of revenue collection. The third was packaging. He called his program not collection at source or withholding, two technical terms for what he was doing. Instead he chose a zippier name: pay as you go.”[37]

Ruml’s scheme of tax withholding promoted as pay-as-you-go greatly increased tax compliance but undermined citizenship, since citizens lost the power to voice displeasure with government by threatening to not pay taxes.[37] Workers never even touched much of their own paycheck money before it was zipped away by their employer to the Treasury. Prominent conservative thinker Milton Friedman who had supported tax withholding came to regret the choice later.[37] Friedman wrote:

“We concentrated single-mindedly on promoting the war effort. We gave next to no consideration to any longer-run consequences. It never occurred to me at the time that (by advocating tax withholding) I was helping to develop machinery that would make possible a government that I would come to criticize severely as too large, too intrusive, too destructive of freedom. Yet, that was precisely what I was doing … There is an important lesson here. It is far easier to introduce a government program than to get rid of it.”[37]

After the war, tax withholding persisted, since government officials now had resources to enact a variety of programs with little fear of popular protest via non-payment of taxes.

Prosperity

After the war, the nation resumed a path to prosperity. Cars. Supermarkets. Typewriters. Television. Movies. Silly putty. Baby boom. Jet planes. America was a land of fountains, French fries, amusement parks and rock music. The G.I. bill sent returning veterans to college in massive numbers, and with their education, the U.S. had an impressive workforce of skilled specialists.

Political participation continued to decline. Some writers blamed increasing wealth for exacerbating the decline.[19] Kaplan wrote:

“Aristophanes and Euripides, the late-eighteenth-century Scottish philosopher Adam Ferguson, and Alexis de Tocqueville in the nineteenth century all warned that material prosperity would breed servility and withdrawal, turning people into, in Tocqueville’s words, industrious sheep.”[19]

There were instances in which technology made it less necessary to rely on neighbors; for example, in Mount Vernon, Maine, telephone service in the 1960s had been routed by two elderly operators “who knew everyone in town”, but with new dialing technology, their assistance was no longer needed.[12] A trend empowering

Old telephone switchboards
required an operator to
manually connect lines.

consumers pushed an alienation too; cars enabled freedom, but people moved to distant suburbs, away from neighbors, into segregated communities of like-thinking people of the same socio-economic bracket. Tinted windows on cars could hide people from passengers; sunglasses hid people’s faces sometimes obscuring identification. We became strangers to each other. Ipods in ears let pedestrians to tune out others and hear only the wavelength of their choice.

Tocqueville saw a natural tendency for democratic peoples to turn inwards, to tune out others.[11] Being in public doesn’t make us feel important, so we turn to families, friends, television, entertainment, that is, we turn away from neighbors and public life. He hoped local organizations and civic groups and churches would counteract this trend and help people turn outward,[11] but church attendance was declining as well, particularly among mainstream Protestant sects.[40]

A speech in 1996 by Jean Elshtain at Brigham Young University looked at democracy in that year, Elshtain spoke about the analysis of Tocqueville:

“In Tocqueville’s worst-case scenario, narrowly self-involved individualists, disarticulated from the saving constraints and nurture of overlapping associations of social life, would move to a bad and isolating egoism. Once that happened, they would require more controls from above in order to muffle the disintegrative effects of egoism. To this end, if you would forestall this moment of democratic despotism, civic spaces between citizens and the state would need to be secured and nourished. Only many small-scale civic bodies would enable citizens to cultivate the democratic virtues and to play an active role in their communities. These civic bodies would be in and of the community–not governmentally derived, not creatures of the state.”[11]

Declining attendance at town meetings

During the second half of the twentieth century, attendance at town meetings continued to decline. In 1970, in Mount Vernon, Maine, 120 of 596 inhabitants gathered for the annual town meeting.[41] In 1977, a Time Magazine reporter wrote that the “town meeting has been declining for decades—a casualty of increasing population and the complexity of issues.”[12] In one study of attendance at town hall meetings from 1970 to 1998, only 20% of the town showed up.[2] One source suggested attendance at town meetings varied from 20% to 26%.[42] One independent writer wondered that the substance of town meetings in present times bordered on the absurd. For example, Victoria Rose Perkins questioned the importance of a town debating ad infinitum about the spelling of the town’s name.[42] In the town of Huntington, Vermont, a meeting in March of 1977 was attended by only 130 out of 519 eligible citizens, that is, three of every four citizens stayed home.[24]  The meeting lasted more than four hours and citizens discussed issues such as local real estate taxes and whether to buy a new fire truck (they did.) The meeting had a social effect in helping people get to know their neighbors; the reporter concluded that “By and large, Huntingtonians seemed to genuinely like and trust each other,” that is, the ones who showed up.[24]

Still, major groups of people were still left out of the American mainstream, including women, African-Americans, homosexuals, and others, and had simmering grievances. Many whites pretended as if blacks weren’t around, as if there was no problem, and let the African-American community fester in its unhappiness. Where could minorities vent their dissatisfaction? In newspaper “letters to the editor”? Readership was declining. At town meetings? They were becoming irrelevant for most matters besides local town business.

Civil rights march at the
Lincoln Memorial in 1963.

As a result, the 1960s were marked by street protests, demonstrations, rioting, civil unrest, and antiwar protests.[43] [44] African-American youth participated in sit-ins led by Dr. Martin Luther King.[45] Creative NAACP activists managed a shrewd campaign in the courts to combat school segregation with brilliant lawyers such as Thurgood Marshall, who engineered a series of lesser victories to clinch a Supreme Court verdict in 1954, notably the Brown v. Board of Education decision; Marshall became a Supreme Court justice himself in 1967. But non-violent protests and court challenges were the only ways for people to express discontent with the political system, since the possibility of attending town meetings to voice complaints was practically abandoned.

FairTax (nonpartisan tax reform)
 supporters meet in New Jersey.

So getting public attention was the first step in any effort to change policy, and this wasn’t easy. Advertising was expensive. Lacking funds, many activists felt pressure to pull bizarre stunts to get free press coverage, since an off-the-wall news story might captivate the public imagination for a short time; accordingly, activists for the left such as Michael Moore made sarcastic documentary movies such as Roger & Me to attract attention;[46] activists from the right such as radio talk show host Rush Limbaugh made outrageous statements such as calling Supreme Court nominee Sotomayor a “reverse racist” to maintain radio ratings.[47] In contrast, activists promoting the non-partisan FairTax tax simplification reform strategy who adopted a more reasonable approach often failed to win attention; since they were often reluctant to pull media stunts, the American public is mostly unaware of their proposal.[48] If activists succeeded at winning public attention without distorting their credibility, the next step was to persuade people to act, such as writing a letter to a congressperson. Here, too, there were obstacles to overcome, including public inertia. People mostly concluded that trying to accomplish some political goal was a waste of time. The instances in which activism brought about successful political change in recent years were instances in which there was an aggrieved group, such as African-Americans or feminists or homosexuals, who felt the sting of bad policy over time, who had sufficient numbers and financial clout and good leadership, and who conducted long-range campaigns of protest together with media campaigns to change public opinion including consumer protests along with campaigns in the courts to change policy.

Erosion of trust

Suburban sprawl around Melbourne,
Australia. Photo by Evan C.

However, overall, the pattern is that trust between citizens seems to be declining.[11] Poll data suggest that people are less and less likely to trust their neighbors, with a marked shift from 1960 (60%) to 1993 (38%) of people answering yes to the question “Do you believe most people can be trusted, or can’t you be too careful?”[11] Author Dick Meyer wrote “Americans don’t trust our institutions or one another” and “without trust, without a shared vocabulary, without community, we feel endangered.”[49] In Why We Hate Us Meyer describes an America in which people don’t trust institutions or one another, and experienced a declining sense of community.[49] Like Putnam, Meyer saw a drastic shift in values beginning about the 1960s, and blames ideological shifts as well as extensive involvement with the mass media and suburban sprawl.[49]

There were questions whether young Americans are learning enough to stay informed about public issues. Membership in local volunteer community groups like the Parent Teacher Association is declining; it had 9.5 million members, or nine percent of the adult population, in 1955, but membership has been declining since the 1960s.[49] Writers such as Charles Murray described the decline in civic engagement and blamed government intervention for harming civic engagement.[50] Other writers notice a trend towards civic disengagement.[51]

Decline of social capital

Note: data from Robert Putnam’s Bowling Alone
 (2000) comparing 18-29 year olds in the
1972-1975 period with a similar age
group during the Clinton years.[23]

By the late twentieth century, Harvard University professor of public policy Robert D. Putnam noticed a decline in civic engagement, including activities normally done by citizens such as voting or attending local meetings.[23] His 1995 seminal article Bowling Alone suggested that for the first two-thirds of the 20th century, Americans were deeply involved in their neighborhoods, towns, and cities, but since the 1950s, baby boomers and Generation Xers and younger generations have gradually withdrawn from civic life; for example, from 1980 to 1993, the total number of bowlers increased by over 10%, yet league bowling fell by more than 40%. “We are bowling alone rather than with our neighbors” according to his analysis.[23] The declining social capital which Putnam defines as the “sum of complex, dense networks of connections, values, norms, and reciprocal relationships in a community” means people are less inclined to do citizenship-related activities.[23] Putnam blames the rise of electronic entertainment, especially television, video games, and the Internet along with the pressures of time and money, the rise of two-income couples, increased commuting time, and urban sprawl.[23]

Rutgers political scientist Benjamin Barber sees a growing incivility in political discussions today and characterizes discussions as “divisive” with “almost no listening” and “no visible modification of opinion” and a “vilification of opponents.”[13] Barber elaborated: “Divisive rhetoric has become not only disagreement between parties but a rejection of the legitimacy of the other side, validating a position that your opponents are immoral, un-American and possibly worthy of being subjected to violence,” and added “Opponents become enemies of the Republic and the political process itself.”[13] There is evidence that citizens have lost the ability to listen to each other; in a painting depicted by Norman Rockwell about a 1943 town meeting, neighbors listened to a man argue for an unpopular opinion; today, however, there are few instances in which people listen to alternative points of view.[2]

Citizenship today

Conservative writer William J. Bennett, despite noting a decline in civic participation, found resilience in the American character in the response after 9/11.[52] But others have been critical, thinking that government, in many instances, over-reacted to the threat of terrorism by removing many civil liberties, with expansive invasions of privacy with warrantless wiretapping, illegal searches and seizures and detentions of persons suspected with involvement with terrorists.

Mass media

Mass media effects on citizenship are mixed.


  • USS Arizona after the attack on Pearl Harbor 1941.

    Positive effects. Newspaper readership encourages the valuable citizenship skill of literacy. The mass media help people share a common media experience, such as a vivid awareness of where they were during the attack on Pearl Harbor or 9/11; and common experiences help unite citizens with a common culture.[53] Further, radio and television and the Internet allow instantaneous transmission of news which helps citizens stay informed of new developments.[14] A video image of police abuse can have a powerful deterrent effect on future police abuse, and can keep overzealous officers within bounds if they’re aware that possible abuse might be broadcast worldwide. While most citizens lack time and capacity to stay informed about political developments, particularly at the national level, reporters and editors can expose wrongdoing by public officials and publicize scandals and, in some respects, the Fourth Estate checks government power.[14] Media can change stereotypes in a positive way. For example, the television miniseries drama Roots by African-American writer Alex Haley chronicled a family’s history from tribal Africa to the post-Civil War south and was viewed by 130 million people in 1977,[54] and it helped white Americans grasp the plight of African-Americans in America.[55] According to Dr. Juliet Walker of the University of Texas, Roots helped people “see the reality of slavery in a way in which historians have really not been able to do.”[54] Writer Tim Arango in the New York Times suggested the The Cosby Show “succeeded in changing racial attitudes enough to make an Obama candidacy possible.”[56]


  • Jolie and Pitt at Cannes Film Festival 2007. Photo by Georges Biard.

    Negative effects. But the drug of entertainment injected by the needle of mass media is a potent concoction keeping people away from neighbors and civic participation. Americans overly entertained are prone to cutting themselves off from flesh-and-blood neighbors and prefer the solitude of screen satisfaction. They fail to form ties of affection and respect and trust and become isolated, estranged, alienated, vulnerable, uninformed, and lose the skill of conversational give-and-take. Manners and politeness suffer. People have always had heroes such as Buffalo Bill[57] and have always loved good stories such as the Odyssey or the Iliad by the Ancient Greek bard Homer or the Aeneid by the Roman poet Virgil, or a Mark Twain novel, but what’s different today is the scale and power of the images as well as access to them. One commentator wrote that the revolution in powerful mass media graphics “has produced a shift in the values of many Americans away from ‘work, family, and citizenship’ and towards self-gratification.”[58] People identify their dreams with the pseudo-immortality of celebrity. A reporter wrote: “Fascination with fame permeates the media and occupies the daydreams of millions.” People know the images are carefully crafted fictional representations to appeal to a wide audience, but the images are powerful nevertheless in terms of influencing behavior and thought. Culture is reduced to gawking over celebrity antics and misdeeds: “Twentieth-century mass media coupled with the entertainment industry pushed the fame machinery into hyperdrive … Welcome Brad and Angelina, the reigning Apollo and Daphne.”[57] Faced with a choice of meeting a possibly cantankerous neighbor, or passively watching beautiful images on a moving screen, people increasingly choose screens, although some secondary schools try to teach students to be more objective with courses in media literacy and foster citizenship with lessons about how a culture of celebrity worship is wasteful.[59] [60] But civics as an academic discipline struggles to compete with dollar-oriented courses focused on making students employable,[61] and academic discussion about citizenship is an abstract exercise far removed from the actual exercise of democracy.

Free speech

Student Andrew Meyer was arrested
while trying to ask a question at a
speech on Constitution Day in 2007.
Watch the YouTube video

An incident at the University of Florida in 2007 may serve to illustrate citizenship today. Student Andrew Meyer rose to ask a question during a question-and-answer session following a speech by Democrat John Kerry to celebrate Constitution Day. Meyer wanted to make a statement and took a fairly long time to ask his “question”, and before he was finished, he became involved in a scuffle with police officers and was tasered and arrested.[62] There were protest rallies against the police treatment later.[62] The University of Florida Taser incident was filmed and replayed widely on YouTube. In one sense, the incident morphed into a form of entertainment but since it was widely reported in the news media around the world,[62] it helped bring attention to the precariousness of free speech.

Town meetings today

Town meetings continue today with greatly reduced attendance and decision-making is limited to a narrow range of topics unlikely to excite the attention of most residents.[24] For example, in 2009 in the New England town of Smithfield, Rhode Island, the town agenda had issues such as housing, conservation, schools, the library, sewers, zoning, soil erosion, traffic safety, and so forth, with separate committees for each issue.[63] The town’s authority in many instances is circumscribed by decisions made at the county, state, or federal level. One of the top stories on the town website of Casco, Maine was dog licenses; they were set to expire on December 31, 2009, and it’s difficult to imagine neighbors getting charged up to attend town meetings to discuss dog licensing.[64] Casco has a year-round population of 3,500, but swells to 15,000 during the summer. Volunteering exists; it has a “Town Meeting form of government with an elected 5 member board of selectmen and a Town Manager” with community volunteers who are the “backbone of the Town of Casco’s Rescue Unit and Fire Department.”[64]

Barack Obama at a
“town hall meeting”
in Green Bay, WI,
2009. Photo credit
Chuck Kennedy

The term town meeting has been somewhat distorted by the media; some television broadcasts describe shows as “town meetings” but they’re more accurately described as “forums with supporters.”[5] A political candidate running for office will surround himself or herself with supporters, make a speech with a nice backdrop and camera-pleasing angles, and have the spectacle presented as a “town meeting” with active discussions happening; but such events are really public relations events analogous to political commercials. Some firms which specialize in the deliberative democracy business use trained facilitators, full-time staff, media and community outreach, and “a lot of technology.”[2] The phrase town hall meeting is often used today to “signify a televised campaign event” and not a real but a “counterfeit” meeting since its primary purpose is to sell a political candidate or push a political agenda such as health care reform.[2]

Political corruption and disenchantment

The trend towards career politicians has continued to the extent that there is public disenchantment with the political process.[65] [66] Reelection rates for members of United States Congress hover around 90% suggesting that incumbents have a huge advantage over challengers because of access to money, gerrymandering, and franking privileges, and leading to accusations that the election process has been rigged to favor incumbents.[67] [68] [69] There are reports that letters to congresspersons have been routinely thrown in the trash.

Charles Murray argued that too much government involvement strips away responsibility from communities and, as a result, harms the “elaborate web of social norms, expectations, rewards and punishment (which) evolves over time that supports families and communities in performing their functions.”[50] He criticized the welfare state as causing “growing legions of children raised in unimaginably awful circumstances, not because of material poverty but because of dysfunctional families, and the collapse of functioning neighborhoods into Hobbesian all-against-all free-fire zones.”[50]

There are further examples of disconnectedness between citizens. In Livingston, New Jersey in 2004, an elderly person died and the body lay undiscovered in a house for months.[70] In Montclair, New Jersey, people don’t volunteer for the ambulance squad as much as before; “twenty years ago, it relied on a hundred volunteers, plus three paid Emergency medical technicians; now it has 23 professionals, only seven volunteers.[70] However, one writer noticed that there were some block parties as well as informal networks.[70] Nevertheless, when a newspaper story in USA Today in 2007 reported that each American household had liabilities of $516,348 for promises made by federal, state and local governments regarding future payouts for Medicare, Social Security, military benefits, state and local debt, federal civil service benefits, state and local retiree benefits, and other federal obligations, there were neither public protests nor serious discussion in town meetings, although there were comments from dissatisfied readers posted anonymously.[71]

Illegal aliens

In America, there are millions of people identified as illegal aliens who work in the nation, typically in low-wage menial jobs, but who lack the official legal designation of “citizen”. For these unfortunate people, the designation of citizenship has an extremely important meaning, for unless they’re recognized as citizens, illegals have great difficulty finding legitimate employment. They can’t vote. Almost all would love to become citizens not only to open up their career horizons, but to be able to receive federal assistance from the government.

In 2006, there were mass protests numbering hundreds of thousands throughout the US demanding U.S. citizenship for illegal immigrants.[72] Many carried banners reading “We Have A Dream Too.”[72] One estimate is that

Protesting for immigrants’ rights
in Ohio in 2006. ccsa2.5 Rrenner

there were 12 million illegal immigrants in the USA in 2006.[72] They live in constant fear of deportation; if they become ensnared in the criminal justice system, they are shepherded often anonymously by contract jailers to unknown destinations. In a sense, the separate designation of citizens and others non-citizens creates a caste system, similar in some respects to ancient civilizations. The illegals who make up the bulk of low-paying nanny jobs and home repair workers are exploited financially. Kaplan wrote: “Ancient Sparta, like Athens, was a two-tiered system, with an oligarchic element that debated and decided issues and a mass — helots (“serfs”) in Sparta, and slaves and immigrants in Athens—that had few or no rights.[19]

Dissatisfaction in Vermont

Citizens complain about a burdensome federal government. And political citizenship, while subdued in most parts of the country, remains alive in places like in Vermont’s 237 towns where a newspaper account reported recently that “every citizen is a legislator who helps fashion the rules that govern the locality.” People attending meetings don’t waste time with “superfluous static” but managed with “quiet efficiency” since “the townspeople have a deep respect for parliamentary procedure and law.”[73] There have been informal reports that 62% of people attending Vermont’s town meetings are actively thinking about Vermont’s membership in the United States based on such issues as dissatisfaction with the growth and power and corruption of the federal government which some believe has eroded fundamental freedoms.[73]

Jury duty and citizenship

Some writers see the institution of the New England town meeting embodied in the jury. “The jury is a direct democracy. It’s the New England town meeting writ large. It’s the people themselves governing.”[74] Others see

Awaiting jury duty in 2007. Photo by Steve Bott. cca2.0

jury duty as a useless chore to be avoided; comedian Norm Crosby once joked “When you go into court, you’re putting your fate into the hands of 12 people who weren’t smart enough to get out of jury duty.”[74] In New York, many categories of people were automatically exempt from jury duty, including doctors, lawyers, firefighters, police officers, and others, until a decision changed that. And there is some evidence of a trend to undo the “automatic exemptions” of many professions across the nation. While many Americans think the idea of being a juror is important, most agree the act of actually serving on one is “inconvenient”. One study found the response to jury summonses to be “extremely low” with sometimes only 15 people showing up out of a list of 100 names.[74] Many people don’t get summonses since the juror lists are often outdated or incomplete. Some people showing up for jury duty find the assembly room full, and end up returning home and feeling like their time was wasted. Only 20% of people summoned for jury duty actually get put on a trial. And payment is low, sometimes barely enough to cover parking fees.[74]

Scholarship relating to the history of U.S. citizenship

  • Jürgen Habermas. Explanations by philosophers such as Jürgen Habermas in his book The Structural Transformation of the Public Sphere are confirmed by events in the media. Today, in contrast to colonial times, there is scant public debate, few public forums, and political discussion has degenerated from a
    Jürgen Habermas
    photo by
    Wolfram Huke ccsa3.0

    fact-based rational-critical examination of public matters into a consumer commodity. There is the illusion of a public sphere, according to Habermas, who argued that citizens have become consumers, investors, and workers. Real news (information which helps free people stay free) is being elbowed out by advice, soft-porn, catchy garbage, celebrity antics, and has become infotainment, that is, a commodity competing in a mass entertainment market. It matters less whether news is right or wrong, and matters more whether it’s gripping. Habermas’ sociological and philosophical work tries to explain how this transformation happened by examining a wide range of disciplines, including political theory, cultural criticism, ethics, gender studies, philosophy, and sociology. According to Habermas, a variety of factors resulted in the eventual decay of the public sphere, including the growth of a commercial mass media, which turned the critical public into a passive consumer public; and the welfare state, which merged the state with society so thoroughly that the public sphere was squeezed out. It also turned the “public sphere” into a site of self-interested political brawl for state resources rather than a space for a public-minded rational consensus. And it turned real citizens into consumers.[75][76]

  • Benjamin Ginsberg. Johns Hopkins political scientist Ginsberg suggested in The American Lie that the state was strong and the individual citizen was weak and that government encouraged passivity by giving the illusion of control with regular elections. He argues voting is like a tranquilizer which passifies people and brings the illusion of popular control. It’s like a safety valve for government since it directs popular energies away from more serious, unwanted threats to government power in the form of public protests, demonstrations, or legal challenges such as lawsuits. Ginsberg argued that when an aggrieved group such as women or African-Americans became angry to the point of possibly protesting, that government gave them voting rights quickly as a way to thwart the more dangerous challenges to their authority. This explanation has is somewhat confirmed with the fact that youth anti-war demonstrations of the late 1960s were soon followed by persons aged 18-21 having the right to vote; the demonstrations died down in the early 1970s, and people resumed a focus on prosperity and jobs and living. Did the youth vote? Not much; the proportion of eligible persons who voted was even lower than for the electorate as a whole.Ginsberg argued that the move to a volunteer military force undermined citizenship. He believes citizen participation in soldiering is good since it strengthens patriotism and instills a willingness to sacrifice for one’s country. In the Vietnam War, the draft system of randomly choosing young men to fight in the war based on a lottery system caused much resentment and fueled popular protests; a volunteer military, in contrast, lets people choose to be soldiers as if it was a paid job with benefits like any other, albeit more dangerous. Ginsberg argued that a voluntary military eliminates “a powerful patriotic framework” since “instead of a disgruntled army of citizen soldiers, the military seems to be consisted of professional soldiers and private contractors.” Ginsberg wrote that “government learned the lessons of Vietnam and has found ways to insulate the use of military force” from society. Ginsberg criticized American leaders for trying to wage war on terrorism without any sacrifice from citizens: “U.S. leaders have pleaded for what can best be described as defiant normalcy — living, spending and consuming to show that terrorists won’t change the American way of life,” according to a reporter commenting on Ginsberg’s views.[28] [32] [77] [78] [79]

I agree with Ginsberg’s views, generally, and my reading of history suggests that it’s important for free people such as ourselves to do our own fighting, and not depend on paid professionals to do it for us. Being a warrior is not a task we can pawn off on others. It’s too important. People with guns have power. Machiavelli, in particular, suggested mercenaries should not be counted on to protect free peoples; armed, they were likely to turn their guns on their “masters”, and there are numerous instances in which hiring mercenaries proved unwise.


  • Dana D. Nelson 2008

    Dana D. Nelson, a Vanderbilt professor, argued that all people do, politically, is vote for president every four years, and think that by voting, they’re finished their civic duty.[80] [81] [82] She is a progressive advocate for citizenship who argued in Bad for Democracy of a tendency by Americans to neglect basic citizenship duties while hoping the president would solve most problems, or what she termed presidentialism. She saw an American tendency to “look to the sitting president as simultaneously a unifier of the citizenry and a protector from political threats.”[83] [84] Nelson urged citizen action:

“Our habit of putting the president at the center of democracy and asking him to be its superhero works to deskill us for the work of democracy … The presidency itself has actually come to work against democracy … We stop waiting for someone else to do it for us. We organize together, using public spaces and the internet. We form blogs, we write letters to the editor, we show up at Congress, we protest, we call, we lobby, we boycott, we buycott, we email our representatives, we find supporters, we get them moving, we grow the movement. We ignore the idea that the right president will do it for us and find every way we can to do it ourselves”. — Dana D. Nelson [85]

 
  • Robert D. Kaplan. In The Atlantic, he agreed that the domain of politics in America has been shrinking. He described how many city spaces are designed not to meet citizens’ needs but to serve corporate ends. He linked the decline of political participation with mass culture, consistent with the analysis by Habermas. Kaplan wrote:

“We have become voyeurs and escapists … it is because people find so little in themselves that they fill their world with celebrities … The masses avoid important national and international news because much of it is tragic, even as they show an unlimited appetite for the details of Princess Diana’s death. This willingness to give up self and responsibility is the sine qua non for tyranny.”[19]

While political participation in terms of voting has been declining steadily, Kaplan argued, in contrast to Ginsberg, that there are substantial benefits in some respects to non-participation. He wrote “the very
Kaplan argues that the Singapore model suggests political participation
is less important than wealth. Singapore Photo by Mohd Kamal ccsa2.0

indifference of most people allows for a calm and healthy political climate … Apathy, after all, often means that the political situation is healthy enough to be ignored. The last thing America needs is more voters — particularly badly educated and alienated ones — with a passion for politics.”[19] He argued that civic participation, in itself, is not always a sufficient condition to bring good outcomes; he argued against bringing democracy to poor countries torn by ethnic violence and marred by illiteracy since the freedom to debate and vote often results in more fractiousness. He pointed to Singapore as an authoritarian model which, because it emphasized “relative safety from corruption, from breach of contract, from property expropriation, and from bureaucratic inefficiency”, it prospered. Kaplan asked “Doesn’t liberation from filth and privation count as a human right?”[19] And in twenty-first century America, with an integrated and robust and growing worldwide economy, there are numerous opportunities to make money and, as a result, have freedom to buy a huge assortment of consumer goods, and not be dependent on citizens or neighbors. If citizens have become consumers, there are positive parts of this, although the risk remains that when people no longer participate in government, there are increased chances for oligarchy or tyranny such as what happened to ancient Athens or the ancient Roman republic.

  • Naomi Wolf. Writer and activist for democracy Naomi Wolf warned of serious danger to America’s
    Naomi Wolf (2008) Photo:
    D. Shankbone.ccsa3.0

    democracy from the federal government in the aftermath of the 9/11 attacks. Her book The End of America described parallels between the United States and pre-World War II Germany including surveillance of citizens, secret prisons, paramilitary forces, torture, and wondered whether America faced the prospect of fascism.[86] She said the National Counterterrorism Center had the names of roughly 775,000 “terror suspects” in February 2008.[87] Wolf wondered how young persons could be accurately called citizens and complained that young people don’t understand capital-D “Democracy”.[88] She wrote in 2007:

“This lack of understanding about how democracy works is disturbing enough. But at a time when our system of government is under assault from an administration that ignores traditional checks and balances, engages in illegal wiretapping and writes secret laws on torture, it means that we’re facing an unprecedented crisis. As the Founders knew, if citizens are ignorant of or complacent about the proper workings of a republic “of laws not of men,” then any leader of any party — or any tyrannical Congress or even a tyrannical majority — can abuse the power they hold. But at this moment of threat to the system the Framers set in place, a third of young Americans don’t really understand what they were up to.” — Naomi Wolf in 2007 [89]

Conclusion

My thinking at this point, and which may change over time as I keep learning, is that citizenship as active political participation has diminished substantially over the past three hundred years, but that it is not all bad. Overall, the bigger picture is highly important. If human history was a newspaper today, the story about declining political participation would be buried in a back section, and the front page headline would be the empowerment of the individual

People are much much better off today than in 1700. We have benefited from an incredible revolution in power and knowledge and technology which has made humans the dominant creature on the planet. A middle-class

Earth is like small blue spaceship, as seen by the crew of Apollo 8
in 1968 while orbiting the moon. Photo: NASA.

person such as myself living in a Western nation such as the U.S. enjoys a much better life than could have ever been imagined by the richest and most powerful kings of ages past. We have better health care, live longer, enjoy incredible music and books and movies, enjoy delicious fresh healthy foods. Egyptian pharaohs never enjoyed the sensation of speed by riding in the back seat of a convertible on a highway — the top speeds of their chariots may have been, perhaps, 30 mph or 48 kph, and were probably dusty, bumpy, and dangerous. Autos today travel twice as fast, smoothly, stereos playing, with windshields and air conditioning. King Charlemagne could never have flown to Australia; almost certainly, he never even knew Australia or America existed. If Ghengis Khan had had a serious toothache, he would have had to endure excruciating pain, but we can see a dentist for a pain-free fix. Battlefield infections usually meant death; we have antibiotics. Nobody before 1968 had ever seen a picture of the earth as a small round ball in the darkness like a satellite — but we can see the photo right here. It was taken from an Apollo spacecraft by an astronaut circling the moon. Any ancient Greek with poor eyesight had to squint through blurriness, but we can have glasses or contact lenses or laser eye surgery. We see better than our ancestors. When did Alexander the Great die? Age 33, after an illness. The philosopher Spinoza? Age 44, perhaps from a lung infection from inhaling glass dust. But most babies today can expect to live into their sixties or seventies or later. Up until recently, sizeable percentages of women died in childbirth; but this is rare today. We live better, much better, than humans of yesteryear.

Clearly, living on a planet with six billion people, we have less power to influence things politically than our

Magnetic resonance imaging
machine. Photo: US Navy.

ancestors did simply because there are many more people. In essence, free people have become parts of much larger, more powerful national entities which are themselves part of a complex worldwide human civilization. As these entities become larger, more complex and powerful, individuals have become smaller, more specialized, minute discrete parts of vast complex systems and networks, less powerful in a relative way. We’re specialists. For example, some people have the highly specific job of MRI technologist; these people study for years to develop this highly specialized skill of making sense of images made by incredibly sophisticated machines which visually photograph the insides of a human body. Is it particularly important for an MRI technologist to attend local town hall meetings perhaps once every few months? Or is it more important that this MRI technologist keep up-to-date with this rapidly changing

An MRI picture of a living head.
No king from ages past has ever
even seen such a photo. You have.

technology to stay employed and to possibly diagnose any illnesses we might have? There are no easy answers here. There are huge benefits from being a part of an elaborate complex entity, and drawbacks too.

Professor David Christian in Big History suggested that as civilizations emerge, they become not only more complex but more fragile. The Mayan civilization disappeared around 800 A.D. and scholars continue to grapple for an explanation about why it disintegrated so quickly. Like it or not, our fate as humans is bound up with these larger structures. And our civilization, too, is highly fragile.

My personal sense is that citizenship is important, and that restoring active political participation would help improve national decision-making and help our civilization be less fragile. There are tough nagging problems ahead for humanity which, in my view, can only be solved if people regain some type of control over political decision-making. Issues such as global warming affect everybody on the planet. In the United States, social security underfunding, foreign policy challenges, water issues, business regulation and government corruption should be addressed.

As an American, I find the most pressing national issue to be the specter of nuclear terrorism, and in my view,

We live in the nuclear age. Test explosion 1954.
Photo: U.S. Dept. of Energy.

this is a problem which can only be solved if citizens elect to rethink the structure of public life. That is, I don’t see terrorism as a government problem, but as a citizens’ problem. At present, people can move about society more or less anonymously and have become used to this arrangement, but this anonymity enables terrorists to cause serious damage. I believe preventing terrorism requires that all movement in public by people and things should be identified, tracked and monitored by authorities who are themselves tracked and monitored, and further, that strong privacy fences be wrapped around this sensitive information so that only authorities use it to prevent terrorism, and don’t abuse it to hurt us or violate our privacy. The idea is both enhanced security and privacy. Abuses or leaks of publicly-held private information would be punished. In my view, it would be a highly beneficial transformation since authorities would be able to prevent almost all crimes by serial killers, nuclear terrorists, bank robbers, kidnappers. We’d be safer. And I outline a way to think about the whole problem of terrorism in my book Common Sense II which people can read without paying. But I realize such a controversial transformation is a highly radical step with its own risks. Should such a system be adopted? It’s a question for citizens. People would have to agree to the changes I propose. But since citizens are, collectively, out to lunch, clueless, not paying

Sunrise in southeast Alaska, 1990. Photo John Bortniak by NOAA.

attention, powerless, overly-entertained, and misinformed, then there is virtually no chance that such reforms could ever happen. And I don’t think government can be trusted to solve the problem for us, since many of its solutions may result in even less freedom and power for individual citizens. As a result, civilized nations may well experience the unpleasant discomfort of having capital cities exploded by smuggled nuclear bombs. It could happen. Let’s hope it doesn’t. In the meantime, it’s critical that we enjoy this amazing world we live in, and do the best we can, and collectively bask in humanity’s moment in the sun. :)

If interested in my other available articles:

About these ads
Follow

Get every new post delivered to your Inbox.