31 January 2007
Gliberalism
The year was 1917. At the beginning of the spring semester, the Harvard Crimson reported that 1,000 undergraduates were ready to enlist in the Reserve Officers' Training Program (ROTC), including students from the law school, from other graduate schools, and even members of the faculty. The recent crisis in international affairs had created a need for qualified military leaders, and the editorial hailed the school's vigorous response to it: "That Harvard is the first University to adopt an intensive system of training officers should not be a matter of pride, but rather a basis for the hope that other colleges will establish the same system, and that the foundations of a great citizen army will be laid among our young men."
Ninety years later, Harvard leads in the opposite direction. John Kerry may have apologized for saying that those who make the most of their education "can do well," while the rest "get stuck in Iraq," but the same cynical message has long since been issuing from elite centers of learning. There are currently ROTC programs at hundreds of American colleges, but the faculties of Harvard, Columbia, Brown, Yale, Dartmouth and Stanford continue their ban on campus military training, a deficiency all the more striking in schools that offer a superabundance of every other type of activity.
Military service is a form of protection that the young must offer the rest of us. The age of undergraduates, 17 to 23, coincides with the universal age for military conscription. When the United States ended its draft in 1973, it turned the protection of the country and its vital interests over to a force of volunteers. At that point, the word ought to have issued from the academic community that democracy will henceforth depend on the readiness of the best and the brightest to volunteer for duty. Instead, faculties shaped by the antiwar movement drove ROTC and its recruiters from the campuses. Adding hypocrisy to injury, they later blamed the military's "Don't ask, don't tell" policy toward gay enlistment for a ban that was already in effect!
Most Americans reacted to the attacks of 9/11. The president took the war to the enemy. Congress launched a commission of inquiry and began putting its recommendations into effect. Tens of thousands of families sent children and loved ones into battle. Democracies are notoriously -- and commendably -- reluctant to resort to military action, which means that almost every sector of society joined the debate over how best to respond to the aggression against us.
The elite universities alone kept silent. They did not undertake an inquiry into the reliability and adequacy of programs in Islamic and Middle East Studies, much less encourage those who "do well" academically to volunteer in the national defense. The only anxieties I heard expressed at faculty meetings since 9/11 were over the anthrax scare as it might affect the campus, and the potential encroachments on privacy of the Patriot Act. Not a word about new responsibilities the university might assume for a democratic way of life under attack.
This is not for lack of nerve among students. Back in 1999, following months of open debate, Harvard's Undergraduate Council voted to support bringing ROTC back to campus. A more recent Dartmouth student poll found students in favor of greater administrative support for its ROTC cadets. The University of California at Berkeley, which has Navy, Army and Air Force ROTC units, reports an increase in all three services. Even students who are themselves reluctant to join the military resent that classmates ready to make the effort should encounter roadblocks instead of encouragement.
Individual teachers have also spoken out for the return of ROTC to their schools. Harvard professor of economics Gregory Mankiw writes, "No one benefits more from the freedoms that the military defends than academics, who use the freedoms of expression more liberally than the average American. It seems particularly reprehensible for us to free ride as completely as we do." But not even Lawrence Summers, who spoke out forcefully in support of ROTC during his tenure as president of Harvard, was able to take on the faculty on this issue.
University administrations live in fear -- but not of al Qaeda or the destructive capabilities of Mahmoud Ahmadinejad and Kim Jong Il. They fear the tactics of disruption and violent uprising perfected by radicals of the 1960s and available to their heirs. The more prestigious the university, the more traumatized it seems to be by memories of riots it was once powerless to quell. Preying on those fears, dissident groups have learned to use the politics of intimidation to impose their agenda, as was recently demonstrated by a consortium of student groups at Columbia University that organized to prevent the speech of Minuteman founder Jim Gilchrist. So far, Columbia's President Lee Bollinger has left these hooligans unpunished, making it all the more unlikely that he would risk inviting them or their peers to participate in the national defense.
Recent surveys confirm that university faculties have been tilting steadily leftward, but I think it is wrong to assume they have been tilting toward "liberalism" as is commonly assumed. Liberalism worthy of the name emphasizes freedom of the individual, democracy and the rule of law. Liberalism is prepared to fight for those freedoms through constitutional participatory government, and to protect those freedoms, in battle if necessary. What we see on the American campus is not liberalism, but a gutted and gutless "gliberalism," that leaves to others the responsibility for governance, and arrogates to itself the right to criticize. It accepts money from the public purse without assuming reciprocal duties for the public good. Instead of debating public policy in the public arena, faculty says, "I quit," but then continues to draw benefits from the system it will not protect.
The national and international crisis may eventually pull the elite universities into action, but by then, gliberalism will have done its damage.
Ms. Wisse is the Martin Peretz Professor of Yiddish literature and professor of comparative literature at Harvard.
If you have tips, questions, comments, suggestions, or requests for subscription only articles, email us at lybberty@gmail.com.
29 January 2007
The Other Russia
By Melanie Kirkpatrick
NEW YORK -- As the longtime world chess champion, Garry Kasparov was a famously aggressive player. His latest game is politics, and his style is equally aggressive. "Our goal is to dismantle the regime," he says, speaking of the political coalition he leads to bring down Vladimir Putin.
Mr. Kasparov's Putin antipathy is well known to readers of this newspaper, of which he is a contributing editor. "I Was Wrong About Putin," was the headline on his Jan. 9, 2001, op-ed article for this page. One year into Mr. Putin's presidency, Mr. Kasparov sounded an early warning about a man whose "KGB roots have informed a style of governance that is neither reformist nor particularly democratic." Since then, Mr. Kasparov has scarcely let up, retiring from chess in March 2005 in part to devote himself to politics.
Mr. Kasparov's new occupation is not without its perils -- a thought that occurred to me as we arranged to meet earlier this month at his newly refurbished apartment in an art deco building on a smart street in Midtown Manhattan. It's a neighborhood replete with sushi bars -- of the sort that bring to mind, ghoulishly, the late Alexander Litvinenko, who was poisoned with polonium 210.
The doorman announces me, and Mr. Kasparov greets me at the door. We are old interlocutors -- I was present at his first meeting with the editorial board in March 1990 and was his editor at the Journal for years. So we kiss -- twice, once on each cheek, not three times, as is the custom in Moscow. After his wife serves tea -- in bone-china English cups, not à la Russe, in glasses -- I ask Mr. Kasparov about the risks: "Look," he says, "there are certain moments in your life when you should forget calculations and do what you believe is your moral duty. I knew that the choice would be dangerous. That's why our baby was born here. I'm prepared to take all the risk, but if I can avoid some, I do." The Kasparovs have a three-month-old daughter.
"The Other Russia" is the name of the unlikely left-right coalition conceived by Mr. Kasparov in 2005 and founded last year. It is composed of groups that would normally be at political odds -- democrats like Mr. Kasparov, nationalists, socialists, even Bolsheviks. Mr. Kasparov predicts that the Communist Party will join up before the end of the year. "There's still a lot of distrust," he says, with more than a modicum of understatement. "It's a problem, but I don't think it's insurmountable. The big advantage of the Other Russia, and I think it's our biggest accomplishment, is that we've established the principle of compromise, which was not yet seen in Russian politics. It was always confrontation. It was a mentality of a civil war. We eliminated it."
A declaration at the time of the Other Russia's organizing conference last summer reads, "We are gathering together because we are united in our disagreement with the current political course of the Kremlin and united in our alarm for the present and future of our country." The group's sole objective is to find a candidate to run -- and win -- in the March 2008 presidential elections. Or as Mr. Kasparov puts it with characteristic bluntness: "When a liberal democracy is re-established, everybody goes his or her way."
The Russian Constitution forbids Mr. Putin from running for a third term -- though that doesn't quell widespread speculation that the president will ignore the rule of law and do so anyway. He "has the administrative resources" to do so, Mr. Kasparov agrees, but it would be at the price of his legitimacy -- both in the West and at home. "I don't think Putin wants to take such a chance."
Mr. Kasparov believes Mr. Putin's "mentality is just to run away -- with all the Russian billionaires. This is the richest ruling elite in the world. They are way ahead of the Saudi princes. They are mega-rich. When you're so rich, you have to make sure that your funds are safe." But "if Putin goes, then who will be in charge? That's a big problem. Then it's instability. An authoritarian regime cannot have a successor while the big name [Mr. Putin] is still alive, much less well, young and strong."
As the new year unfolds, Mr. Kasparov predicts "a political crisis" in Mr. Putin's government, along with "less stability, more uncertainty." That's the opening for the Other Russia. "We should keep our group together, close to the wall, to get into the hall when it's broken. But not too close to be buried under the debris." And then? "If the Other Russia wins, who cares? The victory of the Other Russia candidate destroys the legacy of any institution built under Putin. You have to start from scratch. You have to call new [parliamentary] elections. You have to introduce new laws. You have to undergo judicial reform. You have to destroy censorship." In short, you have to start over, back to where Russia was before Mr. Putin took over, building democracy, block by block.
The next step for the Other Russia, Mr. Kasparov says, is to come up with a platform and work out the rules for selecting a presidential candidate, tasks that are on the agenda for a conference planned for April. The candidate will likely be chosen in another conference in September or October, Mr. Kasparov explains. At the moment Mikhail Kasyanov, a former prime minister, "looks most prominent."
And what about Garry Kasparov? Is he a candidate? It's the only time in 15 years of conversations with Mr. Kasparov that I've known him to be less than confident in a reply. "So far . . .," he says -- note the "so far" -- "so far, I don't think my personal participation helps the coalition because so far" -- another one! -- "I keep the position of moderator. . . . I keep balance of different forces. If I step into the game, that might jeopardize the whole coalition."
In the course of our discussion, Mr. Kasparov refers often to the lack of a free press in Russia. So how, then, will the Other Russia get its message across? "The role of Internet is growing," he says. "Mobile telephones are not unique anymore, not even in rural villages." But -- and the master chess player may have too much confidence in the analytic abilities of ordinary Russians here -- "more important is growing malcontent. People are getting really unhappy. And if they're unhappy, they'll listen."
Mr. Kasparov is far more worried about money, which is short; but "I think in 2007 we will see a major influx of our financial support from within Russia because people can see that the ground is shaky." The Other Russia won't touch "politically exposed money," he says -- and emphatically denies that exiled oligarch Boris Berezovsky is a donor. But in the end, he says, "You know, you can't buy political support. Either you are the right man at the right place at the right time or no money helps you." More political naïveté?
Our hour nearly at an end, conversation drifts back to the early '90s and the discussions we used to have about Russia and its future. Is there something the U.S. might have done differently back then, I ask, that would have helped keep Russia on the path to democracy?
Mr. Kasparov gives a wry smile. "I think the best thing [the U.S.] could have done was to get Saddam [Hussein] 15 years earlier," he says. "By going after Saddam in 1991, I think we could have saved Yugoslavia from a civil war and could have sent a message, a very powerful message, to many dictators. . . . In 1991, the United States was much stronger and everybody else was much weaker."
The decision to let Saddam stay in power happened under the watch of President George H.W. Bush, whom Mr. Kasparov isn't shy about criticizing. But he's far more scathing about President Bill Clinton. "During the Clinton years, the United States did virtually nothing in the international arena. . . . There were a lot of activities, but when you look at the core events, I think the influence was irrelevant. . . . Leadership. There was no leadership. . . . There was a big window of opportunity to show leadership, in 1992-93. In those years the whole world was in an ambiguous state after the Cold War. It was a new world, and it required leadership. The way Winston Churchill and [Harry] Truman showed it in World War II. . . . Missing this chance and playing sporadically -- you know, boom, boom, you play one move here, one move there. The United States was asleep."
What advice does he have for George W. Bush about helping Russian democracy today? "Stay neutral," comes the swift reply. The "worst thing" that happened to the democracy movement, he says, was the inclusion of Russia in the Group of 7 democracies, now the G-8, a designation he can't bring himself to utter. Now, Washington should take that position that "there must be an election under the Russian constitution. Putin must go, and elections should be held. Period. That's enough. There's no double standard. Obey the Constitution. That's it."
In addition to his work with the Other Russia, Mr. Kasparov continues to write books about chess -- he's up to Volume Six in a series about his great predecessors -- and he has a mass-market book coming out this year called "How Life Imitates Chess," about the decision-making process in chess, business, politics and history. But at least for now, politics has taken the place of chess as the big game in his life: "I just don't see any other choice for me," he says. "As I used to say for 25 years, I am defending the colors of my country. I'm still doing the same, just not at the chessboard. At a much larger board."
Ms. Kirkpatrick is a deputy editorial page editor at The Wall Street Journal.
If you have tips, questions, comments, suggestions, or requests for subscription only articles, email us at lybberty@gmail.com.
25 January 2007
Mitt Romney for President - Part IV
Friday, January 5
Dear Damon,
I appreciate your moderate and respectful reply to my objections. It is often hard for non-Mormons to understand how Mormons believe all we do. You at least see how Mormon beliefs and our way of life could be satisfying to educated, reasonable people, among whom you presumably would include Mitt Romney.
What troubles you is the implication of belief in prophetic revelation: Would Mormons perform any dire deed for their prophet no matter how contrary to conscience? And what about the belief that the United States and the Church might combine to dominate the world some day? Would Mitt Romney serve as the tool of Church leaders in facilitating a plan for world domination? His belief in revelation seems to require that he should.
These seem like perfectly legitimate questions, but they have a point only if you assume potentially dark motives on the part of Church leaders. You object that you do not use the word “fanatic” in your article, but the questions evoke the very image of fanaticism I was talking about: evil-minded religious leaders employing their spiritual authority over blindly loyal followers to magnify their own power. That is exactly the picture painted by the nineteenth-century polemicists who labeled Mormons fanatics. And they reached their conclusion in the same way as you do–by “teasing out” implications. The protestations of innocence by Mormons themselves mean nothing. Nor do their actions calm the fears. All that matters is that the reasoning from premise to conclusion–revelation to vicious action–is impregnable. Doubtless without meaning to, you are following the reasoning of the anti-fanatics to its fearful conclusion.
In evaluating the political implications of Mormon beliefs, you should use real facts about real events, not theoretical possibilities. Have Mormon leaders actually used their influence to manipulate politicians in the interest of world domination? What reason is there to think they have this on their minds? The reason Mormons are likely to find your analysis a phantasm is that we rarely, if ever, speculate about the world when the millennium comes. This is simply not on the agenda of active Mormon concerns, and it is certainly not a “core” belief. If anything, Mormons draw on the tradition that holds that many religions will flourish after the coming of Christ–a kind of American-style tolerance of all faiths. Mormons conscientiously carry the gospel to the world, but I have never heard a Mormon forecast political domination, much less collaboration with the United States government. Are you aware of Church leaders discussing such plans? No.
From your reply, I would judge that you are most concerned about loyalty to prophetic authority. Would Mitt Romney as president give way to immoral and illegal directives from Salt Lake? You make the subtle and interesting point that Mormons have no natural law tradition to constrain a Mormon president–either a president of the Church or the country. Since revelation trumps everything, where are the limits?
Your concern might be alleviated by considering how revelation actually works–in Mormonism and in biblical history. The scriptures themselves place heavy restraints on prophets. It makes a big difference that the moral law is enunciated endlessly in Mormon scriptures. The Ten Commandments were rehearsed in an early revelation, reinstalling them as fundamentals of the Church. Later, the Saints were told “no power or influence can or ought to be maintained by virtue of the priesthood, only by persuasion, by long-suffering, by gentleness and meekness, and by love unfeigned.” Could all this be overthrown by a new revelation? You think that revelation wipes the slate clean, negating everything that went before. But that is not the way prophetic revelation works, now or ever.
The proper analogy is to the courts and the Constitution. The law is what the courts say it is, we assert hyperbolically. Theoretically nine justices can overturn any previous interpretation of the Constitution on a whim. But, in fact, they don’t–and we know they can’t. Their authority depends on reasoning outward from the Constitution and all previous decisions.
The same is true for prophets. They work outward from the words of previous prophets, reinterpreting past prophecy for the present. That was certainly true for Joseph Smith, whose most extreme revelation–plural marriage–was based on plural marriage in the Bible. Prophets do not write on a blank slate. They carry forward everything that went before, adapting it to present circumstances. Like Supreme Court justices, they would put their own authority in jeopardy if they disregarded the past. The moral law, embedded in this revelatory tradition, exercises far greater influence on Mormon thought than the abstractions of natural law could possibly effect.
I am asking you not to focus so narrowly on what you take to be the logical implications of revelation. That is what critics of fanaticism have been doing for centuries. Look at the historical record of the past century as Mormons have entered national politics. Is there evidence of manipulation?
Consider the Church’s own renunciation of control over the consciences of Mormon politicians–a stand Catholics have not taken. Are you saying this is a false front? Keeping in mind the injunction in Mormon scripture to submit to lawful government, is there any real basis for concern?
Best,
Richard
If you have tips, questions, comments, suggestions, or requests for subscription only articles, email us at lybberty@gmail.com.
24 January 2007
Mitt Romney for President - Part III
Thursday, January 4
Dear Richard,
I was delighted when I learned that you would be responding to my article on Mitt Romney. I admire your work on Joseph Smith and the beginnings of Mormonism, so I hoped for a critical engagement with the substance of my essay.
I must admit, however, to being disappointed with your response. Instead of answering the questions I pose, you dismiss them as a product of my overheated and paranoid liberal imagination. Unwilling to concede the validity of anything I argued in my piece, you claim that what I wrote “makes no sense” to Mormons–all the while failing to point to a single factual inaccuracy in my article. Rather than engaging with the theological concerns I raise, you say that they all flow from my belief that Mormons are religious “fanatics.” Indeed, you consider this last point so decisive that you use variations on the word “fanatic” 14 times in your 1,000-word response–despite the fact that I never used it or any similarly harsh or dismissive adjective to describe Mormon beliefs in my article.
For the record, I don’t consider Mormons to be fanatics. I consider them to be very seriously religious, and I think that their faith deserves respect–certainly far more respect than it has typically been accorded in the press and by evangelical Protestants. I am deeply impressed by the audaciousness of Joseph Smith’s revelations. In addition to bringing forth a new 500-page book of scripture and setting out to correct (”retranslate”) the canonical Old and New Testaments, Smith denied the creation of the universe ex nihilo, proposed that God has a body, and suggested that human beings can evolve into Gods themselves. More remarkable still, he persuaded large numbers of people to accept these heterodox beliefs and to risk (and, in many cases, to lose) their lives defending their right to affirm them.
However odd Mormon beliefs may sound to orthodox Christians and doctrinaire secularists, these critics need to recognize that the LDS Church proclaims a vision of the world and God that speaks to something noble in the souls of millions of Mormons and the thousands of people who convert to the Church every year. (This is, in part, what Harold Bloom meant in The American Religion when he accurately described Joseph Smith as one of history’s great religious geniuses.)
It is precisely my respect for Mormonism–my desire to take it and its religious claims seriously–that leads to my disappointment at your response to my article. You say that arguments like mine “baffle” Mormons. But why? I made three interrelated assertions in my essay–that Mormons believe Jesus Christ will return sooner rather than later; that, when he returns, he is likely to rule the world from the territory of the United States; and that the president of the Church is considered to be a prophet of God. Then I teased out various possible political implications of these theological commitments. In your response, you do not take issue with my three assertions, presumably because they are accurate statements of core LDS beliefs. Where my article becomes baffling is thus apparently in its discussion of implications. Mormons, you imply, would never follow a morally questionable or politically perilous pronouncement by the prophet in Salt Lake City.
I do not doubt that you and many other Mormons believe this. But can you tell me (and other non-Mormons) why–on what basis–you believe it? A devout Roman Catholic, for example, would have plenty of theological resources to grapple with an analogous question about following a papal edict. She might begin by pointing out that the Pope is not considered a prophet and is only rarely presumed to speak infallibly. She might then appeal to natural law, which an authentic papal pronouncement could never contradict. Then there is the closed canon of scripture. And a series of binding councils stretching back to the early days of the church. And a nearly 2,000-year tradition of relatively settled dogma and doctrine on faith and morals.
As I explained in my article, Mormonism has none of these moderating safeguards. It considers its leader to be the “mouthpiece of God on Earth.” Mormon cosmology is arguably incompatible with natural law theory. It rejects the authority of every church council accepted by historic Christianity. And its scriptural and doctrinal traditions are fluid and radically open to revision in light of new prophetic revelations. On the other side of the ledger, I also suggested that the hierarchical structure of the LDS Church has tended to have a moderating influence on its leadership and that it might very well continue to do so in the coming years. To this you have added individual conscience, which you believe would keep Mormons from following a questionable prophetic commandment unthinkingly. This is a promising start, but it is only a start. Conscience, after all, is a notoriously unreliable guide to right action–one that is most effective when it supplements firmer sources of morality and belief.
Does Mormonism contain such sources? If so, what are they? I taught at Brigham Young University for two years and count several Mormons among my closest friends, and yet the answer to these questions remains a mystery to me. And LDS culture today is shot through with so many unsettling contradictions that I find it hard to see how this mystery could be dispelled anytime soon. The Church is profoundly conservative, but its theological and historical foundations are incredibly radical (involving not only multiple acts of prophesy and revelation but also the establishment of a polygamous theocracy in the intermountain west). I know many intellectually curious and skeptical Mormons, but their curiosity and skepticism nearly always remains cordoned off from their religious beliefs.
At the level of the ward (or parish), LDS church life is highly egalitarian, but individual Mormons tend to be extraordinarily deferential to ecclesiastical and political authority. I could go on.
As Mitt Romney prepares to become the most serious Mormon candidate for president in American history, members of the LDS Church (and especially its leading scholars and intellectuals) owe it to themselves and to their country to think deeply and publicly about these issues. The alternative–striking a purely defensive stance and hoping the questions and concerns will go away–is simply not a serious response.
Best,
Damon
If you have tips, questions, comments, suggestions, or requests for subscription only articles, email us at lybberty@gmail.com.
23 January 2007
Mitt Romney for President - Part II
by Richard Lyman Bushman
(hat tip: Matt Lybbert, thanks: Connor Boyack)
Dear Damon,
Your anxiety about a Mormon politician knuckling under to a Mormon Church president replays the debate in 1904 over the seating of Apostle Reed Smoot in the United States Senate. Senators kept questioning church president Joseph F. Smith about his control of Mormon politics. Over and over, he assured the committee that he had no intention of dictating Smoot’s votes in the Senate, but the questioning went on.
Now, a century later, we can judge the actual dangers of the Mormon Church to national politics from the historical record. Have any of the church presidents tried to manage Smoot, Ezra Taft Benson, Harry Reid, or Gordon Smith? The record is innocuous to say the least. There is no evidence that the church has used its influence in Washington to set up a millennial kingdom where Mormons will govern the world or even to exercise much sway on lesser matters. It’s a long way from actual history to the conclusion that “under a President Romney, the Church of Jesus Christ of Latter-Day Saints would truly be in charge of the country–with its leadership having final say on matters of right and wrong.”
Mitt Romney’s insistence that he will follow his own conscience rather than church dictates is not only a personal view; it is church policy. The church website makes this explicit: Elected officials who are Latter-Day Saints make their own decisions and may not necessarily be in agreement with one another or even with a publicly stated church position.
While the church may communicate its views to them, as it may to any other elected official, it recognizes that these officials still must make their own choices based on their best judgment and with consideration of the constituencies whom they were elected to represent.
You are going against all the evidence of history and stated church policy in contriving the purely theoretical possibility of Mormon domination. Is that not the stuff from which all paranoid projections on world history have been manufactured?
Liberals must be particularly cautious in speculating about the political intentions of religious groups because of their fascination with fanaticism. Fanaticism is one of the most firmly entrenched stereotypes in the liberal mind. The fanatic is the polar opposite of all that the liberal stands for and thus constitutes a particularly delicious enemy.
Joseph Smith ran up against the fear of fanaticism almost from the beginning. It was the chief underlying cause of the recurrent expulsions the Mormons suffered. When non-Mormons could find no specific infractions to warrant prosecution in the courts, they resorted to vigilante action to drive the Mormons out. The Mormon presence was unbearable because they were so obviously fanatics. Quite typically, the fear of fanaticism led democrats into undemocratic extremes. Mormons were deprived of their property and the right to live and vote in a supposedly open society. In 1846, after a decade and a half of recurring attacks in Missouri and Illinois, a body of armed citizens forced out the pitiful remains of the Mormon population in Nauvoo by training six cannons on the town.
The stereotype of fanaticism is essentially a logical construction. The seemingly airtight logic is that anyone who claims to speak for God must believe he possesses absolute truth with an implied commission to impose that truth on everyone else.
Mohammed, to whom Joseph Smith was frequently compared, used violence. Joseph Smith, lacking the means, tyrannized his own followers and refused to acknowledge the truth of any other doctrines but his own. You assume that Mormon leaders, by the same token, will want to commandeer the United States government to advance their cause.
Nothing Mormons can do will ever alleviate these fears. It did not help that the right of individual conscience in religious matters was made an article of faith, or that the Nauvoo city council passed a toleration act for every conceivable religious group including Catholics, Jews, and “Muhammadans.”
Whatever they said, their neighbors could not believe that the Mormons’ ultimate goal was not to compel everyone to believe as they did.
Your essay chooses not to look at the historical record, because specific facts are irrelevant in explicating fanaticism. It is the logic of revelation that counts. The Mormons have to be interested in world domination because their doctrine requires it of them. Furthermore, they are all dupes of the chief fanatic and will willingly do anything he requires. You cite as proof of this extravagant claim “more than one” undergraduate who said he would kill if commanded. No mention was made of students who said they would have refused. That method is in keeping with the management of the fanatic stereotype. There is no effort to give a balanced picture. Certain key facts or incidents are made archetypal. In unguarded moments or exceptional instances the true nature of the fanatic mind reveals itself.
The unquestioned belief in the potency of fanaticism makes facts unnecessary. Readers know in advance what to expect just as they foresee the ending of a romantic movie far in advance. The art of writing in this mode is to mobilize all of the foreknown elements and arrange them to reach an expected conclusion.
Damon, I thought you moved along judiciously through most of the essay, but you blew your cover in the paragraph of questions to Mitt Romney. There, you try to nail him on his beliefs about the church president being a prophet. It follows necessarily, you think, that, if Romney believes in current prophecy, the church will run the country under his presidency. That leap from assumption to conclusion in one bound is only possible if you are steeped in the logic of fanaticism. For Mormons themselves, it makes no sense.
You are caught in the dilemma that ensnares everyone preoccupied with fanaticism. You describe Mormonism in a way that makes perfect sense to non-Mormons and no sense to Mormons themselves. This means, to me, that you are describing the inside of your own mind as much as the reality of Mormonism. Mormons will hear a lot of this so long as Romney is in the race, and it will baffle them every time.
Best,
Richard Lyman Bushman
If you have tips, questions, comments, suggestions, or requests for subscription only articles, email us at lybberty@gmail.com.
22 January 2007
Mitt Romney for President - Part I
Stay tuned for a response from Dr. Richard Bushman (Professor of History, Columbia University), a reply from Linker, and a final repartee from Bushman.
The Big Test
By Damon Linker
Within days of stepping down as governor of Massachusetts on January 4, Mitt Romney is expected to announce his candidacy for president. Shortly after that, Romney will almost certainly need to deliver a major speech about his Mormon faith–a speech in the mold of John F. Kennedy’s 1960 address to the Baptist ministers of Houston, Texas, in which the candidate attempted to reassure voters that they had no reason to fear his Catholicism. Yet Romney’s task will be much more complicated. Whereas Kennedy set voters’ minds at ease by declaring in unambiguous terms that he considered the separation of church and state to be “absolute,” Romney intends to run for president as the candidate of the religious right, which believes in blurring the distinction between politics and religion. Romney thus needs to convince voters that they have nothing to fear from his Mormonism while simultaneously placing that faith at the core of his identity and his quest for the White House.
This is a task that may very well prove impossible. Romney’s strategy relies on the assumption that public suspicion of his Mormonism–a recent poll showed that 43 percent of Americans would never vote for a Mormon–is rooted in ignorance and that this suspicion will therefore diminish as voters learn more about his faith. It is far more likely, however, that as citizens educate themselves about the political implications of Mormon theology, concerns about the possibility of a Mormon president will actually increase. And these apprehensions will be extremely difficult to dispel–because they will be thoroughly justified.
The religious right has been enormously successful at convincing journalists not to raise questions about the political implications of a candidate’s religious beliefs. Analyzing the dangers of generic “religion” to the nation’s political life is considered perfectly acceptable–indeed, it has become a cottage industry in recent years–but exploring the complicated interactions between politics and the theological outlooks of specific religious traditions supposedly smacks of bigotry. The focus on Kennedy’s Catholicism in 1960, for example, is today widely derided as a shameful expression of anti-Catholic prejudice that ought never to be repeated.
This is unfortunate. However useful and necessary it may be to engage in theoretical reflection on politics and “religion,” the fact is that there is no such thing as religion in the abstract. There are, rather, particular religious traditions, each of which has its own distinctive history of political engagement (or disengagement, as the case may be). And, certainly, the political history of pre-Vatican II Catholicism–with its overt hostility to modernity, democracy, liberalism, and religious “error,” as well as its emphasis on the absolute authority of the Pope in matters of faith and morals–raised perfectly legitimate questions and concerns about what it would mean for the United States to elect a Catholic to the nation’s highest office.
A very different, though arguably more troubling, set of questions and concerns are posed by the prospect of the nation electing a president who is an active member of the Church of Jesus Christ of Latter-Day Saints (LDS). In some ways, Catholicism and Mormonism present diametrically opposed political challenges to liberal democracy. With Kennedy’s faith, the concern was over the extent of his deference to a foreign ecclesiastical authority. The genuine and profound loyalty of Mormons to the United States and its political system is, by contrast, undeniable. Indeed, LDS patriotism flows directly from Mormon theology. And that is precisely the problem.
With few exceptions, America’s Christian, Jewish, and Islamic communities have roots in Europe and the Middle East. However Americanized these communities may be in doctrine and spiritual outlook, their theologies ultimately derive from older and richer traditions that predate the United States. This is true even of the many branches of Protestantism that began and flourished in the New World, nearly all of which have built on Calvinist theological motifs.
Not so for Mormonism. Radicalizing traditional Protestant worries about corruption in the historic church, the religion founded in 1830 by Joseph Smith in upstate New York has understood itself from the beginning to be a “great restoration” of authentic Christianity after an 1,800-year “apostasy” that began with the death of the original apostles. That this restoration took place in the United States was no accident, according to Mormon theology. Smith produced a 500-page document, The Book of Mormon, containing the record of an ancient civilization, descended from the biblical Israelites, that supposedly lived, flourished, and collapsed in the Americas 1,000 years before the arrival of Christopher Columbus. Jesus Christ visited these people after his resurrection in Jerusalem, spreading his gospel in the New World and planting the seeds of its rebirth many centuries later by Smith himself.
In later revelations, Smith went even further in placing the United States–both geographically and politically–at the focal point of sacred history. The Garden of Eden, he claimed, was located in Jackson County, Missouri. The American Founders were “raised up” by God in order to establish a free government that would allow the restoration to occur and the LDS Church to spread the restored gospel throughout the nation and the world. (Accordingly, all 30,000 undergraduates at LDS-owned Brigham Young University (BYU) are required to take “American Heritage”–a course that teaches the “American system of government and institutions in the context of the Restored Gospel.”)
The centrality of the United States to Mormon theology extends beyond the past and present to encompass the end times as well. Like many of the religious groups to emerge from the Second Great Awakening of the early nineteenth century, Mormons are millennialists who believe themselves to be living in the years just prior to the second coming of Christ; hence the words “latter day” in the church’s official title. Where the LDS differs from other communities gripped by eschatology, however, is in the vital role it envisions the United States playing in the end times. The Mormon “Articles of Faith” teach that, when Christ returns, he will reign “personally upon the earth” for 1,000 years, and LDS interpretations of a passage in Isaiah have led some to conclude that this rule will be directed from two locations–one in Jerusalem and the other in “Zion” (the United States). This belief has caused Mormons to view U.S. politics as a stage on which the ultimate divine drama is likely to play itself out, with a Mormon in the leading role. Joseph Smith certainly thought so, which at least partially explains why he spent the final months of his life–he was gunned down by a mob in Carthage, Illinois, on June 27, 1844– running for president of the United States.
Mormons differ from mainstream Christians in another respect as well: their emphasis on the centrality of prophecy. Christianity in both the Catholic and Protestant traditions holds that direct revelation ended many centuries ago, before the scriptural canon was closed in the late fourth century. Numerous heterodox movements have made contrary claims, of course, but Mormonism is unique in the emphasis it places on prophetic utterances. Not only was the religion founded by a self-proclaimed prophet who brought forth new works of scripture (The Book of Mormon, Doctrine and Covenants, and The Pearl of Great Price) and even rewrote (”retranslated”) passages of the canonical Old and New Testaments in light of his personal revelations; but the man who holds the office of the president of the LDS Church is also considered to be a prophet–”the mouthpiece of God on Earth,” in the words of Mormon theologian and Apostle Bruce McConkie–whose statements override both scripture and tradition.
The truly radical implications of this view were brought home to me during two years (1998-2000) I spent as a (non-Mormon) visiting professor in the political science department at BYU. Like good teachers everywhere, another non-Mormon colleague and I posed moral and ethical dilemmas in our classes in order to encourage our students to reflect on the character of the beliefs they brought to the classroom. What would they do, we wondered, if the prophet in Salt Lake City commanded them to commit murder in the name of their faith, much as the God of the Old Testament supposedly instructed the ancient Israelites to wipe out the Canaanites? More than one pious young Mormon invariably responded by declaring that he would execute the prophet’s commands, no matter what.
The point is not that Americans need to beware a covert genocidal plot by Mormons. On the contrary, LDS prophetic declarations since the late nineteenth century have tended to moderate church teaching, moving the community into greater conformity with mainstream American values–abolishing polygamy in 1890, for instance, and opening the Mormon priesthood to black members of the church in 1978. Yet the response of the BYU students nevertheless points to a potentially dangerous problem in LDS theology–namely that, by elevating prophecy above other sources of revealed truth and by insisting that the words of a prophet supersede mainstream Christian as well as established LDS scripture and tradition, Mormonism opens the door to prophetically inspired acts and innovations, the content of which cannot be predetermined in any way.
Thoughtful Mormons are well-aware of this problem, but the peculiarities of the church and its founding make devising a solution extremely difficult. One option would be for the LDS Church to follow the lead of the Catholic Church in developing a tradition of philosophical reflection on natural law or some other moral ideal to which God and his prophets are assumed to be bound or co-equal. This rationalist tradition could then be used to check the veracity of prophetic pronouncements. The difficulty, however, is that Smith encouraged his followers to cultivate suspicion of philosophy. Mormons assume that the centuries-long “apostasy” that preceded Smith was caused in large part by the rationalizing of faith that took place in the early church. According to Smith, it was questions like the one Socrates posed to Euthyphro–does God love what is good because it is good, or is it good because God loves it?–that led the church fathers and early church councils into theological and doctrinal errors that corrupted Christianity for nearly 18 centuries. To this day, the Mormon church teaches genuine respect for reason only when it operates within the narrow limits set for it by LDS prophecy.
But the obstacles to Mormons developing a binding moral theory go beyond the church’s generalized suspicion of autonomous reason; their concept of God seems to deny the very possibility of such a theory. Unlike the God of Catholics and Protestants–who is usually portrayed as the transcendent, all-powerful, all-good, and all-wise creator of the temporal universe out of nothingness–Smith’s God is a finite being who evolved into his present state of divinity from a condition very much like our own and then merely “organized” preexisting matter in order to form the world. As a result of this highly unorthodox revelation, there is simply no room for a natural morality in Mormon theology, since Mormonism tacitly denies that the natural world possesses any intrinsic or God-given moral purpose. Everything we know–or could ever know–about right and wrong comes entirely from divine commands communicated to humanity by prophets. The idea of appealing to a higher principle against the word of a prophet–the idea, in other words, of using one’s own mind to cast moral or intellectual doubt on the veracity of a prophetic pronouncement–therefore makes no sense in the Mormon conceptual universe.
These limitations have led some leaders of the church to propose that Mormons should look to the currently accepted canon of scriptures revealed by Smith as the standard by which to assess all future revelations. In the words of Joseph Fielding Smith, the tenth president of the church, official LDS scriptural texts should be used as “the measuring yardsticks, or balances, by which we measure every man’s doctrine.” This moderate and moderating view remains a controversial position in the church, however, and for good reason. None other than Joseph Smith and his successor-prophet Brigham Young seemed to take a different stance toward the authority of revelation. Compared with “living oracles,” Young declared, canonical works of scripture “are nothing,” because they “do not convey the word of God direct to us now, as do the words of a Prophet or a man bearing the Holy Priesthood in our day and generation.” To which Smith replied, “Brother Brigham has told you the word of the Lord, and he has told you the truth.”
It is impossible to know how Mormons will resolve this significant tension over the coming years. The church’s current president, 96-year-old Gordon B. Hinckley, has certainly shown no sign of theological radicalism during his eleven-year tenure as prophet. As those who have caught one of his many jovial appearances on “Larry King Live” will have noted, Hinckley is an exceedingly unthreatening figure. And whoever succeeds him may very well prove to be equally anodyne. In practice, the rigidly hierarchical institutional structure of the LDS Church–with the prophet as well as the two counselors with whom he shares the “First Presidency” drawn from the “Quorum of Twelve Apostles”–is remarkably effective at enforcing theological conservatism. It is simply very difficult to rise to the top of the organization without being a consummate company man.
Yet the fact remains that, as it is currently constituted, Mormonism lacks the intellectual or spiritual resources to challenge a declaration of the prophet who runs the church, regardless of how theologically or morally outrageous that declaration might be. Members of the church may insist that non-Mormons have nothing to worry about, since God would never issue an immoral edict, but that is quite obviously a matter of faith–a faith that non-Mormons do not share. As long as the LDS Church continues to insist that its leader serves as a direct conduit from God–a God whose ways are, to a considerable extent, inscrutable to human reason–Mormonism will remain a theologically unstable, and thus politically perilous, religion.
Article VI of the U.S. Constitution famously stipulates that “no religious Test shall ever be required as a Qualification to any Office or public Trust under the United States.” Though the Framers meant to prohibit a test compelling office-seekers to affirm a particular set of religious views, it makes sense to treat the proscription as applying negatively as well–as prohibiting a test that would exclude members of certain religious sects from holding office. In our time of heightened sectarian tensions–when devout believers and secularists increasingly perceive themselves to be stationed on opposite sides of a cultural chasm–it is crucially important that Americans remain committed to allowing every qualified citizen to run for public office, regardless of his or her religious views.
But defending the constitutional right of every qualified citizen to run for office is not the same as saying that a candidate’s religious views should be a matter of indifference to voters. In the case of Mitt Romney, citizens have every reason to seek clarification about the character of his Mormonism. Does he believe, for example, that we are living through the “latter days” of human history, just prior to the second coming of Christ? And does he think that, when the Lord returns, he will rule over the world from the territory of the United States? Does Romney believe that the president of the Mormon Church is a genuine prophet of God? If so, how would he respond to a command from this prophet on matters of public policy? And, if his faith would require him to follow this hypothetical command, would it not be accurate to say that, under a President Romney, the Church of Jesus Christ of Latter-Day Saints would truly be in charge of the country–with its leadership having final say on matters of right and wrong?
One suspects that, if pressed in this way, Romney would seek to assure voters that he would never follow such a command if it conflicted in any way with his oath of office. How such a statement would square with his professed Mormon faith is far from clear, however. Under modern conditions, some religions–Protestantism, post-Vatican II Catholicism, Judaism–have spawned liberal traditions that treat faith primarily as a repository of moral wisdom instead of as a source of absolute truth. Other religions, by contrast, have tended to require believers to accept everything or nothing at all. Mormonism (like Islam, another faith founded in prophecy) is one of the latter, binary religions. When a Mormon stops accepting the binding truth of prophetic revelation, he effectively becomes a lapsed Mormon.
At the beginning of his political career, that description seemed to fit Romney pretty well. In his failed bid to unseat Senator Edward Kennedy in 1994, Romney responded to questions about his faith by stating that he was not running “to be a spokesman for my church.” In the same campaign, Romney also asserted that states should be free to decide whether to allow same-sex marriage, and he demonized Republican “extremists” for seeking to “force their beliefs on others.” These remarks would be unusual for any devout Mormon, but they are especially noteworthy because Romney made them at a time when the LDS Church was actively working to ensure that Hawaii would not become the first state in the nation to–in the words of a church statement issued in February 1994–”give legal authorization or other official approval or support to marriages between persons of the same gender.” Even on abortion–the issue that, more than any other, unites conservative Catholics, Protestants, and Mormons–Romney portrayed himself as a moderate as recently as 2002, claiming in his run for Massachusetts governor that he “would protect the current pro-choice status quo” in the state because “women should be free to choose based on their own beliefs, not the government’s.”
But the Mitt Romney currently contemplating a run for the White House is a very different candidate. Seeking to serve as the standard-bearer for the religious right, he now staunchly opposes abortion and supports a constitutional amendment banning gay marriage. He claims, in short, to be a man of deep piety who wishes to increase the role of conservative religion in the nation’s public life. Far from soft-pedaling his faith, as he once did, he now embraces it as central to his political strategy.
A cynic would say that Romney has changed his positions in order to win the Republican nomination and that, in his heart, he’s most likely a lukewarm believer in the doctrines of his church. In that case, non-Mormons may have nothing to fear from a Romney candidacy (though religious conservatives may have grounds for concern about how well he will represent their cause). But there is another possibility: Romney may have undergone an authentic religious rebirth during the last few years–a rebirth that has led him to embrace the fundamental tenets of his church more fully than ever before in his political career. If so, voters need to know it. And they need to think long and hard about the possible consequences of making such a man the president of the United States.
If you have tips, questions, comments, suggestions, or requests for subscription only articles, email us at lybberty@gmail.com.
20 January 2007
Pornography--The Real Perversion
On a recent trip to Istanbul I encountered a group of Muslim students who insisted that American culture was morally perverse. They called it “pornographic.” And they charged that this culture is now being imposed on the rest of the world. I protested that pornography is a universal vice. “Yes,” one of the students replied, “but nowhere else is pornography in the mainstream of the culture. Nowhere else is porn considered so cool and fashionable. Pornography in America represents an inversion of values.”
As I returned home to the United States, I wondered: are these students right? I don’t think American culture as a whole is guilty of the charge of moral depravity. But there is a segment of our culture that is perverse and pornographic, and perhaps this part of American culture is the one that foreigners see. Wrongly, they identify one face of America with the whole of America. When they protest what they see as the glamorization of pornography and vice, however, it’s hard to deny that they have a point.
Pornography has become big business in the United States. You no longer have to go places to find it; it now finds you. Once confined to “dirty old men” and seedy areas of town, pornography has now penetrated the hotel room and home. The Internet and cell phone have made pornography accessible everywhere, all the time.
The spread of porn is not surprising, and neither is its popularity. It is not the appeal of sex, but the appeal of voyeurism. After all, the actors in porn films seek to gratify not themselves but the viewer. The spectator finds himself in an unnatural position of being witness to a sexual act which is conducted fully for his benefit. It’s hard to deny that there is something degrading in the continuous exposure to increasingly hard-core pornography.
In a manner that the older generation of Americans finds scandalous, porn has become socially acceptable and lost its moral stigma. A good example of this cultural cache is that today a porn star like Jenna Jameson appears on billboards and on the cover of magazines like Vanity Fair. In some liberal intellectual circles, the advocacy of porn is now viewed as a mark of sophistication. Recently the New Yorker reported on an event held at the Mary Boone art galley in Manhattan where “artists, collectors, literati, and other art world regulars mingled seamlessly with adult-movie producers and directors and quite a few of the performers themselves.” The purpose of the event was to celebrate the publication of the book “XXX: Porn Star Portraits.” The pictures in the book are accompanied by appreciative essays by leading figures on the left like Gore Vidal, John Waters, and Salman Rushdie.
The liberal defense of obscenity and pornography began many decades ago as a defense of great works of literature and of free speech. It began as a defense of books like James Joyce’s Ulysses, Flaubert’s Madame Bovary, and D.H. Lawrence’s Lady Chatterly’s Lover. But now some liberal advocates insist that all forms of sexual explicitness are equally deserving of legal protection and that no restriction of obscenity or pornography should be allowed.
This is the position defended in former ACLU president Nadine Strossen’s book Defending Pornography. As liberal pundit Wendy Kaminer puts it, in her foreword to the book, “You don’t need to know anything about art—you don’t even need to know what you like—in order to defend speech deemed hateful, sick or pornographic.” Kaminer even takes the view that child pornography should be permitted because “fantasies about children having sex are repellent to most of us, but the First Amendment is designed to protect repellent imaginings.” Actually this is pure nonsense: the framers were concerned to protect political speech and not depictions of pedophilia. But Kaminer’s view is a good reflection of what some liberals would like the Constitution to say.
Groups like the ACLU have taken the approach that pornography rights, like the rights of accused criminals, are best protected at their outermost extreme. This means is that the more foul the obscenity, the harder liberals must fight to allow it. By protecting expression at its farthest reach, these activists believe they are fully securing the free speech rights of the rest of us.
It is a long way, for instance, from James Joyce to a loathsome character like Larry Flynt, the publisher of Hustler magazine. There would seem to be an obvious distinction between fighting to include James Joyce in a high school library and insisting that the same library maintain its subscription to Hustler. For the ACLU, however, the two causes are part of the same free speech crusade. In a sense, the ACLU considers the campaign for Hustler a more worthy cause because if Hustler is permitted, anything is permitted, and therefore free speech has been more vigorously defended.
In recent years, leading liberals have gone from defending Flynt as a despicable man who nevertheless has First Amendment rights, to defending Flynt as a delightful man who is valiantly fighting against the forces of darkness and repression. “What I find refreshing about Larry Flynt is that he doesn’t pretend to be anything other than a scumbag,” Frank Rich writes in the New York Times. “At least Flynt’s honest about what he’s doing.”
These liberal virtues—honestly and openness about being a scumbag—are on full display in Milos Forman’s film The People vs. Larry Flynt. The movie sanitizes Flynt in order to make him a likeable, even heroic figure. In reality Flynt is short and ugly; in the movie he is tall and handsome, played by Woody Harrelson. In life Flynt was married five times. His daughter accused him of sexually abusing her, a charge that Flynt has denied. All of this is suppressed in the movie, where Flynt has one wife and is portrayed as an adoring and supportive husband.
Hustler features a good deal of gross and repellent material, such as its parody of Jerry Falwell having sex with his grandmother, or its picture of a woman being processed through a meat grinder. The movie, by contrast, features mostly tasteful erotica; if Flynt goes over the line, it is always presented as mischievous fun. If there is anyone who is despicable in the movie, it is Flynt’s critics, who are unfailingly shown as smug, hypocritical, vicious and stupid.
The pornographer generally knows that he is a sleazy operator. I have read interviews with men like Larry Flynt and Al Goldstein, the publisher of Screw magazine. Typically such men do not even try and defend the social value of what they do, other than to point out that there is a demand for it. It is only the ACLU and its supporters who celebrate the pornographer as a paragon of the First Amendment and a contemporary social hero. Social liberals like Frank Rich seem to have a much higher view of Flynt than Flynt himself. If we confine ourselves to liberal culture and its apologists, my Muslim interlocutors would seem to have a justified complaint. The liberal defense of pornography is even more perverted than the pornography itself.
Dinesh D'Souza's new book The Enemy at Home: The Cultural Left and Its Responsibility for 9/11 has just been published by Doubleday. D’Souza is the Rishwain Fellow at the Hoover Institution.
If you have tips, questions, comments, suggestions, or requests for subscription only articles, email us at lybberty@gmail.com.
19 January 2007
Trade Deficits: Good or Bad?
Two recent articles ought to give pause to current political and journalistic ignorance, perhaps demagoguery, about our international trade deficit. In a December Wall Street Journal article titled "Embrace the Deficit," Bear Stearns' chief economist David Malpass lays additional waste to predictions of gloom and doom associated with our trade deficit.
Since 2001, our economy has created 9.3 million new jobs, compared with 360,000 in Japan and 1.1 million in the euro zone (European Union countries that have adopted the euro), excluding Spain. Japan and euro zone countries had trade surpluses, while we had large and increasing trade deficits. Mr. Malpass says that both Spain and the U.K., like the U.S., ran trade deficits, but they created 3.6 and 1.3 million new jobs, respectively. Moreover, wages rose in the U.S., Spain and the U.K.
Professor Don Boudreaux, chairman of George Mason University's Economics Department, wrote "If Trade Surpluses Are So Great, the 1930s Should Have Been a Booming Decade" (www.cafehayek.com). According to data he found at the National Bureau of Economic Research's "Macrohistory Database", it turns out that the U.S. ran a trade surplus in nine of the 10 years of the Great Depression, with 1936 being the lone exception.
During those 10 years, we had a significant trade surplus, with exports totaling $26.05 billion and imports totaling only $21.13 billion. So what do trade surpluses during a depression and trade deficits during an economic boom prove, considering we've had trade deficits for most of our history? Professor Boudreaux says they prove absolutely nothing. Economies are far too complex to draw simplistic causal connections between trade deficits and surpluses and economic welfare and growth.
Despite all the criticism from abroad and the doom-mongers at home, the world finds our economy attractive. Just as we've been chomping at the bit to buy foreign goods and services, foreigners have been chomping at the bit to invest trillions of dollars in the U.S. Mr. Malpass says our 10-year government bonds yield 4.6 percent per year compared with Japan's 1.6 percent; our government debt is 38 percent of GDP versus 86 percent in Japan; and while Europe's debt to GDP ratio is not as extreme as Japan's, it's not nearly as favorable as ours.
Here's a smell test. Pretend you're a man from Mars knowing absolutely nothing about Earth and you're looking for a nice place to land. You find out that there's one country, say, country A, where earthlings from other countries voluntarily invest and entrust trillions of dollars of their hard earnings. There are other countries where they're not nearly as willing to make the same investment. Which one of those countries would you deem the most prosperous and with the greatest growth prospects? You'd pick country A, which turns out to be the United States. As such, you'd be just like most of the world's population who, if free to do so, would invest and live in the U.S.
The late Professor Milton Friedman said, "Underlying most arguments against the free market is a lack of belief in freedom itself." Some people justify their calls for protectionism by claiming that they're for free trade but fair trade. That's nonsense. Think about it: When I purchased my Lexus from a Japanese producer, through an intermediary, I received what I wanted. The Japanese producer received what he wanted. In my book, that's a fair trade.
Of course, an American auto producer, from whom I didn't purchase my car, might whine that it was unfair. He would like Congress to impose import tariffs and quotas to make Japanese-produced cars less attractive and available in the hopes that I'd buy an American-produced car. In my book, that would be unfair.
If you have tips, questions, comments, suggestions, or requests for subscription only articles, email us at lybberty@gmail.com.
18 January 2007
Bush's Historic Veto
By Charles Krauthammer
When President Bush announced in August 2001 his restrictive funding decision for federal embryonic-stem-cell research, he was widely attacked for an unwarranted intrusion of religion into scientific research. His solicitousness for a 200-cell organism — the early embryo that Bush declared should not be destroyed to produce a harvest of stem cells — was roundly denounced as reactionary and anti-scientific.
And cruel to boot. It was preventing the cure for thousands of people with hopeless and terrible diseases, from diabetes to spinal-cord injury. As John Edwards put it most starkly and egregiously in 2004: If John Kerry becomes president, Christopher Reeve will walk again.
This kind of stem-cell advocacy did not just shamefully inflate its promise. It tended to misrepresent the basis for putting restrictions on embryonic research, insisting that it was nothing more than political enforcement of the religious fundamentalist belief that life begins at conception.
This has always been a tendentious characterization of the argument for restricting stem cell research that relies on the destruction of embryos. I have long supported legal abortion. And I don’t believe that life — meaning the attributes and protections of personhood — begins at conception. Yet many secularly inclined people like me have great trepidation about the inherent dangers of wanton and unrestricted manipulation — to the point of dismemberment — of human embryos.
You don’t need religion to tremble at the thought of unrestricted embryo research. You simply have to have a healthy respect for the human capacity for doing evil in pursuit of the good. Once we have taken the position of many stem-cell advocates that embryos are discardable tissue with no more intrinsic value than a hangnail or an appendix, then all barriers are down. What is to prevent us from producing not just tissues and organs, but human-like organisms for preservation as a source of future body parts on demand?
South Korea enthusiastically embraced unrestricted stem cell research. The subsequent greatly heralded breakthroughs — accompanied by lamentations that America was falling behind — were eventually exposed as a swamp of deception, fraud and coercion.
The slope is very slippery. Which is why, even though I disagreed with where the president drew the line — I would have permitted the use of fertility-clinic embryos that are discarded and going to die anyway — I applauded his insistence that some line must be drawn, that human embryos are not nothing, and that societal values, not just the scientific imperative, should determine how they are treated.
Congress will soon vote to erase Bush’s line. But future generations may nonetheless thank Bush for standing athwart history, if only for a few years. It gave technology enough time to catch up and rescue us from the moral dilemmas of embryonic destruction. It has just been demonstrated that stem cells with enormous potential can be harvested from amniotic fluid.
This is a revolutionary finding. Amniotic fluid surrounds the baby in the womb during pregnancy. It is routinely drawn out by needle in amniocentesis. The procedure carries little risk and is done for legitimate medical purposes that have nothing to do with stem cells. If it nonetheless yields a harvest of stem cells, we have just stumbled upon an endless supply.
And not just endless, but uncontroversial. No embryos are destroyed. The cells are just floating there, as if waiting for science to discover them.
Even better, amniotic fluid might prove to yield an ideal stem cell — not as primitive as embryonic stem cells and therefore less likely to grow uncontrollably into tumors, but also not as developed as adult stem cells and therefore more “pluripotential” in the kinds of tissues it can produce.
If it is proved that these are the Goldilocks of stem cells, history will record the amniotic breakthrough as the turning point in the evolution of stem cell research from a narrow, difficult, delicate and morally dubious enterprise into an uncontroversial one with raw material produced unproblematically every day.
It will have turned out that Bush’s unpopular policy held the line, however arbitrary and temporary, against the wanton trampling of the human embryo just long enough for a morally neutral alternative to emerge.
And it did force the country to at least ponder the moral cost of turning one potential human being into replacement parts for another. Who will be holding the line next time, when another Faustus promises medical nirvana if he is permitted to transgress just one moral boundary?
If you have tips, questions, comments, suggestions, or requests for subscription only articles, email us at lybberty@gmail.com.
17 January 2007
Election/Playoff Fatigue
After Bush
Is conservatism finished?
By Wilfred M. McClay
Even before November's midterm elections and the Republican party's loss of its congressional majorities, there was widespread talk of the exhaustion, even death, of conservatism in America. Over the past year or so, indeed, every new day has seemed to bring another article or book on the subject. Gathering steam as the election approached, such inquests became as popular among conservatives themselves as among liberals. Each offered a distinctive thesis or complaint relating to a perceived malfeasance of the Bush administration, whether in foreign policy, social policy, homeland security, domestic spending, corruption, or any number of other areas.
One particularly notable gesture of disaffection appeared on the very eve of the election, when, in a symposium titled "Time for Us to Go," a group of seven self-identified conservative writers were moved to publish, in the liberal Washington Monthly, their reasons why the Republicans deserved to lose. While not exactly the "A" list of conservative minds, these writers, ranging from Christopher Buckley to Joe Scarborough (the former Florida Congressman turned talk-TV host), urged the defeat of their party for the sake, precisely, of the future health of conservatism itself. But their words contributed mightily to a growing general impression: that after a run of two decades or so, conservatism's day in the American political sun was drawing to a close.
For liberal Democrats, this was a termination devoutly to be wished. So intense, indeed, was the pent-up need of the Democratic party and its media allies for a victory dance in the end zone that the high-stepping began long before any touchdowns had actually been scored. The columnist Joe Klein's exultant observation in Time just prior to the elections--"2006 may be remembered as the year that the Reagan Revolution finally crested and began to recede"--was just one of hundreds of such gun-jumping predictions.
Yet it is now clear that the results of the vote, while a solid reversal not seen since the more epochal mid-term Republican victories of 1994, hardly justified this extravagance. In comparison with similar historical circumstances, the GOP's losses were quite modest, leaving the Democrats with only relatively thin majorities in both houses of Congress. This was all the more impressive given the pervasive national mood of discouragement over the war in Iraq. Nor did anything about the GOP losses justify the claim that conservatism lost, or that the slow movement of the American electorate to the center-Right of the political spectrum has stopped or even diminished, let alone reversed.
Some Republican defeats, for example, including that of the liberal Republican Senator Lincoln Chafee of Rhode Island, effected no change in the ideological balance and can hardly be seen as a setback for conservatism. On the Democratic side, meanwhile, the remarkably easy triumph of the highly-targeted, much-reviled Senator Joseph Lieberman over his more liberal anti-war challenger was a bellwether. So too were the Senate victories of such relatively conservative Democrats as James Webb in Virginia and Robert Casey, Jr. in Pennsylvania. There was also the surprisingly strong showing of Harold Ford, Jr., the Democratic Congressman who promised Tennesseans that if they elected him to the Senate, they would get "a gun-loving, Jesus-loving American who thinks that taxes ought to be lower and America ought to be stronger." In the event, most Tennesseans were not quite willing to buy that assertion, but there can be no doubt that they took Ford seriously in offering it.
In short, it is still unclear that the achievement of a majority of congressional members with the letter "D" after their names means a shift in the ideological balance of the nation. The internal Democratic fissures that opened up immediately after the election--as in the struggle between Nancy Pelosi and Steny Hoyer over leadership of the House, and the patent discomfiture within the party over certain likely appointments to key committee chairmanships--suggest that electoral victory has not automatically conferred a durable majority, let alone a governing vision.
As the liberal journalist Michael Tomasky observed in an acute analysis penned in April 2006: "What the Democrats still don't have is a philosophy, a big idea that unites their proposals and converts them from a hodgepodge of narrow and specific fixes into a vision for society." The Democratic party won its new majorities largely on the basis of general discontent. It will take a Democratic party that actually stands for something other than the obstruction and investigation of George W. Bush to achieve more than temporary electoral reversals.
It would be foolish to deny the importance, or the usefulness, of closely examining the performance of the Bush administration with regard to the entire range of government policy and actions. Precisely by having made such an act of reconsideration imperative, the 2006 election results may even turn out to be a blessing in disguise for Republicans in particular and conservatives in general. This is how democracies are supposed to operate. Moreover, the ability of conservatives to engage in self-criticism is surely a salutary thing--so long as the self-criticism is both honest and accurate.
Is that the case in this instance? Before turning to the substantive points raised by Bush's conservative critics, one is bound to note the startling weakness for hyperbole and the bitter invective in their writings, often the signs of unrealistic expectations and narrow or sectarian agendas. In addition, almost all of them judge Bush, and find him woefully wanting, by the standard of Ronald Reagan, thereby demonstrating a limited ability to recall what the now-sainted Reagan administration was actually like, let alone what sorts of criticisms it had to bear during its time--and where those criticisms came from. None seems to remember Reagan's famous embrace of an Eleventh Commandment--"Thou shalt not speak ill of a fellow Republican"--let alone the context in which it arose: namely, the bitter intra-party struggles of the early 1960s in which liberal Republicans sought to block the rising Goldwater movement in their midst.
Americans in general too easily forget such times of struggle and division, making them over into placid and uncomplicated memories. A bipartisan example of this creative amnesia occurred at the time of Reagan's death in June 2004 and spilled over into that year's presidential campaign. Television journalists and Democratic candidates alike repeatedly contrasted the idyllic spirit of unity at home and cooperation abroad that allegedly prevailed during the cold-war years under Reagan with the national disunity prevailing over the Iraq issue under Bush. Many Americans, even some old enough to know better, seem actually to have credited such ridiculous assertions.
We forget, too, that predictions like Joe Klein's have been made again and again since 1981. We forget that the current charges of "theocracy" were thoroughly rehearsed in the Reagan years, when Reagan's open support for the beliefs of evangelicals was passionately decried, and his affirmation of the veracity of the Bible was used against him (notably in the 1984 campaign) to suggest that he would recklessly seek to bring on Armageddon. And we forget that not only Reagan but every Republican President since Eisenhower has been solemnly adjudged a cretin by the national press during his time in office, only--even unto the supposedly irredeemable Richard Nixon--to be turned into a wise leader after his departure from power.
We also forget that the Reagan administration itself, far from being happily unified, was driven by internal battles between "pragmatists" and "ideologues," conflicts that prefigured many of the policy battles of the present. And we forget that, outside the administration, Reagan got plenty of grief from his own Right as well.
The querulous Richard Viguerie, for example, an influential but notably unhappy camper in those halcyon days, began hectoring the Reagan presidency almost from the beginning, complaining to the Associated Press in January 1981 that with his cabinet appointments Reagan had given conservatives "the back of his hand." A July 1981 op-ed by Viguerie in the Washington Post, entitled "For Reagan and the New Right, the Honeymoon Is Over," was thoughtfully timed less than four months after the President had nearly been killed by an assassin's bullet. By December 1987, Viguerie was declaring that Reagan had actually "changed sides" and was "now allied with his former adversaries, the liberals, the Democrats, and the Soviets." A year later, in the final months of his presidency, when it was clear to all that Reagan had fundamentally changed the terms of debate in American politics, Viguerie announced that, thanks to his tenure in office, "the conservative movement is directionless."
It is especially pertinent to recall such statements when one opens Viguerie's current book, a catalog of Bush-administration horrors whose pages are replete with inspirational Reagan quotations and the highest praise for Reagan and his appointees. For a movement that claims to rest upon long perspectives and deep cultural sources, American conservatism can be remarkably short-sighted, impatient, brittle, fractious, and downright petulant. Indeed, conservatism has been found by its adherents to have "cracked up" or "lost its soul" more times than are worth counting in the years since 1980 (at least as many times as America has "lost its innocence").
But these crack-ups have been mainly in the eyes of their beholders. The simple fact, to repeat, is that the American electorate has, by most measures, moved slowly but steadily in a conservative direction since 1968, in a pattern that the two moderate, Southern Democratic presidencies of Jimmy Carter and Bill Clinton did more to confirm than to interrupt. The adjustments brought about by the 2006 midterm elections have done little to alter this pattern.
Still, pointing to long-term political success does not necessarily answer the charge that the Bush administration represents an egregious departure from or even a betrayal of conservative ideals and principles. This is precisely the contention of Jeffrey Hart, a longtime senior editor of National Review. Near the conclusion of his "The Making of the American Conservative Mind," a relaxed, chatty, anecdotal account of the past half-century through the lens of National Review's reporting and editorials, Hart comes down hard on Bush, a "transformative" President who in Hart's judgment is emphatically not a conservative one.
Hart finds two principal and related failings in Bush. First, in the Iraq war, and in seeking to "cure" the problems of the Middle East by imposing a regime of "modernization and democratization," Bush has demonstrated that he is at heart a "hard Wilsonian"--that is, a utopian thinker who has wedded the use of "concerted military force" to Woodrow Wilson's very unconservative brand of "optimistic universalism." Second, Bush's determination to allow his own evangelical Christianity to influence his thinking and actions, particularly in his zeal for large-scale social and moral reform, departs from "the accepted convention in America . . . that religious beliefs are a private matter." It also runs counter to the strong conservative preference for "magisterial" and "traditional" forms of religion that, unlike evangelicalism, do not so much challenge a culture as stabilize it.
Hart's criticisms, stated relatively mildly in his book, took on much more strident expression in his contribution to the pre-electoral Washington Monthly symposium. There, Hart described Bush as "a man who has taken the positions of an unshakable ideologue" on issues ranging from the Terri Schiavo case to supply-side economics. If conservatism is a "politics of reality," this President, wrote Hart, lives sheltered in delusion. No longer is the adjective "Wilsonian" sufficient to describe his disconnectedness from reality. Indeed, Bush's naïve belief in the universality of the human preference for freedom over tyranny makes "Woodrow Wilson look like Machiavelli" by comparison.
In the end, Hart finds that "Bushism" has so subverted the principles of conservatism as to have "poisoned the very word." Others are of the same opinion. In particular, the vexed question of religion, and of religiosity, has given rise to its own avalanche of conservative anti-Bushians, including not only Hart but Kevin Phillips in his over-the-top "American Theocracy," Damon Linker in "The Theocons," and especially Andrew Sullivan in "The Conservative Soul."
Although all of these authors complain about the excessive influence of religious faith in present-day conservative politics, the specifics of their indictment vary. Where Hart focuses on the emotionalism of evangelical Protestants, Linker concentrates on the influence of a small cell of conservative Catholics around the journal First Things while Sullivan, who paints with a broad brush, denounces "Christianists" (his analogue to "Islamists") of all persuasions.
"The defining characteristic of the conservative," Sullivan asserts, "is that he knows what he doesn't know." This stance of systematic modesty, or principled unprincipledness, undergirds the way Sullivan himself, an avowed if unorthodox Catholic, proposes to understand politics, culture, society, and religion itself. His own, properly "conservative" perspective, he writes, stands as a bulwark between two antithetically dangerous forces: "fundamentalism," which fraudulently claims to be in full possession of the truth, and "nihilism," which fraudulently denies that truth exists. But, for Sullivan, the former is a much greater enemy than the latter. Fundamentalism, he asserts, turns religion into a "mechanism for social order" or "a regulatory scheme to keep human beings in line." By thus denying the essentially individual and mystical character of religious experience, it amounts to nothing less than a "profound blasphemy." Judged according to this standard, it would appear, George W. Bush is the blasphemer-in-chief.
One such principle, according to some conservatives, is the limitation of executive power. But Thomas Jefferson, who himself held a strict-constructionist view of executive authority, violated that view in order to undertake the Louisiana Purchase, which doubled the size of the nation and made it a continental power. Abraham Lincoln made extensive use of executive authority, including the suspension of basic civil liberties, in order to prosecute the Civil War and save the Union. During the Eisenhower administration, the exercise of federal authority to enforce basic civil rights for blacks in the Jim Crow South righted a historical wrong that seems unlikely to have been righted in any other way.
It is in fact a perfectly respectable conservative principle that leadership sometimes demands bold actions undertaken with the right ends in view. This, indeed, is the situation in which we find ourselves today, in what is likely to be a prolonged conflict with determined, well-organized, and well-funded transnational Islamic terrorists. It was one thing to assert, with John Quincy Adams in 1821, that the United States does not go abroad in search of monsters to destroy; at the time, in any case, there was hardly much choice about the matter. It is quite another thing to stand on such a dictum in 2006, in the name of limited government, while remaining oblivious to the nature of the challenges before us.
The proliferation of weapons of mass destruction, along with the incapacity or unwillingness of international and multilateral organizations to contain or control them and our own growing vulnerability to their use by shadowy proxies or groups accountable to no one, leaves the United States no responsible choice but to act vigorously and even pre-emptively in ways that an older conservatism could never have envisioned and would not have approved. That fact does not make such action imprudent; on the contrary, a failure to act, because of prior ideological commitments to a particular understanding of conservatism, would represent a lapse of prudence, and a betrayal of the core conservative imperative to defend and protect what is one's own.
Nor is Bush's insistence on the universal appeal of free institutions out of line with a sensibility that since the American Revolution has envisioned the United States as a carrier of universal values and a beacon to the rest of the world. Hart decries this "Wilsonian" aspect of Bush's presidency as a form of Jacobinism, promising the forced conversion of the world to American values and practices. But what has Bush said that is not a restatement of what Ronald Reagan said so often and with such conviction? Consider Reagan's address to the British parliament on June 8, 1982, a self-conscious echo of Churchill's "Iron Curtain" speech 36 years earlier:
We must be staunch in our conviction that freedom is not the sole prerogative of a lucky few, but the inalienable and universal right of all human beings. . . . The objective I propose is quite simple to state: to foster the infrastructure of democracy, the system of a free press, unions, political parties, universities, which allows a people to choose their own way to develop their own culture, to reconcile their own differences through peaceful means. This is not cultural imperialism, it is providing the means for genuine self-determination and protection for diversity. Democracy already flourishes in countries with very different cultures and historical experiences. It would be cultural condescension, or worse, to say that any people prefer dictatorship to democracy.If Bush has abandoned conservatism in saying such things and acting upon them, then what are we to make of Reagan?
As for the inherent tension between religion and conservatism in America, that is real enough, and we hardly need the latest tomes on the subject to tell us about it. As I have pointed out in Commentary, the evangelical Protestantism that gives American religion much of its distinctive form and energy, and that fuels George W. Bush's own commitments, is a faith of personal and social transformation, constantly seeking to challenge the status quo. Thus, although evangelicalism can be a force of moral conservatism, it can also be a force of moral radicalism, calling into question the justice and equity of the most basic structures of social life. As Hart recognizes, it does not share traditional conservatism's preference for stasis, prejudice, and custom.
This is no doubt why Bush's evangelicalism, itself as American as apple pie, makes him so unpopular among many conservatives. They see it, rightly enough, as the source of his involvement of the federal government in promoting educational reform, his faith-based initiative, his African AIDS initiative, and the like--in short, of that "compassionate conservatism" which to them reeks equally of do-goodism and unlimited government. And if that is the case with conservatives who, like Jeffrey Hart, prefer a gentlemanly privatization of faith, it is all the more the case with those who prefer their conservatism without any element of religious faith at all.
The intellectual historian Jerry Z. Muller has argued in the past that a distinction needs to be made between conservatism and religious orthodoxy of any kind, on the grounds that too much admixture of religion may interfere with the "epistemological modesty" and empiricism that Muller regards as essential to conservative thought. Such an idea seems to have taken up residence in the thinking of Andrew Sullivan, who places the principle of "doubt" in the center of his own understanding of conservatism and who regards deviations from it as blasphemous. By this standard, however, nearly all the religious activity of human history has been blasphemous.
A more honest account of that history, as well as of the history of American conservatism, would reflect instead the dictum of Russell Kirk that "conservatives generally believe that there exists a transcendent moral order, to which we ought to try to conform the ways of society." Religion, after all, is a social and not merely an individual institution--one presumably cannot be a Methodist or Baptist all by oneself--and in the formation of social institutions the past is our indispensable teacher. Conservatives revere tradition not merely because it is, in Sullivan's bland words, "where you start from," but because they regard what past generations have discovered and passed along to us as precious. No part of a culture's traditions is more central than its religious institutions, and none more inseparable from its political and social health.
Here, too, one turns with profit to Reagan. Consider, to pick just one among scores of examples, this portion of his 1983 speech to the National Association of Evangelicals, a speech best known for his naming the Soviet Union as "the focus of evil in the modern world":
While America's military strength is important, let me add here that I've always maintained that the struggle now going on for the world will never be decided by bombs or rockets, by armies or military might. The real crisis we face today is a spiritual one; at root, it is a test of moral will and faith.Much has been made of the fact that there are Bible studies and prayer meetings going on in the Bush White House. One wonders why this fact is more sinister in the eyes of Bush's detractors than what Reagan said to the nation and the world in his First Inaugural Address:
Whittaker Chambers, the man whose own religious conversion made him a witness to one of the terrible traumas of our time, the Hiss-Chambers case, wrote that the crisis of the Western world exists to the degree in which the West is indifferent to God, the degree to which it collaborates in Communism's attempt to make man stand alone without God. And then he said, for Marxism-Leninism is actually the second-oldest faith, first proclaimed in the Garden of Eden with the words of temptation, "Ye shall be as gods."
I believe we shall rise to the challenge. I believe that Communism is another sad, bizarre chapter in human history whose last--last--pages even now are being written. I believe this because the source of our strength in the quest for human freedom is not material, but spiritual. And because it knows no limitation, it must terrify and ultimately triumph over those who would enslave their fellow man. For in the words of Isaiah: "He giveth power to the faint; and to them that have no might He increaseth strength. But they that wait upon the Lord shall renew their strength; they shall mount up with wings as eagles; they shall run, and not be weary."
I am told that tens of thousands of prayer meetings are being held on this day, and for that I am deeply grateful. We are a nation under God, and I believe God intended for us to be free. It would be fitting and good, I think, if on each Inauguration Day in future years it should be declared a day of prayer.Evidently Reagan did not think this amounted to the proposing of a Christian theocracy, and neither need we.
Paradoxically, the profusion of books about the death-rattle of conservatism might be better understood as a sign of conservatism's continuing importance. It is no small matter when writers as diverse as Jeffrey Hart and Andrew Sullivan want to contend for the ownership of, or at least identification with, the label "conservative"--especially when the Democratic party's most promising younger candidates still run like the wind away from the label "liberal," or when those of us who teach in colleges and universities observe that our students are nearly always much more conservative than their professors.
But the survival of a political idea depends upon its adaptability, and conservatism has the advantage of a certain flexibility built into its nature. In his great novel "The Leopard" (1958), Giuseppe de Lampedusa places this insight in the mouth of his protagonist Don Fabrizio, a proud and large-gestured Sicilian aristocrat who was fated to live through the abrupt transition from traditional society to the modern democratic order in the late 19th century. "Things must change," Don Fabrizio reflects, "if they are to remain the same," and the novel relates his agonizing efforts to accommodate a new world without becoming a slave to it. A similar task lay behind the conservatism of Alexis de Tocqueville, another aristocrat (and would-be statesman) who was fated to live on the cusp of vast historical change, and who sought to preserve what was best about the old order while accepting the inevitability of the new.
American conservatism has these features, too, but also some of its own. For Americans, as for others, a conservative sense of the past is expressed partly through shared stories and sufferings and customs, the mystic chords of memory. But that is only part of the story. In the United States, national identity is expressed as well through loyalty to the country's founding principles and propositions, and to quasi-scriptural documents, like the Declaration of Independence and the Constitution, which seek to express them.
Many of these principles, including the "self-evident" assertion that "all men are created equal" and possess "inalienable rights," have always been put forward as statements of universal scope, and not merely particular or local values. Their universalistic implications have a tendency, indeed, to cut against the equally vital elements in the conservative tradition that argue for the primacy of the local, the settled, and the particular. The same is true of the culturally dominant Protestant emphasis on the primacy of the individual conscience, which also takes on a universalistic character, putting loyalty to principle above loyalty to settled traditions.
To revere America without honoring these principles would mean revering a different country from the one we actually inhabit. But it is true that the principles are not always themselves conservative, either in their applications or their effects. Hence the inherent tendency of American conservatism to show, as the political scientist Walter Berns has pointed out, a dual aspect, combining the customary and the propositional, the affective and the rational, the particular and the general. One should love one's country both for what it is and for what it stands for; both because it is one's own and because it embodies or aspires to the highest and finest ideals.
In the conservative view, love for America cannot mean merely a love for disembodied abstract ideals yet to be achieved, as Richard Rorty and other leftists would have it. But neither can it be merely a primal love for the "fatherland," a term that has never had much of a place in American discourse. It is in fact both, and the two are inseparable.
If conservatism as a philosophy, or an ideology, is a more various thing in the American context than Bush's conservative critics allow, conservatism in American politics is less an ideology than a coalition. It has many different flavors and strands, and there is no sense in pretending that they do not occasionally conflict with one another, or tug at the fabric of the whole. As in any coalition, not all of the pieces fit together coherently. Rather, they resemble in some respects the structure of a crossword puzzle, in which not all lines intersect, but all are nonetheless connected.
This is always frustrating to those who want their ideology neat and pure. But show me a political movement that has a clear, crisp, unambiguous, and systematic philosophy and I will show you a movement that will lose, and will deserve to lose. The most successful coalition in modern American political history, the New Deal coalition wrought by Franklin Roosevelt, lasted for nearly four decades. It was a political philosopher's nightmare but a political scientist's joy, a hodgepodge of conflicting political principles and starkly opposed regional coalitions, the most notable of which was the now nearly incomprehensible alliance of Northern liberals and Southern segregationists.
In any event, a coalition's strength is partly a function of the force and quality of its opposition. The fissures and conflicts within conservatism are getting so much attention now because conservatism is still, intellectually speaking, where the principal action remains. So long as the Democratic party continues down the road it has been following, led by its aging left-wing lions and lionesses, funded and directed by the most extreme and irresponsible elements in its ranks, and finding clarity only in discrediting George W. Bush and regaining office, conservatives will always have plenty to unify around. For their own part, so long as conservatives are able to remember Ronald Reagan as a leader who not only embodied the distinctive characteristics of American conservatism but who finessed its antinomies and persevered against the contempt and condescension of his own era--including among some of his allies--they can yet regain their bearings and prevail.
Mr. McClay, who holds the SunTrust Chair of Excellence in the Humanities at the University of Tennessee at Chattanooga, will be spending the spring 2007 semester as senior Fulbright lecturer in American history at the University of Rome. His most recent book is Figures in the Carpet: Finding the Human Person in the American Past (Eerdmans).
If you have tips, questions, comments, suggestions, or requests for subscription only articles, email us at lybberty@gmail.com.