Is the United States a "Christian nation"? Some Americans think so. Religious Right activists and right-wing television preachers often claim that the United States was founded to be a Christian nation. Even some politicians agree. If the people who make this assertion are merely saying that most Americans are Christians, they might have a point. But those who argue that America is a Christian nation usually mean something more, insisting that the country should be officially Christian. The very character of our country is at stake in the outcome of this debate.
keyboard shortcuts: V vote up article J next comment K previous comment