(Note: this post has been edited a few times since it's origional posting, sorry for any confusion regular readers)
Some thoughts from my American Church/Religion History class.
Did we ever have a Christian nation?
Depends on how you define Christian. The morals people lived by, kind of Christian, (if you ignore that whole slavery thing) The theology they professed, also kind of Christian; again it depends on who you ask, the average Joe in the shop or field, or the intellectual’s at the Continental Congress. One thing is for sure, you can’t say that everyone believed one thing, or were even Christian, there were a plethora of opinions and beliefs well before 1787 when the constitution was written.
However, by and large the history of our morals are Christian-Judeo, something which didn't start to significantly change untill recently. BUT, we routinely get off track, especially in the area of social justice (which should include a pro-life stance), and particularly race relations.
Here is the ironic part.
The people that tend proclaim the fact that we were a Christian nation (mostly evangelicals), also have a pretty narrow view of what constitutes a Christian. (I’m not saying this is bad…or good, that’s another post)
Clearly, far from all of the founding fathers would fit an evangelical definition of Christian.
Now considering the fact that few evangelicals want to be known as ecumenical, it’s pretty ironic that they gravitate toward our sometimes deist founding fathers, and others who were hardly evangelical.
What ever we used to be, Christian or not, we aren't now; and were not going to be by passing laws that require people to act in a manner they do not believe.
(though we do need to pass laws that protect people that may be infringed on, or other wise harmed, which is why I support affirmative action, pro-life, and some environmental laws)