WAS AMERICA FOUNDED AS A CHRISTIAN NATION?
I hear this so much today. The logic is this:
1. America was founded as a Christian nation.
2. Christians have lost control of the nation.
3. Christians need to (and have a right to) take it back.
I do not believe the premise that America was founded as a Christian nation. I think the evidence for that premise is exaggerated and misstated.
Consider this. In 1797, the United States entered inot a treaty with Tripoli, an Islamic nation. Its pirates caused us a lot of trouble. Tripoli was reluctant to enter into the treaty, for its leaders saw the problem with a Christian nation and an Islamic nation trying to get along.
George Washington negotiated the treat. It was ratified by the Senate. It was signed by President John Adams.
Look what it says: "As the government of the United States of America is not in any sense founded on the Christian religion, as it has in itself no character of enmity against the laws, religion or tranquility of Muslims, . . . it is declared by the parties that no pretext arising from religious opinions shall ever produce an interruption of the harmony existing between the two countries" (Hunter Miller, Treaties and Other International Acts of the United States [Government Printing Office, 1930], Vol.II, p.365).
So, when you criticize the current president for saying America is not a Christian nation, remember the beloved first president did also.