We’re at the end of white Christian America. What will that mean?

We’re at the end of white Christian America. What will that mean?

[Ed. – The fact that the author presents this is as an indisputable truth tells you all you need to know.]

America is a Christian nation: this much has always been a political axiom, especially for conservatives. Even someone as godless and immoral as the 45th president feels the need to pay lip service to the idea. On the Christian Broadcasting Network last year, he summarized his own theological position with the phrase: “God is the ultimate.”

And in the conservative mind, American Christianity has long been hitched to whiteness. The right learned, over the second half of the 20th century, to talk about this connection using abstractions like “Judeo-Christian values”, alongside coded racial talk, to let voters know which side they were on.

But change is afoot, and US demographics are morphing with potentially far-reaching consequences. Last week, in a report entitled America’s Changing Religious Identity, the nonpartisan research organization Public Religion Research Institute (PRRI) concluded that white Christians were now a minority in the US population.

Trending: Another appeals court rules against Biden Administration vaccine mandate for most workplaces

Continue reading →

Comments

For your convenience, you may leave commments below using Disqus. If Disqus is not appearing for you, please disable AdBlock to leave a comment.