We’re at the end of white Christian America. What will that mean?

We’re at the end of white Christian America. What will that mean?
Image: Shutterstock

[Ed. – The fact that the author presents this is as an indisputable truth tells you all you need to know.]

America is a Christian nation: this much has always been a political axiom, especially for conservatives. Even someone as godless and immoral as the 45th president feels the need to pay lip service to the idea. On the Christian Broadcasting Network last year, he summarized his own theological position with the phrase: “God is the ultimate.”

And in the conservative mind, American Christianity has long been hitched to whiteness. The right learned, over the second half of the 20th century, to talk about this connection using abstractions like “Judeo-Christian values”, alongside coded racial talk, to let voters know which side they were on.

But change is afoot, and US demographics are morphing with potentially far-reaching consequences. Last week, in a report entitled America’s Changing Religious Identity, the nonpartisan research organization Public Religion Research Institute (PRRI) concluded that white Christians were now a minority in the US population.

Continue reading →


Commenting Policy

We have no tolerance for comments containing violence, racism, vulgarity, profanity, all caps, or discourteous behavior. Thank you for partnering with us to maintain a courteous and useful public environment where we can engage in reasonable discourse.

You may use HTML in your comments. Feel free to review the full list of allowed HTML here.