Signs of Revival: Is the West Returning to Christianity?

We’ve heard a prevailing narrative that Americans are drifting away from faith in general and Christianity more specifically. For a long time, we’ve seen interviews with celebrities who grew up in Christian homes but now consider themselves “spiritual but not religious.” Surveys make a big deal out of how many Americans consider themselves “nones.”

Europe and the rest of the West have seen a decline in religious faith even longer than we have. So much of what we used to call “Christendom” is irreligious, except for the infiltration of Muslims.

Read more at: pjmedia.com

This website stores cookies on your computer. These cookies are used to provide a more personalized experience and to track your whereabouts around our website in compliance with the European General Data Protection Regulation. If you decide to to opt-out of any future tracking, a cookie will be setup in your browser to remember this choice for one year.

Accept or Deny