Signs of Revival: Is the West Returning to Christianity?

We’ve heard a prevailing narrative that Americans are drifting away from faith in general and Christianity more specifically. For a long time, we’ve seen interviews with celebrities who grew up in Christian homes but now consider themselves “spiritual but not religious.” Surveys make a big deal out of how many Americans consider themselves “nones.”

Europe and the rest of the West have seen a decline in religious faith even longer than we have. So much of what we used to call “Christendom” is irreligious, except for the infiltration of Muslims.

Read more at: pjmedia.com