But, the question remains, is Christianity dying in America? Are we about to capitulate to secularism? While secularism is on the rise (whatever secularism means), I'm of the opinion that E.J. Dionne is correct, Christianity is pretty resilient.
I do think that younger people are walking away from coercive and exclusivist forms of religion (yes conservative churches are growing among younger people, but the ones with the most success are adapting their message). I do think younger people are fed up with politicized religion (which is a message to Progressives as well as Conservatives) -- and for the past quarter century Christianity has been identified in the minds of many with conservative politics and politicians. They are looking for a faith that makes sense of their lives and encourages love of neighbor.
Religion is always corrupted when it gets too close to political power. It's possible to win a precinct caucus and lose your soul, to mistake political victory for salvation itself.
It is this approach to Christianity that is decidedly in decline, thank God, in part because conservative Christians themselves are rediscovering the church's mission to the poor, the sick, the strangers and the outcasts. This augurs new life, not decay.
So, should we mourn the decline of American Christianity as we've known it recently, or should we rejoice that Christianity may see itself resurrected into something new and closer to the founding vision? Remember when Christianity was a proscribed and persecuted religion? It didn't die, instead it thrived by serving and loving their neighbors. So, no, Christians probably won't be able to control the agenda on controversial social issues, simply by saying that it goes against the Bible. Is that a bad thing?