By David Barr
Christians often credit or blame the changing fortunes of America on changes in our religious commitments. Some of these theories are nutty: I once heard a guy claim that Hurricane Katrina was punishment for a slackening of U.S. support for Israel at the time. Others aren't nutty, but still pretty strident: folks like Bill O’Reilly and Mike Huckabee have pointed to the end of prayer in schools as the cause of things like school shootings.*
More subtly, many Christians point to evangelism as a way to address problems like crime and poverty or credit the Christian religion with successes in our nation’s past. People envision the connection between religion and the fate of the nation in different ways. Some, like the man with the Israel-Katrina theory, see God as directly intervening to bless or smite us in response to our policy decisions. Others see a more natural cause and effect: Christianity makes us better people and a nation thrives naturally when its people are more honest and hardworking, less violent, and so on. However they see it working, it is very common for American Christians to think our current problems are because we have abandoned Christianity.
This is a really bad idea, and for a few reasons.