I came across this op-ed piece titled "Evangelicals a Liberal Can Love." The author writes pretty positively about Evangelicals, recognizing that Christians actually do work to help the poor, the sick and yes, even the environment. He even calls out liberals for mocking Christians. But as refreshing as it might be to hear these things, I wonder about how Christians react to this "praise." Certainly there are some who would refuse to hold hands with scary liberals at any cost. Maybe some others would agree that Evangelicals are starting to recognize areas of Christian charity that had been de-emphasized during the Religious Right era, and that those looking from the outside into Christianity have been able to point out the shortcomings of its American followers. Certainly others would say that Christianity hasn't changed, but it is nice for others to notice that Christians do care about the needs of the world. Is Christianity becoming more culturally acceptable? Or is culture becoming more Christianity-accepting? Or is it neither? And if Christianity were to rise in the popular opinion polls, is that a pedestal on which we really want to be standing? In the meantime, though, I don't really mind the label "bleeding heart Christian."
Wednesday, February 6
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment