Are Christians ever too judgmental and quick to condemn others for not believing the same things that they do? An anonymous writer asks this question openly, and we thought this might be something that our readers would want to comment on:
My background: I grew up in the church. I am currently heavily involved in working with our Church and my wife is a Nazarene Pastor.
Up until recently I have held the belief that Christians should get involved politically. So we can try and help the world, from the top down. I believe all the standard “Christian ideas” homosexuality is wrong etc etc. and if Christians can stop them from getting married thats great. Because its wrong.
However…I have recently been turned on to a completely different mindset thanks to r/atheism. Christ never got involved in politics. And “Christians” (or people claiming to be) today are destroying our reputation by doing so. It is time Christians listen to their own advice and read the bible.
Christ never expected non-believers to believe in him. Or to act a certain way or behave a certain way. He hung out with them and loved them. As a Christian, I think its about time we stop destroying America from the top down (as well as our reputation) and try re-building it from the bottom up. Most of the world is NOT ever going to share our beliefs. Lets stop pretending they do and stop destroying our reputation by forcing them to act like us.
What do you think? Does the person have a point?