I love the fact that Christians were considered to be the "atheists" of their time in the first century; that Christianity was a movement against the establishment of the Roman empire - it was basically the underground movement. The concept that America is God's nation or a "Christian" nation and that Christians think that's a good thing always baffled me. When we read the gospels, it's obvious that not only was that idea something Jesus didn't want, but he spoke against it and even rebuked the disciples at one point because they still didn't get that he didn't come for that. He never came to claim any kind of land whatsoever. So how did we get to this point where it's a good thing that Christianity is the norm? Perhaps the whole problem of complacency in our congregation's faith all comes down from the fact that true "persecution" from the world stopped. Perhaps we need to re-understand that Christ came to rock the boat. Perhaps we need to save Christianity from "Christianity".
1 comment:
Amen.
Post a Comment