When did so much of American Christianity get mean and stupid?
It wasn't the 30's Churches fought to keep people from starving.
It wasn't the 40's they were nearly all of aid during WWII.
And in the 50's everyone admired science then even if they didn't get it.
Was it the 60's?
Did Russia cause this during the cold war?