How did we get to the point in our nation where we, as Americans consider the act of taking care of ourselves, to be beneath us? When did we cross the line into thinking welfare exists on a higher plane than working at a mundane job? When did it become OK to take money from someone who makes more money than us?
Just the fact that so many Americans seem to be in agreement with our new President is evidence. to me at least, that many are willing to give socialism, and its seemingly free ride, a chance. Where did we go wrong? Have we lost our national pride?
I believe that we have lost our pride in ourselves. We have lost our pride in our ability to succeed. We just do not believe that we can achieve the American Dream anymore. This is a problem.
I do not know what to do about the problem. Does anyone have any ideas?