Do you think we live in a society where people feel entitled to things they are not willing to work for?
Good grades, money , job promotions, and nice houses all come to mind. We've lived in prosperity for over 25 years here in America. We've all experience the benefits of living in a country where we have a lot of stuff.
Then this "economic crisis" happens and people begin to realize they are not entitled to everything and anything...they have to work for it. Work for money, food, good grades..etc.!
Have we lost sight of what hard work is because so much has been handed to us?
I watch the news and people are panicking over this crisis...I think, "Good!" We all need to live within our means, stay out of debt, and work for the things we want in life. God created work and it is a good thing!