Socialism is a dirty word to Americans. Most wealthy European countries have strong, successful socialist programs and are proud of them. The U.S. has socialist programs too but we refuse to label them as such... Social Security, Welfare, Medicare, etc. What's the big deal? I personally would love to see a strengthening of social programs in the U.S. I would love to see a good Universal Health Care program for all Americans. And I make NO apologies for that!





Reply With Quote
Bookmarks