Anybody else who grew up in the '60s and appreciated the "hippie" mindset of that era, feel like this country has lost a lot of its soul?
at that time. Now everything seems to be dominated/influenced by corporate America. Everything is a "brand" now, even people. You hear famous people/celebrities/politicians, etc being referred to as the so-and-so "brand". It just seems really sterile and impersonal...unnatural even. Like we've stopped being human beings. Anybody else feel this way?