it seems the media heavily underline the dark side in everything. Yes there are really bad problems in the world and they need to be addressed. But we also need to keep on the positive. For example whilst I dislike the power of the multi media ( advertising) I also appreciate the benifits of multi media power and its technology. So whilst I personally feel science and technology has cut of human feelings, I also appreciate medical science has been a great benifit and my lifespan extended because of it.
Now as for advertising personally I feel it has created artificial expectations of men and women to sell its products, but provided it doesn't put false slogans with the advertisements, I can live with it. Anyway I guess our entertainment industry has always given us beautiful images of men and women and it does so because that's what we want to see. So why focus on the dark and say such images of human beings dehumatize the average person and make us feel inferior?
So do you also feel the news people and politically correct groups focus too much on the dark side of everything? Remember it's just our opinion posted under " friends" and other view points are equally respected.