I've been looking at the "health care" thread here and there and, although I did not reply (it's one of the issues where most Americans and Europeans seem to be really really far apart) I felt like asking a distinct yet related question which probably best suits another thread.
So, what is a "right"?
Are "human rights" written somewhere, in heaven or in nature, other than in human laws / customs which are the provisional, ever-changing result of power struggle and negotiation, in an equally ever-changing economical-social conjuncture?
And then what is the point of discussing "rights," if they are not grounded anywhere else than in the socio-political arena itself? Whose interests does our "opinion" serve?
I personally tend to think that the Marxist concept of "class consciousness" has been too quickly swept under the carpet. When I hear lower middle-class people ranting against social rights I can't help imagining sheep speaking the wolves' language, to their own detriment.