The Lie: Healthcare is a Human Right
by Kevin Jackson on December 22, 2014 at 1:02 PM
We are told in America by the Left that healthcare is a human right. Then the Left trots out some failing European system to bolster their point.
As with most things Left, they are disingenuous liars. Healthcare is a privilege; always has been, and always will be. Ever wonder what primitive man did for healthcare?
Here’s a hint: He died! And it’s that way around much of the world. The fact is that healthcare is “earned” as a society becomes more civilized.
The idea that somebody can distribute healthcare is ridiculous.
Here is an infographic on Healthcare by the team at Work the World. It’s eye-opening to say the least!
Issues:
Topics: