Folks, just when I thought I had seen it all, the New York Times published an article yesterday that quoted a man as saying his health insurance provider "recently forced him to buy antibiotics at pet stores because it was cheaper than the pharmacy."
WTF?!
The link to the article is HERE!
What is this country coming to? Where is the outrage?
I'm an American expat who lives in Germany, a country that has had mandatory health care for over 50 years. If something that absurd ever happened in Germany, BILD, the leading tabloid newspaper, would have been all over the story like stink on a skunk.
Now, I'm not wild about every aspect of the health care system in Germany - it is bureaucratic and intransparent for those enrolled in the government-sponsored system (you can opt to have a private insurer, which means lower costs for younger insurees but piles on the fee increases if you have a history of illnesses), but at least I am covered, regardless of where I work.
(Just for the record, most people in Germany get 30 days paid vacation, paid sick leave with no specific limit, and health insurance is mandatory and paid 50/50 between employer and employee. Freelancers bear the costs of health insurance by themselves. I'm close to 50 and pay a little over 300 Euros a month for mine.)
Whenever I mention the health care situation or working conditions that most Americans have, people over here are shocked at how nasty and uncaring the U.S. system truly is, for the very people that form the backbone of the American economy. In response, I often get an earful from typically blunt Germans about the U.S. hypocrisy in encouraging countries in Europe to "follow the American example".
In my humble opinion, this "Pet Store Antibiotic" story should be the cornerstone of a whole series of Obama/Biden campaign ads focusing on the disgrace that is the American health care system.
What do you folks think?