are industries. They suck the money from our wallets and give us as little as possible in return. Yes they do dictate life and death and yes they are an industry. Quit trying to make this something it isn't.
Now if you don't live in America and have to fight insurance companies for every bit of health care they promised then I suggest you pay closer attention before trying to make me the bad guy here.
Why not do some research on the way corporations run America and then look up the salaries for the CEO's of the major insurers.
I have been in health care since the 70's and insurance companies
are an industry who give less and less while taking more and more.
If I am misunderstanding you then please clarify, but if you are arguing with me about the state of health care and the companies who dictate how and when we can have health care in this country, you are barking up the wrong tree.
America has industrialized its health care and corporations are dictating what is and isn't going to be covered, it's a nightmare for some and a sin of the gravest kind that so many can't even get the crumbs they do offer.
http://www.thefreelibrary.com/Health+insurance+industry-s11101