I'm not even sure what cancer is anymore. And it seems crystal clear now that hospitals only see giant dollar signs when anyone walks in with a critical diagnosis. The only thing I expect doctors to do is give the most expensive drugs that will keep someone barely alive, so they can keep taking expensive drugs.
I'm not even sure what cancer is anymore.
The concept is pretty simple, even for the laypeople such as ourselves: Cancer, in a nutshell, is simply out-of-control cell reproduction. It happens when cell DNA is mutated in just the right way (there's probably lots of "right ways" for this to happen). If nothing else kills you, cancer will eventually get you.
Doctors don't get paid by the drug.
But hospitals do. And they employ the doctors and oversee medications.
(post is archived)