The United States is facing a health care crisis, due to a model of healthcare (i.e. disease-care) that rewards doctors mostly when people are sick. Because of this structure, we find that less effort is placed on prevention of disease then one would expect given the direct consequences that lifestyle and environment have on health.
As the baby boomer generation also experiences becoming the sandwich generation, we have become distraught at how ineffective the current medical establishment has become at dealing with chronic illnesses that are to a large degree lifestyle and environmentally triggered.
Would Americans be better served by a healthcare model in which the health insurance sector is merged with the medical sector such that doctors can be paid bonuses when their patients remain healthy or improve their health instead of having salaries that increase as the population gets sicker?
I will refer to this as the Health Margin Model of healthcare since the idea is that doctors should be rewarded for how much they can improve the health of their clients before they actually become patients.
What do you think?