Employers Need to Be Part of the Solution

eAlert 61705db0-ed8a-4026-a2b9-fc677bc28f31

<p>Given the central importance of employers to Americans' health insurance coverage, it is critical that they become partners in any new reforms to expand and improve coverage, say the authors of <a href="/cnlib/pub/enews_clickthrough.htm?enews_item_id=29911&return_url=http%3A%2F%2Fwww%2Ecommonwealthfund%2Eorg%2Fpublications%2Fpublications%5Fshow%2Ehtm%3Fdoc%5Fid%3D522916%26%23doc522916">Whither Employer-Based Health Insurance? The Current and Future Role of U.S. Companies in the Provision and Financing of Health Insurance.</a><br /><br />Despite growing rhetoric that employer-based coverage in the U.S. is rapidly disintegrating, nearly all large firms continue to offer health benefits to their workers. Survey data furthermore reveal that more than three-quarters of Americans believe employers should either provide health insurance to their employees or contribute to a fund that would help cover uninsured workers. A majority of employers concur with this public sentiment. <br /><br />In this new issue brief, Fund researchers Sara R. Collins, Chapin White, and Jennifer L. Kriss examine the central importance of employer coverage in our health system, and show why it is imperative that employers join individuals, government, and other stakeholders in designing and contributing to "a more equitable, rational, and high performing health care system."</p>