Unfortunately, there is currently no federal law that requires an employer to provide health insurance for employees. Some states are exploring this requirement for all in-state employers, and by the time you read this book, those laws may have already passed in some states.
The country is currently in a health care crisis due to increases in the cost of health care. Employers that have always picked up the health care premiums for all employees may be having problems paying the increased costs. As a result, employees are being required to pay more or all of the health care insurance premiums.
Many employers are also trying to reduce their health care costs by creating incentives for healthy lifestyle changes and penalties for known unhealthy behavior. Finally, some employers are only offering health care insurance to certain groups of employees or not offering it at all.