I sometimes have people tell me they started their new job and then found out there was no health insurance. They express shock -- but they have to give me insurance, don't they?
No, they don't. There is no law requiring any employer to provide any particular benefits to employees. There are some tax incentives for employers to provide benefits like health insurance and 401(k) plans, which is why so many do (that, and the executives want them). Also, some large businesses have to pay an assessment if they don't provide health insurance for employees.