I work in a corporation that has about 400 employees. I don't have health insurance right now. I think that the company offered some employees health insurance. Somebody told me that I should confront HR because they said I should have been asked if I wanted health insurance and sign that I didn't want it. They said that it is a law or some requirement. Does anybody know what law this could be. I want to be prepared before I talk to HR.