I am an international student. I had health insurance while I was studying at my school because it was a requirement for the school. I graduated from college in May 2011. I am working under an OPT term this year, so I do not have to buy health insurance for school. I understand that it is better to have a health insurance package, but it is just too expensive for me right now when I do not have discount for student. My friends said that everyone needed to have a health insurance plan in the US.
Is it legal to live in the US without any health insurance at all?
Is it legal to live in the US without any health insurance at all?