Do you have any links?
I was not aware that any employer was ever required to provide health insurance for employees, but that this was a negotiated benefit I know that employers cannot discriminate and must offer the same benefits to all employees, but I didn't realize anyone was required by law to offer anything.
Was in the paper. We are talking large companies - yes they have to provide it as an option - nowhere does it say it needs to be good though.
There must be certain "minimum" requirements that employers must provide, or forcing them to provide insurance would be meaningless. Also, is this by state?
I was under the impression that ObamaCare was going to force MORE employers, not fewer, to provide insurance and that this would be a burden on small business.
We're self-employed, so I have to say that I haven't been keeping up with laws that pertain to large companies. When I was employed, I had a "Cadillac" plan...what the insurance company didn't pick up, my employer did. Those were the days...and I was too young to truly appreciate them.