Healthcare Consumerization in America

Healthcare Consumerization in America

The Affordable Care Act, also commonly referred to as “Obamacare,” has changed the way Americans think about and purchase health insurance. It has also forced an environment of healthcare responsibility, making consumers more aware of where their health dollars are going, from monthly premiums costs and deductibles, to medical care and best options for treatment. Healthcare, as we once knew it, has largely been transferred to the hands of the consumer. Here’s a brief summary.

Colleen McGuire

About Colleen McGuire

Colleen McGuire is an independent consultant who has spent most of her career writing about healthcare and the health insurance industry. For fun she blogs, travels and takes a lot of pictures along the way.

Comments