In a letter to the editor this morning, a writer claimed many people stay in jobs that are unhealthy for them, physically and/or emotionally, just to keep healthcare benefits for their family. What a sad tradeoff.
I believe it happens more often than you think. My brother, for example, stayed in a job he hated for 20 years because his wife had diabetes and couldn’t work and he felt trapped into providing healthcare benefits for her. Changing jobs was too risky. Many employers won’t add a spouse or child with a pre-existing condition to the new employee’s policy. So he stayed in a miserable work environment until he developed diabetes himself.
How many thousands of people are making that unhealthy choice?
I’m currently facing a similar situation. A full-time writer/editor position just opened at the newspaper. It’s the same position I applied for (with mixed feelings), then didn’t get (to my relief) four months ago. After four stressful months, the woman who got the job was fired. Now they’re asking me if I want it.
The truth is I don’t want the job, I just want the healthcare benefits. But if I get the position and end up stressed, unhappy, and not focused on my novels, how can it be worth it? Since I was laid off early last year and ended up with two, flexible part-time jobs, I’ve been happier—and healthier—than ever before. So I’m starting to think that being happy is the best health tonic of all.
Wouldn’t it be good for our entire culture if healthcare was easy to access and not linked to employment? And no one had to make a bad job choice based on fear of losing a loved one or going bankrupt from medical bills?
What’s your experience? Have you taken a job just for the benefits? Do you stay in your job for the healthcare insurance?