My fiance told me that in order to get her new insurance through her work, the insurance company wanted to test her blood! I’ve never had that happen to me at ANY job. Something seems fishy to me (gut feeling) and I was just curious if any of you have ever heard of such a thing.
Reason I’m suspicious is because recently I’ve had concerns that she may have an STD and she’s been itching down there and says that’s it’s probably just bacterial. So soon after that she comes home with a shot in her arm and says that her new job’s insurance requires her to get some blood drawn… really?
Just seems strange and I’m curious if there is such a thing?
Thanks in advance! I’d love to stop being so paranoid!