Does going to the doctor actually makes you better? I mean this in the most innocent, unsarcastic way.
See the thing is I grew up with homeopathy and have, as an adult, veered towards for the most part. I like the route of least intervention. I like the idea of treating the whole person, rather than just a specific set of symptoms. I like the idea that your body can develop it's on strong defenses to most ailments. And in general, I've experienced that it can...
And I've also experienced some rather negative encounters with traditional doctors who mostly seem to be prescription writers for antibiotics... (again, this is just my experience.) I've discovered from these encounters that a) I get righteous and horrific yeast infections the minute I take any kind of antibiotic and that b) the doctor's solution is to then prescribe Diflucan which I am apparently violently allergic to (last time my lips burst out into blisters within hours of taking it and my chin and cheeks went numb.)
Not to mention, when I fell pregnant I thought I had a virus picked up in Spain so I went to my GP who told me she thought I had IBS. (Ick. Look that one up.) I sort of rolled my eyes and cocked my head and said, "any chance I could be pregnant?" to which she declared with much bravado, "Absolutely not."
So you see why I'm a skeptic?
But now, well, I'm in a bit of a predicament.
I feel like my natural immune defense is not winning anymore against the tide of germs coming my way from school--twenty odd kids with germy hands and Strep and Pneumonia and everything else they've been passing around...plus whatever Bean has been bringing back from preschool (which has resulted in his first ever double ear infection.) I've kind of reached my limit in fact. I've been sick to varying degrees since September. And before that I had morning sickness... so basically I've been affected by some form of malaise for the past 6 months and it's kindof affecting my will to do anything other than bury my head under several pillows and sob.
So I want to know: what do doctors actually do these days--other than prescribe antibiotics? Is there anything they can actually do to help me that will help as much as my mom stopping by to rub my feet and feeding me chicken soup?
Do you 'believe' in your doctor?