View Single Post
May 10th, 2007, 08:37 AM
Pure Innocence
Posts: n/a
I love doctors, but I don't like it when doctors demean the women they are treating. I feel like I got that once or twice (before I switched doctors). I don't distrust the medical profession, but I think that gaining a patient's trust is important. I won't go to a doctor who acts as if he/she is God.[/b]
Reply With Quote