From what I have noticed in the workforce...and in other parts of life, women seem to dislike other women. Do you believe this to be true? I have had female co-workers tell me that they enjoy working with men better than women. Is this due to some sort of jealousy? Is it competition? If it is true, have you experienced hate from other women?