I am a Manager in the petroleum field. I am one of 3 women of the company I am with, who manages a successful business in this field. There are times when I feel I am not taken seriously, almost like because I am a woman I do not know what I am talking about. It is almost like at times I feel I am being "humored". This is a business I have been in for a long while. I have been very successful with other companies I was with before this one and in those companies I was taken seriously. My opinion counted and mattered, my advice was considered and taken.
When I applied for the job I was the only women interviewed. I obviously got the job. But did I get it based on my experience and good interview skills or because they felt they needed another woman for the job?
Before I took over, the business I am now managing was suffering, customer 's were low, product was not being purchased and things were falling apart. Since I have been there customers have flocked and purchases have increased and the place is starting to thrive. Employees who wanted to leave before I came no longer want to and they come to work ready to do their job. I have done well for the company.
What I want to know, is how does a woman, in a mans world get taken more seriously? There are not a lot of women in this field. At least not with this company. What advice would you give me to get the male members on board to take me as a serious Manager?
I am tired of always having to try to "prove" myself. It seems that if I have anything to say I need proof to back it up. But if a male were to say something it gets taken more seriously. Now do not get me wrong, I am not a whining woman who is stomping her foot to get attention. I know that because I chose this field I will have to work harder, which is fine. But what do I need to do to get the respect I know I deserve?
Any help would be great,