We live in an age and a country of great enlightenment, in which we are all very aware of the need to recognize the rights of all men and women alike. We have laws that specifically forbid discrimination against anyone based on race or gender. We know that it is right (though unfortunately not always practiced) to give a man and a women equal pay for equal work. All of this enlightenment and yet, when we look at the Bible, it seems to undermine these principles. Well, does the Bible teach that women are equal to and as important as men?