Nowhere on earth has the role of women in both politics and religion been more profound nor more widely recognized than in the American West. But that may change.