I think by law doctors are required to give you a booklet that specifies some of the various treatment options that are available to women, because I think before, it was just a question of well, we need to cut those breasts off. You don't need them anyway. You know, well, yes we do need them. And I think you need to know about all of your treatment options. Its too bad that it took a law to say that. Do women not matter in this society as to what our feelings are and how we view ourselves?