Tay was designed to learn via unsolicited anonymous input. If they only train it with sanitized input, and don't allow it to "learn" from its interactions in actual use (unless that data is given permission to be used for further training and is sanitized by a professional) then it should be fine.
I'm skeptical of the idea of having a machine "comfort" a woman who is a victim of sexual assault or rape though. If all it does is gather information to help her make a report that's fine, but if it crosses the line and tries to console her that's just creepy. What's next, make a robot shoulder for her to cry on? Some things require human interaction, at least until we can make androids that pass the Voight Kampff test (not in the lifetime of anyone reading this, I'm sure)