I live in the south of Thailand, and I meet travelers all the time who are backpacking around; some have also chosen to live here. But back home, in Canada, so many women seem to think that Southeast Asia is a scary place where you'll get kidnapped and sold on the black market. I understand that the reason for them thinking this is the media - TV shows, movies, the news, etc. So I'm wondering what women in this forum think about Southeast Asia: what is your general impression of the place, and do you have any desire to go there? If not, why not?