I've always believed that people choose to follow a religion 'cause it creates some form of "hope" for them, if that makes any sense
For example, Christians are taught that Jesus/God or whomever is always by their side so they don't have to worry. Would you rather believe in that, or believe the fact that you have nobody to rely on but yourself? Same goes for the theory of heaven/hell; believe in heaven and that there's a better place in the afterlife, or believe in science and know that you're just gonna rot in the ground?
I am not a Christian myself, but most Christians I meet are very welcoming and loving people. I just don't like the persistent sons of bitches who come knocking on my door at 8 a.m in the morning, shove religion "Armageddon" crap down my throat, and plead me to accept Jesus
Yeah I also believe that religion can warp a person's perspective and change them into a better person, but it could also totally fuck with somebody's head. Doesn't religion cause a shit-ton of wars?

Hopefully I haven't offended anybody, 'cause I honestly didn't mean to
And as for homosexuality:

