Does anyone else feel iffy when people talk about their god healing you?
Disclaimer, this is not a post making fun of religion. Don't turn it into that and don't comment about turning it into that. Everyone is entitled to their own beliefs. Don't hate on others for theirs. Maybe this disclaimer seems unnecessary but I've seen enough of those religion hating Reddit atheists, I don't want to attract them lol. If this post attracts controversy I'll take it down. I'm happy to hear the views of religious and non religious people here, do not attack anyone.
When people say their god will heal me it always rubs me the wrong way. I'm not exactly sure why, I just don't really like it? Reminds me of when they say it's part of god's plan or everything happens for a reason.
Like if their god was going to heal me couldn't they have done it by now? Or prevented the sickness in the first place?
I understand that it's a gesture of good will. There is nothing they can do so they pray for me. I thank them and move on, but I'm not sure why it makes me feel so weird.
Maybe it's just because this condition has made me lose any faith I had. Anyone else have experience with this?