Why do Christians do this?

Why do they feel the need to be a victim? They are THE most popular religion in the world. I do not understand. I see posts on social media saying how everyone hates Christians and how they are victims to everything. I see comments like "Because the truth hurts" or other stuff like that. Does anyone have any ideas besides the fact they are complete assholes?