annwfyn: (Sally - close up)
[personal profile] annwfyn
This is a ramble, brought to you by an assortment of random communities which sparked my overtired brain and left me feeling the need to try and say something profound at 1 am.

I think I'm growing tired of Christianity being described as 'oppressive', 'patriarchal', 'the white man's religion', 'a male dominated religion', and a dozen other hackneyed phrases which all try and convey this impression of Christianity as the US Republican Party with added alters and crucifixes.

Christianity, as I understand it, was originally one of a number of mystery religions which became popular in the late Roman Empire, especially amongst the poor and vulnerable. Most religions at the time essentially offered one deal - you worship [insert name of deity here], and in return you'd get good stuff. If you didn't get the good stuff, your chosen deity probably didn't like, and it would mostly suck to be you. Christianity (amongst some others) offered a more intangible reward, that of life/reward after death. It offered the poor, the desperate and the weak something that they could continue to hope for, and a belief that something cared, even if the physical world which they lived in seemed to demonstrate that there was no divine presence watching out for them.

Christianity wasn't a 'religion of oppression'. It was a religion of the oppressed, of the poor, of the downtrodden.

Has that changed? Insomuch as it was adopted by the elite and has therefore become entwined with that elite, it has. Yet the entire bible, with its endless harping on about the need to protect the weak and the vulnerable, about how the poor shall be rewarded - that's not changed.

Is it a religion which oppresses women? Well, if you believe that everyone prior to Christ was worshipping a loving and naked Earth Mother, then maybe. If, on the other hand, you believe that many women were living in a classical society in which one's husband or father had the power of life and death over the women in their family, had no political rights, no right to wealth or control of their own life, and were sold into marriage at relatively young ages, then maybe you'll see Christianity as no better and no worse than many of the moral codes of its day. It did, at least, emphasise that women too have a soul which is of equal significance to men, and that they too were entitled to certain things in the eyes of G-d.

Admittedly, it's track record on feminism since hasn't been too great.

Is Christianity intrinsically a 'white man's religion'? For chrissakes! The goddamn faith came into being in the middle east. It is still worshipped in Egypt, and in the middle east by people who can trace the lineage of their faith back to the first Christians. In terms of numbers today, there are more Christians in Africa, and in South America than there are in Europe or America. It has never been a 'white man's religion'. It is a religion which was adopted with a great deal of enthusiasm by the European elites of the last 1000 years, but to call it a 'white man's religious' is to entirely disregard its history, where it comes from, where it originated from, and what it is now.

Bah. And humbug.

This incoherant rant is brought to you by an assortment of pagan, anti-racist, and assorted identity politics websites, and Sally's ongoing fondness for any faith that invested so heavily in illuminated manuscripts and stained glass windows.
This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

Profile

annwfyn: (Default)
annwfyn

March 2025

S M T W T F S
      1
2345678
9 101112131415
161718 19202122
23242526272829
3031     

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Feb. 10th, 2026 01:48 am
Powered by Dreamwidth Studios