There have been many actions of hate in this country over the last year. More than ever, many minorities and citizens without privilege have been personally attacked due to their cultural or religious background. So I wanted to take this moment to talk about what it means to be American and “American” culture. I define American culture as a continuing patriarchal society with deep foundations that constantly need to be challenged. We are founded on worthy principles which are being twisted and pitted against what we say we stand for. The paper is different from the talk.
America is a country of immigrants. The true Americans had their land stripped from them and were forced to live in reservations. Being a country which started from immigrants, it is my thinking that we should be accepting and inclusive of those who come to our country for various reasons. The original “Americans” are Europeans who wanted freedom from the monarchy but we are placed in a potentially similar period now. These resistances are splitting the country in half but there is no new land to flee to and steal to become a new “free” country so the citizens of this “America” are going to have to figure out their problems on this soil.
What is American culture? What is it that we bring that is unique to the world? It is not wrong and it should be encouraged to explore these various walks of life and learn how these various societies grew under their brand of leadership. These phobias against those that are different should be questioned because at least they are unique and when I think of American culture I think sterile and clean. We should welcome diversity to help color our lives and become a greater unified force that could change the world through tolerance and acceptance.