Skip to main content

College "Education" Is BS

Ugh, I have to say, as much as I love college, half of me almost hates it at the same time! bang

Everybody always assumes that colleges educate and teach students to think for themselves. People assume that college is a place where independent thoughts, original ideas and free thought are encouraged as well as free exchange. THAT'S BULLSHIT. From my experience this year at college, with International Studies, I can say that a lot of my so-called "education" is little more than indoctrination. I took World History and INST last semester, and I'm taking Human Geography, Compartive World Politics and Macroeconomics this semester. And A LOT OF IT IS BS.


In class, a lot of what we are taught is that Capitalism = "free market", individual decision making and Consumer Sovereignty. Then they teach that Socialism = collective decision making, "centralized planning" and state control of market. When I brought up the fact that many forms of Capitalism (including what we have) is acutally more like State Capitalism, and that not all forms of Socialism are statist (I myself am a Socialist and an anti-Statist), I just get uncomfortable stares, and a hestitant and tortured-logic backpeddle/subject change from the professors. We are also taught that democracy is synonymous with capitalism (that's a load of shit), and we are taught to embrace Globalism as unquestionably good. Whwn we DO learn the negative sides and the consequences of such things, they are generally glaced over. We never discuss how Globalism screws over Asian, African and Latin American countries, we don't discuss ho the so-called "free market" in Globalism is really CONTROLLED by Western nations.



And Neocons complain that colleges are "breeding grounds" for "radical liberalism". Yeah, this is "radical liberalism" only if you are a Neocon. Most of what we are taught is Neoliberal rhetoric (from my experience). I have many friends (some of them Alumni) that agree. Colleges, contrary to the myth that they educate, really serve to indoctrinate largely. They indoctrinate students to become future Neoliberal traders and corporatists. I'd say the only thing really "radicall libreral" about college education is that it tends to be socially liberal (not economically or politically liberal). That's about it.


Has anyone else had this experience with college? sck
Original Post
×
×
×
×