I, like so many others, have been practically bombarded since birth with the message that one must go to college to ever succeed in the "Real World." We go through our grade school years knowing that we are moving on to high school, where we must lay an essential foundation for our college years, and pretty much the rest of our working lives. I have never understood how people can place so much emphasis on the importance of college while knowing full well that unless you...
A) went to Harvard
B) grew up in a rich family where mommy and daddy were b/fs with a CEO of some company and got you a job
C) screwed the right person/people to get you the job
D) got extremely fucking lucky
E) somehow got 10 years worth of extremely specific work experience despite of the fact that 10 years ago you were just turning 16 and couldn't even legally drive..
..it is impossible to gain a job with any kind of worth after college.
It is my belief that my peers (particularly those between the ages of 22-30) have become the greatest casualty of this piece-of-shit reality. Far too many people that I know have student loans of $20K and more and are stuck working in shitty jobs that realistically you barely need a high school diploma to be considered qualified. I recently discovered that I girl with whom I went to high school with works as a manager of a Rita's. Another person whom I know graduated from college works as supervisor at Starbucks. And for myself, I work as a jewelry salesperson in a failing jewelry store with a Master's Degree in Industrial/Organizational Psychology. How's THAT for a waste of $35K? Needless to say, this trend NEEDS to change.
Blame what ever you want on the sagging economy, high unemployment, and the fact that I, and so many others, have JUST earned our degrees. The fact of the matter is that the bachelor's degree has somehow become as commonplace as a high school diploma. Employers are seeing it as less of a symbol of maturity and dedication to education and more of a minimum requirement for their $8/hr stock boy job. Is it wrong to brainwash innocent children into thinking that a college education will secure them for life with the career of their dreams? Absolutely. Let's get honest people...the "Real World" may not be pretty but it should at least be realistic.
-A