College these days are just a scam. Out of all my friends who have gone to college, only the med students actually get a job in their field. Everyone else ends up flipping burgers anyway, when if they hadn't gone to college, they could have been doing long enough to get another job and not be in tons of debt.
We need to make college more serious. That would be forcing students to get their own loans, so they actually make a plan, instead of going to college just being the standard.
One main issue are parents these days, they feel their kids MUST go to college to succeed, but the kid doesn't know what they want to do yet.
I didn't go to college and make more money than any of my friends, and i'm only 21. Its all about networking and knowing what you are good at, college rules out public speaking skills and social networking, which is required for a job.
If some kid comes in with a masters degree in computer science and then some kid who has worked at a local IT shop a few years and has made a good name for themselves comes in, who is the employer going to pick?
edit on 24-10-2011 by doom27 because: (no reason given)