posted on Nov, 12 2014 @ 03:55 PM
originally posted by: StarGazer77
As a graduate student, I can honestly say that MOST college courses are things you could teach yourself in 2 weeks at a library or online for free.
The world has drastically changed, and information is abundant at your fingertips at no charge.
It was the same back in the 1980's. Around 1982, my family bought an Atari 800. With that I learned all about interrupts, assembly language,
pointers, player-missile graphics, display lists, display list interrupts, data streaming, data communications with RS232. Our high-school computer
studies course just taught 10 line BASIC programs. Then when I started college in 1986, our courses covered Pascal on Desktop PC's, C on UNIX and
assembly language on AIM-69's. Later in the decade, when it was possible to custom build a multimedia PC from Adlib Soundblaster and Hercules
coprocessor boards, download programming guides from USENET, the college was still using PC's with regular 8-bit VGA boards.
Generally, academics wait until their is industry demand for a particular technology, then they create a course to cover that market. By the time that
course is complete, there is mass competition by undergraduates for jobs. When you do a post-graduate degree then you get to play with the current
custom technology. But you have to wait ten years or more for it to go mainstream. Then you are back to mass competition. Much like the game industry
now - ten years ago, most game companies did their own technology. Now, everyone works with Unity and Unreal.