Half of college is usually spent taking basic classes when more time could be spent on a major.
We spend years of our lives learning English, history and math before college. So, why spend even more time going over the same subjects?
For most majors, that only leaves two years to learn about what you are actually going to be doing in the professional world.
The logic just isn’t there.
The way things are set up in college is simply because that is how things have always been.
Curriculum and requirements in college need to evolve alongside industries.
Colleges should be asking employers what knowledge and skills they are looking for in potential hires.
Is it more important for someone majoring in web design to have four Spanish classes or to learn all the coding languages they will need in the industry?
This disconnect with colleges and industries is causing employers to have a harder time finding employees that can do the jobs they need them to do.
“More employers than ever are struggling to fill open jobs — 45 percent globally say they can’t find the skills they need,” according to the Manpower Group, who publishes a talent shortage survey every year.
This is not surprising seeing some of the class “requirements” for some of these majors.
For example, someone majoring in visual arts is required to take two government and two history classes. That is equal to a whole semester of non-art classes.
Writing essays, doing algebra and taking tests might get you through college, but once you get a job, your test-taking skills probably aren’t going to help much.
College classes need to be designed to reflect the industry world starting more on day one.
Four years is the minimum someone should spend learning their craft whatever it may be.