Is a college education worth it? I’ve read many articles recently about college education and how various companies are not hiring today’s college graduates. They feel colleges are not preparing students to enter the real world and get jobs. Some companies think it’s because current graduates have no work ethic. They are on the phone all day. Graduates are late finishing assignments. They are too entitled. One article even talked about how today’s college graduates are more political than ever.
When your in University or College
I started thinking that when you are in college, you make your own decisions and do your own thing. There is no one telling you what to do. You select your class schedule, you go to class or don’t go to class, you can sleep during class, and the professor still gets paid. If you want to study, then you study; if you want to go out, you go out. After graduation, you get a job in which you are no longer making all the decisions. You have a work schedule, a list of things to do with deadlines, and you can’t decide not to do them. It’s a significant change that happens very quickly. I remember I couldn’t wait to graduate, but I missed college when I did. Making money was great, but I missed being in charge.
Why aren’t Companies Hiring New Graduates
Why aren’t companies hiring new graduates? Firstly, curricula have changed, and one can attain many more degrees than in the past. Is a Cannabis Biology degree required to grow weed? Or is a Puppet Art degree needed to learn how to make and bring puppets to life? I am not putting these degrees down, but could you learn these skills on the job? Also, colleges and universities have added many non-relevant classes. I am unsure how taking a Taylor Swift, Lady Gaga, Harry Potter, or a tree-climbing class will get you a job. Yes, these classes were or are offered in some colleges and universities. Is the university or college trying to make lessons more fun or entice more students to enroll in their school? Is this a class you add to your resume and hope the interviewer gives you the job because they like Taylor Swift or Harry Potter?
Secondly, not only have the college and university curricula changed, but the political landscape has also changed. Over the years, universities have become more liberal and started teaching students what they should believe and feel. They teach you that expressing your feelings is OK, even if it hurts someone. If you don’t get what you want, damaging or taking someone’s property or things is OK. After all, you are entitled to have everything you want.
You should be able to express your feelings without consequences. Companies should give you a job because it is your right. Universities allow students to set up tents on campus, take over libraries, and yell at students with different beliefs, while also punishing those who disagree with them. Could companies not be interested in these students because they are not running a daycare? They are in business to make money. If they cannot, they will go bankrupt. The jobs will go with them.
Thirdly, Generation Z is the first generation with easy access to the internet and video games. This access has led to a lack of social skills for communicating and interacting with people. It’s amazing how you see a bunch of teenagers sitting together while they all look at their phones instead of talking. If they would put down their phones and started talking to each other, they would learn how to communicate and express their feelings correctly. Videos and games will always be there, so what are you missing?
Finally, Companies do not have time to worry about new graduate hires’ feelings or hold their hand throughout the day. Most companies have a probationary period. This time helps them decide if the employee is a good fit. It also gives the employee time to learn the job and contribute to the organization. A company’s job is to make money, and as much as they want you to succeed, they do too.
Universities and Colleges Need to Change
If colleges do not change what they teach students, I am unsure how these institutions will survive. Many parents of future students who went to college are not making a lot of money and have big loans to repay. They don’t want their kids to make the same mistake they did. Additionally, an increasing number of companies are no longer requiring a college degree. I am not saying you don’t need a college degree; some fields do require one. You can’t become a doctor or lawyer without one. However, if you can acquire the necessary skills on the job, why incur debt for a degree? Do your research before spending that money. An associate’s degree or certification may give you the same outcome for a lesser price. Plus, currently there is a shortage of trades jobs that pay more money than people with a college degree.
In conclusion, please determine what you want to do with your life and see how to attain it. Do you need a degree, or can you get on-the-job training? It is up to you to do the research and figure it out. Good luck!
