Oh geez. I used to think education was super important. I did well in school and had planned on going to college. Life got in the way of my plans. I went to work right out of high school, sometimes working two jobs to get by. Most of my friends had gone off to college and I felt like I was missing out. When everyone started graduating and getting jobs, I realized something: I worked retail and they had a 4-year degree, but we were making the same amount of money.
I stopped to think about it. Did I actually want to try and further my education?
Nowadays, people seem to be finding their own paths in life and making money with no college education. It's all over the news as a big debate: Is college 'worth it'? Some argue that college students end up making more money than those with who only graduated high school. Others argue that the job market right now is so bad that people are graduating, not getting jobs in their fields and ending up in serious debt, barely making payments back on their student loans and therefore college is not worth the money.
A growing trend (that grates on my nerves) is people getting schooled by life and majoring in Stupid. Have you ever watched a reality show? People are actually PAID to be complete idiots! A scientist could possible bank a respectable 6-figure salary, but an idiot with a sex tape who is famous for being famous banks millions. I'm pretty sure she didn't go to college. Of course not, there are no cameras there.
It's plain and simple.
STUPID MAKES MONEY!
While I do think it's very important to finish at least high school, I feel that college for some is becoming more of an accessory over a necessity. But, as always, to each his own.
How do you feel about education?