(By M.A. Hargett, Originally posted at Indiana Economics.)
Having read George’s well-reasoned article on Why Education Subsidies Should Cease, I would like to approach the matter from a slightly different perspective. This fetish of “universal education” is founded on a big myth, that being the necessity of attending college in order to acquire job-related skills.
Historically, this idea is relatively new. For example, prior to WWII, those who went to universities were either of middle-to-upper class standing, or people with very specific ambitions who would pay their own way through by working and studying simultaneously. In both cases, these educations were paid for in cash. A statistic of note, approximately 15% of college-aged individuals attended a university in 1940.
A dirty little secret you won’t be told in class is that the Great Depression actually continued into the 1950′s because the war did not actually help the economy to rebound, all it did was restructure unemployment numbers when most unemployed males went into the military and females joined the ranks of workers in the biggest make-work program of the New Deal.
When the war started to wind down, troops returning home were facing unemployment again. War manufacturing had gone into decline, obviously, even in spite of Truman’s later efforts in Korea. In anticipation of this, FDR endorsed a solution for sustaining the lie that the Great Depression had ended, the G.I. Bill, which, among other things, would offer to pay veterans’ college tuition. By keeping these would-be workers otherwise occupied for four to eight years, it kept unemployment rates artificially lowered until the real economic recovery occurred.
With the federal funding came federal guidelines, or accreditation, whereby schools who wished to receive the infusion of government money would have to tow the line. Initially, the idea was meeting certain basic standards requirements. However, it didn’t take long until government pressure was applied to manipulate programs and schools into being very strict in upholding commonly held beliefs about U.S. superiority, and most importantly, the role of the U.S. (particularly the government itself) as a force for “good” in world history.
As a result of the mass-influx of students following the Second World War, classroom space grew scarce, and this scarcity caused non-veterans to have to compete more than they had previously for the remaining seats. Economic truths dictate that in a case of scarcity, the first move is to raise prices in order to maximize both profits and the ability to make capital purchases (more classrooms and instructors, in this case), so the cost of tuition rose and fewer non-veterans from lower income brackets could afford to work their way through college.
When the veterans entered the workforce after graduation, the glut of applicants led employers to a new method of filtering out potential hires: those having degrees versus those without. Employers would prefer hiring college graduates at the same salaries they were paying the non-graduates who were working just as well only a few years prior. This forced non-graduates into lower income brackets.
Thanks to the perpetual war-making of the U.S. through the fifties and sixties, the drafts and subsequent G.I. Bill payouts kept many college students from having to pay for their educations out of pocket. Sputnik-phobia also gave rise to the National Defense Education Act of 1958 which created federal funding for non-veterans to attend college in order to help America “keep up” with Cold War Russia. As a result, the percentage of those attending college rose to and remained at a level 40% during this period.
As the unpopular Vietnam War wound down, the grounds for a draft went away and it became apparent that the G.I. Bill would not be enough to entice young people to join the military, even with provision added for “peace time” service. This led to secondary concerns that the education bubble (that is, the artificially hyper-inflated price of college education) might go bust (as in, adjust back to the proper price). The government quickly put together a GSE (government-sponsored enterprise) called Sallie Mae whereby a group of government-sponsored banks would supply low-interest loans to students who could not qualify for grants or scholarships in order that they might attend college and thus keep tuition prices “stable.” In order to receive an education loan through Sallie Mae one essentially had only to fog a mirror.
Oddly simultaneous to Sallie Mae’s genesis was Nixon’s abandonment of any pretense of the gold standard whereby the U.S. Federal Reserve (another government-backed banking cabal) became sole and final arbiter of the interest rates and money valuation. Both the Federal Reserve and Sallie Mae have come to benefit extensively from widespread (even universal) debt, and one can easily see that they both have contributed to the runaway inflation of prices and spiraling debt of average people in the United States.
For proof, one need only look at the value of the dollar, both in general and in relation to the cost of attending college. The purchasing power of the dollar in 2010 is less than 20% of what it was in 1970. Thus, a dollar 40 years ago had over five times the purchasing power of a dollar today. Meanwhile, average college tuition in 1970 was $700 per year as opposed to $6,000 per year in 2010. So, the purchasing power of a dollar paying for a college education is 12% of what it was in 1970. Basically, a dollar going toward a college education 40 years ago was worth almost ten times what it is today.
Thanks to these subsidies, which is a pleasant way of saying government tampering, the value employers place on a degree is inflated over what it was 70 years ago. At the same time, the banking structure and state-sponsored increase in demand have caused the sticker price of a college “education” to skyrocket. Let us also not forget that by reducing the number of years an individual is eligible to enter the workforce on the front end, colleges help to cook the books for unemployment numbers. Most dangerous of all, however, is the war of ideas waged by these institutions, promoting an unhealthy view of the state as a benevolent force doing what nobody else will.
With all of that history, the question raised by my rather inflammatory opening statement remains: Is a college education necessary for acquiring the skills necessary to get a good job? In almost all cases, excepting highly specialized fields such as law, accounting, medicine and the hard sciences, the answer is no. Non-college-educated employees and college-educated employees hired for the majority of positions within a company both require the same amount of training and attention, the only difference being that the college-educated employee typically has more at stake due to the crippling debt of his student loans. Skills acquired through experience and any company-sponsored, specialized training are really what provide value to an employer on a day-to-day basis.
Unfortunately, as has been the case for years, it isn’t the skills acquired in college that make the applicant attractive, it’s the prejudice of employers against those without degrees. While George cited this article which stated:
in addition to positively influencing core task performance, education level is also positively related to creativity and citizenship behaviors and negatively related to on-the-job substance use and absenteeism.
The fact remains that correlation does not equal causation. Core task performance among college graduates is often higher because those of who excel at the sorts of tasks most commonly undertaken in white-collar working environments are those to whom college is most recommended and by whom it is most sought. The same is true of creativity, although one might have to look into the definition of creativity at play in this article. “Citizenship” is a dubious term, as it could mean general capacity for interacting in a civil manner with one’s peers, or it could mean unquestioning obedience to authority, something repeatedly reinforced in modern pedagogy. As far as “on-the-job substance use” and absenteeism go, the types of jobs held by non-graduates are often of a blue or gray collar variety and substance use on the job among blue/gray as opposed to white collar workers will naturally be higher, and absenteeism would be higher among a poorer population, as the poor are more prone to illness.
In light of all this, I stand by my assertion that if one’s aim in attending college is the acquisition of skills, it is a pure waste of time in most every case, but if one is doing it as an acknowledgment of a social “hoop” he must traverse in order to become sufficiently “gentrified” in the eyes of potential employers, then it may be worth it.
So, the next time a young person asks “Why should I go to college,” it would be wise to re-phrase the question in one’s mind as, “Why should I go into debt to the tune of one full year’s starting salary, allow myself to be brainwashed and (further) molded into a conformist whose view of the world will be shaped by the court-historians, and spend roughly half of my time in the next four years being forced to study topics that have no real-world application.”