In the long history of the American 20th century, few decisions seem smarter in retrospect than the mass production of bachelor’s degrees. Starting with the G.I. Bill after World War II and continuing with the enormous expansion of state university systems and community colleges in the 1950s and 60s, huge efforts were expended to expand college access. When globalization pushed blue collars jobs overseas and technological advances ushered in the Information Age, America was well-positioned with the most educated workforce in the world.
Even that wasn’t enough to meet employer demand. As the supply of degree holders increased significantly from 1970 to 2000, the price of a BA in the labor market went steadily up, not down. Meanwhile, our foreign competitors have been straining to make up ground, investing billions in their national higher education systems while simultaneously sending tens of thousands of their students to American universities, all in pursuit of the perfect piece of 21st century intellectual property: a portable, non-expiring, universally recognized credential of higher learning.
All of which is to say: There is a very high evidentiary bar for asserting, as Charles Murray does, that “the BA is the work of the devil.” It is a bar he does not come close to clearing. But in trying, he raises some important points about the flaws and failures of our higher education system. We don’t need fewer bachelor’s degrees. But we do need better bachelor’s degrees, and in this respect Murray’s arguments have value.
Murray believes that fewer than ten percent of people have the cognitive abilities necessary to earn a legitimate four-year degree. Space does not permit an exploration of the underlying arguments related to IQ and academic standards. I will simply note (and Murray acknowledges this) that over 30 percent of working-age adults currently have four-year degrees (at least) and assert that the evidence he presents suggesting that over two-thirds of those degrees are illegitimate is exceedingly thin and unpersuasive.
Murray would replace much of our current higher education system with a massive regime of workplace certification via standardized testing. Instead of wasting time in the futile pursuit of degrees that lie beyond their cognitive means, the vast majority of students would focus immediately on vocational training after completing high school. In support of this idea, he offers the example of accountants, who must pass the CPA exam. “The merits of the CPA exam,” Murray writes, “apply to any college major for which the BA is now used as a job qualification. To name just some of them: journalism, criminal justice, social work, public administration… business, computer science, engineering technology, and education.” Businesses, he says, could band together to create common professional standards and assessments to match. In Murray’s preferred future, students could bypass a traditional college education and go right to the test.
But that begs a question: If businesses would be better off under such a system, why haven’t they implemented it already? The accounting industry, for example, is dominated by just four companies—the so-called “Big Four”—with combined annual revenues of roughly $100 billion. If KPMG, Ernst & Young, et al. wanted people to be able to sit for the CPA exam without first earning 120 (or often 150) college credits, I imagine they would lobby state accounting boards to make the change, and apply significant resources to the effort. (I am told that large businesses have some influence over the government entities that regulate them, as well as state lawmakers.) Similarly, the rapidly consolidating newspaper industry could, if it wished, create a national CJ (Certified Journalist) exam. Tests, after all, are cheap compared to the cost of recruiting and retaining talented employees. The Washington Post already owns the largest test prep company in the nation.
Of course, none of these things are actually happening, and for good reason: Employers value the bachelor’s degree, and most professions aren’t as easily definable—and thus testable—as accounting. The fact that the world is not already the way Murray wishes it to be assumes a catastrophic ongoing failure of intelligence and rational self-interest on the part of the business community. (Admittedly, recent events on Wall Street make this more plausible.)
Murray’s proposal would also radically restrict access to the traditional liberal arts curriculum. The number of institutions offering such an education would contract to the familiar set of elite colleges on the coasts, while the rest would have to re-tool for vocational training. It is undoubtedly true that some people aren’t smart enough to master the complexities of philosophy, literature, and social science. But it’s also very difficult to decide ahead of time who those people are. America’s open, flexible labor markets constitute one of our great competitive advantages in the global economy, and that includes broad access to higher education. By ensuring that everyone can get accepted to college somewhere, and by using public funds to keep prices low, we maximize the chance that our future innovators, CEOs, and political leaders will find the higher education they need, regardless of their socioeconomic background.
Imagine, for example, a young man from a working-class background growing up in an apartment above a bank in a small town in Illinois. His high school interests run from sports to acting, and he aspires to be a broadcast journalist. In Charles Murray’s world, such a student would likely be counseled into a short-term training program to prepare for the national journalism test. Fortunately, Ronald Reagan had the chance to attend Eureka College, where he took to economics and sociology before going on to other things.
Indeed, Reagan’s life of unlikely career transitions points to the main benefit of a good college education: It teaches students not how to do but how to think in ways that are applicable across varied careers. And such skills are much more important to many more people now than they were eighty years ago. The future economy will hold vast numbers of jobs that have yet to be invented. The bachelor’s degree will qualify students to pursue all of them and graduate education besides, while a narrowly defined certificate, by definition, will not.
College is not, moreover, only about preparing for work. Higher education exposes students to our intellectual and cultural inheritance, to hard-won wisdom and works of surpassing beauty. Sometimes these things merely enrich students’ lives; sometimes they spur creation of new contributions to the human project. Locking students out of academia will worsen cultural divisions and leave more people to the depredations of a coarsening popular culture.
Murray points to the large number of students who drop out of college as evidence that we are cruelly forcing students to waste time and money pursuing a goal they are intellectually incapable of achieving. But this ignores virtually everything that has been learned about why students leave college. Most students who drop out don’t fail out. Rather, they leave for a complex set of reasons, including increasingly high costs, competing demands of work and family, and colleges that fail to provide an engaging, high-quality education. Students are more likely to stay in school if they’re learning, and they’re more like to learn when they’re well taught. Yet teaching is an afterthought in many colleges and universities, subsumed to the demands of research, athletics, fundraising, and the rest. The problem often is not that academic standards are too high but too low, resulting in boring, unfulfilling courses that students conclude they can do without.
And it is in this last area that Murray’s essay—along with Real Education, the book on which it is based—touches on some very legitimate areas of concern. It’s true that our higher education system is not serving the interests of many students. Not because it encourages them to earn a bachelors degree, but because it does a poor job of helping them succeed. Colleges are not judged by how well they teach students, and this includes the most elite institutions. Charles Eliot, the great 19th and early-20th century Harvard president, was supposedly once asked how his university was able to amass its store of knowledge. “Because our students arrive with so much,” he is said to have replied, “and leave with so little.” This remains more or less true today.
Academic quality should be a higher priority for college presidents, and far more attention should be given to student success. Murray is correct that all programs don’t have to take four years. (European universities are rapidly coalescing around a three-year standard.) Degree programs would certainly benefit from more empiricism and connection to the real-world concerns of the workplace. There are many students out there majoring in business, education, social work, etc. who aren’t learning very much about those things, and the same is true for the classic liberal arts.
But the solution isn’t to divert those students into a huge testing and certification apparatus that would cripple a higher education system that remains, for all its flaws, a bulwark of the economy and the envy of the world. Instead, we should ensure that students learn more in college by keeping higher education affordable and holding colleges and universities accountable for how much students learn and whether they eventually succeed in the workforce and life. The bachelor’s degree represents the best of American opportunity, a vehicle for social and economic advancement that has produced fantastic dividends for our economy, citizenry, and society. We need to make it better, not tear it down.
—
Kevin Carey is the research and policy manager for Education Sector, an independent education policy think tank.