college classroom

Editor’s Note: Amy Laitinen is deputy director for higher education at New America. She previously served as a policy adviser to the undersecretary and assistant secretary for vocational and adult education at the U.S. Department of Education and the White House. This is the latest in a series, “Big Ideas for a New America,” in which the think tank New America spotlights experts’ solutions to the nation’s greatest challenges. The opinions expressed in this commentary are solely those of the author.

Story highlights

Is American higher education worth the price? Are students and their families getting what they're paying for?

Amy Laitinen: The big problem is that we measure education in terms of time, rather than learning

CNN  — 

The cost of college has rapidly increased over the past 30 years. Students today face annual costs, between tuition and living, that can easily exceed $10,000 at a community college, $18,000 at a public four-year college (in-state), and $40,000 at a private four-year school. It’s unsurprising that today’s students often graduate with large debt loads. More than two-thirds of students graduate with debt. And the average amount of debt owed is about $30,000.

Given the cost of college, students and families need to know that they’re making a good investment. That’s why we need to move to a system where we measure learning outcomes, not just time spent in a classroom accumulating credits.

 Amy Laitinen

A college degree is the only sure path to middle-class security, and because young people and their parents know that, the cost of college, and the availability of loans and other aid, has become a powerful political issue. But for all the attention paid to the price of college, we haven’t given enough thought to whether students and their families are getting their money’s worth. Is American higher education worth the price? Are students and their families getting what they’re paying for?

There’s plenty of evidence that for many of them, the answer is no. In 2006, a government study found that nearly 70% of college graduates could not perform basic tasks like comparing opposing editorials. In a 2011 book, “Academically Adrift,” researchers studied 2,000 students at two dozen universities over four years and found that 45% of them showed no significant gains on a test of critical thinking, complex reasoning, and communication skills after two years of college. Even at the end of four years, 36% of the students hadn’t gained those skills.

Given the evidence, maybe it’s not a surprise that employers aren’t impressed by recent college graduates.

credit hour 2

Employers want the skills that higher education says it provides to students: the ability to critically think, communicate, work in a team, write effectively, and adapt. Yet only about one-quarter of employers say that colleges and universities are doing a good job in preparing students effectively for the challenges of today’s global economy. A recent Gallup poll found that only 11% of business leaders strongly agreed that college graduates have the skills necessary to succeed on the job. In addition to money, these graduates have spent hours and hours in classrooms and taking tests, but the time doesn’t seem to have translated into learning.

Why is this? Perhaps it’s as simple as this: We measure education in terms of time, rather than learning. A four-year degree attests that you have acquired 120 credits. That’s an accidental result of the credit hour system, which was created by philanthropist Andrew Carnegie more than 100 years ago, for the purpose of providing struggling professors with pensions.

At the turn of the 20th century, Carnegie created a $10 million free pension fund to help professors retire. The Carnegie Foundation for the Advancement of Teaching, which was set up to administer the fund, determined that only “full-time” faculty would qualify for pensions, which they defined as teaching 12 “credit units,” with each unit equal to one hour of faculty-student contact time per week, over a 15-week semester. While originally a narrow measure of faculty workload, the credit hour quickly morphed into much more. The Carnegie Foundation warned against using the credit hour as a proxy for student learning, but the temptation of an easy-to-understand and seemingly standardized measure was too great to resist. It just made organizing the whole higher education enterprise much easier.

If credit hours truly reflected a standardized unit of learning, they would be fully transferable across institutions. An hour in Arizona is an hour in New York. But colleges routinely reject credits earned at other colleges, suggesting that even though they use credit hours themselves, they know they are not a reliable measure of how much students have learned. Many students, however, believe the fiction that the credit hour is a standardized currency and assume that credits will transfer from one school to the next. This is an unfortunate and costly assumption, as community college students in Louisiana will tell you.

Until recently, Louisiana students with an associate degree typically lost between 21 and 24 credits when transferring to a four-year state school. That’s a year of time and money lost. Given that nearly 60% of students in the United States attend two or more colleges, the nontransfer of credits has huge costs, not only to individuals, but also to the federal government and states that are financing this duplicative classroom time. If higher education doesn’t trust its own credits, why should anyone else? And Louisiana students aren’t alone; transfer students across the country lose credits, which lengthens their time to get a degree.

So we have two problems: Students who have earned credits – at great expense in time and money– can’t use or transfer them. Others who have accumulated costly credits haven’t learned much.

And then there’s a third dimension: Millions of people who have learned a great deal have no “credit” because they learned it at the wrong place — that is to say, not at a “college.” Someone who has spent the last 10 years working as a nurse’s aid in a hospital who decides to go get a nursing degree has to start from scratch, taking introductory courses he could probably teach, because colleges treat those without credits as blank slates. Employees at a biotech company with a high-quality on-the-job training program might learn more than someone in a two-year college science program, but unless this training is attached to an accredited institution of higher learning, the learning won’t “count.” For the millions of adult workers looking to retrain and reskill, the focus on time rather than learning, especially when between family and work, their time is scarce, is a daunting proposition.

State and federal governments add to the problem, because while they spend hundreds of billions on higher education each year, most of it is for time served, in the form of credit hours, rather than learning achieved.

We need to stop counting time and start counting learning.

What could that look like? We don’t have to wonder; some schools are experimenting with measuring learning rather than time—some for decades. One relatively new program is Southern New Hampshire University’s College for America, or CfA, an online “competency-based” Associate of Arts degree aimed at working adults. The program has no courses, no credit hours and no grades. The school has broken down what students with a degree from CfA should know and be able to do, what it calls competencies. CfA worked closely with employers to identify the competencies employers were looking for, like communication, critical thinking and teamwork. Then faculty designed real-world tasks and projects to determine whether students had mastered each competency.

Unlike in credit-hour courses, CfA has no seat-time requirements. Students can move through the program as quickly as they can demonstrate mastery of the competencies. Someone who worked at a PR firm might whiz through the communications competencies and spend more time on the math competencies. And the faster students can progress, the less they will ultimately pay. Students pay $1,250 for all-they-can-learn in six months.

credit hour 1

This means they can spend their precious time and money learning what they don’t already know, rather than wasting it on what they already do. Students at CfA can be confident that their time and money are well spent and, at the end, they will have a very clear picture of what they know and can do. CfA is not the only one to offer this to students, nor the only model. Hundreds of schools, from Antioch to the University of Michigan to Purdue University, are looking to offer competency-based certificate, associate, and baccalaureate degree programs. How are universities staying afloat financially with such low tuition? In many cases, the answer is surprisingly simple—and, sadly, not commonplace in higher education—by focusing on what students need in order to, wait for it, learn. Fancy amenities, great football teams and sprawling college campuses may bring attention, but they have little to nothing to do with student learning. Some competency-based programs don’t focus on research – faculty are hired for specific expertise, like curriculum design, English literature or advising. Other programs use technology and data analytics to help students and faculty understand where students are doing well and where they are struggling. This allows for more targeted, personalized support by faculty.

There is, however, a downside for students: Self-paced competency-based programs do not fit in neatly with the historically time-based credit hour, making it difficult for students in these types of programs to receive state and federal support. Without access to these dollars, the programs will remain one-offs and unavailable to the majority of Americans who could use them.

Only recently has the federal government recognized the role it could play in encouraging the move from seat time to learning by redirecting some of its nearly $150 billion-plus financial aid budget. The U.S. Department of Education is encouraging innovation by colleges looking to experiment with alternatives to the credit hour, and there is strong bipartisan interest in both the House and Senate to explore innovative ways of paying for learning, rather than time.

As higher education becomes increasingly necessary and expensive, measuring time rather than learning is a luxury that students, taxpayers and the nation can no longer afford. Paying for what students learn and can do, rather than how or where they spent their time, would go a long way toward providing students and the nation with desperately needed, high-quality degrees and credentials.

Read CNNOpinion’s new Flipboard magazine.

Follow us on Twitter @CNNOpinion.

Join us on Facebook.com/CNNOpinion.