We research all brands listed and may earn a fee from our partners. Research and financial considerations may influence how brands are displayed. Not all brands are included. Learn more.

Cannon Green and Chapel Drive, Princeton University
Cannon Green and Chapel Drive, Princeton University
Denise Applewhite—Princeton University, Office of Communications

College is now the second-largest financial expenditure for many families, exceeded only by the purchase of a home. So it isn’t surprising that parents and students are taking a hard look at the costs and payoffs of any college they consider.

To help families do that, Money has drawn on the research and advice of dozens of the nation’s top experts on education quality, financing, and value to develop a new, uniquely practical analysis of more than 700 of the nation’s best-performing colleges.

Mark Schneider, vice president at the American Institutes for Research (AIR) and president of College Measures, consulted with Money on this project. Data collection and analysis were performed by researcher Cameron Smither and research assistant Bao Lin Xu. Principal researcher Matthew Soldner supported the project team. Money’s editorial staff, however, is solely responsible for the final ranking decisions.

Here is the methodology behind our new 2017 rankings, first in a short version, followed by a more comprehensive one.

Money's Methodology, in Brief

To make our initial cut, a college had to:

  • Have at least 500 students.
  • Have sufficient, reliable data to be analyzed.
  • Not be in financial distress.
  • Have a graduation rate that was at or above the median for its institutional category (public or private), or have a high “value-added” graduation rate (in other words: score in the top 25% of our test of graduation rates after accounting for the average test scores and percentage of low income students among its enrollees).

A total of 711 schools met these requirements. We ranked them on 27 factors in three categories:

Quality of education (1/3 weighting), which was calculated using:

  • Six-year graduation rate (30%). This measure is widely recognized as one of the most important indicators of a college's quality.
  • Value-added graduation rate (30%). This is the difference between a school's actual graduation rate and its expected rate, based on the economic and academic profile of the student body (measured by the percentage of attendees receiving Pell grants, which are given to low-income students, and the average standardized test scores of incoming freshmen).
  • Peer quality (10%). This is measured by the standardized test scores of entering freshman (5%), and the percentage of accepted students who enroll in that college, known as the "yield" rate (5%).
  • Instructor quality (10%). This measured by the student-to-faculty ratio.
  • Financial troubles (20%). This is a new factor added in 2017, as financial difficulties can affect the quality of education, and a growing number of schools are facing funding challenges.

Affordability (1/3 weighting), which was calculated using:

  • Net price of a degree (30%). This is the estimated amount a typical freshman starting in 2017 will pay to earn a degree, taking into account the college's sticker price; how much the school awards in grants and scholarships; and the average time it takes students to graduate from the school, all as reported to the U.S. Department of Education. (The data on merit and need-based grants you'll see reported on our website are numbers reported by the colleges to Peterson's, and are for your information but are not used in the rankings.)
  • Debt (20%). This takes into account both the estimated average student debt upon graduation (15%) and average amount borrowed through the parent federal PLUS loan programs (5%).
  • Student loan repayment and default risk (15%). This is the percentage of students who have paid down at least $1 in principal in five years of payments, as reported on the federal College Scorecard, and the Student Loan Default Risk Index, which is a calculation that compares the number of borrowers who default on their federal student loans as a percentage of the school’s enrollment.
  • Value-added student loan repayment measures (15%). These are the school’s performance on the student loan repayment and default measures after adjusting for the economic and academic profile of the student body.
  • Affordability for low-income students (20%). This is based on federally collected data on the net price that students from families earning $0 to $30,000 pay.

Outcomes (1/3 weighting), which was calculated using:

  • Graduates' earnings (12.5%), as reported by alumni to PayScale.com; early career earnings within five years of graduation (7.5%), and mid-career earnings, which are for those whose education stopped at a Bachelor’s degree and graduated, typically, about 15 years ago. (5%).
  • Earnings adjusted by majors (15%). To see whether students at a particular school earn more or less than would be expected given the subjects students choose to study, we adjusted PayScale.com's data for the mix of majors at each school; for early career earnings (10%) and mid-career earnings (5%).
  • College Scorecard 10-year earnings (10%). The earnings of federal financial aid recipients at each college as reported to the IRS 10 years after the student started at the college.
  • Estimated market value of alumni’s average job skills (10%). Based on a Brookings Institution methodology, we matched up data provided by LinkedIn of the top 25 skills reported by each school’s alumni with Burning Glass Technologies data on the market value each listed skill.
  • Value-added earnings (12.5%). To see if a school is helping launch students to better-paying jobs than competitors that take in students with similar academic and economic backgrounds, we adjusted PayScale.com's earnings data for the student body's average test scores and the percentage of low-income students at each school; for early career earnings (7.5%) and mid-career earnings (5%).
  • Job meaning (5%). We used the average score of each school’s alumni on PayScale.com’s survey question of “Does your work make the world a better place?”
  • Socio-economic mobility index (20%). For the first time, we included new data provided by the Equality of Opportunity Project that reveals the percentage of students each school move from low-income backgrounds to upper-middle class jobs by the time the student is 34 years old.

Finally, we used statistical techniques to turn all the data points into a single score and ranked the schools based on those scores.

For a more detailed description of the methodology, read on.

Money's Methodology, in Detail

Money’s Best Colleges for Your Money rankings are the first to combine the most accurate pricing estimates available with all reliable indicators of alumni financial success, along with a unique analysis of how much “value” a college adds when compared to other schools that take in similar students.

We estimate a college’s “value-add” by calculating its performance on important measures such as graduation rates, student loan repayment and default rates, and post-graduation earnings, after adjusting for the types of students it admits. We believe this analysis gives students and parents a much better indication of which colleges will provide real value for their tuition dollars.

In building our rankings, Money focused on the three basic factors that surveys show are the most important to parents and students:

  • Quality of education
  • Affordability
  • Outcomes

Because these three factors are so interrelated and crucial to families, we gave them equal weights in our ranking.

In each of these three major categories, we consulted with our advisers to identify the most reliable and useful data to assess a school’s performance. We also balanced the basic data in each category with at least one “value-added” measure.

To gauge how well a college is performing relative to its peers, we gathered federal and college data on the admissions selectivity of each school such as average test scores and grade point averages of students at each college; and the percentage of each graduating class receiving degrees in each major; and the percentage of students with incomes low enough to qualify for Pell Grants (about 90% of which go to families with incomes below $50,000). We then used the statistical technique of regression analysis to determine the impact of a student’s test scores, economic background, and college major on key factors, such as graduation rates and future earnings. That enables us to see how much better or worse a particular college performed than would is typical for schools with similar students.

We used this estimate of comparative performance by the college as an important part of the ranking, as you’ll see below.

Money assigned each indicator a weight based on our analysis of the importance of the factor to families, the reliability of the data, and our view of the value of the information the data provided.

To avoid overloading readers with too many choices or too much data, and to ensure that we fairly compared apples with apples, we decided to analyze only those colleges that, in our view, passed these minimal quality and data tests. We decided that to be included in our rankings, a college must meet these criteria:

  • Be a public or private non-profit college in the United States that enrolls freshmen.
  • Have at least 500 in-person undergraduate students and at least 150 full-time first-time (FTFT) undergraduate students.
  • Charge tuition in dollars. We screened out military academies that require a commitment of service in exchange for free tuition because we don’t know how to value the cost of the time commitment and risk.
  • Have sufficient reliable data to be analyzed.
  • Not show signs of financial distress by having at least two of the following indicators: Low ratings by bond-rating agencies, warnings from accreditors, layoffs in the last year, being placed on “heightened cash management” by the U.S. Department of Education.
  • Have a graduation rate that was at or above the median for its institutional category (public, private, or HBCU). Or have a high “value-added” graduation rate (in other words: score in the top 25% of our test of graduation rates after accounting for the average standardized test scores, average high school grade point average, share of entering class in the graduation rate cohort, and percentage of students receiving a Pell grant).
  • Have a Cohort Default Rate (CDR) lower than 25%.

This eliminated some colleges that may be good values, but might be facing temporary financial difficulties or be too small for us to evaluate. But it left us with a robust universe of more than 700 colleges. In our view, even the lowest-ranked of the schools on our list demonstrate that they provide at least some value for your tuition dollars.

We then used the following data and methodologies in our three basic categories to create our rankings:

Quality of Education: 33.3% weighting

For this factor, we combined the following indicators which, research indicates, provide meaningful information about the quality of a school’s instruction, weighted as shown:

Graduation rates: 30%. Education experts and college officials generally agree that one of the most important reflections of a college’s quality is its graduation rate. (The U.S. Department of Education calls them "helpful indicators of institutional quality.") Many rankings use this commonly cited federal statistic on the percentage of freshmen that graduate within six years. Because of its importance and wide acceptance, we assigned this measure a comparatively heavy weight.

Value-added graduation rate: 30%. Many education experts and college officials point out that the basic graduation rate number, while useful, is an insufficient indicator of a college’s value because research shows that wealthier students and students who got good grades in high school are more likely to graduate on time whatever college they attend. So elite, expensive schools such as, say, Harvard, would be expected to have high graduation rates. For that reason, we also calculated each school’s relative performance after accounting for the economic background and academic preparation of its students. The higher a school’s graduation rate was above the rate that would predicted for a school with that particular mix of students, the more value that particular college is assumed to have added. This “value-added” graduation rate analysis is widely accepted. (A 2013 OECD paper found that such “value-added measurement provides a ‘fairer’ estimate of the contribution of educational institutions.”) Because of its reliability and acceptance, we weighed this factor heavily.

Peer quality: 10%. Decades of research have shown that undergraduates have a major impact on their peers. Students who room with better students get better grades, for example. By contrast, students surrounded by less studious peers—for example, heavy drinkers, video game players, etc.—study less and get worse grades. And students who room or socialize with more successful students tend to get better jobs upon graduation.

Our peer quality measure consists of these two indicators:

  • Academic preparation of students (5%). We used composite test scores of incoming freshmen to estimate the academic qualifications of the student body. While there is much debate over the usefulness and validity of standardized tests, the SAT and ACT tests currently provide the only nationally comparable data on student abilities. In addition, many studies have found a high correlation between test scores and academic success.
  • Yield (5%). The federally reported “yield” is the percentage of accepted students who enroll in a given college. The higher the yield, the more likely it is that the school was the student’s first choice, or best option, and that applicants have perceive the college’s quality as high.

Faculty: 10%. Research shows that students who get more individual attention from faculty tend to achieve more, both in college and after graduation. (See, for example, the recent Gallup-Purdue study.) So we include each school’s student-to-faculty ratio.

Financial troubles: 20%. This is a new factor added in 2017 because financial difficulties affect the quality of education through layoffs of staff, closure of programs, and reductions of services. And a growing number of schools are facing significant funding problems. A school was deemed to be facing financial trouble if it:

  • Is on the “Heightened Cash Monitoring 2” list published by the U.S. Department of Education reflecting concerns about its financial stability.
  • Its bonds are rated below-investment grade by Moody’s or Standard & Poor’s.
  • Has received warnings from its accreditor.
  • Has had staff layoffs in the most recent academic year.

Changes from our 2016 rankings: For 2017, we slightly reduced the weightings of our standard quality measures to make room for the new financial trouble measure.

Affordability: 33.3% weighting

For this factor we used eight indicators, weighted as shown:

Net price of a degree: 30%. Money has developed a unique and — according to many experts — more accurate estimate of college prices. We started with the “sticker” price provided by the college to the federal government. The full cost of attendance includes tuition, fees, room, board, books, travel, and miscellaneous costs. For public colleges, we used the in-state tuition and fees.

We then subtracted the average amount of institutional aid provided per student by the college. That gave us an estimate of the net price the college charged an average student for the most recent academic year these data were available. (The data on merit and need-based grants you'll see reported on our website are numbers reported by the colleges to Peterson's, and are for your information but are not used in the rankings.)

Next, we used federal data on the percentage of students who graduate from that college in four, five, and six years respectively to calculate an average time to degree, which for the vast majority of schools is now more than 4 years.

Judith Scott-Clayton of Columbia University, for example, has found that the average college graduate pays for 4.5 years of college, not just 4. However, because Money’s 2017 rankings includes only those schools with the best graduation rates, the average time to degree for all of the schools on our list is just 4.3 years.

We generated an estimated net price for a series of six years, inflating each slightly since tuition prices typically rise each year and then applying a small inflation adjustment. We then created a weighted average total net price, based on those prices and the proportion of students who complete their degrees within four years, in their fifth year, and in their sixth year. A handful of exceptions are made for institutions with longstanding co-op programs.

College counselors and financial aid experts have told us that the Money calculation is more realistic and helpful to parents and students than other popular price estimates, most of which use the cost of a single year, not the full degree. Sandy Baum, a senior fellow at the Urban Institute, says our estimates of the net prices of individual institutions “provide good benchmarks.” She added an important caveat, however: “Students at each institution face a wide range of net prices, so no individual student should assume that the schools on this list with highest net prices would end up being most expensive for them.”

The federal net price estimate for a year’s costs is typically lower than Money’s estimate because the Department of Education subtracts almost all grants—federal, state and institutional—while we only subtract aid provided by the school, for reasons described below.

Money gives the net price sub-factor a very heavy weighting because surveys show that the cost of college is now one of families’ biggest worries. A 2014 Harvard Institute of Politics survey reported, for example, that 70% of young adults say finances were a major factor in their college decision.

Educational debt: 20%. Surveys show debt to be another of most families’ biggest worries.

Our educational debt assessment is based on these two indicators:

  • Student borrowing (15%). Our measure of student borrowing starts with the average dollar value of federal student loans by entering undergraduates, multiplies that by the share of entering undergraduates that take out a federal student loan, then multiplies that again by the institution’s average time to degree. This generates the average total federal student loan amount across all entering undergraduate students. These data are reported by the federal government or calculated using federally reported data. A handful of exceptions are made for institutions with longstanding co-op programs.
  • Parent borrowing (5%). The federal government reports the total amount of parent PLUS loans awarded to parents at each college each year. We divided this number by the school’s total undergraduate enrollment to calculate an average parent PLUS debt per undergraduate student. While other organizations generally don’t include parental debt in their rankings, Money believes parent educational borrowing is a financial burden, and should be an important consideration.

Ability to repay: 30%. The ability to repay loans taken out to finance a college education is another indication of a school’s affordability for its students.

We evaluated ability to repay using these four indicators:

  • Student loan default risk index (SDRI) (7.5%). Each year, the federal government publishes the number of former students who left college three years ago and have since defaulted on their federal student loans. Using a methodology proposed by The Institute for College Access and Success (TICAS), Money adjusts these numbers for the share of undergraduate students at the college who take out federal student loans. TICAS says this is a fairer and more accurate indicator of the odds that any particular student at the college will end up defaulting on a student loan.
  • SDRI value-add (7.5%) We analyzed how much above or below the school’s SDRI was compared to schools with similar student bodies.
  • Federal student loan repayment (7.5%). We used the new data released on the federal College Scorecard on the percentage of student borrowers who pay down at least $1 on their principal in their first five years of payments. We believe this provides different and important information about alumni financial stability since some schools have started to game default rates by helping students to take advantage of hardship programs that delay defaults until after the three-year measuring window is closed.
  • Student loan repayment value-add (7.5%). We analyzed how much above or below the school’s repayment rate was compared to schools with similar student bodies.

Affordability for low-income families: 20%. The federal government reports the average price paid at each school by federal aid recipients who are in the lowest income group ($0-$30,000). Money uses this as an indicator of how well the college targets need-based financial aid. At some schools, those from the lowest-income group are expected to come up with more than $20,000 per year—which may be more than the entire annual income of the family.

Changes from our 2016 rankings: The only change to this section this year was that we made a small shift (5%) in the weights from the average net price paid by all students to put slightly more emphasis on the net price charged the lowest income students, because of what we perceive to be as growing concerns about the targeting of aid to those who truly need it.

Outcomes: 33.3% weighting

For our third category, we used a total of 12 indicators, weighted as shown below.

College Scorecard earnings: 10%. Median earnings for 2012-13 as reported to the IRS of people who started as college freshmen and received federal aid in 2001 or 2002 and are currently working and not enrolled, published on the U.S. Department of Education’s College Scorecard.

Payscale.com earnings: 40%. Although the earnings data reported on PayScale.com is based on what visitors to the site voluntarily enter, our analysis shows that it is reliable. PayScale provided Money with aggregated data on more than 1.4 million people who in the last three years have filled out an online form that asks what their job is, what their salary is, where they went to college, what they majored in, and when they graduated. This adds additional important information to the new federal data because it accounts for only students who earned a bachelor’s degree from the school, not the dropouts. This data also screens out those who have earned graduate degrees, and thus may be earning more because of their graduate, rather than undergraduate, education. While the two data sets cover different populations, the correlation between the PayScale.com and federal College Scorecard earnings data sets for this group of 711 colleges is .8, which is quite high.

  • Early career earnings (7.5%). Earnings as reported for students with less than five years’ work experience
  • Mid-career workers (5%). We used the average earnings reported by those who do not have graduate degrees and have at least 10 years’ work experience. The typical worker in this group has 15 years of work experience. We weighted early earnings more heavily than mid-career earnings because Trey Miller, a RAND Corporation economist who has studied the relationship between college choice and earnings, noted that a college choice has a much stronger impact on the type and pay of a first job after graduation than it does on job type and pay for a person who has been in the workforce for, say, 10 years. By then, experience and skills acquired since graduation will also play an important role.
  • Value-added early career earnings (7.5%) We analyzed how much above or below the school’s early-career earnings level was compared to schools with similar student bodies
  • Value-added mid-career earnings (5%) We analyzed how much above or below the school’s mid-career earnings level was compared to schools with similar student bodies
  • Major-adjusted early career earnings: (10%) We analyzed the impact of the popularity of each major on schools’ average earnings, and subtracted out the impact, say, of having many engineering or education majors on each school’s average earnings. We then calculated how much above or below the school’s adjusted earnings were compared to what would be typical for schools with similar major mixes. This allows us to compare, for example, schools that specialize in producing teachers with schools that produce engineers, and determine which art schools produce the highest-earning artists.
  • Major-adjusted mid-career earnings. (5%)

Estimated market value of alumni skills: 10%. In April 2015, the Brookings Institution published an analysis of new data shedding light on the value added by each college. One of Brookings’ indicators was an estimate of the national average job market value of the 25 most commonly cited skills listed by alumni of each college in their LinkedIn profiles. Using the methodology developed by Jonathan Rothwell, the author of the Brookings study, Money gathered 2017 data from LinkedIn on the 25 most commonly listed job skills for each school’s alumni. We then obtained data from Burning Glass Technologies on the market value of each skill. This allowed us to calculate the average job market skill value for students at each college. This data is for all graduates, including those who have earned additional degrees, so it captures the earnings of some schools’ higher earners.

Job meaning: 5%. Our 2016 Money/Barnes & Noble College survey found that parents and students alike put a higher emphasis on finding a “fulfilling” career, than on finding a high-paying job. So we included in this year’s rankings, alumni’s response to a PayScale.com survey question about whether their job “makes the world a better place.”

Share of alumni who are unemployed six years after starting school: 7.5%. We have added to the rankings new data from the federal College Scorecard on the percentage of people who started as freshmen in 2005-06, and were neither enrolled in school nor employed in 2012-13.

Share of alumni earning at least $25,000 within six years of starting college: 7.5%. The federal government now publishes this data to show which colleges produce alumni who are earning at least as much as the typical high school graduate in 2012-13 six years of starting college .

Socio-economic mobility: 20%. In January of 2017, a team of economists led by Raj Chetty of Stanford published a paper showing data for each college on how many low-income students (i.e with family incomes in the lowest quintile, or below $25,000 a year) were earning incomes that put them in the top quintile for their age group (earning at least $58,000) by 2014. We used the Equality of Opportunity Project’s “mobility rate” in the rankings this year because that is the indicator that shows “which colleges are contributing to the American Dream,” according to a co-author of the study, Brown University economist John Friedman.

Changes from our 2016 rankings: To make room for the new socio-economic mobility indicator, and the College Scorecard’s employment and $25,000 salary data, we eliminated two previously used factors – the staffing level of the career services offices, and the presence (or absence) of a program linking job-seeking undergraduates with working alumni. To make additional room, we slightly reduced the weight given to the PayScale early career earnings factors and PayScale’s “job meaning” survey.

How We Calculated These Rankings

For each of our data points, for each college, we calculated a “Z-score” — a statistical technique that turns lots of differently-scaled data into a standard set of numbers that are easily comparable. Each Z-score tells you how far any particular number — such as, say, a graduation rate of 75% — is from the average for the entire group under consideration. To reward colleges that demonstrate generalized excellence (and dampen the effect of a school’s outperformance on any one particular measure) we limited the range of the Z-scores to three standard deviations. We then rescaled each of the three main factors so that they had the same ranges. Finally, we ranked the schools according to their total score across our three main factors.

Some Caveats

While we believe our rankings are better than any others out there, we know they are not perfect. Here are some caveats that we are aware of and hope to address in future rankings.

Student learning. What students actually learn is a big unanswered question in higher education. The various assessments of college student learning are controversial, few colleges use them and very few of the ones that do release the results publicly. In addition, we were not able to find good data on basic indicators of academic rigor, such as the number of pages of writing or reading required per assignment. We will continue to explore the data in hopes of finding useful indicators of student learning.

Geographical cost of living adjustments. Some colleges in low-wage, low-living-cost areas such as parts of the South and Midwest may get lower rankings because we have not adjusted the earnings data cost of living. But several factors added to the Money rankings recently lessen the impact of any geographic wage differential. The Brookings Institution skills value measure, for example, is based on national averages, so treats all regions equally, as does the College Scorecard employment data, and PayScale’s “Job Meaning” survey. The student loan default and repayment data also should treat all regions equally since it shows which alumni have sufficient disposable income to repay their loans, in theory, possibly benefiting students in low-cost areas.

Alumni satisfaction. The information that’s currently available on alumni satisfaction—based on surveys and donation rates—is incomplete for many of the colleges on our list, so we were unable to include it as a measure. We are looking for ways to improve the alumni data and make it part of future rankings.

Out-of-state-public college tuition. Many students are interested in attending public colleges out of state. But public colleges charge higher tuition to out-of-state students. We will consider developing a cost and value comparison for out-of-state students.

Net prices. Money’s estimated net price is likely to be higher than the average price actually paid by most families. It is crucial to understand that while the Money net price estimate is based on the average price charged by the college, you and your family will pay less than that if your student receives any federal, state, or private scholarships. As an analogy, if you’re buying a can of soup, you have to pay what the grocery store charges, unless you have a coupon. Just as coupons can be used at competing supermarkets, most federal, state, and private scholarships can be used at many competing colleges. So we help you identify which college has the lowest net price at which you can apply any additional scholarships. In addition, our net price is based on the average student’s time-to-degree. Your student may finish in four years. And while many students take more than four years to finish a degree, they aren’t necessarily paying full tuition for the five or six years before they graduate, since they may, for example, take a year off to work. Money attempted to account for this by adjusting the estimated time to degree for all schools with large and established co-op work programs, such as Northeastern University. In addition, Money is not adding to the cost of a degree any amount for “opportunity cost,” which is the amount in earnings a student loses by not finishing a degree on time and delaying entry into the higher-paying job market of college graduates. So, while we may, in some cases, be overestimating the price of a degree, we are also underestimating the total economic expense to a student of schools that don’t speed them to graduation.


Mark Schneider, vice president at the American Institutes for Research (AIR) and president of College Measures, consulted with Money on our data project. The data collection and analysis was performed by researcher Cameron Smither, and research assistant Bao Lin Xu. Principal researcher Matthew Soldner supported the project team. The Money editorial staff is solely responsible for the final ranking decisions, however.