How Money Ranked the 2020 Best Colleges

Money's annual Best Colleges for Your Money ranking offers a practical analysis of more than 700 four-year colleges. We spent months evaluating data on quality, affordability, and student outcomes. Watch the video for a quick overview or read the full step-by-step breakdown below.

Six out of 10 parents of college-bound students surveyed by Sallie Mae this spring said they worried about how the pandemic will affect college affordability.

That’s little surprise, given how far college costs have already climbed in recent decades. Today, the average price paid at a four-year public college is roughly double what it was 30 years ago, even as household income has barely budged. Now, an economic recession will do little to ease affordability anxieties.

To help families navigate these challenges, Money’s seventh annual Best Colleges for Your Money offers a practical analysis of more than 700 four-year colleges.

Data collection and analysis for the rankings were led by American Institutes for Research (AIR) researchers Dr. Robert Nathenson and Dr. Audrey Peek, with the help of researcher Melissa Arellanes and research associate Marina Castro. Quality control was led by Dr. Eric Larsen of AIR. Money’s editorial staff, however, is solely responsible for the final ranking decisions.

Here is the methodology behind our new 2020 rankings, first in a short version, followed by a more comprehensive one.

Money’s Methodology, in Brief

To make our initial cut, a college had to:

  • Have at least 500 students.
  • Have sufficient, reliable data to be analyzed.
  • Not be in financial distress.
  • Have a graduation rate that was at or above the median for its institutional category (public, private or historically black college or university), or have a high “value-added” graduation rate (in other words: score in the top 25% of graduation rates after accounting for the average test scores and percentage of low-income students among its enrollees).

A total of 739 schools met these requirements. We ranked them on 27 factors in three categories:

Quality of education (30% of weighting), which was calculated using:

  • Six-year graduation rate (30%). This measure is widely recognized as one of the most important indicators of a college’s quality. We adjust our six-year graduation rate to capture students who transferred into a college as well as first-time students.
  • Value-added graduation rate (30%). This is the difference between a school’s actual graduation rate and its expected rate, based on the economic and academic profile of the student body (measured by the percentage of attendees receiving Pell grants, the majority of which are given to low-income students; the average standardized test scores and high school GPAs of incoming freshmen; and the share of traditional, full-time students on campus).
  • Peer quality (10%). This is measured by the standardized test scores of entering freshman (5%), and the percentage of accepted students who enroll in that college, known as the “yield” rate (5%).
  • Instructor quality (10%). This is measured by the student-to-faculty ratio.
  • Financial troubles (10%). Financial difficulties can affect the quality of education, and a growing number of schools are facing funding challenges. Financial troubles are signaled by a college having low bond ratings, being labeled by the U.S. Department of Education as having financial issues, or having accreditation warnings.
  • Pell Grant recipient outcomes (10%). This measures how many Pell Grant recipients a school graduates, as a way to analyze how well schools support their low-income students.

Affordability (40% of weighting), which was calculated using:

  • Net price of a degree (30%). This is the estimated amount a typical freshman starting in 2019 will pay to earn a degree, taking into account the college’s sticker price; how much the school awards in grants and scholarships; and the average time it takes students to graduate from the school, all as reported to the U.S. Department of Education. (The data on merit and need-based grants you’ll see reported on our website are numbers reported by the colleges to Peterson’s. We publish these for your information, but they are not used in the rankings.)
  • Net price paid by students in different income brackets (20%). This is based on federally collected data on the net price for one year paid by students from families earning $0 to $30,000 (10%); students from families earning $30,001 to $48,000 (5%); and students from families earning $48,001 to $75,000 (5%). Data for the second two income groups are new to the ranking this year.
  • Debt (20%). This takes into account for all undergraduates both the estimated average student debt upon graduation (15%) and average amount borrowed through the parent federal PLUS loan programs (5%).
  • Ability to repay debt (15%). This measure includes the percentage of students who have paid down at least $1 in principal in five years of payments, as reported on the federal College Scorecard (10%).The other half is the Student Loan Default Risk Index, (5%) which is a calculation that compares the number of borrowers who default on their federal student loans as a percentage of the school’s enrollment.
  • Value-added student loan repayment measures (15%). These are the school’s performance on the student loan repayment and default measures after adjusting for the economic and academic profile of the student body.

Outcomes (30% of weighting), which was calculated using:

  • Graduates’ earnings (15%), as reported by alumni to Payscale.com. This includes earnings for alumni with up to five years of work experience (10%) and at least 10 years of work experience (5%).
  • Graduates’ earnings adjusted by majors (20%). To take into account the subjects students choose to study, this measure adjusts PayScale.com’s data for the mix of majors at each school for early career earnings (15%) and mid-career earnings (5%).
  • College Scorecard employment outcomes (30%). This includes two measures for federal financial aid recipients: the share of alumni who are neither working nor enrolled six years after starting college (15%) and the share of alumni who are earning less than $28,000 — roughly equivalent to the earnings of a high school graduate — six years after starting (15%)
  • Earnings 10 years after college entry (10%). This captures the earnings of federal financial aid recipients at each college 10 years after the student started at the college, as reported to the IRS and presented in the College Scorecard.
  • Value-added earnings (15%). To capture whether a school is helping launch students to better-paying jobs than competitors that take in students with similar academic and economic backgrounds, this measure adjusts PayScale.com’s earnings data for the student body’s average test scores and high school GPA, and for the percentage of low-income students at each school.
  • Socio-economic mobility index (10%). We included data provided by Opportunity Insights that shows the percentage of students at each school who move from low-income backgrounds to upper-middle-class jobs by the time students reach their mid-30s.

For a more detailed description of the methodology, read on.

Money’s Methodology, in Detail

The Best Colleges for Your Money rankings combine the most accurate pricing estimates available with indicators of alumni financial success, along with a unique analysis of how much value a college adds when compared to other schools that take in similar students.

We estimate a college’s “value add” by calculating its performance on important measures such as graduation rates, student loan repayment and default rates, and post-graduation earnings, after adjusting for the types of students it admits. We believe this analysis gives students and parents a much better indication of which colleges will provide real value for their tuition dollars, rather than simply rewarding colleges that admit the highest-performing students.

In building our rankings, Money focused on the three basic factors that surveys show are the most important to parents and students:

  • Quality of education
  • Affordability
  • Outcomes

In the past, we’ve given these three crucial factors equal weights in our rankings. This year, given the economic outlook, we’ve increased the emphasis on affordability. Now, affordability accounts for 40% of the ranking, while the other two account for 30% each.

To gauge how well a college is performing relative to its peers, we gathered federal and college data on the admissions selectivity of each school such as average test scores and average freshman GPA; and the percentage of each graduating class receiving degrees in each major; and the percentage of students with incomes low enough to qualify for Pell Grants (the majority of which go to families with incomes below $50,000). We then used the statistical technique of regression analysis to model the impact of a student’s test scores, economic background, and college major on key factors, such as graduation rates and future earnings. That enables us to see how much better or worse a particular college performed than would be typical for schools with similar students.

Value-added measures are a way to capture some parts of a college’s quality that more commonly used objective numbers can’t capture. They give some indication of how a college affects graduates’ outcomes — regardless of where those graduates started.

Money assigned each indicator a weight based on our analysis of the importance of the factor to families, the reliability of the data, and our view of the value of the information the data provided.

To ensure that we fairly compared apples with apples, we decided to analyze only those colleges that, in our view, passed these minimal quality and data tests. We decided that to be included in our rankings, a college must meet these criteria:

  • Be a public or private nonprofit college in the United States that enrolls freshmen.
  • Have at least 500 in-person undergraduate students and at least 150 full-time first-time (FTFT) undergraduate students.
  • Charge tuition in dollars. We screened out military academies that require a commitment of service in exchange for free tuition because we don’t know how to value the cost of the time commitment and risk.
  • Have sufficient data to be analyzed. Any college missing graduation rates and Payscale earnings data was removed from our universe. And any college missing 5 or more metrics was removed from the final ranking.
  • Not show signs of financial distress, as defined by having at least two of the following indicators: Low ratings by any of the three major bond-rating agencies, warnings from accreditors, or being placed on “heightened cash management 2” by the U.S. Department of Education.
  • Have a graduation rate that was at or above the median for its institutional category (public, private, or HBCU). Or have a high “value-added” graduation rate (in other words: score in the top 25% of graduation rates after accounting for the average standardized test scores, average high school grade point average, share of entering class that are first-time, full-time students, and percentage of students receiving a Pell Grant. For these colleges, a graduation rate of 40% was used as a cut off).
  • Have a Cohort Default Rate (CDR) lower than 25%.

This eliminated some colleges that may be good values, but might be facing temporary financial difficulties or may be too small for us to evaluate. But it left a robust universe of nearly 750. In our view, even the lowest-ranked of the schools on our list demonstrate that they provide at least some value for your tuition dollars.

We then used the following data and methodologies in our three basic categories to create our rankings:

Quality of Education: 30% of weighting

Changes from our 2019 rankings: We added bond ratings from Moody’s and Fitch ratings and shifted some weight away from the financial difficulties category to increase the weight on the Pell graduation index.

For this factor, we combined the following seven indicators:

Graduation rates: 30%. Education experts and college officials generally agree that one of the most important reflections of a college’s quality is its graduation rate. (The U.S. Department of Education calls them “helpful indicators of institutional quality.”) Many rankings use the most commonly cited graduation rate statistic — the percentage of freshmen that graduate within six years. Yet that rate is based only on first-time, full-time students, meaning it misses a large (and growing) population of students. To help address that, we also use recently available federal data to expand that graduation rate by specifically including the share of students who transferred into a college and earned a degree within six years. The included graduation rate is calculated using bachelor’s degree-seeking students only.

Value-added graduation rate: 30%. Many education experts and college officials point out that the basic graduation rate number, while useful, is an insufficient indicator of a college’s value because research shows that wealthier students and students who got good grades in high school are more likely to graduate on time no matter what college they attend — so elite, expensive schools would be expected to have high graduation rates. For that reason, we also calculated each school’s relative performance after accounting for the economic background and academic preparation of its students. The higher a school’s graduation rate was above the rate predicted for a school with that particular mix of students, the more value that particular college is assumed to have added. This “value-added” graduation rate analysis is widely accepted. (A 2013 OECD review of research on value-added analysis says it can provide “a more ‘accurate’ estimate of the contribution educational institutions make to students’ academic progress” by isolating student attainment from other contributing factors such as family characteristics and socioeconomic background.)

Peer quality: 10%. Decades of research have shown that undergraduates have a major impact on their peers. Students who room with better students get better grades, for example. By contrast, students surrounded by less studious peers — for example, heavy drinkers, video game players, etc. — study less and get worse grades. And students who room or socialize with more successful students tend to get better jobs upon graduation.

Our peer quality measure consists of these two indicators:

  • Academic preparation of students (5%). We used composite test scores of incoming freshmen to estimate the academic qualifications of the student body. While there is much debate over the usefulness and validity of standardized tests, the SAT and ACT tests currently provide the only nationally comparable data on student abilities.
  • Yield (5%). The federally reported “yield” is the percentage of accepted students who enroll in a given college. The higher the yield, the more likely it is that the school was the student’s first choice, or best option, and that applicants perceive the college’s quality as high.

Faculty: 10%. Research shows that students who get more individual attention from faculty tend to achieve more, both in college and after graduation. (See, for example, the recent Gallup-Strada study.) We include each school’s student-to-faculty ratio.

Financial troubles: 10%. We added this factor in 2017 because a growing number of schools are facing significant funding problems. Financial difficulties affect the quality of education through layoffs of staff, closure of programs, and reduction of services. A school was deemed to be facing financial trouble if it:

  • Is on the “Heightened Cash Monitoring 2” list published by the U.S. Department of Education, reflecting concerns about the school’s financial stability.
  • Has bonds that are rated below-investment grade by Standard & Poor’s.
  • Has received warnings from its accreditor.

This year, we did not consider whether a college had staff layoffs in the most recent academic year because layoffs or furloughs were so widespread in academia after the pandemic shuttered campuses. Counting colleges with layoffs was also a more subjective measure, since layoffs could be an indication of significant financial troubles or an indication that a college is right-sizing its budget for long-term sustainability. We also decreased the weight for this measure from 15% to 10%, in part because after removing any college with two warning flags, this measure affects just 11 colleges in our universe.

Pell Grant recipient outcomes: 10% We added this factor in 2018 as a way to analyze how well schools help low-income students succeed. We used the number of Pell recipients in the bachelor’s degree seeking cohort who actually earned a degree six years after enrolling. With this new measure, schools get credit for each additional Pell Grant recipient they graduate — so schools that serve a large number of low-income students and serve them well rise to the top. We increased the weight of this measure from 5% to 10%.

Affordability: 40% of weighting

Changes from the 2019 ranking: We added two new net price figures to capture affordability for students from middle-income backgrounds alongside our existing measure of net price for low-income students.

For this factor we used 10 indicators, weighted as shown:

Net price of a degree: 30%. Money uses a unique and — according to many experts — more accurate estimate of college prices. We started with the “sticker” price provided by the college to the federal government. The full cost of attendance includes tuition, fees, room, board, books, travel, and miscellaneous costs. (In calculating tuition and fees for public colleges, in-state amounts were used.)

We then subtracted the average amount of institutional aid provided per student by the college. That gave us an estimate of the net price the college charged an average student for the most recent academic year these data were available. (The data on merit and need-based grants you’ll see reported on our website are numbers reported by the colleges to Peterson’s. They are not used in the rankings.)

Next, we used federal data on the percentage of students who graduate from that college in four, five, and six years respectively to calculate an average time to degree, which for the vast majority of schools is now more than four years. The National Student Clearinghouse Research Center reported in 2016 that students at private, four-year colleges are enrolled an average 4.8 years before graduating. At public colleges, it’s 5.2 years.

We generated an estimated net price for a series of six years, inflating each slightly since tuition prices typically rise each year. We then created a weighted average total net price, based on those prices and the proportion of students who complete their degrees within four years, five years, and in six years. A handful of exceptions are made for institutions with longstanding co-op programs, as those programs intentionally extend the length of a degree and no tuition is charged during co-op periods.

The federal net price estimate for a year’s costs is typically lower than Money’s estimate because the Department of Education subtracts almost all grants — federal, state and institutional — while we only subtract aid provided by the school, for reasons described in the ‘caveats’ section below.

Money gives the net price sub-factor a very heavy weighting because surveys show that the cost of college is now one of families’ biggest worries. About half of freshmen in 2017 said cost was a “very important” factor in where they ended up attending, according to an annual survey from the Higher Education Research Institute at UCLA.

Net price by income bracket: 20%. The federal government reports the annual average price paid at each school by federal aid recipients broken down into five income groups. In the past, Money looked only at the price paid by low-income families. But given the way college costs have outpaced income growth in the last few decades, many middle-income families also report that paying for college is out of reach. As a result, we added two additional income brackets to capture affordability for middle-income families as well.

We continued to award the most weight to the lowest income group as an indicator of how well the college targets need-based financial aid:

  • Net price for students from families earning $0-$30,000 (10%)
  • Net price for students from families earning $30-001-$48,000 (5%)
  • Net price for students from families earning $48,001-$75,000 (5%)

Educational debt: 20%. Surveys show debt to be another of most families’ biggest worries, and research shows student debt can have long-lasting effects on young adults’ finances.

Our educational debt assessment is based on these two indicators:

  • Student borrowing (15%). Our measure of student borrowing starts with the average dollar value of federal student loans by entering undergraduates, multiplies that by the share of entering undergraduates that take out a federal student loan, then multiplies that again by the institution’s average time to degree. This generates the average total federal student loan amount across all entering undergraduate students. These data are reported by the federal government or calculated using federally reported data. A handful of exceptions are made for institutions with longstanding co-op programs. We capped the dollar value of student loans at $9,500, the federal maximum for independent, first year students.
  • Parent borrowing (5%). The federal government reports the total amount of parent PLUS loans awarded to parents at each college each year. We divided this number by the school’s total undergraduate enrollment to calculate an average parent PLUS debt per undergraduate student. While other organizations generally don’t include parental debt in their rankings, Money believes parent educational borrowing is a financial burden, and should be an important consideration.

Ability to repay: 30%. The ability to repay loans taken out to finance a college education is another indication of a school’s affordability for its students.

We evaluated ability to repay using these four indicators:

  • Student loan default risk index (SDRI) (5%). Each year, the federal government publishes the number of former students who left college three years ago and have since defaulted on their federal student loans. Using a methodology proposed by The Institute for College Access and Success (TICAS), Money adjusts these numbers for the share of undergraduate students at the college who take out federal student loans. TICAS says this is a fairer and more accurate indicator of the odds that any particular student at the college will end up defaulting on a student loan.
  • Value-added SDRI (5%) We measured whether a college’s SDRI was above or below the rates typical of schools with similar student bodies.
  • Federal student loan repayment (10%). We used the data released on the federal College Scorecard on the percentage of student borrowers who pay down at least $1 on their principal in their first five years of payments. We believe this provides different and important information about alumni financial stability since some schools have started to game default rates by helping students to take advantage of hardship programs that delay defaults until after the three-year measuring window is closed.
  • Value-added student loan repayment (10%). We measured whether a college’s student loan repayment rate was above or below the rates typical of schools with similar student bodies.

Outcomes: 30% of weighting

Changes from the 2019 rankings: We eliminated the job impact measure, which was based on the results of a PayScale.com question, “Does your work make the world a better place?” While we continue to believe job satisfaction and impact are important measures of alumni success, we don’t believe this question directly relates to whether the respondent’s college had any influence over their answer. We also reduced the amount of weight on the socioeconomic mobility score as the data from Opportunity Insights have not been updated since they were initially released in 2016. In redistributing those weights, we slightly increased the weight on early career earnings (both raw and value-added), major-adjusted earnings, and two workforce measures from the College Scorecard.

For our third category, we used a total of 10 indicators, weighted as shown below.

Earnings 10 years after college entry: 10%. Median earnings for 2014-15, as reported to the IRS, of people who started as college freshmen and received federal aid in 2003 or 2004 and are currently working and not enrolled, published on the U.S. Department of Education’s College Scorecard.

PayScale.com earnings: 40%. Although the earnings data reported on PayScale.com is based on what visitors to the site voluntarily enter, our previous analysis shows that it is reliable. PayScale provided Money with aggregated data from an online form that asks website visitors what their job is, what their salary is, where they went to college, what they majored in, and when they graduated. This adds additional important information to federal earnings data, because it covers only the students who earned a bachelor’s degree from the school, not the dropouts. These Payscale data also screen out those who have earned graduate degrees, and thus may be earning more because of their graduate, rather than undergraduate, education.

  • Early career earnings (10%). Median earnings as reported by graduates with between 0 and 5 years’ work experience.
  • Experienced career earnings (5%). Median earnings reported by those who do not have graduate degrees and have at least 10 years’ work experience.
  • Value-added early career earnings (10%). We analyzed how the school’s early-career earnings compared with schools with similar student bodies.
  • Value-added experienced career earnings (5%). We analyzed how the school’s mid-career earnings compared with schools with similar student bodies.
  • Major-adjusted early career earnings (15%). We adjusted for the impact of majors on each school’s average earnings. We then calculated how the school’s adjusted earnings compared with what would be typical for schools with similar mixes of majors. This allows us to compare, for example, schools that specialize in producing teachers with schools that produce engineers, in seeing how well they each perform compared with other schools that primarily produce teachers and engineers. We’ve increased the weight on this measure in recent years, including this year, as additional research and new data continue to show that what a student studies drives their future earnings potential.
  • Major-adjusted experienced career earnings. (5%)

Share of alumni who are not employed six years after starting school: 15%. This data from the College Scorecard shows the percentage of federal financial aid recipients who started as freshmen in 2007-08 or 2008-09, and were neither enrolled in school nor employed in 2014-15.

Share of alumni earning at least $28,000 within six years of starting college: 15%. The federal government publishes these data to show which colleges produce alumni who are earning at least as much as the typical high school graduate. It is based on incoming freshmen in 2007-08 or 2008-09 who received federal financial aid.

Socio-economic mobility: 10%. Opportunity Insights, formerly the Equality of Opportunity Project, published “mobility rate” data for each college on how many low-income students (i.e., with family incomes in the lowest quintile, or below $25,000 a year) were enrolled and how many went on to earn incomes that put them in the top quintile for their age group (earning at least $58,000) by 2014. We used this “mobility rate” data in the rankings as an indicator of which colleges are helping promote upward mobility. It is the best available measure of upward mobility at thousands of colleges, but because the data track students who were born in the 1980s and attended college in the early 2000s, and the numbers haven’t been updated since originally published, the rate does not capture how colleges are promoting mobility more recently. We decreased the weight given to this measure from 20% to 10%.

How We Calculated These Rankings

For each of our data points, for each college, we calculated a “Z-score” — a statistical technique that turns lots of differently scaled data into a standard set of numbers that are easily comparable. Each Z-score tells you how far any particular number — such as, say, a graduation rate of 75% — is from the average for the entire group under consideration. Finally, we ranked the schools according to their total score across our three main factors.

Why Isn’t My College Ranked?

If a college does not appear on our list that means it did not meet one of the initial screening requirements outlined above. Those requirements include enrolling at least 500 students and having sufficient data to analyze. But the most common reason colleges don’t make the cut is that their six-year graduation rate is too low.

Colleges need a graduation rate that was at or above the median for its institutional category. This year, those medians were 52.1% for public colleges and 62.5% for private, not-for-profit colleges.

You may notice that some colleges with graduation rates below those cutoffs are still included. In those cases, the colleges were originally disqualified but got brought back into the universe because they had exceptionally high value-added graduation rates. That means our analysis shows their graduation rate is much higher than expected given the students who enroll. This year, about 100 colleges were added to the universe this way.

Some Caveats

While we believe our rankings are better than any others out there, we know they are not perfect. Here are some caveats that we are aware of.

Student learning. What students actually learn is a big unanswered question in higher education. The various assessments of college student learning are controversial, few colleges use them and very few of the ones that do release the results publicly. In addition, we have not been able to find good data on basic indicators of academic rigor, such as the number of pages of writing or reading required per assignment.

Standardized Tests. This spring, colleges made a near-universal shift to “test-optional,” by giving applicants the choice of whether to submit SAT or ACT scores. In most cases, the changes were announced as a temporary move as a result of the pandemic, and it’s unclear how many colleges will stick with the policy. But even before this spring, an increasing number of colleges were going test optional. If this continues to grow in popularity to where a large minority of students at colleges in Money’s universe do not submit scores, we’ll have to evaluate the effect that has on our rankings, particularly our value-add assessments that rely heavily on scores to predict academic performance.

Geographical cost of living adjustments. Some colleges in low-wage, low-living-cost areas, such as parts of the South and Midwest, may get lower rankings because we have not adjusted the earnings data cost of living. While there are some factors in our rankings that lessen the impact of any geographic wage differential — including the College Scorecard employment data, student loan default rate, and debt repayment rate — we hope to address this issue more thoroughly in future rankings.

Measuring economic mobility. The data we use from Opportunity Insights data is now a few years old, but it remains the best that is available in a national dataset. We recognize other experts have recently suggested different ways of measuring how well a college serves low-income students and promotes socioeconomic mobility, and we will continue to investigate whether we should replicate their models to update our measure in the future.

Out-of-state public college tuition. Many students are interested in attending public colleges out of state. Colleges charge higher prices to out-of-state students, and we haven’t developed a strong cost and value comparison for those cases. Out-of-state students can still use our rankings to assess public colleges on their educational quality and alumni outcomes, but the affordability metrics — specifically net price and average borrowing levels — likely will not apply.

Net prices. Money’s estimated net price of a degree is likely to be higher than the average price actually paid by most families. It is crucial to understand that while the Money net price estimate is based on the average price charged by the college, you and your family will pay less than that if your student receives any federal, state, or private scholarships. In addition, our net price is based on the average student’s time-to-degree. Your student may finish in four years. And while many students take more than four years to finish a degree, they aren’t necessarily paying full tuition for the five or six years before they graduate, since they may, for example, take a year off to work. Money attempted to account for this by adjusting the estimated time to degree for all schools with large and established co-op work programs, such as Northeastern University. In addition, Money is not adding to the cost of a degree any amount for “opportunity cost,” which is the amount in earnings a student loses by not finishing a degree on time and delaying entry into the higher-paying job market of college graduates. So, while we may, in some cases, be overestimating the price of a degree, we are also underestimating the total economic expense to a student of schools that don’t speed them to graduation.

Acknowledgements

Former Money senior writer Kim Clark was instrumental in developing our methodology and rankings. Mark Schneider, now the director of the Institute of Education Sciences, and Matthew Soldner, now the commissioner of the National Center for Education Evaluation and Regional Assistance at Institute of Education Sciences, consulted with Money on the creation of these rankings in 2014 when they were at the American Institutes for Research. Cameron Smither, now at the American Association of State Colleges & Universities, also influenced Money’s rankings while he worked at AIR. Money is thankful to several higher education experts whose insights in recent years helped us make decisions on what data to use and how to weigh it.

The Money editorial staff is solely responsible for the final ranking decisions.

All the information for Money's Best Colleges of 2020, which includes tuition prices, with and without aid, percentages, average graduation debt, and early career earnings, was researched by Money. However, Money does not guarantee that this information will apply to each person nor guarantees any early career earnings

Sign Up for Our Newsletters

Sign up to receive the latest updates and smartest advice from the editors of Money

SUBSCRIBE