TL;DR
- In raw, unconditional comparisons, Ivy League graduates earn a lot more than everyone else. The gap is real on paper, but it is not the whole story.
- Once economists compare students who are equally capable and ambitious, the earnings gap mostly disappears. The Ivy didn't make them richer. They were going to do well anyway.
- This has been confirmed across multiple studies over 20+ years, including a 2023 study using near-random waitlist data.
- In STEM specifically, school prestige matters even less; employers judge you on skills you can actually demonstrate.
- The elite school does boost your odds of reaching the top 1%, elite firms, or elite grad schools. It raises the ceiling, not the floor.
- The biggest real exception: first-generation and low-income students. For them, the network effect is genuine and measurable.
- Bottom line for most STEM students: choose the mentor, the lab, and the problems, not the logo.
Picture two high school seniors. Let's call them Maya and Alex.
Both have near-perfect SATs. Both are science fair winners. Both spent their summers in research labs. Both applied to the same eight schools. Both got into Harvard.
Maya packs her bags for Cambridge.
Alex thinks about it for a week, then turns Harvard down and heads to Penn State instead.
Fast forward twenty years.
Who earns more?
The answer might be the most counterintuitive finding in modern economics. And it has been tested, re-tested, poked, prodded, and argued about for more than two decades.
Ready for it?
They earn about the same.
The Headlines Tell a Different Story
Open any magazine ranking colleges and you'll see numbers that make the Ivy League look unbeatable.
By their early thirties, students who attended Ivy-Plus colleges (the eight Ivies plus MIT, Stanford, Chicago, and Duke) earn about $244,000 a year on average. Similar students who went to state flagships earn about $143,000. That gap, roughly $101,000 a year at age 33, is large and specific: it comes from Chetty, Deming, and Friedman (2023) using IRS tax records for more than two million students.
Case closed. Get the Ivy.
Right?
Not so fast. Because there's a problem with that comparison, a problem so sneaky that it fooled researchers for decades.
The Sleight of Hand
Harvard doesn't accept random people off the street.
Harvard accepts the kids who already have a head start. Perfect grades. Top test scores. Research experience. Leadership. A certain kind of drive you can't fake.
So when you compare "Harvard graduates" to "everyone else," what are you really measuring?
Are you measuring what Harvard did to those students?
Or are you measuring what those students were already like before they showed up?
This is called selection bias, and it's one of the most important ideas in all of science.
Here's a classic way to see it. Imagine you compare Olympic sprinters to people at the mall. The sprinters run way faster. Obvious. But would you conclude that the stadium made them fast? That if you moved a shopper to the track, they'd suddenly break a world record?
Of course not. The stadium didn't make the sprinters fast. The stadium selected fast people.
Colleges might be doing the same trick. And nobody knew for sure, because you can't run the real experiment. You can't take one student, split them into two copies, send one to Harvard and one to Penn State, and compare them thirty years later.
Or can you?
The Detective Work
In the late 1990s, two economists, Stacy Dale at Mathematica and Alan Krueger at Princeton, had an idea so clever it changed the whole field.
They couldn't clone students. But they could find something almost as good.
They tracked down pairs of students with eerily similar profiles. Same test scores. Same high school GPAs. Same ambitions, measured by the list of colleges they applied to. Same colleges accepting them.
But different choices about where to go.
Think of it this way. If two students both apply to Harvard, Yale, MIT, and State U, and both get into all four, they probably have similar aspirations and similar abilities, whatever test scores can't capture. One of them goes to Harvard. The other goes to State U. Now you have something close to a natural experiment.
Dale and Krueger did this matching for thousands of students and followed them for decades. Their 2002 paper used self-reported earnings from the College and Beyond longitudinal survey. Their 2014 paper used administrative earnings data from Social Security records, real paycheck data instead of self-reports.
Students who went to the fancy schools earned roughly the same as their twins who went somewhere less fancy.
The prestige premium, the huge earnings gap everyone was pointing to, mostly vanished once you compared apples to apples.
A kid who said no to Yale earned about the same as a kid who said yes to Yale, as long as they started out equally capable.
But Wait, Maybe the Data Was Old?
Scientists love to stress-test each other's findings. So other researchers dug in.
In 2023, a team at Harvard's own Opportunity Insights tried something even cleaner. They looked at students who had been waitlisted at Ivy League schools. Some got pulled off the waitlist and admitted. Some didn't. Whether you got in was almost random; it depended on how many admitted students said yes first, yield numbers, last-minute decisions by strangers.
It was as close to a coin flip as real college admissions gets.
The researchers followed both groups using IRS tax records.
Here's where it gets interesting. For the typical outcome that parents and magazines usually track, admission to an Ivy-Plus college had a "small and statistically insignificant" impact on mean earnings and on the probability of reaching the upper middle class, compared to attending a highly-selective flagship public. That result is consistent with what Dale and Krueger had found decades earlier.
So for ordinary middle-class career outcomes, the Ivy didn't make people richer on average.
The Part That Does Change Things
Now here's where it gets genuinely interesting. Because the Ivy isn't a scam, exactly. It just does something different from what people think.
The same 2023 study found that while typical earnings barely budged, something else changed dramatically: the odds of ending up at the very top.
Students admitted to an Ivy-Plus college were about 50% more likely to reach the top 1% of the earnings distribution, nearly twice as likely to attend a highly-ranked graduate school, and about 2.5 times as likely to work at a firm in the top decile of the firm-pay distribution.
The pattern at the top of society is striking. Fewer than half of one percent of Americans attend an Ivy-Plus college. Yet those twelve schools produce more than 10% of Fortune 500 CEOs, a quarter of U.S. senators, and three-fourths of Supreme Court justices appointed in the last half-century.
So on the evidence we have, the Ivy does not appear to raise the floor. It appears to raise the ceiling. The waitlist design gets us closer to causal claims than most studies in this space, but it is still a natural experiment on one slice of the admissions process, not a randomized trial of the college itself.
It's like the difference between a reliable daily driver and a lottery ticket. The daily driver gets you to work every day. The lottery ticket mostly changes nothing, but it hands you a chance at a jackpot you wouldn't have otherwise.
For most students, the Ivy is a lottery ticket. Usually it doesn't hit.
For some students, that ticket pays off spectacularly.
Now Let's Talk About Science Specifically
This is where the story gets juicy for anyone reading this magazine.
There is a reason STEM fields plausibly behave differently from everything else when it comes to school prestige, and it sits right in the Chetty-Deming-Friedman and Dale-Krueger findings: in STEM, skills are demonstrable in a way that makes school name a poor proxy.
In STEM, employers can judge you on what you can actually do. Can you debug the code? Solve the equation? Design the experiment? Read and extend the paper?
The skills are concrete. The curriculum is standardized. A gradient descent algorithm works the same whether you learned it at MIT or at Michigan State. A protein folds the same way in a Princeton lab as in a Rutgers lab.
Employers can test for those skills directly. They don't need to use the school name as a shortcut.
This is the opposite of, say, consulting or investment banking, where prestige IS the signal. In those industries, the diploma is half the resume. In science, the diploma is one line on a resume full of projects, papers, and code.
What About Grad School?
This is where the pattern gets a little more nuanced.
For an MBA, a business master's, prestige matters enormously. A study that applied the Dale-Krueger method specifically to MBA programs found big earnings returns to going to a selective program, even after correcting for selection bias. The top-tier MBA really does pay off more than the bottom-tier MBA, for the same student.
But for a master's in science or engineering? The evidence points the other way.
Having the degree at all matters. Georgetown's "College Payoff" report found that, across engineering and computing occupations, a master's adds hundreds of thousands of dollars in lifetime earnings over a bachelor's in the same field. The field is the thing.
The school? Much less.
An MS in computer science from Georgia Tech opens about the same doors as an MS from Cornell. Stanford and MIT have slight recruiting advantages at certain tech companies, but the effect shrinks fast with experience. After a few years on the job, your GitHub and your last project matter far more than your diploma.
And PhDs?
Here the story is even more interesting, and a little sobering.
A VATT Institute study from Finland used a clever trick. Finland's most prestigious engineering school uses a point-based admissions system with a hard cutoff. Students who score just above the cutoff get in. Students who score just below, don't.
This creates two groups of students who are basically identical; the difference between them is a rounding error on an admissions test. It's as close to a real experiment as social science gets.
The researchers followed both groups into the labor market.
In this natural experiment, being admitted to the elite school was associated with a much higher chance of graduating from that elite school (obviously). Stronger peer group. Better access to campus.
But on average early-career earnings?
No significant effect.
The same story keeps repeating. At every level, bachelor's, master's, PhD, in STEM, the school name does much less than people think.
There's one more wrinkle PhD students should know. A study of the Survey of Doctorate Recipients found that each additional year spent doing a postdoc actually reduces lifetime average earnings by about $3,730 per year. Postdocs are often necessary for academic careers, but they come at a real economic cost, regardless of how fancy your PhD institution was.
The One Big Exception
There's one group for whom the elite school genuinely does change the trajectory. Not by a little. By a lot.
Students from low-income families. Students whose parents never went to college. First-generation students.
Dale and Krueger found it in 2002. The Chetty team found it in 2023. The Finland study found it in 2026. Every study, same pattern.
Why?
Think about what a kid from a well-connected family already has before they even enroll. A parent who can explain how professional jobs work. Family friends who are lawyers, doctors, engineers, professors. Dinner table conversations about office politics, salary negotiation, graduate school applications. Internship leads from an uncle. A network.
A first-generation student walks in with none of that.
When they arrive at Harvard, they suddenly have access to it all. Classmates whose parents run companies. Roommates who already know what an MBA is for. Career offices specifically designed to compensate for a missing background. Alumni who pick up the phone.
For kids who already have the network, the Ivy adds more of the same. For kids who don't, enrolling at the Ivy suddenly puts a network within reach that wasn't before. The Finland study, the Dale-Krueger matching work, and the Chetty waitlist design all converge on this pattern across very different populations and methods, which is about as much causal confidence as observational social science gets; we are reading those findings together, not any single one in isolation.
That access is a real gift. A measurable one.
So What Should You Actually Do?
If you made it this far, you're probably a student, or a parent of one, trying to figure out how to think about all this.
Here's what the evidence actually tells you.
If you love science and you're good at it, the rankings are a much smaller deal than you've been told. A student who's hungry, curious, and hardworking will end up doing well whether they study at Harvard or at their flagship state university. The data is clear on this.
The questions that matter more than rankings are the ones no ranking can answer:
Who's the professor you'd most want to work with? Which lab studies the thing that actually keeps you up at night? Where will you find peers who push you? What's the research culture like?
A brilliant mentor at a mid-ranked school beats a famous school with nobody who has time for you. A lab working on the problem you love beats a lab with a fancier logo but boring work.
If you're a first-generation student or from a low-income family, the picture changes. The network effect is real for you in a way it isn't for peers with a head start. If you get in, the research says go.
If you're aiming for the top 1%, you want to run a hedge fund, found a unicorn, clerk at the Supreme Court, then yes, on the evidence from the waitlist natural experiment, admission is associated with better odds of getting there. It doesn't guarantee anything. It appears to stack the odds.
For everyone else in STEM, the most freeing thing the research says is this: your school name won't make you. And it won't unmake you either.
The Thing Nobody Tells You
Here's the honest truth that most college admissions coverage misses completely.
The student Maya became in our opening scene? She wasn't made by Harvard. She was the kind of person who could get into Harvard before she ever got there. Her curiosity, her grit, her late nights hunched over a textbook because she genuinely wanted to understand, those were the engines driving her career.
Same for Alex at Penn State.
The school is a setting. The person is the story.
You're the person. Choose the setting that helps you become more of who you already are. Choose the labs and mentors and problems that light you up. Run your own experiment.
And trust the evidence: no matter what the magazine rankings say, the outcome was mostly decided before you even opened the acceptance letter.
Sources
Dale, S. B., & Krueger, A. B. (2002). Estimating the payoff to attending a more selective college. Quarterly Journal of Economics, 117(4), 1491–1527. DOI. Preprint: NBER WP 7322.
Dale, S., & Krueger, A. B. (2014). Estimating the effects of college characteristics over the career using administrative earnings data. Journal of Human Resources, 49(2), 323–358. DOI. Preprint: NBER WP 17159.
Chetty, R., Deming, D. J., & Friedman, J. N. (2023). Diversifying society's leaders? The causal effects of admission to highly selective private colleges. NBER Working Paper 31492. Non-technical summary on Opportunity Insights.
Kuuppelomäki, T., Kortelainen, M., Suhonen, T., & Virtanen, H. (2019). Does admission to elite engineering school make a difference? VATT Working Paper 127. Published 2026 in Journal of Human Capital as "Labor market returns to elite STEM education."
Cheng, S. D. (2023). What's another year? The lengthening training and career paths of scientists. PLOS ONE, 18(6), e0285550. DOI.
Arcidiacono, P., Cooley, J., & Hussey, A. (2004 WP; 2008 published). Returns to schooling: Results when the counterfactual is observed. Duke Economics working paper; published 2008 as "The economic returns to an MBA," International Economic Review, 49(3), 873–899.
Carnevale, A. P., Cheah, B., & Hanson, A. R. (2015). The economic value of college majors. Georgetown University Center on Education and the Workforce.
Carnevale, A. P., Rose, S. J., & Cheah, B. (2011). The college payoff: Education, occupations, lifetime earnings. Georgetown University Center on Education and the Workforce.
U.S. Department of Education (ongoing). College Scorecard. Public-domain dataset of post-enrollment earnings by institution.

