CSStipendRankings/CSStipendRankings

Suggested ranking measures

Closed this issue · 24 comments

Thanks for building this resource.

  1. It would help disaggregate 9 month (promised), 12 month (promised), and 12-month (extrapolated) stipend.

  2. It would help to provide a rank based on gross income before tax after fees, without living wages.

  3. The MIT Living wage calculator does seem to be slightly off, based on experience!
    For instance Berkeley (Alameda County) is $46,488 vs UCSC Santa Cruz ($54K) sounds rather unreal, but perhaps the rent in Santa Cruz is indeed that high.

Some universities only have 8 month promised funding. For example, Boston University does 8-month. Such an aggregate could be more complicated to implement in practice due to this difference. The alternative could be to only include promised funding vs. extrapolated funding to 12 months. But, this also could have problems. University of Michigan guarantees only 9 months of funding, but the practice in CSE is to give 12 months of funding regardless of this.

This sort of transparency is good for the field.
Agreed. However, it would be good to include promised (in offer letter) vs extrapolated as a column.

In some universities, while the 12 month stipend uses 100% RA in summer for extrapolation, it is often left to PIs to pay RAs 50% or 100%. This makes the blanket 12 month numbers less meaningful.

it would also help to include redacted offer letter PDFs that include the numbers.
Otherwise the numbers can marked be self-reported vs verified (redacted offer letter).

We will consider adding a feature that allows users to remove unpromised money from the stipend. (And you are both welcomed to contribute!)

Shall we add the minimum wage of the state that the university is located at? maybe times with the total work hours per year.

@zhezhouzz I'm strongly against adding the minimum wage of the state for two reasons:
(1) There is a lot of contention about how many hours PhD students "work". This is mostly rooted in the debate of what is "work" and what is "education". Trying to compare the stipend to some hourly rate can get really in the weeds of this debate.
(2) Often the state minimum wage is not reflective of the actual minimum wage of the town the university resides in. For example, the University of Michigan has instituted a minimum wage of $15/hr for student employees, and the city of Ann Arbor has a minimum wage of $14.05/hr for those working on city projects, but the state minimum wage in Michigan is $10.10/hr.

This sort of transparency is good for the field. Agreed. However, it would be good to include promised (in offer letter) vs extrapolated as a column.

In some universities, while the 12 month stipend uses 100% RA in summer for extrapolation, it is often left to PIs to pay RAs 50% or 100%. This makes the blanket 12 month numbers less meaningful.

it would also help to include redacted offer letter PDFs that include the numbers. Otherwise the numbers can marked be self-reported vs verified (redacted offer letter).

@animesh-garg Agree with everything written here. Some redacted offer letter (or department webpage) would be immensely helpful for verifying the data. One thing to note here is that up until recently the UMich CSE department did not list the promised summer funding in the official offer letter (after the first year) and that funding was guaranteed for 5 years, but it was a well-known unofficially that this was the case. Now, they do make this clear in the offer letter on an official capacity due to substantial efforts from a group of faculty members. I do think that offer letters should be more transparent, so this effort could raise awareness that the benefits outlined in the offer letter are not always fully obvious.

@noah-curran I agree with what you said, however, I suggest to add the minimum wage to reflect the "minimum pay to live in a state" instead of desired pay of a PhD student. The former one is objective and based on the law, the later one is subjective. I think this measure can be a good complement of the "living cost".

BTW, even we don't add it, people will easily think of the minimum wage from our stipend ranking.

@zhezhouzz the minimum wage does not measure the minimum pay to live in a state, and it has not measured that for several decades.

ahoho commented

There are some issues with the MIT calculator that I've tried to address in a related project I've done for the Big 10 and CS programs using Jeff Huang's numbers from last year.

It's a great benchmark, but subtracting stipend from their living wage isn't exactly accurate for a couple reasons

  • Graduate assistants don't pay FICA taxes
  • GAs also usually have subsidized health insurance. The MIT number healthcare costs comprise both premiums and estimated out-of-pocket expenses, so I'm not sure how to best decouple them, but they're definitely an overestimate

The best fix I came up with is to take the MIT figure less tax & healthcare as a cost-of-living number, which I compare to the stipend less premiums and calculated taxes (this is in the Big 10 sheet; didn't get around to the healthcare numbers for CS).

I do think the rank order is largely unchanged by these factors, but did want to note it. Not sure if I should make a separate issue?

[Also, sending my support to everyone on strike at UM!]

Some universities only have 8 month promised funding. For example, Boston University does 8-month. Such an aggregate could be more complicated to implement in practice due to this difference. The alternative could be to only include promised funding vs. extrapolated funding to 12 months. But, this also could have problems. University of Michigan guarantees only 9 months of funding, but the practice in CSE is to give 12 months of funding regardless of this.

Generally most universities are at a 9-month model.
While a small set of institutions may not have the 9-3 split, Summer is always treated specially.
Since summer term at different at different universities, especially out of US.

Instead of calling is 9-month, you may call it with and without Summer.
This could be made into a toggle.

Instead of calling is 9-month, you may call it with and without Summer.

I believe this is what is currently being worked on with the SUMMER tag. But, I think there are a lot of factors at odds here, and we risk adding too many toggles when we deal with all of these conditions for funding. There seem to be at least a few things to consider (and probably more that I'm not aware of):

  • Summer? Yes/No
  • Non-summer 8-month or 9-month?
  • If summer, 0.5 FTE or 1.0 FTE?

It seems the best solution is your solution "to include promised (in offer letter) vs extrapolated as a column". That way we don't have to deal with a ton of toggles. For more information, there could be an info box on hover that gives the breakdown of the final number for each institution. The various data could be required as part of the csv entries and this info box will be automatically populated and generated based on the institutional funding info.

An example for the University of Michigan:
12-months of funding. Summers are 3-months at 0.5 FTE.

Thoughts on this?

I think the current summer tag is likely insufficient.

For instance, consider offer letters from following Univs
Uni A: 9-months promised at 27K (3K/month).
Uni B: 12-months promised at 36K (3K/month)
Uni C: 12-months promised at 45K (3K/month + 6K/month in summer @1.0FTE)
Uni D: 8-months promised at 24K (3K/month)

All three pay equally per month, yet they would end up in different places in the ranking.
While Summer is not guaranteed at Uni A/D, students are not expected to work for free either!
Similarly, a promise of funding at Uni B/C doesn't always mean an RA.
They could be required to TA. So, students are either going to be TAs, or RAs, or be free to find internships.
It is worth noting that even within the same Uni, a differential may exist in 0.5 FTE vs 1.0 FTE during the summer.
What if some students are getting 1.0 FTE at Uni A, while others are getting 0.5 FTE? This would make that offer better than of Uni B.

Hence, AY salary normalized to monthly is a fair standard to be set as default
AY may in a select few places be smaller than 9 months, but would still yield a more apples-to-apples measure. We do not need to aggregate this with summer stipend.
This would read:

In an typical month during an active term, when the student is registered full-time, what do they expect to get.

This would also simplify a future inclusion of non-US universities, which may follow a slightly different funding scheme.
For instance as University of Toronto, where PhD student funding, even if guaranteed, is a combination of RA, TA and fellowships.

Since the summer is rather nuanced, it is better to treat it separately.
We may include Summer Stipend as an additional Column. It may be included in ranking with a checkbox toggle.
The Summer Stipend column would include all additional information,

  • promised vs extrapolated. It could be distinguished by color scheme.
  • 0.5 FTE vs 1.0 FTE in summer. It can be distinguished by an asterisk.

So, in this case, the data might look like

| Uni Name | AY (normalized monthly) | AY | Summer |
| Uni A | 3K | 27K | NA or (9K) |
| Uni B | 3K | 27K | 9K |
| Uni C | 3K | 27K | 18K* |
| Uni D | 3K | 24K | NA or (9K) |

Note:

  1. NA should not necessarily count negatively, since this is information genuinely not available.
    A monthly extrapolation could be used at 0.5 FTE rate during the summer.
  2. The AY and Summer columns do not need to list how many months, since this may vary across institutions.
    AY includes the length that is considered normal Academic term at that university.
  3. All four Univs rank the same - by default,
    However if annualized for 12 months either using promised or extrapolated sums, then ranking changes to C>B=>A=>D. Ties are broken in favor of more promised money.

The best fix I came up with is to take the MIT figure less tax & healthcare as a cost-of-living number, which I compare to the stipend less premiums and calculated taxes (this is in the Big 10 sheet; didn't get around to the healthcare numbers for CS).

We are a bit hesitant to dive into the itemized costs; this may make it too complex for people to maintain and contribute to this repository. The actual living cost for each individual student will be very different based on individual choices and background (e.g., different living arrangements, whether there are childcare needs, etc.), and it is not possible for us to consider all these individual factors.

And also, not all institutions provide free insurance coverage as benefits, and students may have different co-pays when visiting doctors, or still have to purchase their own insurances to cover some gaps (e.g., many of us at UM purchase additional dental and vision coverage that is not offered by the university, which would be reflected in the MIT estimation). We would be going down the rabbit hole of itemizing costs again from this perspective.

With that being said, we do agree that the living cost estimate, regardless of where it is from, can be subjective and inaccurate, and we will consider adding alternative sources of the living cost (e.g., those calculated by the institutions) so that users can choose which one to use for the ranking purpose.

ahoho commented

@jiong-zhu For reference, I'd strongly recommend against using the university-provided cost-of-attendance. My data show that universities in high-CoL areas tend to underestimate the true costs, misleading prospective PhDs. The MIT figure uses public benchmarks, usually from government sources, but university methods have no transparency.

For example, the university-provided cost of attendance at UMD (which is just outside DC) estimates rent to be equal to that of University of Indiana, Bloomington. Meanwhile, the Dept of Housing and Urban Development estimates a studio in our area to be twice as expensive as Bloomington.

I hear what you're saying about healthcare costs, and definitely agree that there is a complexity with healthcare that renders maintenance difficult. That said, it does distort the rankings because I know MIT overestimates healthcare costs for the Big 10 schools by several thousand dollars a year (this is because most put graduates on a heavily subsidized state employee plan; it's one of the few places where public schools can positively differentiate themselves in terms of benefits). I don't think there's any one right way to do it, but my inclination is to remove the healthcare cost from the MIT baseline and count premiums as a fee, then caveat that costs outside premiums are not included. This definitely induces some overhead, but if we maintain information about sources, then maybe it's not too hard?

ahoho commented

@animesh-garg I guess my question is: what is the purpose of this ranking?

Monthly pay rates do not seem to be as informative to prospective graduates, who still need to live and eat over the summer. And they don't strike me as useful when negotiating with administration, who know that prospective graduates will be comparing schools based on the number in the offer. You're completely right that there are a lot of factors that influence pay, but to my mind, the true apples-to-apples ranking is whatever is guaranteed in the offer letter. (I do think that having it as an option is a good idea, though).

These rankings are already provoking discussion among faculty at my institution (presently the last-ranked school). The default rank matters, because it has the potential to influence administrative decisions

I do agree with you that the guaranteed offers don't give a complete picture. Perhaps it's worth including information on reasonable upper bounds, whatever they may be. For example, I will have had two summers funded as a full-time RA (40 hours per week / 1.0 FTE), and I know that's relatively common at my university. But I know there are others at 20 / 0.5, 30 / 0.75, and who are not funded at all. It sounds like UMich only does 0.5? It looks like the current form asks about the maximum, which seems like a reasonable approach.

ahoho commented

Also just to note—great work on this! I had been trying to get traction at our university with the spreadsheet I linked above, but using the familiar csrankings format on a public website is inspired

@animesh-garg I guess my question is: what is the purpose of this ranking?

I believe it should be to create transparency in stipends and costs while pursuing CS PhDs.
While I dont really agree with the term "ranking" here, but this is an invaluable resource towards national equity and transparency. This will hopefully used by perspective students to make informed decisions.

Monthly pay rates do not seem to be as informative to prospective graduates, who still need to live and eat over the summer. And they don't strike me as useful when negotiating with administration, who know that prospective graduates will be comparing schools based on the number in the offer. You're completely right that there are a lot of factors that influence pay, but to my mind, the true apples-to-apples ranking is whatever is guaranteed in the offer letter. (I do think that having it as an option is a good idea, though).

I still argue normalized monthly 9-month salary + summer is still a better metric.
It retains transparency, while providing a way to handle various institutional rules on how long is summer and how much is provided in summer stipends. As I stated in the example, if there is no guaranteed money, just mark it as a Zero/NA.
If a student has an offer from this Uni, this would then be a question for current students and PI.
We could also collect self-reported data on actual salary and expenses via Drafty.

Indeed guaranteed numbers are the best. Worth noting that, in the monthly normalization scheme, we don't remove them, but dont make ranking measure just on this. We should break ties in favor of more guaranteed money.

However, the issue is with guaranteed vs actual summer salary.
If we only make offer letter number to be a ranking measure then students will be misguided, since other places while not guaranteeing may still be funding through the summer. The numbers in offers are often a result of arcane accounting practices!

I appreciate the efforts by the maintainers of this repo. However, it might inadvertently create a detrimental effect if reductive measures become the goto metric for students.

These rankings are already provoking discussion among faculty at my institution (presently the last-ranked school). The default rank matters, because it has the potential to influence administrative decisions

Yes, indeed! But this should be treated as a resource.
I would also add two additional features

  • verified offer letters
  • self-reported salaries and living expenses. Instead of providing only Gross aggregates, we could provide median ranges. This could be done with an open source tables based data using Drafty

I do agree with you that the guaranteed offers don't give a complete picture. Perhaps it's worth including information on reasonable upper bounds, whatever they may be.
For example, I will have had two summers funded as a full-time RA (40 hours per week / 1.0 FTE), and I know that's relatively common at my university. But I know there are others at 20 / 0.5, 30 / 0.75, and who are not funded at all.
It sounds like UMich only does 0.5? It looks like the current form asks about the maximum, which seems like a reasonable approach.

If offers and salaries are not standard across groups in a university, which is often the case, then instead of providing maximum, median or minimum numbers might be more informative.

ahoho commented

@animesh-garg "Ranking" is in the name of this project, so I think we should adhere to that purpose. From the comments of the maintainers here and the website introduction, a larger goal is to encourage "institution[s] to pay a living wage."

To your second point, it seems you're arguing that, in practice, many universities typically offer summer funding even if it is not guaranteed. Indeed, that is often---although far from universally---the case at my institution. Unfortunately, de facto is not de jure. I can tell you first-hand that the lack of guarantees around summer funding cause an immense amount of financial and emotional stress among graduate assistants, even for those who have soft assurances. Internship offers get rescinded; grants get rejected; summer classes are cancelled; professors leave; etc.

There is already an effective tie-breaking in place, because a lack of guaranteed summer funding lists all such universities lower. If anything, I would hope the ranking should encourage institutions to provide guarantees. I'm not sure arcane accounting practices ought to be the target of blame, as much as the systems that support those practices. The current ranking, as it stands, is basically measuring the worst-off student.

That said, I think your suggestion to collect data to estimate the expected rate of (summer) funding is a useful additional metric (as I mentioned in the other thread). Knowing the median (& min, max) stipend, if we can avoid staleness issues, would be very useful.

On the other hand, self-reported living costs leads to a very biased metric, because GAs are constrained by what they are paid. To say students at UIUC and UCSD both spend $900/month on housing tells us very little about living conditions. Having a first-principles "basket-of-goods" index (per the MIT or other CoL estimates) seems more appropriate.

self-reported living costs leads to a very biased metric

Mostly provides additional information on actuals, because Grad students are much more like each other than they are to the general populace in spending patterns. It is not an either/or.

@animesh-garg Firstly, thank you for this discussion. I appreciate what you're saying and hope we can find the best way to present this data. I love your ideas of verified (and redacted for personal information) offer letters and reported salaries (and metrics thereof). I agree that transparency is important and this information would help achieve that. I think having the guaranteed minimum drive the ranking, and auxiliary data to see what "could" happen if you land some luck would improve transparency. It would also alleviate the concern that some low-ranking universities are devoid of well-paying opportunities; indeed, some do have better offers for a few students (I share an anecdote of one of my own experiences of such a case here: #43 (comment)).

self-reported living costs leads to a very biased metric

Mostly provides additional information on actuals, because Grad students are much more like each other than they are to the general populace in spending patterns. It is not an either/or.

I'd agree it is interesting information about the spending patterns of grad workers, but I don't think it yields useful data for the purpose of this ranking. For instance, something I know to happen with many grad workers is that they will "get by" financially through skipping meals. So, their spending pattern would reflect this behavior. Other examples I've heard from grad workers are avoiding medical procedures/consultations, living further from the university and wasting much of their day commuting, or skipping loan payments. I'm sure there are other examples I'm unaware of.

One could interpret the data to mean that grad students are getting by just fine; if they spend around what the stipend amount is, then they are able to make the stipend fit the CoL. (If they are spending more, then we should ask whether they are taking out additional loans or relying on family support. Neither of these should happen. This is why I personally would find it to be interesting data.) But the worrying hidden trait is that they could be exhibiting a destructive behavior in order to spend around the stipend amount, especially in areas where the CoL is higher than the stipend amount. The data you propose is absent of what they aren't spending their money on and should be in order to live a happy and healthy life.

ahoho commented

Wanted to raise this again. What is the current policy for the default ranking?

Right now, it appears to include any non-guaranteed funding (our graduate director just updated our numbers to include a 40-hour / week summer appointment, but there isn't data to support that being the norm). It seemed per this discussion that the default rank should be by guaranteed funding?

Wanted to raise this again. What is the current policy for the default ranking?

I just raised this issue to a few others in my own department as needing urgent inspection. Before we do much more, getting a public methodology for what numbers should/shouldn't be included is the top priority. My personal concern was that something like this would happen, and I'm disappointed to hear it already is happening. Thanks for bumping this thread.

BTW - I'm not seeing the commit from your graduate director. Could you point me toward the one you're talking about?

Edit: Sorry, I see it now. I didn't refresh my page apparently.

Now we support only displaying guaranteed funding. I'm closing this issue.