wilkelab/ggridges

geom_density_ridges_gradient produces stripey gradient

Opened this issue · 4 comments

Using the example from the vignette:

ggplot(lincoln_weather, aes(x = `Mean Temperature [F]`, y = Month, fill = stat(x))) +
    geom_density_ridges_gradient(scale = 3, rel_min_height = 0.01) +
    scale_fill_viridis_c(name = "Temp. [F]", option = "C") +
    labs(title = 'Temperatures in Lincoln NE in 2016')

The resulting gradients are stripey, i.e. there vertical striations in what should be a smooth color gradient. Here's an example. Note that I've added alpha = 0.5 to make the stripes more visible, although they are also visible without:

stripe_example

Details:

  • Computer: MBP M1 running 11.2.3
  • R version: 4.0.3
  • RStudio version: 1.3.1093
  • ggridges version: 0.5.3
  • ggplot2 version: 3.3.3
  • viridis version: 0.6.0
  • viridisLite version: 0.4.0

I tried to test with the development version of ggridges, but it kept throwing the error:

Fehler in get0(oNam, envir = ns) : 
  lazy-load database '/Library/Frameworks/R.framework/Versions/4.0/Resources/library/ggridges/R/ggridges.rdb' is corrupt

I can confirm that the Issue appears outside of RStudio as well.

Setting gradient_lwd to a value > 1 works. I had already found this workaround, but none of the online examples I looked at use it, and the values 0.5-0.7 only improved things minimally. The docs say 0 is ideal, so I didn't try any larger values… until just a moment ago. I was interested in figuring out what gradient_lwd does, exactly, so I set it to 5 and saw that the artifacting does clear up completely if the value is high enough. Somewhere around 1.5 seems to be the sweet spot in my case. So, I guess the issue is closed. Maybe someone else will benefit from my battle 💪.

The gradients are drawn by drawing many thin stripes next to each other, and depending on the precision of the graphics device and rounding errors you get stripes. gradient_lwd draws lines of a given width around the stripes, and so causes the stripes to partially overlap. This can make things better or worse, depending on many different factors. Mathematically gradient_lwd should be zero, but that almost never works. I would set it to the smallest value that works for the given graphics device and desired output size of the image.

Thanks for this explanation. I suspected something like this was going on, but I wasn't sure. I vaguely recall there being some artifacting on an older Intel Mac, but nothing as pronounced as now. Does this imply some kind of issue with newer versions of Quartz? Is there a way to modify Quartz parameters directly? I've taken a look at the grDevices functions, but I haven't found anything useful yet.