gliderlabs/herokuish

Error during installation: manifest for not found for v0.5.40

ngrichyj4 opened this issue ยท 16 comments

It seems a new version has been released 0.5.40 but it hasn't been tagged yet in Dockerhub.

Importing herokuish into docker (around 5 minutes)
Error response from daemon: manifest for gliderlabs/herokuish:v0.5.40-18 not found: manifest unknown: manifest unknown
Error response from daemon: manifest for gliderlabs/herokuish:v0.5.40-20 not found: manifest unknown: manifest unknown
Error response from daemon: manifest for gliderlabs/herokuish:v0.5.40-22 not found: manifest unknown: manifest unknown
Error response from daemon: No such image: gliderlabs/herokuish:v0.5.40-18
Error response from daemon: No such image: gliderlabs/herokuish:v0.5.40-20
Error response from daemon: No such image: gliderlabs/herokuish:v0.5.40-22

Currently unable to update Dokku instances because of this issue. Looks like last Dockerhub update was 14 days ago:

https://hub.docker.com/r/gliderlabs/herokuish/tags

Same here.

For anyone who's dokku/etc. install broke because of this, this will get you back to working (on Debian/Ubuntu):

sudo apt install herokuish=0.5.39

What an embarrassing oversight

Quite embarrassing. I triggered a release and it failed, but then I got sick (still recovering, mind you, pretty sure it's Covid) and haven't had time to look into the problem.

It's comments like that which leave me tempted to leave this broken.

What an embarrassing oversight

The only embarrassing thing about your comment is that you thought it was worth making.

Thanks for all your hard work @josegonzalez. Synchronising releases to multiple targets is tricky, unsurprising that we get occasional mismatches. ๐Ÿ‘

Quite embarrassing. I triggered a release and it failed, but then I got sick (still recovering, mind you, pretty sure it's Covid) and haven't had time to look into the problem.

It's comments like that which leave me tempted to leave this broken.

Hell, everyone gets sick once in a while. But how a failing release ended up in the repos... That's beyond me.

But how a failing release ended up in the repos... That's beyond me.

Clearly.

What an embarrassing oversight

The only embarrassing thing about your comment is that you thought it was worth making.

Thanks for all your hard work @josegonzalez. Synchronising releases to multiple targets is tricky, unsurprising that we get occasional mismatches. ๐Ÿ‘

I'm sorry I so deeply offended you with my comment. I do not doubt there's a lot of effort going into this project, but a failing install/update might not convey that to other users.

Yeah please uninstall dokku and any usage of it from your systems, I'd rather not support someone who just complains vs trying to actually look into the issue.

For anyone who has more brain cells than I do at this point (literally feel like dying here, though at least much better than the past few days) please take a look at the GitHub actions on the release branch. I feel like I'm forgetting something there, like asset building, or maybe it's trying to run all tests against non-amd64 builds (and that will definitely fail as the official heroku buildpacks almost certainly don't work on non-amd64).

I should be fine by Monday to take a look at this. I think @michaelshobbs can also pull the package from packagecloud to unblock new dokku installs (the credentials should be in 1password).

Thanks for all your hard work, @josegonzalez. I'm very sorry you're sick, and hope you feel better soon. I'll take a look at the actions on the release branch, though I may not be able to shed any light -- we've had a pretty hard time getting multi-arch builds to work.

For anyone else who wants to take a look, I believe the build in question is here and the failure looks like

#41 [linux/amd64 builder 5/5] RUN go build -a -ldflags "-X main.Version=v0.5.40" -o herokuish .
#41 35.60 # herokuish
#41 35.60 ./herokuish.go:71:16: undefined: Asset
#41 35.60 ./herokuish.go:96:38: undefined: Asset
#41 35.60 ./herokuish.go:98:46: undefined: Asset
#41 ERROR: process "/bin/sh -c go build -a -ldflags \"-X main.Version=$VERSION\" -o herokuish ." did not complete successfully: exit code: 2

Here's herokuish.go and .github/workflows/release.yml.

I think this should be fixed but am rebuilding everything locally to get things pushed up.

Looks like the latest version now successfully installs. I'll have a bunch of follow-ups here - including to the release process, which is a bit funky due to us trying to past me trying to be clever and lazy - but things should be working for folks.

Aside, I've also blocked @codesalatdev from commenting participating in the Gliderlabs and Dokku orgs on Github. I don't really make a ton of money from donations to the Dokku project, and certainly not enough for someone to call my work in my spare time an embarrassment. I also wouldn't accept a job where that happened either, so there isn't a number you could pay me to let me feel denigrated. I feel everyone working on OSS is entitled to a bit of respect, whether or not their work is deemed "good" in the eyes of others, and especially when that work is being given away for free.

To tack on to that last thought, no one is owed anything for any given OSS project. Heck, this one is even MIT licensed, which quite literally gives me no liability and provides no warranties for users. If for whatever reason I stopped working on this tomorrow - because I have work commitments, my computer broke, my apartment caught on fire, my life made it impossible to contribute, I died, or I just didn't want to - then that still doesn't give anyone any entitlements surrounding this or any other project I contribute to. If I left the project in a broken state on purpose, I still wouldn't owe anyone anything (though I might feel bad about it). If someone doesn't like that, then I advise you to pay for a service where you can complain to customer support (Heroku provides a very nice hosting platform for $7 an app!).

Whether or not I got a belated apology (and one that doesn't even seem sincere) is moot - do better in the future.

Thanks for your hard work on this project @josegonzalez, I hope you fully recovered from the sickness! Otherwise don't worry too much about the occasional bad release, most of us know how to install a previously working version instead of blaming you here ;)