FinalsClub/karmaworld

buildpack no longer finishes, seems to error with libffi

btbonval opened this issue · 12 comments

Since the buildpack repo (https://github.com/FinalsClub/heroku-buildpack-karmanotes) has no issue tracker, I'll put this issue here.

Having problems with libffi. Last successful build was 8 days ago. No one else running forks of this has updated their repos, so either they don't have the problem or haven't pushed a fix for it yet.

-----> Noticed cffi. Bootstrapping libffi.
curl: /app/.heroku/vendor/lib/libsasl2.so.2: no version information available (required by /usr/lib/libldap_r-2.4.so.2)

gzip: stdin: not in gzip format
tar: Child returned status 1
tar: Exiting with failure status due to previous errors

 !     Push rejected, failed to compile Python app

To git@heroku.com:karmanotes-beta.git
 ! [remote rejected] note-editing-merge-more -> master (pre-receive hook declined)
error: failed to push some refs to 'git@heroku.com:karmanotes-beta.git'

I have no idea what this is, but if I had to guess, the downloaded file is a 404 error or something and not the expected gzipped tar file curl'd here:
https://github.com/FinalsClub/heroku-buildpack-karmanotes/blob/master/bin/steps/libffi#L20

Myth confirmed.

$ curl -s -L -o /tmp/tmp-libffi.tar.gz https://s3.amazonaws.com/rt-uploads/libffi-3.0.tgz
$ file /tmp/tmp-libffi.tar.gz 
/tmp/tmp-libffi.tar.gz: XML document text

Upstream file no longer in that bucket.

<Error><Code>NoSuchBucket</Code><Message>The specified bucket does not exist</Message><BucketName>rt-uploads</BucketName>

Current libffi package traces back to a binary file ostensibly uploaded by @kennethjiang
FinalsClub/heroku-buildpack-karmanotes@3bb5fab

Checking some peers in the network, it looks like forks have not changed that static file location.

I suppose I need to figure out what that file is supposed to contain, generate it, and host it on our CloudFront CDN. Let's see if the process is documented somewhere... Nope. It's just a mysterious file.

Don't know what changes were involved, but the older URL still works.

$ curl -s -L -o /tmp/tmp-libffi2.tar.gz https://s3-us-west-2.amazonaws.com/mfenniak-graphviz/libffi-3.0.tgz
$ file /tmp/tmp-libffi2.tar.gz 
/tmp/tmp-libffi2.tar.gz: gzip compressed data, from Unix, last modified: Sun Jun 16 12:23:32 2013

I might be able to reverse engineer the file's contents from one of our running Heroku apps.
https://github.com/FinalsClub/heroku-buildpack-karmanotes/blob/master/bin/steps/libffi#L20

$ heroku run bash
/ $ cd /app/vendor/libffi-3.0/
~/vendor/libffi-3.0 $ du --summarize -h *
136K    lib
56K share

Since it just untargzipped everything in there, if I can get those contents out, I can tar/gzip them and host it. Now what dirty tricks can I do to get this file off there. Perhaps heroku run tar ... and truncate into a local file?

Not quite. heroku tools won't transfer binary files.

$ heroku run 'tar -c /app/vendor/libffi-3.0 2>/dev/null' > libffi-3.0.tar
 !    Heroku client internal error.
 !    Search for help at: https://help.heroku.com
 !    Or report a bug at: https://github.com/heroku/heroku/issues/new

    Error:       invalid byte sequence in UTF-8 (ArgumentError)

Looks like xxd will convert binary to/from ASCII hex.
http://stackoverflow.com/questions/13160309/conversion-hex-string-into-ascii-in-bash-command-line

... but xxd is not installed.

$ heroku run 'tar -c /app/vendor/libffi-3.0 2>/dev/null | xxd' > libffi-3.0.tar.hex
$ cat libffi-3.0.tar.hex 
Running `tar -c /app/vendor/libffi-3.0 2>/dev/null | xxd` attached to terminal... up, run.6542
bash: xxd: command not found

Alright. This seems to work. od was available and, in experiments, I was able to convert od output back to binary using xxd -r to convert od -Ax -t x1 output back to binary.

The problem is that the file size is not consistent.

$ heroku run 'cd /app/vendor/libffi-3.0; tar -cz . | wc -c'
Running `cd /app/vendor/libffi-3.0; tar -cz . | wc -c` attached to terminal... up, run.8856
51302
$ heroku run 'cd /app/vendor/libffi-3.0; tar -cz . | wc -c'
Running `cd /app/vendor/libffi-3.0; tar -cz . | wc -c` attached to terminal... up, run.7479
51637
$ heroku run 'cd /app/vendor/libffi-3.0; tar -cz . | wc -c'
Running `cd /app/vendor/libffi-3.0; tar -cz . | wc -c` attached to terminal... up, run.1327
50547

The size is consistent without compression, but the contents seem to change. No idea why. Maybe it is doing it in a different order?

$ heroku run 'cd /app/vendor/libffi-3.0; tar -c . | wc -c'
Running `cd /app/vendor/libffi-3.0; tar -c . | wc -c` attached to terminal... up, run.2219
153600
$ heroku run 'cd /app/vendor/libffi-3.0; tar -c . | wc -c'
Running `cd /app/vendor/libffi-3.0; tar -c . | wc -c` attached to terminal... up, run.9355
153600
$ heroku run 'cd /app/vendor/libffi-3.0; tar -c . | md5sum'
Running `cd /app/vendor/libffi-3.0; tar -c . | md5sum` attached to terminal... up, run.8747
84b89cd1b9eb857f0ad87750c0f5e5fd  -
$ heroku run 'cd /app/vendor/libffi-3.0; tar -c . | md5sum'
Running `cd /app/vendor/libffi-3.0; tar -c . | md5sum` attached to terminal... up, run.7152
41a278ce6950894839f59b56208bf4ab  -

Tried to download an uncompressed version, it was clearly truncated before heroku finished (an initial nibble was present in the last byte, but no second nibble). Converting back to binary, the file size was only 47573 bytes (compare to 153600).

I just don't see how I can get these files off of the Heroku slug.

$ heroku run bash
Running `bash` attached to terminal... up, run.4029
$ cd /app/vendor/libffi-3.0/
~/vendor/libffi-3.0 $ tar -cz . > /tmp/libffi-3.0.tgz
~/vendor/libffi-3.0 $ cd /tmp/
/tmp $ ssh-keygen -b 768 -t rsa
Generating public/private rsa key pair.
...
# temporarily copy id_rsa.pub into ~/.ssh/authorized_keys on an internet accessible machine
/tmp $ scp -i ./id_rsa libffi-3.0.tgz me@machine:
# confirm md5s on either end
# remove key from authorized_keys

poof. extraction complete. now to put it on s3 or something.

libffi-3.0.tgz is now hosted with our custom packages.
https://s3.amazonaws.com/karmanotes-buildpack/libffi-3.0.tar.gz

need to update the buildpack repo.