ueno/ruby-gpgme

encoding issues when clearsigning UTF-8 data

ice799 opened this issue · 2 comments

If you GPG clearsign some UTF-8 encoded data, ruby-gpgme will raise in the IOCallbacks:

Encoding::UndefinedConversionError: "\xC3" from ASCII-8BIT to UTF-8
from /usr/lib/x86_64-linux-gnu/ruby/gems/2.4.0/gems/gpgme-2.0.2/lib/gpgme/io_callbacks.rb:12:in `write'
/usr/lib/x86_64-linux-gnu/ruby/gems/2.4.0/gems/gpgme-2.0.2/lib/gpgme/ctx.rb:417:in `gpgme_op_sign'
/usr/lib/x86_64-linux-gnu/ruby/gems/2.4.0/gems/gpgme-2.0.2/lib/gpgme/ctx.rb:417:in `sign'
...

The issue is that write_cb in the C extension does not associate an encoding with the string allocated with rb_str_new so the default is ASCII-8BIT. This string is then passed to IOCallbacks#write as buffer where it will have the ASCII-8BIT encoding associated with it. If the output @io object uses a different encoding, writing the buffer to that @io will raise as seen above.

To fix this, the string in the C extension would ideally at least be associated with the default internal encoding this way the app author can specify which encoding strings should be and everything will work as planned. Alternatively, you would add a method for library users to set the desired encoding to take precedence over the default internal encoding.

For an example of how to do this, you can check out the YAJL ruby code.

For anyone following along there are two ways you can avoid this bug:

  1. Write a wrapper IO object that will transcode the data from ASCII-8BIT to the desired output format and then write it to the "real" IO object yourself, or
  2. Apply a patch like this which will convert the string to the Ruby internal encoding. This patch applies cleanly against 2.0.2 for me and fixes the issue.

@ice799 I've been looking at this issue.

This error could be avoided if the IO is set in "binary mode".
Instead using something like: File.open("file", "w")
We should do: File.open("file", "wb")