shesek/spark-wallet

Server fails with EISDIR when directory named config exists in $HOME

hossbeast opened this issue · 4 comments

Just started trying this out. I ran

 npm install -g spark-wallet

Now when I try to start spark-wallet, I get an error.

1 master % spark-wallet
internal/fs/utils.js:332
    throw err;
    ^

Error: EISDIR: illegal operation on a directory, read
    at Object.readSync (fs.js:617:3)
    at tryReadSync (fs.js:382:20)
    at Object.readFileSync (fs.js:419:19)
    at Object.<anonymous> (/usr/lib/node_modules/spark-wallet/dist/cli.js:81:55)
    at Module._compile (internal/modules/cjs/loader.js:1085:14)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:1114:10)
    at Module.load (internal/modules/cjs/loader.js:950:32)
    at Function.Module._load (internal/modules/cjs/loader.js:790:12)
    at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:76:12)
    at internal/main/run_main_module.js:17:47 {
  errno: -21,
  syscall: 'read',
  code: 'EISDIR'
}

strace shows spark-wallet looking for a file named config in my home directory.

openat(AT_FDCWD, "/home/todd/config", O_RDONLY|O_CLOEXEC) = 17
statx(17, "", AT_STATX_SYNC_AS_STAT|AT_EMPTY_PATH, STATX_ALL, {stx_mask=STATX_ALL|STATX_MNT_ID, stx_attributes=0, stx_mode=S_IFDIR|0755, stx_size=4096, ...}) = 0
read(17, 0x55cac7387110, 8192)          = -1 EISDIR (Is a directory)

Well, I happen to have a directory named config there.

Looks like it runs when invoked like,

% spark-wallet --config ~/.spark-wallet/config
...
HTTP server running on http://localhost:9737

This probably happens because spark uses the CONFIG environment variable (when available) as the path to the config file. I haven't seen this variable set by default in the linux environments I use, but it is a pretty generic name and apparently some environments do use it :)

I will look into renaming this to avoid the conflict, but need to give some thought regarding backwards compatibility (CONFIG is used by some downstream projects as part of the docker setup).

In the meanwhile, could you please confirm that that's indeed the cause by running unset CONFIG before starting spark?

Thanks for reporting this!

Yep you're right about $CONFIG.

todd@euclid ~
130 % env | grep CONFIG
CONFIG=/home/todd/config
XDG_CONFIG_HOME=/home/todd/.config

Thanks for confirming. I will think about a fix for this.