scality/Arsenal

S3 server crashes if the access key and secret key are missing

dkilcy opened this issue · 2 comments

GA6.2.0-beta5-rc3

I made a mistake and had empty access key and secret key, and sending a request to s3 using the AWS CLI crashes the server.

[devops@ws2 S3]$ cat ~/.aws/credentials 
[default]
# acct1/devops
#aws_access_key_id = 1D8ZM2NS0R8V47L474OW
#aws_secret_access_key = mK1tWVk0SL0KNqo7WWT+NbuuxltuPyr9PbXkwJth
#
# acct1
#aws_access_key_id = QYODQNKJ4AQUCC2U7WDS
#aws_secret_access_key = 7PK0rwX5DB6IsVCLIaMS1QAqIbUJTRaN=vRiCEj7
# acct2
aws_access_key_id = UYERP648I7SZLGBUUIB2
aws_secret_access_key = OeIFnKGxiUIqyHE8F1O5GyWym42oFAfTiN5RQgTQ 
# acct3
aws_access_key_id =
aws_secret_access_key =
[devops@ws2 S3]$  aws --endpoint-url http://s3.lab.local s3 ls

An error occurred (502) when calling the ListBuckets operation: Bad Gateway
[devops@ws2 S3]$ 
{"name":"S3","clientIP":"::ffff:10.0.0.11","clientPort":23311,"httpMethod":"GET","httpURL":"/","time":1473384972656,"req_id":"5b5ca79e53067eb1ece5","level":"info","message":"received request","hostname":"app1.lab.local","pid":2545}
{"name":"S3","error":"Cannot read property 'MissingSecurityHeader' of undefined","stack":"TypeError: Cannot read property 'MissingSecurityHeader' of undefined\n    at Object.headerAuthCheck.check (/home/scality/S3/node_modules/arsenal/lib/auth/v2/headerAuthCheck.js:45:29)\n    at Object.auth.setAuthHandler.auth.check (/home/scality/S3/node_modules/arsenal/lib/auth/auth.js:41:43)\n    at Object.auth.setAuthHandler.auth.check.auth.doAuth.vault.authenticateV2Request [as doAuth] (/home/scality/S3/node_modules/arsenal/lib/auth/auth.js:66:22)\n    at Object.callApiMethod (api.js:75:21)\n    at routerGET (routeGET.js:31:13)\n    at checkUnsuportedRoutes (routes.js:33:16)\n    at routes (routes.js:71:12)\n    at Server.<anonymous> (server.js:61:17)\n    at emitTwo (events.js:87:13)\n    at Server.emit (events.js:172:7)\n    at HTTPParser.parserOnIncoming [as onIncoming] (_http_server.js:528:12)\n    at HTTPParser.parserOnHeadersComplete (_http_common.js:103:23)","level":"fatal","message":"caught error","hostname":"app1.lab.local","pid":2545}
{"name":"S3","level":"error","message":"shutdown of worker due to exception","hostname":"app1.lab.local","pid":2545}
{"name":"S3","workerId":277,"level":"error","message":"worker disconnected. making sure exits","hostname":"app1.lab.local","pid":42}
{"name":"S3","workerId":277,"level":"error","message":"worker exited.","hostname":"app1.lab.local","pid":42}
{"name":"S3","workerId":286,"level":"error","message":"new worker forked","hostname":"app1.lab.local","pid":42}
{"name":"S3","bootstrap":["app1.lab.local"],"https":false,"level":"info","message":"bucketclient configuration","hostname":"app1.lab.local","pid":2594}
{"name":"S3","host":"app1.lab.local","port":8500,"https":false,"level":"info","message":"vaultclient configuration","hostname":"app1.lab.local","pid":2594}
{"name":"S3","level":"warn","message":"scality kms selected but unavailable, using file backend","hostname":"app1.lab.local","pid":2594}
{"name":"S3","https":false,"level":"info","message":"Https server configuration","hostname":"app1.lab.local","pid":2594}
{"name":"S3","address":"::","port":8000,"pid":2594,"level":"info","message":"server started","hostname":"app1.lab.local"}

This fixes that issue #155

Fixed by #155