Using exporter on kubernetes and redis is AWS elasticachegiving error ```Redis::ConnectionError: Connection lost (ECONNRESET)```
Closed this issue · 7 comments
I want to use this sidekiq-prometheus-exporter on kubernetes in monittoring namespace. The redis which I have given environmanet varaibles values is in AWS elastic-cache, and so the exporter is giving error when I curl on pod curl http://127.0.0.1:9292/metrics as follows:
~ kubectl version Client Version: version.Info{Major:"1", Minor:"18", GitVersion:"v1.18.8", GitCommit:"9f2892aab98fe339f3bd70e3c470144299398ace", GitTreeState:"clean", BuildDate:"2020-08-13T16:12:48Z", GoVersion:"go1.13.15", Compiler:"gc", Platform:"darwin/amd64"} Server Version: version.Info{Major:"1", Minor:"18", GitVersion:"v1.18.8", GitCommit:"9f2892aab98fe339f3bd70e3c470144299398ace", GitTreeState:"clean", BuildDate:"2020-08-13T16:04:18Z", GoVersion:"go1.13.15", Compiler:"gc", Platform:"linux/amd64"}
Environment variables for redis which I am setting in values.yaml are:
REDIS_HOST: "some.endpoint.cache.amazonaws.com"
REDIS_PORT: 6379
REDIS_PASSWORD: "************"
REDIS_DB_NUMBER: "0"
127.0.0.1 - - [05/Mar/2021:08:38:36 +0000] "GET /metrics HTTP/1.1" 500 140876 0.0693 Redis::ConnectionError: Connection lost (ECONNRESET) /usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:275:in
rescue in io'
/usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:267:in io' /usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:279:in
read'
/usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:131:in block in call' /usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:248:in
block (2 levels) in process'
/usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:389:in ensure_connected' /usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:238:in
block in process'
/usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:325:in logging' /usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:237:in
process'
/usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:131:in call' /usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:113:in
block in connect'
/usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:313:in with_reconnect' /usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:111:in
connect'
/usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:386:in ensure_connected' /usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:238:in
block in process'
/usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:325:in logging' /usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:237:in
process'
/usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:203:in call_pipelined' /usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:170:in
block in call_pipeline'
/usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:313:in with_reconnect' /usr/local/bundle/gems/redis-4.2.5/lib/redis/client.rb:168:in
call_pipeline'
/usr/local/bundle/gems/redis-4.2.5/lib/redis.rb:2445:in block in pipelined' /usr/local/bundle/gems/redis-4.2.5/lib/redis.rb:69:in
block in synchronize'
/usr/local/lib/ruby/2.7.0/monitor.rb:202:in synchronize' /usr/local/lib/ruby/2.7.0/monitor.rb:202:in
mon_synchronize'
/usr/local/bundle/gems/redis-4.2.5/lib/redis.rb:69:in synchronize' /usr/local/bundle/gems/redis-4.2.5/lib/redis.rb:2441:in
pipelined'
/usr/local/bundle/gems/sidekiq-5.2.8/lib/sidekiq/api.rb:68:in block in fetch_stats!' /usr/local/bundle/gems/sidekiq-5.2.8/lib/sidekiq.rb:97:in
block in redis'
/usr/local/bundle/gems/connection_pool-2.2.3/lib/connection_pool.rb:63:in block (2 levels) in with' /usr/local/bundle/gems/connection_pool-2.2.3/lib/connection_pool.rb:62:in
handle_interrupt'
/usr/local/bundle/gems/connection_pool-2.2.3/lib/connection_pool.rb:62:in block in with' /usr/local/bundle/gems/connection_pool-2.2.3/lib/connection_pool.rb:59:in
handle_interrupt'
/usr/local/bundle/gems/connection_pool-2.2.3/lib/connection_pool.rb:59:in with' /usr/local/bundle/gems/sidekiq-5.2.8/lib/sidekiq.rb:94:in
redis'
/usr/local/bundle/gems/sidekiq-5.2.8/lib/sidekiq/api.rb:67:in fetch_stats!' /usr/local/bundle/gems/sidekiq-5.2.8/lib/sidekiq/api.rb:23:in
initialize'
/usr/local/bundle/gems/sidekiq-prometheus-exporter-0.1.15/lib/sidekiq/prometheus/exporter/standard.rb:19:in new' /usr/local/bundle/gems/sidekiq-prometheus-exporter-0.1.15/lib/sidekiq/prometheus/exporter/standard.rb:19:in
initialize'
/usr/local/bundle/gems/sidekiq-prometheus-exporter-0.1.15/lib/sidekiq/prometheus/exporter/exporters.rb:38:in new' /usr/local/bundle/gems/sidekiq-prometheus-exporter-0.1.15/lib/sidekiq/prometheus/exporter/exporters.rb:38:in
block in to_s'
/usr/local/bundle/gems/sidekiq-prometheus-exporter-0.1.15/lib/sidekiq/prometheus/exporter/exporters.rb:38:in map' /usr/local/bundle/gems/sidekiq-prometheus-exporter-0.1.15/lib/sidekiq/prometheus/exporter/exporters.rb:38:in
to_s'
/usr/local/bundle/gems/sidekiq-prometheus-exporter-0.1.15/lib/sidekiq/prometheus/exporter.rb:50:in call' /usr/local/bundle/gems/rack-2.0.9/lib/rack/urlmap.rb:68:in
block in call'
/usr/local/bundle/gems/rack-2.0.9/lib/rack/urlmap.rb:53:in each' /usr/local/bundle/gems/rack-2.0.9/lib/rack/urlmap.rb:53:in
call'
/usr/local/bundle/gems/rack-2.0.9/lib/rack/tempfile_reaper.rb:15:in call' /usr/local/bundle/gems/rack-2.0.9/lib/rack/lint.rb:49:in
_call'
/usr/local/bundle/gems/rack-2.0.9/lib/rack/lint.rb:37:in call' /usr/local/bundle/gems/rack-2.0.9/lib/rack/show_exceptions.rb:23:in
call'
/usr/local/bundle/gems/rack-2.0.9/lib/rack/common_logger.rb:33:in call' /usr/local/bundle/gems/rack-2.0.9/lib/rack/chunked.rb:54:in
call'
/usr/local/bundle/gems/rack-2.0.9/lib/rack/content_length.rb:15:in call' /usr/local/bundle/gems/rack-2.0.9/lib/rack/handler/webrick.rb:86:in
service'
/usr/local/lib/ruby/2.7.0/webrick/httpserver.rb:140:in service' /usr/local/lib/ruby/2.7.0/webrick/httpserver.rb:96:in
run'
/usr/local/lib/ruby/2.7.0/webrick/server.rb:307:in block in start_thread'
I have added servicemonitor as I use prometheus-operator and it is sowing down targets as well. with error server returned HTTP status 500 Internal Server Error --
How can I resolve this issue?
Hi, according to the backtrace the error is related to connectivity between pod which is running metrics, and AWS Redis. Unfortunately, this issue is more about your infrastructure setup and less about the gem.
If you are using the Docker image from here, I would recommend you trace the connectivity issue within the pod. You can jump into it and try to establish a connection the same way as an exporter and then you can either use Sidekiq.redis
or something like this to test the connection
Sidekiq::Queue.all.map do |queue|
[queue.name, queue.size, queue.latency]
end
@Khemdevi If you have ElastiCache with encrypt in-transit enabled then you have to use rediss
instead of redis
.
@Strech this needs to be changed in the https://github.com/Strech/sidekiq-prometheus-exporter/blob/master/docker/config.ru
I will suggest adding support for the new ENV variable called REDIS_USE_SSL
. Based on the value of the REDIS_USE_SSL you can construct a URL with redis://
or rediss://
Let me know if you need a PR for this fix.
@ajinkyapisal Thank you very much, In my case, elastic-cache had in-transit encryption enebled, and using url with rediss:// instead of redis:// worked fine for me.
Also @Strech it would be great if we could have an environment variable for using SSL if we have in-transit encryption enabled.
@ajinkyapisal Thanks for your support, PR would be nice 💚
@Namrata3991 While PR is on review, you can also use REDIS_URL
variable which will go in prior to all other Redis connection settings.
Thanks @Namrata3991 and @ajinkyapisal for your help, I appreciate it 💚
A new docker image was published 0.1.15-1
and the Helm chart was updated https://github.com/Strech/sidekiq-prometheus-exporter/releases/tag/v0.1.15-1