[ SUGGESTION ] - Multiple Docker hosts
Opened this issue · 10 comments
i would really love a feature where we can add remote docker sockets just like in dozzle if you know that.
I am not sure on how this could work because i am not a backend developer :/
This would allow user to monitor remote docker / podman sockets as well
Thanks for the suggestion! I'll look into it!
It may be a bit tricky to implement because of the library currently used. I might also need your help for testing.
Thanks for the suggestion! I'll look into it!
It may be a bit tricky to implement because of the library currently used. I might also need your help for testing.
I can definitely help with testing, since I have like 6 Servers to try it on to really stress this tool :D
Super interested in this and want to comment on a setup.
I use https://github.com/Tecnativa/docker-socket-proxy on every docker host so that I can use some tools like https://gethomepage.dev/ to report status without having write access.
I believe starting with supporting accessing docker information through the HTTP interface even if only one host is used is a great first step in the direction of supporting multiple hosts.
As an alternative, you could take input from a yaml or toml config file with a list of the images, diun style. Then you wouldn't need access to the socket at all and could support checking for more images than what's running on the local or any connected host.
Hey @Its4Nik! I've looked into this issue a bit and from what I see, projects that support multiple hosts run an agent on each host and communicate with that, without exposing any sockets on the network (this is done by Dozzle and Portainer).
If that were to happen, a very simple way to achieve this functionality would be to run a Cup instance on each server and add an option to the config file to add more hosts. Then, the "main" instance could just use the API endpoint from each server and aggregate the results. What do you think?
Hey @Its4Nik! I've looked into this issue a bit and from what I see, projects that support multiple hosts run an agent on each host and communicate with that, without exposing any sockets on the network (this is done by Dozzle and Portainer).
If that were to happen, a very simple way to achieve this functionality would be to run a Cup instance on each server and add an option to the config file to add more hosts. Then, the "main" instance could just use the API endpoint from each server and aggregate the results. What do you think?
Yeah that could work but (i am not sure how cup works under the hood) is it possible to add an env var to toggle the frontend on/off? I guess that would improve container reliability and more usecases in low power environments
Just as a question if the frontend actually is already very light on the host
I could add a config option to turn the frontend off, however all code for it would still be present in the binary. It's just a few hundred kb.
I would be more than happy to help test as well. Also, it appears that Dozzle, homepage, and What'sUpDocker can all access the docker socket via a tcp port, which is all that https://github.com/Tecnativa/docker-socket-proxy exposes. Sometimes it is denoted by just host and port and other times it is denoted as tcp://10.0.10.1:2375
.
Either way, I have been looking for a project like this for so long!! Thank you so much for all your work!!!
Hey!
I haven't had the time to implement this feature yet, but it's planned for version 3 which I'm working on at the moment. I'm kind of stuck on semver support, so it'll be a while until it's ready (I have a very busy schedule, I'm sorry). I'll tell you when it's ready to test.
Thanks for the idea of using something else instead of the docker socket. I know the benefits and I'll try to implement that too if I can!