Leverage `docker-compose.yml` instead of `Dockerfile` for Clients.
abdulrabbani00 opened this issue · 7 comments
Hello Team,
I would like to request the following feature:
- Instead of utilizing a
Dockerfile
, for my client, I would like the ability to leverage adocker-compose.yml
file. - I am fairly new to the codebase, but I am hoping to create a fork to write integration tests for my orgs
geth
fork. - It would be great to utilize a compose file as it would allow us to spin up "adjacent" containers necessary for our version of geth.
I am more than happy to look into and add the feature/functionality myself.
- Would it be possible for someone to direct me to where this logic might fit in (I can also look into it myself)?
- Does this seem like a fairly light lift, or do you expect it to require a lot of time/effort?
- Is this functionality something that would be useful to the team?
- If I added this feature, would the team be willing to merge it?
- Any other thoughts or concerns?
I look forward to hearing your feedback 😄
It would be great to utilize a compose file as it would allow us to spin up "adjacent" containers necessary for our version of geth.
What adjacent containers would that be?
Please expand a bit on why you need this.
@holiman - We run Postgres in the adjacent container which connects our geth-fork
. We would like to keep the Postgres DB and geth-fork
in separate containers, which is why we want to leverage docker-compose
. The "dirty" workaround is to build geth-fork
inside of our Postgres container.
@holiman - Any thoughts pal? I would be happy to look into the integration on my own time if you guys can't. Any insight into this task would be greatly appreciated.
I think this change will be complicated. Hive does not invoke the docker command line tool, but uses the Docker daemon's HTTP API directly. While I don't know how docker-compose works, I know it is not a native feature of the Docker API, so we would have to reimplement docker-compose inside hive to make this work.
It's not such a big issue to run postgres and geth in the same container. In this project, we prefer simple solutions, so I think you should just try running it in one container first.
Thanks, pal, I found a workaround that suited my needs. I am working on connecting hive to test our version of geth. In the future, if I have any new questions should I open an issue, or do you guys have another venue for basic questions (telegram or discord)?
Best if you open issues here.