`prefect deploy` command fails if not all dependencies are present at runtime
jamiezieziula opened this issue · 24 comments
First check
- I added a descriptive title to this issue.
- I used the GitHub search to find a similar request and didn't find it.
- I searched the Prefect documentation for this feature.
Prefect Version
2.x
Describe the current behavior
Currently, running prefect deploy...
will fail if not all dependencies required by the deployments are present:
jamiedick in ~/flows on platform-refactor ● λ prefect deploy --all
Deploying all deployments for current project...
╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ Deploying Deployment 1 │
╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
Script at 'deployments/deployment1/flow.py' encountered an exception: ModuleNotFoundError("No module named
'slack_sdk'")
jamiedick in ~/flows on main
This can be problematic if deploying multiple flows that have different requirements / when using virtual environments to isolate dependencies. This also makes deploying via cicd cumbersome because, again, all dependencies need to be present (which is duplicative of something like a docker build step)
Describe the proposed behavior
The prefect deploy
command would not require all dependencies to be present at runtime (either optionally or by default)
Example Use
No response
Additional context
No response
We think we need to load the flow to generate the OpenAPI schema for its parameters. We could do this at an earlier time and store the schema JSON somewhere?
This issue is stale because it has been open 30 days with no activity. To keep this issue open remove stale label or comment.
This issue is stale because it has been open 30 days with no activity. To keep this issue open remove stale label or comment.
+1 on this - Using both the previous deployment.apply() and currently prefect deploy
require me to have all dependencies installed. This has frustrating implications when I'm running this in any CI/CD fashion as I have to install packages into container, and then install the packages again in the builder to deploy.
+1 on this.
I'm surprised this seems like quite a stale issue. How are others solving for this? We have this problem for deploying flows from a centralised CICD but each flow runs using a different docker image and we don't want the CICD to have ALL the requirements at runtime, nor build docker images for the simple task of creating a deployment config.
+1 on this.
On CI/CD waste a lot of time to install dependencies which already installed on Docker image (for example)
Would love to see this behavior changed. My initial understanding of the prefect.yaml
/ deployment.yaml
file was that all necessary metadata was present, rendering it unnecessary to manipulate the Python representation of the flow. It certainly complicates CI/CD.
My workaround will be to install flows in my CI/CD environment, and to have a separate build action for each flow, but this not ideal from a code hygiene and performance perspective.
Any updates on this issue? Having to install all dependencies to create a simple deployment object is kind of cumbersome and conflicts with the purpose of separating flow's dependencies using different images.
My workaround will be to install flows in my CI/CD environment, and to have a separate build action for each flow, but this not ideal from a code hygiene and performance perspective.
Here too. I use PIpfile with pipenv to ensure all works properly.
Dockerfile (image where run flow)
...
RUN pip install --no-cache-dir pipenv==${PIPENV_RELEASE} && \
pipenv sync --clear --system && \
pip cache purge
CI/CD (GitHub Actions)
- name: Setup Python env
uses: actions/setup-python@v4
with:
python-version: ${{ vars.PYTHON_VERSION }}
- name: Setup Prefect CLI
run: |
pip install --no-cache-dir prefect==${{ vars.PREFECT_CLI_VERSION }}
- name: Setup pipenv and Install dependencies
working-directory: ${{ env.PREFECT_FLOW_NAME }}
run: |
pip install --no-cache-dir pipenv==${{ vars.PIPENV_RELEASE }}
pipenv sync --clear --system
...
# step to prefect deploy
So, I ensure that CI/CD needs the requirements like Dockerfile where build a flow image.
+1 on this. Currently I deal with that by calling the external packages only inside functions.
+1 on this.
Thank you for all the +1. We'll look more into this.
adding another +1
it is so frustrating issue..
+1
Hi all, this will probably not get into tomorrow's release of 2.19.3, but hopefully one very soon! Thanks to @desertaxle for this
+1
@jamiezieziula @prefectcboyd @ne-warwick @ialejandro @speedyturkey @furkanrollic @pierreloicq @netanelm-upstream @yuriy-arabskyy @tourist @cvvs @MartijnvanElferen @baxen @akfmdl @syakesaba @josefondrej @ChrisPaul33 @nicelgueta @Mdanford97
This will be available after today's release of 2.19.3
! Please give it a try :)
Edit: Released
Thank you so much.
I could deploy flow without it's dependencies at 2.19.3.
https://zenn.dev/syakesaba/articles/prefect_2_19_3
but I realize that some flow not recognized when that is asyncsonized like this.
@flow
async def main():
any ideas?
ty
@syakesaba I think this was fixed in a subsequent PR
@syakesaba I think this was fixed in a subsequent PR
Thank you so much that was fixed!
@serinamarie This issue occurs again at version > 2.20.2
hi @akfmdl - can you explain more what you tried and what you're seeing?
the issue (as expressed here) should be solved as of 2.20.2 but its possible there's an edge case you're hitting
@zzstoatzz I was testing prefect deploy with some dependencies missing locally. Before version 2.19.3
, I encountered an exception as expected:
prefect.exceptions.ScriptError: Script at 'flows/demo_flow.py' encountered an exception: ModuleNotFoundError("No module named 'pandas'")
The deployment worked without dependencies from version 2.19.3
until 2.20.0
, when this PR was merged.
Starting from version 2.20.0
, I encounter the same exception again.
UPD: I also checked v. 3.0 but I still din't work for me