๐ [BUG] - Unable to import module 'lambda_handler': No module named 'openai'
Closed this issue ยท 7 comments
Description
Hi,
I was following the instructions for the REST API and for the most part everything worked great but the vast majority of endpoints on postman return the following error message:
"errorMessage": "Unable to import module 'lambda_handler': No module named 'openai'",
"errorType": "Runtime.ImportModuleError",
I tried verifying that I am using python 3.11 and that everything gets installed correctly but can't really figure out what the issue is.
Thank you
Reproduction URL
https://github.com/FullStackWithLawrence/aws-openai
Reproduction steps
1. Follow instructions on https://github.com/FullStackWithLawrence/aws-openai/tree/main/api
2. Try running passthrough.
Screenshots
![DESCRIPTION](LINK.png)
Logs
No response
Browsers
No response
OS
Mac
Hi @marcelfolaron,
This is the exact line of code to which the import error is referring. This is a symptom indicating that your Lambda Layer is either not present or is corrupted (i'm guessing the former). More specifically, the package 'openai' is included in the contents of the AWS Lambda Layer that Terraform creates, here.
Creation of the Lambda Layer is by far the most technically complex operation that Terraform performs during this build. The layer itself essentially contains a run-time copy of the Python virtual environment that both Lambdas use. This helps to minimize the Lambda package size, which in turn has some nice side benefits like for example, the source code of the Lambdas becomes viewable in the AWS console once the package dependencies have been moved to a Layer. This also ensures that both Lambdas are using the same Python package dependencies, which is nice, and helps with trouble shooting. Lastly, the Layer code and Lambda function source code change for quite different reasons, and at different frequencies, so it's nice to have these separated so that these can be independently managed.
Of note:
you need Docker and Docker Compose, which are used in order to avoid strange platform incompatibility issues that can surface with some of the larger PyPi packages that are included in the build such as NumPy and SciPy. You can refer here and here to see how Docker Compose is being used to create the Layer.
If the Layer exists and is configured correctly then you'll be able to match up what you see in the AWS console to these two screen shots:
Maybe related: Have you seen any issues running everything on a Macbook M1. I did have to go through some additional steps getting terraform running on my mac due to Arm incompatibility. In particular I had to compile the template 2.2.0 module using https://kreuzwerker.de/en/post/use-m1-terraform-provider-helper-to-compile-terraform-providers-for-mac-m1
Yes. the terraform run-time issues are resolved at this point, so nothing to discuss on that front. but the platform-specific issues i mention above regarding the build for the Layer is something specific to Python that is permanent. at issue is that certain, large PyPi packages include Cython as well as really old source files such as C and C header files which have to be compiled for the platform on which they'll run (ie Amazon Linux). Hence, the Dockerfile
Thank you for your help! I think I am almost there. Was able to fix my python configuration so that the openAI import works. I am now running into a problem with pydantic_core. It appears to be a platform issue as well since it's pulling the arm version on my mac when I need the x86 version. Is that part of the docker build as well?
That sounds familiar. you might want to review this thread -- pydantic/pydantic#4699.
More generally, the terraform variable compatible_architecture should be set to "x86_64" here , and you can confirm from this window in the AWS console
also, your Dockerfile should be using AWS' linux/amd64 image, see here
The Docker approach should make moot the point that you're working on a Mac, as the Dockerfile pulls everything based on the platform designated at the top of the file.
Closing this, as there hasn't been any further discussion in the last month.