aws-samples/aws-serverless-saas-workshop

Errors for 'Unable to upload artifact' during deployment in Lab 4 - S3 bucket inaccessible

Closed this issue · 1 comments

In Lab4, I got errors during deployment when running the workshop in self-guided manner, at the following step - https://catalog.us-east-1.prod.workshops.aws/v2/workshops/b0c6ad36-0a4b-45d8-856b-8a64f0ac76bb/en-US/lab4/53-deploy-lab4

There were errors in uploading resources, at the deployment stages for both the Bootstrap server code and Tenant server code deployment steps. The error at the Bootstrap server code deployment stage is:

Running PythonPipBuilder:ResolveDependencies
Running PythonPipBuilder:CopySource
Error: Unable to upload artifact DynamoDBTables/template.yaml referenced by Location parameter of DynamoDBTables resource.
An error occurred (AccessDenied) when calling the PutObject operation: Access Denied

The error at the Tenant server code deployment stage is:

Running PythonPipBuilder:ResolveDependencies
Running PythonPipBuilder:CopySource
Error: Unable to upload artifact ServerlessSaaSLayers referenced by ContentUri parameter of ServerlessSaaSLayers resource.
An error occurred (AccessDenied) when calling the CreateMultipartUpload operation: Access Denied

I was able to resolve both issues by creating my own S3 bucket for upload of the relevant resources, using the same account that I was using to run the deployment script (cd ~/environment/aws-serverless-saas-workshop/Lab4/scripts/ ./deployment.sh -s.) I then changed the value for s3_bucket to the name of my bucket, in both the shared-samconfig.toml and the tenant-samconfig.toml config files (the preset value for the s3_bucket is aws-saas-sam-cli-ujwbuket)

Should the website for the workshop include a step for creation of a new S3 bucket and amendment of the config files?

@chalcrow - yes the workshop does this. In Lab2 it creates a new S3 bucket and updates config files of all Labs.