Region missing for SQS so failing to upload messages
Closed this issue · 5 comments
Describe the bug
Line 1231 of stream-producer.py instantiates the SQS connection
self.sqs = boto3.client("sqs")
But this is missing region_name, for example:
self.sqs = boto3.client("sqs", region_name="eu-west-2")
To Reproduce
- Start Docker container and set all environment variables
- Connect to running container and run
python3 /app/stream-producer.py csv-to-sqs
Expected behavior
Should connect to SQS and upload messages into queue based upon rows in a CSV file
Headless server:
- OS: Linux AMI (but using Docker 19.03.4, build 9013bf583a)
Additional context
Suggest extracting the region from environment variable SENZING_SQS_QUEUE_URL or adding another environment variable so the user specifies the region
Thank you for pointing this out. I will correct early in the week.
@davdasil The boto3 library respects values placed in Environment variables and ~/.aws.config
See https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html
AWS_DEFAULT_REGION
can be set and used.
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html#using-environment-variables
Rather than "re-inventing the wheel" by adding more code, I'd like to improve the documentation for this repository to highlight the existing methods of configuring boto3
.
Would that satisfy the issue you raised?
@davdasil I added information in:
- https://github.com/Senzing/stream-producer#examples-of-docker
- https://github.com/Senzing/stream-producer#aws-configuration
If you think that is sufficient, go ahead and close the issue. If there's something more that you'd like please let me know.
Many thanks @docktermj . Very elegant solution
@davdasil Thank you again for raising the issue.