This repository contains the code and configuration for an AWS workflow using Lambda, SQS, and S3. The workflow is designed to process messages and store the output as CSV files in an S3 bucket.
The URL is being used to fetch financial data for a specific stock or symbol from the Mynet Finans website. Here's a detailed explanation of why and how it is used:
body = json.loads(event['Records'][0]['body'])
symbol = body[0]
urlpart = body[1]
url = f'https://finans.mynet.com/borsa/hisseler/{urlpart}'
-
Step 1: Lambda Function (Step_1)
- Role: Initiates the process by sending messages to an SQS queue.
- Permissions:
AmazonSQSFullAccess
- SQS URL:
https://sqs.eu-north-1.amazonaws.com/891377159635/step1
-
Step 2: SQS Queue (step1)
- Role: Acts as a buffer, holding messages sent by Step_1 and ensuring they are delivered to Step_3.
- Trigger: Configured to trigger the Lambda function in Step 3.
-
Step 3: Lambda Function (Step_3)
- Role: Processes messages from the SQS queue, performing tasks like extracting, transforming, and loading data into an S3 bucket.
- Trigger Configuration:
- Maximum concurrency: 2
- Permissions:
AmazonSQSFullAccess
AmazonS3FullAccess
-
Amazon S3 Bucket
- Role: Stores the processed data as CSV files.
- Files: Example files include
A1CAP.csv
,ACSEL.csv
,ADEL.csv
, etc. - Data Structure: Each CSV file contains columns such as
date
,high
,low
,close
, andvolume
.
- AWS account with necessary permissions
- AWS CLI configured
- Python 3.x installed