Python utilities for AWS. These utilities help save time with different facets (RCA/reporting/cost saving) of performance testing as part of devops or standalone. The readme page will continue to get updated as and when, I add new utility to the repo.
This simple utility allows you to generate Images for the Cloudwatch Metrics. There are times when you want to have an Image for reporting purposes (for example, Performance TSR). This utility reduces the effort required to generate the Image manually. It saves a lot of time when you have lots of Images to generate.
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.
What things you need to execute the script
1: awscli
2: boto3
3: python 3.5
4: Setup your AWS Access, Secret key and the AWS region
1: Make sure above prerequisite are met first.
2: Update the FileName in the script to where you want to save the Image.
3: Update the timezone setting in the script.
4: If AWS access setup has a different AWS region then you can overwrite it in the RegionConfig parameter in the script. Otherwise comment it out.
5: Replace the default json payload with the correct Image API json playload. Correct json payload can be copied from the Cloudwatch console.
6: Now execute the python script by passing in the start and end time in epoch (ms) for which you want to generate the Image.
This simple utility allows you to enable or disable dynamodb contributor insights. Pass the table name in the script for which you want contributor insights enabled or disabled. Contributor Insights helps you identify which dynamodb partition is highly accessed. It is useful for DynamoDB table RCA.
What things you need to execute the script
1: awscli
2: boto3
3: python 3.5
4: Setup your AWS Access, Secret key and the AWS region
1: Make sure above prerequisite are first met.
2: Replace the default value for the variable "TABLE_TO_UPDATE" with your table name.
3: If AWS access setup has a different AWS region then you can overwrite it in the RegionConfig parameter in the script. Otherwise comment it out.
4: Now execute the python script.
There might be cases when you end up having a lot of DynamoDB tables in your non-prod environment and they might be either set to Provisioned or On-Demand capacity. If they are not properly managed, cost ($$) of keeping these tables on Provisioned capacity can escalate pretty quickly. This simple python script goes through all the tables and if they are on provisioned capacity changes them to On-demand. If they are already on On-Demand capacity, it doesn't nothing.
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.
What things you need to execute the script
1: awscli
2: boto3
3: python 3.5
4: Setup your AWS Access, Secret key and the AWS region
Once above prequisites are setup, execute the python script
- Check when was the table last changed to On-Demand capacity. If it was less than 24 hours than reduce the Provisioned capacity else change it to On-Demand.
- Improve on the get() function call. In future if AWS changes the json structure, this call if fail. Need to come up with a better approach.
This simple utility allows you to create or delete cloudwatch dashboards. Useful when you need to create multiple dashboards.
What things you need to execute the script
1: awscli
2: boto3
3: python 3.5
4: Setup your AWS Access, Secret key and the AWS region
1: Make sure above prerequisite are first met.
2: Replace the default payload for the variable "DASHBOARD_JSON" with your dashboard json payload. Best way is to take an exisiting dashboard payload from console, modify it and pass it into the script (if you are creating a dashboard).
3: Replace the default value for the variable "DASHBOARD_NAME" with your dashboard name.
4: If AWS access setup has a different AWS region then you can overwrite it in the RegionConfig parameter in the script. Otherwise comment it out.
5: Now execute the python script.
Compare DynamoDB Query vs Scan time.
Compare DynamoDB GetItem and BatchGetItem API calls.
Get S3 Bucket Size.
This script takes a backup of a dynamodb table and copy's the data to a different dynamodb table. During the process, the data is first saved to S3 bucket.
A sample python script example to execute SQL statement against DynamoDB.
Script to help extract spot instance information to answer questions such as:
- What is the current spot instance price in each region?
- What type of spot instances are available in a region & availability zone?
- What is the interruption rate for a spot instance?
- What OS is available for a spot instance in a region & availability zone?
Script gives the capability to stop & start ec2 instanced based on instanceid, instance type or platform.
./stop_start_ec2.py stop id
./stop_start_ec2.py start id
./stop_start_ec2.py stop type <<instance type>>
./stop_start_ec2.py start type <<instance type>>
example
./stop_start_ec2.py stop type t2.micro
./stop_start_ec2.py start type t2.micro
Pass "windows" if you want to stop or start Windows platform. Otherwise pass other.
./stop_start_ec2.py stop platform windows
./stop_start_ec2.py start platform windows
or
./stop_start_ec2.py stop platform other
./stop_start_ec2.py start platform other
Pass "windows" if you want to stop or start Windows platform. Otherwise pass other. Also pass in what Instance type you want to stop or start.
./stop_start_ec2.py stop windows {InstanceType}
./stop_start_ec2.py start windows {InstanceType}
or
./stop_start_ec2.py stop other {InstanceType}
./stop_start_ec2.py start other {InstanceType}
example
./stop_start_ec2.py stop windows t2.micro
./stop_start_ec2.py start other t2.micro
./stop_start_ec2.py start windows t2.small
./stop_start_ec2.py stop other c4.2xlarge
If you never used Amazon Web Services with Python before, you have to install two additional modules:
pip install boto3 botocore
or
pip3 install boto3 botocore
Save your AWS Credentials in your home/users folder:
Linux:
/home/[username]/.aws
Windows:
/Users/[username]/.aws
For more information about the content of the .aws folder check the AWS documentation: Configuration and Credential Files.
Instead of creating the .aws folder manually you can use the AWS Command Line Interface:
After you've installed the AWS CLI open the PowerShell (or the Command Prompt) in Windows. In UNIX-like systems open a Shell. Then run the following command:
aws configure
Enter
- your AWS Access Key ID and
- your AWS Secret Access Key.
- As default region name enter your Availability Zone (AZ) and
- use "json" as default output format
If you would like to contribute to this project, please reachout to me. Issues and pull requests are welcomed too.
This project is licensed under the Apache License - see the LICENSE file for details