Terraform module to create default storage bucket with.
We eat, drink, sleep and most importantly love DevOps. We are working towards strategies for standardizing architecture while ensuring security for the infrastructure. We are strong believer of the philosophy Bigger problems are always solved by breaking them into smaller manageable problems. Resonating with microservices architecture, it is considered best-practice to run database, cluster, storage in smaller connected yet manageable pieces within the infrastructure.
This module is basically combination of Terraform open source and includes automatation tests and examples. It also helps to create and improve your infrastructure with minimalistic code instead of maintaining the whole infrastructure code yourself.
We have fifty plus terraform modules. A few of them are comepleted and are available for open source usage while a few others are in progress.
This module has a few dependencies:
- Terraform 1.x.x
- Go
- github.com/stretchr/testify/assert
- github.com/gruntwork-io/terratest/modules/terraform
IMPORTANT: Since the master
branch used in source
varies based on new modifications, we suggest that you use the release versions here.
Here are some examples of how you can use this module in your inventory structure:
module "bucket" {
source = "clouddrove/storage/google"
version = "1.0.0"
name = "storage-bucket"
environment = "test-main-bukcet"
label_order = ["name", "environment"]
location = "US"
project_id = "clouddrove"
#website
website = {
main_page_suffix = "index.html"
not_found_page = "404.html"
}
#logging
logging = {
log_bucket = module.bucket_logs.bucket.id
log_object_prefix = "gcs-log"
}
#cors
cors = [{
origin = ["http://image-store.com"]
method = ["GET", "HEAD", "PUT", "POST", "DELETE"]
response_header = ["*"]
max_age_seconds = 3600
}]
# versioning
versioning = true
#lifecycle_rules
lifecycle_rules = [{
action = {
type = "SetStorageClass"
storage_class = "NEARLINE"
}
condition = {
age = 60
created_before = "2018-08-20"
with_state = "LIVE"
matches_storage_class = ["STANDARD"]
num_newer_versions = 10
custom_time_before = "1970-01-01"
days_since_custom_time = 1
days_since_noncurrent_time = 1
noncurrent_time_before = "1970-01-01"
}
}]
}
Name | Description | Type | Default | Required |
---|---|---|---|---|
attributes | Additional attributes (e.g. 1 ). |
list(string) |
[] |
no |
bucket_id | Used to find the parent resource to bind the IAM policy to | string |
"" |
no |
cors | The bucket's Cross-Origin Resource Sharing (CORS) configuration. Multiple blocks of this type are permitted. | any |
[] |
no |
default_event_based_hold | (Optional) Whether or not to automatically apply an eventBasedHold to new objects added to the bucket. | bool |
null |
no |
default_kms_key_name | The bucket's encryption configuration | string |
null |
no |
enabled | Set to false to prevent the module from creating any resources. | bool |
true |
no |
environment | Environment (e.g. prod , dev , staging ). |
string |
"" |
no |
force_destroy | When deleting a bucket, this boolean option will delete all contained objects. If you try to delete a bucket that contains objects, Terraform will fail that run. | bool |
true |
no |
google_storage_bucket_iam_member_enabled | Set to false to prevent the module from creating any resources. | bool |
true |
no |
label_order | Label order, e.g. sequence of application name and environment name ,environment ,'attribute' [webserver ,qa ,devops ,public ,] . |
list(any) |
[] |
no |
lifecycle_rules | The bucket's Lifecycle Rules configuration. | any |
[] |
no |
location | (Required) The GCS location. | string |
n/a | yes |
logging | The bucket's Access & Storage Logs configuration. | any |
null |
no |
member | Identities that will be granted the privilege in role | string |
"" |
no |
name | Name (e.g. app or cluster ). |
string |
"" |
no |
project_id | GCS Project ID. | string |
"" |
no |
public_access_prevention | Prevents public access to a bucket. Acceptable values are inherited or enforced . If inherited , the bucket uses public access prevention. only if the bucket is subject to the public access prevention organization policy constraint. Defaults to inherited . |
string |
"" |
no |
requester_pays | Enables Requester Pays on a storage bucket | string |
true |
no |
retention_policy | Configuration of the bucket's data retention policy for how long objects in the bucket should be retained. | any |
null |
no |
storage_class | (Required if action type is SetStorageClass) The target Storage Class of objects affected by this Lifecycle Rule. Supported values include: STANDARD, MULTI_REGIONAL, REGIONAL, NEARLINE, COLDLINE, ARCHIVE. | string |
"STANDARD" |
no |
uniform_bucket_level_access | Enables Uniform bucket-level access access to a bucket. | bool |
true |
no |
versioning | The bucket's Versioning configuration. | bool |
true |
no |
website | Map of website values. Supported attributes: main_page_suffix, not_found_page | map(any) |
null |
no |
Name | Description |
---|---|
bucket | All attributes of the created google_storage_bucket resource. |
id | GCS bucket id. |
name | GCS bucket name. |
self_link | URI of the GCS bucket. |
url | The base URL of the bucket, in the format gs:// |
In this module testing is performed with terratest and it creates a small piece of infrastructure, matches the output like ARN, ID and Tags name etc and destroy infrastructure in your AWS account. This testing is written in GO, so you need a GO environment in your system.
You need to run the following command in the testing folder:
go test -run Test
If you come accross a bug or have any feedback, please log it in our issue tracker, or feel free to drop us an email at hello@clouddrove.com.
If you have found it worth your time, go ahead and give us a ★ on our GitHub!
At CloudDrove, we offer expert guidance, implementation support and services to help organisations accelerate their journey to the cloud. Our services include docker and container orchestration, cloud migration and adoption, infrastructure automation, application modernisation and remediation, and performance engineering.
We are The Cloud Experts!
We ❤️ Open Source and you can check out our other modules to get help with your new Cloud ideas.