Loading Posts...

How To Configure Terraform AWS Backend With S3 And DynamoDB Table

Managing state with terraform is quite crucial, when we are working with multiple developers in a project, with remote operation and sensitive data, let’s see how to use AWS Backend with S3 and DynamoDB table for remote state and locking for a Terraform project. Here, S3 bucket and a folder is used as the primary location of the state file and, DynamoDB is used to maintain the state locking to avoid any configuration issues with multiple and remote operations to the infrastructure.

Personally, I create these resources from the Terraform itself with my backend repository which can be found here. When applying these Terraform configuration it creates a DynamoDB table with the name “tf-remote-state-lock” along with the “LockID” to maintain a state lock while there is an ongoing configuration “apply” to the environment. S3 bucket also created with a random number at the end of the name “tc-remotestate-xxxx”, state file will be saved inside the folder called “terraform-aws”. These configurations can also be created manually if you need it, or an existing S3 bucket or DynamoDB can be used for this.

If you have multiple environments, you can use the same DynamoDB table to maintain the state lock in the same AWS account, you don’t need to have individual DynamoDB tables for different environments.

Read More: How To Create A Multi-Node ECS Cluster And A Task Definition In AWS

Here is the my Terraform code (“remotestate.tf” file only refer the repository for the “variable.tf” file)

provider "aws" {
  region  = "${var.aws_region}"
}
resource "random_id" "tc-rmstate" {
  byte_length = 2
}
resource "aws_s3_bucket" "tfrmstate" {
  bucket        = "tc-remotestate-${random_id.tc-rmstate.dec}"
  acl           = "private"
  force_destroy = true

  tags = {
    Name = "tf remote state"
  }
}

resource "aws_s3_bucket_object" "rmstate_folder" {
  bucket = "${aws_s3_bucket.tfrmstate.id}"
  key = "terraform-aws/"
}

resource "aws_dynamodb_table" "terraform_statelock" {
  name = "${var.aws_dynamodb_table}"
  read_capacity = 20
  write_capacity = 20
  hash_key = "LockID"

  attribute {
      name = "LockID"
      type = "S"
  }
}

Here is the created S3 Bucket with a random number and the folder, after applying the terraform configuration

AWS Backend With S3 And DynamoDB folder

Here is the DynamoDB Table with the “LockID”, initially the table will be empty and no entries can be seen.

AWS Backend With S3 And DynamoDB table

Now we have already, setup the AWS backend to work with Terraform, and this configuration should be added to your source infrastructure deployment code. What I usually do is, creating a separate file called “terraform.tf” with the below config. So I don’t need to add this code snippet to the source terraform code.

terraform {
  backend "s3" {
    key = "terraform-aws/terraform.tfstate"
  }
}

When initializing the project below “terraform init” command should be used (generated random numbers should be updated in the below code)

terraform init –backend-config=”dynamodb_table=tf-remote-state-lock” –backend-config=”bucket=tc-remotestate-xxxx”

It will initialize the environment to store the backend configuration in our DynamoDB table and S3 Bucket.

AWS Backend With S3 And DynamoDB initialize

When applying the Terraform configuration, it will check the state lock and acquire the lock if it is free. Successful lock acquisition can lead you to the deployment. Otherwise it will not allow you to proceed with the changes.

AWS Backend With S3 And DynamoDB acquisition

Then the “LockID” will be created while it runs the terraform deployment, wait until it completes the command execution to see the state file in your S3 Bucket.

AWS Backend With S3 And DynamoDB displayed

Lock will be released after the completion or, if failed the deployment for a reason

AWS Backend With S3 And DynamoDB released

At the same time state file will be saved to the S3 bucket and inside the folder. There won’t be any local state copy and everything will be saved in the bucket.

AWS Backend With S3 And DynamoDB state file

State lock also be saved with the Digest in the DynamoDB table, if you are updating the environment secondary lock will be listed below this.

AWS Backend With S3 And DynamoDB digest

I hope this article clearly explains,Terraform AWS Backend with S3 and DynamoDB for remote state files and locking for your terraform infrastructure.

Click to rate this post!
[Total: 4 Average: 5]

Aruna Fernando

"Sharing knowledge doesn't put your job at risk - iron sharpen iron" I heard this and it's true.

Get Updates Directly To Your Inbox!

   

Show 2 comments

Leave a Comment

Loading Posts...