AWS with Terraform (Day 03)

 

Provisioning My First Real Resource (S3 Bucket)

Today was exciting. After two days of learning fundamentals like providers, versioning, and workflow, Day-03 was finally hands-on — provisioning my first actual AWS resource using Terraform: an S3 bucket.

No console clicks. No manual setup. Just clean Infrastructure as Code. This is the moment where Terraform stops being a theory and becomes real power.


Why Start with an S3 Bucket?

S3 is simple but fundamental. Almost every application touches S3 — for static hosting, logs, backups, artifacts, or data pipelines.

Starting with S3 makes sense:

  • Easy to understand

  • Clear outcome

  • Immediate feedback in AWS console

  • Perfect place to validate workflow

If IaC were learning to drive, this is starting the engine for the first time.


Environment Setup Before Coding

Inside VS Code, I opened my Day03 folder from the challenge repo and copied my existing main.tf from Day-02. Terraform detects any file ending with .tf, which keeps things flexible.

Before running Terraform, I ensured my AWS authentication was configured:

aws configure

Entered Access Key, Secret Key, Region, Output Format.
Tested it by listing buckets:

aws s3 ls

Credentials → Secure & ready. Never hardcode inside Terraform files.


Writing the Resource Block – Creating the S3 Bucket

Went to the Terraform Registry → searched aws_s3_bucket → copied example structure and customized it.

My sample code:

provider "aws" { region = "us-east-1" } resource "aws_s3_bucket" "first_bucket" { bucket = "adnan-terraform-learning-bucket-demo"
tags = { Name = "MyFirstTerraformBucket" Environment = "dev" } }

A couple things clicked for me here:

  • aws_s3_bucket = resource type

  • first_bucket = logical name for referencing

  • Bucket name must be globally unique

  • Tags make infra readable in production


Executing the Terraform Workflow

Inside terminal:

Init

terraform init

Downloads the AWS provider plugin & prepares working directory.

Plan

terraform plan

Dry run preview — shows + create action & exact resource details.

Apply

terraform apply

Confirmed with yes and within seconds…

S3 bucket created successfully in AWS Console 

There is something magical about seeing real infra appear from a few lines of code.


Understanding Change Detection & State

I updated a tag and ran plan again:

Name = "UpdatedBucketName"

Terraform showed:

~ update in-place

That symbol ~ now makes total sense — updating existing infra without recreating it.
State file tracks everything. Terraform compares real vs desired.


Destroying Infrastructure

Cleanup was simple:

terraform destroy

Bucket gone. Infra clean.
From creation → update → deletion — full lifecycle understood.


Day-03 Key Takeaways

ConceptLearning
ProvidersThe real engine behind cloud actions
terraform initPrepares plugins and backend
terraform planSafe preview of upcoming changes
terraform applyProvisions actual cloud resources
terraform destroyRemoves infra cleanly
State fileTracks reality vs configuration

Diagram



What’s Next?

Day-04 moves into more AWS resources and dependencies, building on this foundational victory.
Excited to scale up from single resources to complex architectures.

This journey already feels powerful — and this is just Day-03.


Final Thought

Creating infrastructure with zero console clicks felt like the true spirit of DevOps —
automation over manual, repeatable over risky, code over chaos.

Terraform is already proving its value.

See you in Day-04 

Here is the video link:


Comments

Popular posts from this blog

AWS with Terraform (Day 01)

AWS with Terraform (Day 02)

AWS with Terraform (Day 06)