AWS with Terraform (Day 03)
Provisioning My First Real Resource (S3 Bucket)
Today was exciting. After two days of learning fundamentals like providers, versioning, and workflow, Day-03 was finally hands-on — provisioning my first actual AWS resource using Terraform: an S3 bucket.
No console clicks. No manual setup. Just clean Infrastructure as Code. This is the moment where Terraform stops being a theory and becomes real power.
Why Start with an S3 Bucket?
S3 is simple but fundamental. Almost every application touches S3 — for static hosting, logs, backups, artifacts, or data pipelines.
Starting with S3 makes sense:
-
Easy to understand
-
Clear outcome
-
Immediate feedback in AWS console
-
Perfect place to validate workflow
If IaC were learning to drive, this is starting the engine for the first time.
Environment Setup Before Coding
Inside VS Code, I opened my Day03 folder from the challenge repo and copied my existing main.tf from Day-02. Terraform detects any file ending with .tf, which keeps things flexible.
Before running Terraform, I ensured my AWS authentication was configured:
Entered Access Key, Secret Key, Region, Output Format.
Tested it by listing buckets:
Credentials → Secure & ready. Never hardcode inside Terraform files.
Writing the Resource Block – Creating the S3 Bucket
Went to the Terraform Registry → searched aws_s3_bucket → copied example structure and customized it.
My sample code:
A couple things clicked for me here:
-
aws_s3_bucket= resource type -
first_bucket= logical name for referencing -
Bucket name must be globally unique
-
Tags make infra readable in production
Executing the Terraform Workflow
Inside terminal:
Init
Downloads the AWS provider plugin & prepares working directory.
Plan
Dry run preview — shows + create action & exact resource details.
Apply
Confirmed with yes and within seconds…
S3 bucket created successfully in AWS Console
There is something magical about seeing real infra appear from a few lines of code.
Understanding Change Detection & State
I updated a tag and ran plan again:
Terraform showed:
That symbol ~ now makes total sense — updating existing infra without recreating it.
State file tracks everything. Terraform compares real vs desired.
Destroying Infrastructure
Cleanup was simple:
Bucket gone. Infra clean.
From creation → update → deletion — full lifecycle understood.
Day-03 Key Takeaways
| Concept | Learning |
|---|---|
| Providers | The real engine behind cloud actions |
| terraform init | Prepares plugins and backend |
| terraform plan | Safe preview of upcoming changes |
| terraform apply | Provisions actual cloud resources |
| terraform destroy | Removes infra cleanly |
| State file | Tracks reality vs configuration |
Diagram
What’s Next?
Day-04 moves into more AWS resources and dependencies, building on this foundational victory.
Excited to scale up from single resources to complex architectures.
This journey already feels powerful — and this is just Day-03.
Final Thought
Creating infrastructure with zero console clicks felt like the true spirit of DevOps —
automation over manual, repeatable over risky, code over chaos.
Terraform is already proving its value.
See you in Day-04
Here is the video link:
Comments
Post a Comment