Infrastructure as Code with Terraform: From Manual to Automated Cloud Management
Organizations using Infrastructure as Code (IaC) provision environments 90% faster and experience 60% fewer configuration-related outages than those managing infrastructure manually. Terraform has emerged as the dominant IaC tool, with over 3,000 providers supporting every major cloud platform and SaaS service.
This guide covers practical Terraform usage for web applications, ERP systems, and eCommerce platforms --- from your first resource definition to production-grade multi-environment deployments.
Key Takeaways
- Terraform makes infrastructure changes reviewable, testable, and reversible through version control
- Remote state management prevents conflicts when multiple engineers modify infrastructure
- Modules encapsulate reusable patterns, reducing configuration from hundreds of lines to a few parameters
- Terraform Cloud or CI/CD integration enforces plan-before-apply discipline for safe changes
Why Terraform for SMBs
The Manual Infrastructure Problem
Without IaC, your infrastructure knowledge lives in:
- AWS Console click paths that no one documented
- SSH commands run months ago that no one remembers
- Configuration files edited directly on servers
- One engineer's mental model of "how the network works"
With Terraform, your infrastructure lives in Git. Every change is a pull request. Every deployment is reproducible. Every engineer can understand the full picture.
Core Concepts
| Concept | Description |
|---|---|
| Provider | Plugin that interfaces with a cloud platform (AWS, GCP, Azure, Cloudflare) |
| Resource | A single infrastructure component (EC2 instance, RDS database, S3 bucket) |
| Data source | Read-only reference to existing infrastructure |
| Variable | Input parameter for reusable configuration |
| Output | Exported value from a Terraform configuration |
| State | Record of what Terraform manages and its current attributes |
| Module | Reusable group of resources with a defined interface |
First Terraform Configuration
AWS VPC and EC2 for a Web Application
# providers.tf
terraform {
required_version = ">= 1.7"
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 5.0"
}
}
backend "s3" {
bucket = "ecosire-terraform-state"
key = "production/terraform.tfstate"
region = "us-east-1"
encrypt = true
dynamodb_table = "terraform-locks"
}
}
provider "aws" {
region = var.aws_region
}
# variables.tf
variable "aws_region" {
type = string
default = "us-east-1"
}
variable "environment" {
type = string
default = "production"
}
variable "instance_type" {
type = string
default = "t3.large"
}
# main.tf
resource "aws_vpc" "main" {
cidr_block = "10.0.0.0/16"
enable_dns_hostnames = true
enable_dns_support = true
tags = {
Name = "${var.environment}-vpc"
Environment = var.environment
ManagedBy = "terraform"
}
}
resource "aws_subnet" "public" {
count = 2
vpc_id = aws_vpc.main.id
cidr_block = "10.0.${count.index + 1}.0/24"
availability_zone = data.aws_availability_zones.available.names[count.index]
map_public_ip_on_launch = true
tags = {
Name = "${var.environment}-public-${count.index + 1}"
}
}
resource "aws_instance" "app" {
ami = data.aws_ami.ubuntu.id
instance_type = var.instance_type
subnet_id = aws_subnet.public[0].id
vpc_security_group_ids = [aws_security_group.app.id]
key_name = aws_key_pair.deploy.key_name
root_block_device {
volume_size = 50
volume_type = "gp3"
encrypted = true
}
tags = {
Name = "${var.environment}-app"
Environment = var.environment
}
}
resource "aws_db_instance" "postgres" {
identifier = "${var.environment}-db"
engine = "postgres"
engine_version = "17"
instance_class = "db.t3.medium"
allocated_storage = 50
max_allocated_storage = 200
storage_encrypted = true
db_name = "ecosire"
username = "app"
password = var.db_password
vpc_security_group_ids = [aws_security_group.db.id]
db_subnet_group_name = aws_db_subnet_group.main.name
backup_retention_period = 7
backup_window = "03:00-04:00"
maintenance_window = "sun:04:00-sun:05:00"
skip_final_snapshot = false
final_snapshot_identifier = "${var.environment}-db-final"
tags = {
Environment = var.environment
}
}
Modules for Reusable Infrastructure
Creating a Web Application Module
# modules/web-app/main.tf
variable "name" {
type = string
}
variable "environment" {
type = string
}
variable "instance_type" {
type = string
default = "t3.medium"
}
variable "vpc_id" {
type = string
}
variable "subnet_ids" {
type = list(string)
}
resource "aws_lb" "app" {
name = "${var.name}-${var.environment}-alb"
internal = false
load_balancer_type = "application"
security_groups = [aws_security_group.alb.id]
subnets = var.subnet_ids
}
resource "aws_lb_target_group" "app" {
name = "${var.name}-${var.environment}-tg"
port = 3000
protocol = "HTTP"
vpc_id = var.vpc_id
health_check {
path = "/health"
healthy_threshold = 2
unhealthy_threshold = 3
interval = 30
}
}
resource "aws_autoscaling_group" "app" {
name = "${var.name}-${var.environment}-asg"
min_size = 2
max_size = 10
desired_capacity = 2
vpc_zone_identifier = var.subnet_ids
target_group_arns = [aws_lb_target_group.app.arn]
launch_template {
id = aws_launch_template.app.id
version = "$Latest"
}
tag {
key = "Name"
value = "${var.name}-${var.environment}"
propagate_at_launch = true
}
}
output "alb_dns_name" {
value = aws_lb.app.dns_name
}
Using the Module
# environments/production/main.tf
module "web" {
source = "../../modules/web-app"
name = "ecosire-web"
environment = "production"
instance_type = "t3.large"
vpc_id = module.network.vpc_id
subnet_ids = module.network.public_subnet_ids
}
module "api" {
source = "../../modules/web-app"
name = "ecosire-api"
environment = "production"
instance_type = "t3.large"
vpc_id = module.network.vpc_id
subnet_ids = module.network.public_subnet_ids
}
State Management
Remote State with S3
# Bootstrap: create the state bucket and DynamoDB table manually or with a separate config
resource "aws_s3_bucket" "terraform_state" {
bucket = "ecosire-terraform-state"
lifecycle {
prevent_destroy = true
}
}
resource "aws_s3_bucket_versioning" "terraform_state" {
bucket = aws_s3_bucket.terraform_state.id
versioning_configuration {
status = "Enabled"
}
}
resource "aws_dynamodb_table" "terraform_locks" {
name = "terraform-locks"
billing_mode = "PAY_PER_REQUEST"
hash_key = "LockID"
attribute {
name = "LockID"
type = "S"
}
}
State locking via DynamoDB prevents two engineers from running terraform apply simultaneously, which could corrupt state.
State File Security
The Terraform state file contains sensitive information including database passwords, API keys, and resource IDs. Protect it:
- Encrypt at rest: S3 bucket versioning + server-side encryption
- Encrypt in transit: HTTPS only for state access
- Restrict access: IAM policies limiting who can read/write state
- Never commit to Git: State files must never be in version control
- Enable versioning: S3 versioning allows recovering from corrupted state
CI/CD Integration
GitHub Actions Terraform Pipeline
name: Terraform
on:
pull_request:
paths: ['infrastructure/**']
push:
branches: [main]
paths: ['infrastructure/**']
jobs:
plan:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: hashicorp/setup-terraform@v3
- name: Terraform Init
run: terraform init
working-directory: infrastructure/environments/production
- name: Terraform Plan
run: terraform plan -out=tfplan
working-directory: infrastructure/environments/production
- name: Comment PR with plan
if: github.event_name == 'pull_request'
uses: actions/github-script@v7
with:
script: |
const plan = require('fs').readFileSync('infrastructure/environments/production/tfplan.txt', 'utf8');
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: `## Terraform Plan\n\`\`\`\n${plan}\n\`\`\``
});
apply:
needs: plan
if: github.ref == 'refs/heads/main'
runs-on: ubuntu-latest
environment: production
steps:
- uses: actions/checkout@v4
- uses: hashicorp/setup-terraform@v3
- name: Terraform Apply
run: terraform apply -auto-approve
working-directory: infrastructure/environments/production
Multi-Environment Strategy
| Environment | Purpose | Instance Sizes | Cost Target |
|---|---|---|---|
| Development | Feature testing | t3.micro / t3.small | <$100/month |
| Staging | Pre-production validation | Mirrors production (smaller) | ~30% of production |
| Production | Live traffic | Right-sized for load | Optimized |
Use Terraform workspaces or separate directories per environment:
infrastructure/
modules/
web-app/
database/
network/
environments/
development/
main.tf
terraform.tfvars
staging/
main.tf
terraform.tfvars
production/
main.tf
terraform.tfvars
Frequently Asked Questions
Terraform or Pulumi --- which should we choose?
Terraform if your team includes operations engineers who prefer declarative configuration. Pulumi if your team is developer-heavy and prefers writing infrastructure in TypeScript or Python. Terraform has a larger ecosystem and more community modules. Pulumi has a steeper initial learning curve but is more flexible for complex logic.
How do we import existing infrastructure into Terraform?
Use terraform import to bring existing resources under Terraform management. For example: terraform import aws_instance.app i-1234567890abcdef0. After importing, write the matching configuration. Terraform 1.5+ supports import blocks in configuration files for bulk imports.
How do we handle secrets in Terraform?
Never commit secrets to Terraform files. Use terraform.tfvars (excluded from Git), environment variables (TF_VAR_db_password), or a secrets manager (AWS Secrets Manager, HashiCorp Vault). Mark sensitive variables with sensitive = true to prevent them from appearing in plan output.
What is the cost of managing Terraform?
Terraform itself is free and open source. Terraform Cloud has a free tier for up to 5 users with remote state and plan/apply. The main cost is the learning curve (20-40 hours for an experienced engineer) and ongoing maintenance (2-4 hours per month). This is offset by the time saved on manual infrastructure management.
What Comes Next
Terraform provides the foundation for automated infrastructure. Combine it with CI/CD pipelines for automated deployment, monitoring for operational visibility, and disaster recovery for resilience.
Contact ECOSIRE for infrastructure automation consulting, or explore our DevOps guide for small businesses for the complete roadmap.
Published by ECOSIRE -- helping businesses automate cloud infrastructure.
Written by
ECOSIRE Research and Development Team
Building enterprise-grade digital products at ECOSIRE. Sharing insights on Odoo integrations, e-commerce automation, and AI-powered business solutions.
Related Articles
Accounts Payable Automation: Cut Processing Costs by 80 Percent
Implement accounts payable automation to reduce invoice processing costs from $15 to $3 per invoice with OCR, three-way matching, and ERP workflows.
AI in Accounting and Bookkeeping Automation: The CFO Implementation Guide
Automate accounting with AI for invoice processing, bank reconciliation, expense management, and financial reporting. 85% faster close cycles.
AI Agents for Business Process Automation: From Chatbots to Autonomous Workflows
How AI agents automate complex business processes across sales, operations, finance, and customer service with multi-step reasoning and system integration.