DEV Community

akhil mittal
akhil mittal

Posted on

Building a EKS Cluster with Terraform: A Modular and Scalable Approach

Below is a production-standard folder structure for Terraform to provision EKS across multiple environments using modules. The structure ensures modularity, environment separation, and reusability. Additionally, I'll provide steps and pipeline code for deploying this infrastructure using both GitHub Actions and Jenkins.

Production-Standard Terraform Folder Structure

terraform/
├── modules/                     # Reusable modules
│   ├── eks/                     # EKS module
│   │   ├── main.tf              # EKS resource definitions
│   │   ├── variables.tf         # Input variables
│   │   ├── outputs.tf           # Output values
│   │   ├── versions.tf          # Required providers and versions
│   └── vpc/                     # VPC module
│       ├── main.tf              # VPC resource definitions
│       ├── variables.tf         # Input variables
│       ├── outputs.tf           # Output values
│       ├── versions.tf          # Required providers and versions
├── envs/                        # Per-environment configurations
│   ├── dev/                     # Development environment
│   │   ├── main.tf              # Environment-specific resources
│   │   ├── variables.tf         # Environment-specific variables
│   │   ├── backend.tf           # S3/DynamoDB backend for Terraform state
│   │   └── tfvars.json          # Environment-specific variable values
│   ├── staging/                 # Staging environment
│   │   ├── main.tf
│   │   ├── variables.tf
│   │   ├── backend.tf
│   │   └── tfvars.json
│   └── prod/                    # Production environment
│       ├── main.tf
│       ├── variables.tf
│       ├── backend.tf
│       └── tfvars.json
└── pipelines/                   # CI/CD pipelines for Terraform
    ├── github-actions.yml       # GitHub Actions workflow
    └── jenkins-pipeline.groovy  # Jenkins pipeline script
Enter fullscreen mode Exit fullscreen mode

Key Files

1. modules/eks/main.tf

Defines EKS cluster and related resources:

resource "aws_eks_cluster" "this" {
  name     = var.cluster_name
  role_arn = var.cluster_role_arn

  vpc_config {
    subnet_ids = var.subnet_ids
  }
}

resource "aws_eks_node_group" "this" {
  cluster_name    = aws_eks_cluster.this.name
  node_role_arn   = var.node_role_arn
  subnet_ids      = var.subnet_ids
  scaling_config {
    desired_size = var.desired_size
    max_size     = var.max_size
    min_size     = var.min_size
  }
}
Enter fullscreen mode Exit fullscreen mode

2. envs/dev/main.tf

Calls the EKS module and other required modules (e.g., VPC):

module "vpc" {
  source      = "../../modules/vpc"
  cidr_block  = var.cidr_block
  env         = var.env
}

module "eks" {
  source            = "../../modules/eks"
  cluster_name      = var.cluster_name
  cluster_role_arn  = var.cluster_role_arn
  node_role_arn     = var.node_role_arn
  subnet_ids        = module.vpc.subnet_ids
  desired_size      = var.desired_size
  max_size          = var.max_size
  min_size          = var.min_size
}
Enter fullscreen mode Exit fullscreen mode

3. envs/dev/backend.tf

Defines remote backend configuration for state storage:

terraform {
  backend "s3" {
    bucket         = "your-terraform-state-bucket"
    key            = "eks/dev/terraform.tfstate"
    region         = "us-east-1"
    dynamodb_table = "terraform-lock-table"
    encrypt        = true
  }
}
Enter fullscreen mode Exit fullscreen mode

4. envs/dev/tfvars.json

Environment-specific values:

{
  "cluster_name": "eks-dev",
  "cidr_block": "10.0.0.0/16",
  "desired_size": 2,
  "max_size": 5,
  "min_size": 1,
  "env": "dev",
  "cluster_role_arn": "arn:aws:iam::123456789012:role/EKS-Cluster-Role",
  "node_role_arn": "arn:aws:iam::123456789012:role/EKS-Node-Role"
}
Enter fullscreen mode Exit fullscreen mode

Steps to Deploy Using GitHub Actions

1. GitHub Actions Workflow (pipelines/github-actions.yml)

name: Deploy Terraform

on:
  push:
    branches:
      - main

jobs:
  terraform:
    runs-on: ubuntu-latest

    steps:
    - name: Checkout repository
      uses: actions/checkout@v2

    - name: Setup Terraform
      uses: hashicorp/setup-terraform@v2
      with:
        terraform_version: 1.5.0

    - name: Terraform Init
      run: |
        cd terraform/envs/dev
        terraform init

    - name: Terraform Plan
      run: |
        cd terraform/envs/dev
        terraform plan -var-file=tfvars.json

    - name: Terraform Apply
      if: github.ref == 'refs/heads/main'
      run: |
        cd terraform/envs/dev
        terraform apply -auto-approve -var-file=tfvars.json
Enter fullscreen mode Exit fullscreen mode

Steps to Deploy Using Jenkins

2. Jenkins Pipeline Script (pipelines/jenkins-pipeline.groovy)

pipeline {
    agent any
    environment {
        TF_VAR_AWS_ACCESS_KEY = credentials('aws-access-key-id')
        TF_VAR_AWS_SECRET_KEY = credentials('aws-secret-access-key')
    }
    stages {
        stage('Checkout') {
            steps {
                checkout scm
            }
        }
        stage('Terraform Init') {
            steps {
                sh '''
                cd terraform/envs/dev
                terraform init
                '''
            }
        }
        stage('Terraform Plan') {
            steps {
                sh '''
                cd terraform/envs/dev
                terraform plan -var-file=tfvars.json
                '''
            }
        }
        stage('Terraform Apply') {
            steps {
                input "Apply changes?"
                sh '''
                cd terraform/envs/dev
                terraform apply -auto-approve -var-file=tfvars.json
                '''
            }
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

How to Use

With GitHub Actions:

  1. Push changes to the main branch to trigger the workflow.
  2. Review the GitHub Actions logs for progress.

With Jenkins:

  1. Trigger the pipeline manually or on code commits.
  2. Approve the Terraform Apply stage if required.

Key Considerations

  1. State Management:
    • Ensure the S3 bucket and DynamoDB table for state locking are created beforehand.
  2. Environment Variables:
    • Use environment variables for sensitive information like AWS keys.
  3. Separate Environments:
    • Use different backend.tf configurations for dev, staging, and prod.
  4. IAM Roles:
    • Attach least-privilege IAM roles for Terraform execution.

This setup ensures a robust and modular approach to managing infrastructure across environments. Let me know if you need further assistance!

Top comments (0)