DEV Community

Harsh Viradia
Harsh Viradia

Posted on • Updated on

Delete EBS Volume Using Boto3 Script

Managing resources in the AWS cloud infrastructure is a critical task for any AWS user. As part of this process, it is essential to have the ability to efficiently delete unused resources to optimize costs and maintain a clean environment. In this blog post, we will explore how to delete Elastic Block Store (EBS) volumes using a Python script with the Boto3 library, a powerful tool for interacting with AWS services programmatically. We will build upon the provided script and guide you through the process of deleting EBS volumes in the "available" state, ensuring a streamlined infrastructure management workflow.

Why Deleting Unused EBS Volumes is Important:
As your AWS infrastructure grows, you may find yourself with unused EBS volumes that are no longer needed. These volumes can consume storage space and incur costs without providing any value to your applications. By regularly identifying and removing these unused EBS volumes, you can optimize costs, simplify your infrastructure, and maintain better control over your AWS resources.

In the realm of cloud computing, managing and interacting with AWS resources is a crucial aspect of building and maintaining a robust infrastructure. Enter Boto3, the official AWS SDK for Python, which provides a powerful and intuitive way to interact with AWS services programmatically. Boto3 acts as a bridge between your Python applications and the vast array of AWS resources, enabling you to automate tasks, integrate with services, and streamline your operations.

In this blog post, we will explore what Boto3 is and why it has become the go-to choice for developers and system administrators working with AWS resources. We will uncover its key features, benefits, and use cases, shedding light on how Boto3 simplifies AWS resource management and accelerates cloud development. Whether you are a seasoned AWS professional or just getting started with cloud computing, understanding the capabilities of Boto3 will empower you to efficiently harness the full potential of AWS services.

Using Boto3 to Delete EBS Volumes:
Boto3, the AWS SDK for Python, provides a comprehensive set of tools and functionalities to interact with AWS services. Leveraging the power of Boto3, we can automate the process of deleting EBS volumes through a simple Python script.

import boto3
import csv

region_name = 'us-east-1'

session = boto3.Session(region_name=region_name)

ec2_resource = session.resource('ec2')

volumes = ec2_resource.volumes.filter(Filters=[{'Name': 'status', 'Values': ['available']}]) 
Enter fullscreen mode Exit fullscreen mode

This code demonstrates the usage of the Boto3 library, which is a Python SDK for interacting with Amazon Web Services (AWS). Specifically, it focuses on performing operations on EC2 (Elastic Compute Cloud) resources. First, the code imports the necessary modules, boto3 and csv, to enable communication with AWS and handle CSV files. Next, the region_name variable is set to 'us-east-1', representing the AWS region where the EC2 resources are located. This region can be changed according to the desired location. Then, a Boto3 session is created using the Session class, which is initialized with the specified region. The session acts as a foundation for subsequent AWS operations. Following that, an EC2 resource is instantiated using the resource method of the session object. This resource represents the EC2 service and provides access to various EC2-related operations. The code proceeds to filter the EC2 volumes using the volumes.filter() method. The filter specifies that only volumes in the "available" state should be retrieved. The resulting volumes are stored in the volumes variable. At this point, the code has obtained a collection of EBS (Elastic Block Store) volumes matching the filter criteria. Although the code snippet doesn't include any further operations, this collection can be iterated over to perform additional actions or extract information about the volumes.

deleted_volumes = []
deleted_snapshots = []
for volume in volumes:
    try:
        # Store the IDs of the snapshots associated with the volume
        snapshot_ids = [snapshot['SnapshotId'] for snapshot in volume.snapshots.all()]

        # Delete the volume
        response = volume.delete()
        deleted_volumes.append(volume.id)

        # Store the IDs of the deleted snapshots
        deleted_snapshots.extend(snapshot_ids)

    except Exception as e:
        print(f"Error deleting volume {volume.id}: {str(e)}")

# Print the list of deleted volumes
print("Deleted volumes:")
for volume_id in deleted_volumes:
    print(volume_id)

# Create CSV file and upload it to S3 bucket
with open("deleted_snapshots.csv", 'w', newline='') as csvfile:
    csvwriter = csv.writer(csvfile)
    csvwriter.writerow(['VolumeID', 'SnapshotID'])

    # Write volume ID and associated snapshot IDs to the CSV file
    for volume_id, snapshot_id in zip(deleted_volumes, deleted_snapshots):
        csvwriter.writerow([volume_id, snapshot_id])
Enter fullscreen mode Exit fullscreen mode

This code extends the previous code snippet and performs additional operations on the filtered EC2 volumes. It starts by initializing two empty lists, deleted_volumes and deleted_snapshots, which will be used to store the IDs of the deleted volumes and their associated snapshot IDs. A loop is then executed for each volume in the volumes collection. Within the loop, the code attempts to perform several operations. First, it retrieves the IDs of the snapshots associated with the volume by iterating over volume.snapshots.all() and using a list comprehension to extract the IDs. Next, it invokes the delete() method on the volume object to delete the volume. The response from the deletion operation is stored in the response variable. The ID of the deleted volume is then appended to the deleted_volumes list using deleted_volumes.append(volume.id). Additionally, the IDs of the associated snapshots are added to the deleted_snapshots list using deleted_snapshots.extend(snapshot_ids). To handle any exceptions that may occur during the deletion process, the code employs a try-except block. If an exception is caught, it prints an error message indicating the volume ID and the corresponding error message. After iterating over all the volumes, the code proceeds to print the list of deleted volumes. It does this by looping through the deleted_volumes list and printing each volume ID. Next, the code creates a CSV file named "deleted_snapshots.csv" using the open() function in write mode. It also initializes a csv.writer object called csvwriter to facilitate writing data to the CSV file. The code writes a header row to the CSV file, specifying the column names as 'VolumeID' and 'SnapshotID', using csvwriter.writerow(['VolumeID', 'SnapshotID']). Finally, the code iterates over the deleted_volumes and deleted_snapshots lists simultaneously using the zip() function. For each iteration, it writes the volume ID and the associated snapshot ID as a row in the CSV file using csvwriter.writerow([volume_id, snapshot_id]).

Benefits of Automating EBS Volume Deletion:
Automating the deletion of unused EBS volumes brings several benefits to your AWS infrastructure management workflow:

Cost Optimization: By removing unused EBS volumes, you can eliminate unnecessary storage costs and optimize your overall AWS spending.

Simplified Resource Management: Deleting unused volumes reduces clutter and simplifies your infrastructure, making it easier to manage and maintain.

Improved Efficiency: Automating the deletion process using Boto3 eliminates manual, time-consuming tasks, allowing you to focus on more critical aspects of your AWS infrastructure.

Conclusion:
Efficiently managing your AWS infrastructure includes regularly cleaning up unused resources. Deleting EBS volumes that are no longer needed is a crucial step in optimizing costs and maintaining a streamlined environment. With Boto3 and the provided script as a reference, you can automate the process of deleting EBS volumes, making infrastructure management simpler and more efficient.

By embracing automation and leveraging the power of Boto3, you can ensure that your AWS resources are clean, up-to-date, and aligned with your application requirements. Start implementing these practices today to enhance your AWS infrastructure management workflow and maximize the benefits of the cloud.

Remember, always exercise caution when deleting resources and double-check that the volumes you are deleting are no longer needed. Happy automating!

  • Harsh Viradia

Top comments (0)