Introduction
Ever found yourself in a situation where you needed a large amount of dummy data for testing, but didn't want to spend hours writing scripts or manually inserting records? Or perhaps you've been curious about how to leverage new features in MySQL 8.0 to streamline your database tasks? Well, you're in for a treat! In this post, we'll explore how to use Common Table Expressions (CTEs) to effortlessly generate and insert vast amounts of dummy data into your MySQL database.
Imagine needing to populate a table with a million hash values for load testing or performance benchmarking. Sounds like a nightmare, right? Not anymore! With the advent of CTEs in MySQL 8.0, you can achieve this in a matter of seconds. Let's dive into how this works and how you can use this powerful feature to simplify your data generation needs.
TL; DR
Common Table Expressions (CTEs), a new feature added in MySQL 8.0, can be used to easily input a large amount of simple dummy data. For example, if you want to input 1 million dummy data into a table called hashes
that stores hash values, you can achieve this with the following steps:
Table Definitions
First, create the table:
CREATE TABLE hashes (
id INT PRIMARY KEY AUTO_INCREMENT,
hash CHAR(64)
);
Query Execution
Set the session variable to allow a higher recursion depth:
SET SESSION cte_max_recursion_depth = 1000000;
Then, execute the CTE to insert 1 million rows:
INSERT INTO hashes(hash)
WITH RECURSIVE cte (n) AS
(
SELECT 1
UNION ALL
SELECT n + 1 FROM cte WHERE n < 1000000
)
SELECT SHA2(n, 256) FROM cte;
This method leverages a recursive common table expression to generate the dummy data.
Understanding CTEs
A Common Table Expression (CTE) is a named temporary result set that can be referenced within a single statement multiple times. CTEs are particularly useful for simplifying complex queries and improving readability.
Syntax Breakdown
Setting Recursion Depth
SET SESSION cte_max_recursion_depth = 1000000;
The cte_max_recursion_depth
system variable sets the upper limit for recursion. By default, it's 1000, so to recurse more, you need to adjust it. Here, we set it to 1 million.
The CTE Query
INSERT INTO hashes(hash)
WITH RECURSIVE cte (n) AS
(
SELECT 1
UNION ALL
SELECT n + 1 FROM cte WHERE n < 1000000
)
SELECT SHA2(n, 256) FROM cte;
Let's break down this query:
WITH RECURSIVE cte (n)
: This starts the CTE definition.cte
is the name of the temporary result set, andn
is the column.SELECT 1
: This is the non-recursive part of the CTE, serving as the starting point (initial value).UNION ALL SELECT n + 1 FROM cte WHERE n < 1000000
: This is the recursive part, which increments the value ofn
by 1 until it reaches 1,000,000.SELECT SHA2(n, 256) FROM cte
: This final part of the query selects the SHA-256 hash of eachn
value, generating the dummy data for insertion.
How It Works
The CTE recursively generates numbers from 1 to 1,000,000. For each number, it computes the SHA-256 hash and inserts it into the hashes
table. This approach is efficient and leverages MySQL's recursive capabilities to handle large data volumes seamlessly.
Processing Speed
Verification Environment
To understand the impact of this feature, I used a Gitpod Enterprise workspace, leveraging the powerful and ephemeral environment to avoid the hassle of setups and installations. Here’s a glance at the setup:
-
Machine: Gitpod Enterprise
XXLarge
workspace - OS: Ubuntu 22.04.4 LTS (Jammy Jellyfish)
- Containerization: Docker version 26.0.1
- MySQL Version: Official MySQL 8.0 Docker image
Results
For 1 million rows, the query execution time is approximately 4.46 seconds:
mysql> INSERT INTO hashes(hash)
-> WITH RECURSIVE cte (n) AS
-> (
-> SELECT 1
-> UNION ALL
-> SELECT n + 1 FROM cte WHERE n < 1000000
-> )
-> SELECT SHA2(n, 256) FROM cte;
Query OK, 1000000 rows affected (4.43 sec)
Records: 1000000 Duplicates: 0 Warnings: 0
Performance Metrics
Number of Rows | Execution Time |
---|---|
1,000 | 0.03 sec |
10,000 | 0.07 sec |
100,000 | 0.42 sec |
1,000,000 | 4.43 sec |
10,000,000 | 48.53 sec |
Conclusion
Using CTEs in MySQL 8.0 is a game-changer for generating large amounts of dummy data quickly. It's especially handy for load testing and performance benchmarking. With just a few lines of code, you can effortlessly populate your tables and then get back to the other important parts of your project.
Don't hesitate to play around with different data generation strategies and recursion depths to see what works best for you. For more tips and insights on security and log analysis, follow me on Twitter @Siddhant_K_code and stay updated with the latest & detailed tech content like this. Happy coding!
Top comments (1)
For those using MariaDB, you can do the same with:
MariaDB> INSERT INTO hashes(id, hash) SELECT seq, sha2(seq,256) from seq_1_to_1000000;
A little shorter and easier to remember.