DEV Community

dkelxldk
dkelxldk

Posted on • Edited on

Handling concurrent mysql transaction request with sequelize

In this post, I just wanna share my experience on handling hundreds of concurrent request on sql transaction using sequelize.

At first, when i build my application. There was no problem during the test. The problem comes when the third party that I want to integrate with is testing my app with a stress test.

And in a result, my app can't handle hundreds of concurrent request. There was no problem with the nodejs. The problem was when the app trying to get a connection from a database pool, but there was no released connection on the pool. Because it was all in use.

My db pool config was :

pool: {
      max: 38, // maximum number of pool connection
      min: 0,
      acquire: 60000, // 60 seconds acquire time
      idle: 5000, // 5 seconds idle time
    },
Enter fullscreen mode Exit fullscreen mode

And here's the example code :

await Promise.all([
  User.findByPk(1, { transaction }),
  Gamelog.bulkCreate(gamelog, { transaction }),
  Balancelog.bulkCreate(balancelog, { transaction })
]);
Enter fullscreen mode Exit fullscreen mode

The solution was quite simple, it was an issue from the sequelize itself. https://github.com/sequelize/sequelize/issues/11024

pool.max transactions can run simultaneously, and the (pool.max + 1) transaction will cause a deadlock which completely blocks new transactions.

So we can't use async/await on this scenario, rather we use promises.

sequelize.transaction(async transaction => {
  User.findByPk(1, { transaction });
  Gamelog.bulkCreate(gamelog, { transaction });
  Balancelog.bulkCreate(balancelog, { transaction });
})
.then(() => {
  return true
})
.catch(err => return false)
Enter fullscreen mode Exit fullscreen mode

Top comments (1)

Collapse
 
bias profile image
Tobias Nickel

We also had a similar issue when using transactions. Our problem was that some code that was running within a transaction, actually used a new connection from the pool and this way caused a deadlock.

the code that you definetly looks like the queries run concurrently, however it only takea one connextion from the pool, execute all queries on this connection and then commit.

According to your configuration, your server can handle 38 transactions concurrently. And I think that is good and enough for your usecase.

the underlying driver (the mysql package) execute on one transaction only one query at a time. I mean, it send one query to the db wait for the response and then send the next. A large poolsize is still giving you a good usage of the networking(even if it is localhost or docker). but is requires a lot of resources on the db server side.

I don't know how to truely run the 3 queries concurrently with mysql, but the postgres module, can begin the transaction, send three querries, send the commit, and then wait for the server response to give the connection back to the connection pool.

Thanks for sharing your experience, just want to help understand what parts of a database transaction run concurrently.

By the way, I like the syntax you are showing. Because now I think the postgres module can learn from it, so that it can begin the next transaction, before waiting that the last transaction committed or failed.