I am trying to figure out if I have a problem with how I use a mySql db in an application. I am running nodejs, express, mysql2, and use connection pooling, with multiple statements queries. I have multiple servers, running different services, accessing the same db.
At the beginning of each mySql instructions set, I set some user variables to check user credentials, latest status etc... In following instructions, I use these variables to conditionally access some data, or also do inserts, modify data. There are different players involved here (nodejs, mysql2 driver, mysql specs, innoDb engine), and I am not sure on which has the definite answer to the following:
- Are all my instructions in my multiple statements query going to be run in one shot, or do I need to worry about different instructions from different queries and/or servers interleaving?
- user variables are maintained per connection. If instructions can interleave, I could end up with different transactions messing up each other's user variables. Would wrapping a query in a
START TRANSACTION; .... COMMIT;
fix the problem?
Right now, a high level view of my queries would be:
const mysql2 = require('mysql2');
const pool = mysql2.createPool({
multipleStatements: 'true',
// other options...
}).promise();
[rows, fields] = await pool.query(`
// Clear variables, in case no result in found in following SELECT:
SET @enable=NULL, @status=NULL, ...;
// Check credentials, status, others:
SELECT blah_blah INTO @enable FROM ...
SELECT more_blah INTO @status FROM ...
//Do stuff based on these variables
SELECT some_stuff FROM somewhere WHERE @enable IS NOT NULL;
INSERT INTO someplace ... // conditioned on @status and others
`);
Thanks!