I need to import simultaneously the data from 55 .tsv files into MySQL Database.
I have the job that reads the file and updates the rows on database using the updateOrCreate() method.
My Supervisor config:
[program:laravel-worker]
process_name=%(program_name)s_%(process_num)02d
command=php /Users/denis/Code/my_project/artisan queue:work --tries=3 --timeout=3600 --daemon
autostart=true
autorestart=true
numprocs=55
stdout_logfile=/Users/denis/Code/my_project/storage/logs/workers.log
stderr_logfile=/Users/denis/Code/my_project/storage/logs/workers_error.log
But when I do it, all the jobs fails with the error:
PDOException: SQLSTATE[40001]: Serialization failure: 1213 Deadlock found when trying to get lock; try restarting transaction in /Users/denis/Code/my_project/vendor/laravel/framework/src/Illuminate/Database/Connection.php:474 ...
I can resolve it with try-catch block
$updated = null;
$db_max_retries = 10;
$db_retry_count = 0;
while($updated == null && $db_retry_count < $db_max_retries) {
try {
$updated = AdsPerformance::updateOrCreate(['campaign_id' => $performance_obj['campaign_id'], 'date' => $performance_obj['date'], 'ad_provider' => 'adwords'], $performance_obj);
} catch (\Exception $e) {
sleep(1);
$db_retry_count++;
}
}
$lines_proceded++;
... but in this case the process takes very long time, as there are ~200 deadlocks in ~2000 imported lines.
- When the table is empty, there is no deadlocks. Deadlocks only appear when the table already has the data (Basically when trying to update)
- I tried to use different sleep times. With the sleep=5, there are less deadlocks. With sleep=1/10 seconds (i used usleep(100000)) there are more deadlocks. Btw the total execution time is not very different in these cases.
- I tried to use other queue drivers (redis and beanstalkd). But there is also not the big difference.
Maybe there is some other solution I can try? Like some MySQL configuration or something else? I don't believe there is no solution to make simultaneous updates without deadlocks in MySQL.