1
votes

working on the PHP project related to web scraping and my aim is to store the data into the mysql database,i'm using unique key index on 3 indexes in 9 columns table and records are more than 5k.

  1. should i check for unique data at program level like putting values in arrays and then comparing before inserting into database ?
  2. is there any way so that i can speed up my database insertion ?
5
can you elaborate more on what are you inserting and if its single queries and if you are doing matches first. - azngunit81
im inserting names of businesses with character length 30 on average, street addresses average length 40 , website etc. i am matching on database level not in program - Khalid Khan

5 Answers

1
votes

Never ever create a duplicate table this is a anti SQL pattern and it makes it more difficult to work with your data.

Maybe PDO and prepared statement will give you a little boost but dont expect wonders from it.

multible INSERT IGNORE may also give you a little boost but dont expect wonders from it.

You should generate a multiinsert query like so

INSERT INTO database.table (columns) VALUES (values),(values),(values) Keep in mind to keep under the max packet size that mysql will have. this way the index file have to be updated once.

0
votes

You could create a duplicate of the table that you currently have except no indices on any field. Store the data in this table.

Then use events to move the data from the temp table into the main table. Once the data is moved to the main table then delete if from the temp table.

0
votes

you can follow your updates with triger. You should do update table and you have to right trigger for this table.

0
votes

use PDO, mysqli_* function, to increase insertion into database

0
votes

You could use "INSERT IGNORE" in your query. That way the record will not be inserted if any unique constraints are violated.

Example:

INSERT IGNORE INTO table_name SET name = 'foo', value = 'bar', id = 12345;