1
votes

I am trying to insert 13k rows in my table. My table structure is as follow

ID | Col1 | Col2 | Col3 | Col4 | Col5 | Col6 |

ID is the primary (auto-increment) column.

I need Col1 to reflect number 1 to 13000.

The rest of the columns will remain blank.

I am sure this question has been asked before but I am really confused as to how to do that efficiently. Thanks

3
there is no "efficient" way: do 13,000 inserts. either the mysql extended syntax, or 13,000 separate inserts.Marc B

3 Answers

3
votes

http://naspinski.net/post/Running-a-For-Loop-in-MS-SQL.aspx

Follow the example posted here. your While (@count < 40) will be 13000, Set @count = 0 will be 1, and instead of insert into some_table ([columnA], [column]) Values ('val1', 'val2') it will just have [columnA] and @count instead of val1. The modified example below should suffice.

SQL Command:

DECLARE @count INT   
SET @count = 1   
WHILE (@count < 13001)   

BEGIN   
  INSERT INTO some_table ([Col1]) VALUES (@count)   
  SET @count = (@count + 1)   
END  

MYSQL Command:

BEGIN  
  DECLARE @count INT;  
  SET @count = 1;  

  WHILE (@count < 13001)
    INSERT INTO 'some_table' ('col1') VALUES (@count);  
    SET @count = (@count + 1);  
  END WHILE;  
END;  
1
votes

The most efficient way of loading data into MySQL is to load it from a local delimited file. MySQL has a load local infile command that does this:

LOAD DATA INFILE '/tmp/data.csv' INTO TABLE mytable
FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n' (Col1);

Where your input file is just a list of the Col1 values, one per line:

8377
3888
3244
...
0
votes

You need to specify your source of the information.

Each platform has it's own ways of doing this.

Microsoft has their integration services for example. But if it is simple data then you could create a regex replace to encapsulate the data into a number of select queries that inserts into the table.

But I need to know more about which database and the source of the information to help you anymore.