I'm creating a VB.Net application that stores data in a SQLite backend. The data comes in excel workbooks that the user can import from. Each workbook has one worksheet (about 30,000 rows) that gets reformated a bit and imported to a new table. What's the most efficient way to do this?
I'm currently reading in the entire range from Excel into a 2D array. Looping over the rows in this array and adding each row to a long SQL statement that gets executed every thousand rows. But this is painfully slow both on the looping through the array bit and the pushing to the SQLite step. I can't help but think there must be a more efficient means of doing this.
Thanks,
Code below: 'First open the xls reformater book and read in our data Dim xlApp As New Excel.Application Dim xlWorkBook As Excel.Workbook Dim xlWorkSheet As Excel.Worksheet
xlWorkBook = xlApp.Workbooks.Open(strFile)
xlWorkSheet = xlWorkBook.Worksheets("ToDSS")
Dim r As Excel.Range = xlWorkSheet.UsedRange
Dim array(,) As Object = r.Value(Excel.XlRangeValueDataType.xlRangeValueDefault)
xlWorkBook.Close()
xlApp.Quit()
releaseObject(xlApp)
releaseObject(xlWorkBook)
releaseObject(xlWorkSheet)
SQLconnect.Open()
SQLcommand = SQLconnect.CreateCommand
'now loop through the rows inserting each into the db
Dim curDate As Date
strSQL = ""
Dim batch As Integer = 0
For row As Integer = 16 To array.GetUpperBound(0)
strSQL += "INSERT INTO scenario_" & strScenarioName & " VALUES ('"
curDate = array(row, 1)
strSQL += curDate.ToString("yyyy'-'MM'-'dd") + "'"
For col = 2 To 30
strSQL += ", " & array(row, col)
Next
strSQL += " );" & vbCrLf
If batch > 1000 Or row = array.GetUpperBound(0) Then
Debug.Print(Str(row))
SQLcommand.CommandText = strSQL
SQLcommand.ExecuteNonQuery()
Debug.Print("pushed")
strSQL = ""
batch = 0
Else
batch += 1
End If
Next
SQLcommand.Dispose()
SQLconnect.Close()