Does anyone know the fastest way to get data from and Excel table (VBA Array) to a table on SQL 2008 without using an external utility (i.e. bcp)? Keep in mind my datas
By far the fastest way to do this is via T-SQL's BULK INSERT.
There are a few caveats.
BULK INSERT
command and specify a filename, remember that the filename will be resolved on the machine where SQL Server is running).FIELDTERMINATOR
and ROWTERMINATOR
values to match your CSV.It took some trial and error for me to get this set up initially, but the performance increase was phenomenal compared to every other technique I had tried.
As far as I remember, you can create a linked server to the Excel file (as long as the server can find the path; it's best to put the file on the server's local disk) and then use SQL to retrieve data from it.
The following code will transfer the thousands of data in just few seconds(2-3 sec).
Dim sheet As Worksheet
Set sheet = ThisWorkbook.Sheets("DataSheet")
Dim Con As Object
Dim cmd As Object
Dim ServerName As String
Dim level As Long
Dim arr As Variant
Dim row As Long
Dim rowCount As Long
Set Con = CreateObject("ADODB.Connection")
Set cmd = CreateObject("ADODB.Command")
ServerName = "192.164.1.11"
'Creating a connection
Con.ConnectionString = "Provider=SQLOLEDB;" & _
"Data Source=" & ServerName & ";" & _
"Initial Catalog=Adventure;" & _
"UID=sa; PWD=123;"
'Setting provider Name
Con.Provider = "Microsoft.JET.OLEDB.12.0"
'Opening connection
Con.Open
cmd.CommandType = 1 ' adCmdText
Dim Rst As Object
Set Rst = CreateObject("ADODB.Recordset")
Table = "EmployeeDetails" 'This should be same as the database table name.
With Rst
Set .ActiveConnection = Con
.Source = "SELECT * FROM " & Table
.CursorLocation = 3 ' adUseClient
.LockType = 4 ' adLockBatchOptimistic
.CursorType = 0 ' adOpenForwardOnly
.Open
Dim tableFields(200) As Integer
Dim rangeFields(200) As Integer
Dim exportFieldsCount As Integer
exportFieldsCount = 0
Dim col As Integer
Dim index As Integer
index = 1
For col = 1 To .Fields.Count
exportFieldsCount = exportFieldsCount + 1
tableFields(exportFieldsCount) = col
rangeFields(exportFieldsCount) = index
index = index + 1
Next
If exportFieldsCount = 0 Then
ExportRangeToSQL = 1
GoTo ConnectionEnd
End If
endRow = ThisWorkbook.Sheets("DataSheet").Range("A65536").End(xlUp).row 'LastRow with the data.
arr = ThisWorkbook.Sheets("DataSheet").Range("A1:CE" & endRow).Value 'This range selection column count should be same as database table column count.
rowCount = UBound(arr, 1)
Dim val As Variant
For row = 1 To rowCount
.AddNew
For col = 1 To exportFieldsCount
val = arr(row, rangeFields(col))
.Fields(tableFields(col - 1)) = val
Next
Next
.UpdateBatch
End With
flag = True
'Closing RecordSet.
If Rst.State = 1 Then
Rst.Close
End If
'Closing Connection Object.
If Con.State = 1 Then
Con.Close
End If
'Setting empty for the RecordSet & Connection Objects
Set Rst = Nothing
Set Con = Nothing
End Sub
works pretty fine, on the other hand to improve speed we may still modify the query:
Instead: Source = "SELECT * FROM " & Table
We can use: Source = "SELECT TOP 1 * FROM " & Table
Here is we only need column names. So no need to maka a query for entire table, which is extending the process as long as new data imported.
There is no single fastest way, as it's dependent on a number of factors. Make sure the indexes in SQL are configured and optimized. Lots of indexes will kill insert/update performance since each insert will need to update the index. Make sure you only make one connection to the database, and do not open/close it during the operation. Run the update when the server is under minimal load. The only other method you haven't tried is to use a ADO Command object, and issue a direct INSERT statement. When using the 'AddNew' Method of the recordset object, be sure to issue only one 'UpdateBatch' Command at the end of the inserts. Short of that, the VBA can only run as fast as the SQL server accepting the inputs.
EDIT: Seems like you've tried everything. There is also what is known as 'Bulk-Logged' recovery mode in SQL Server, that reduces the overhead of writting so much to the transaction log. Might be something worth looking into. It can be troublesome since it requires fiddling with the database recovery model a bit, but it could be useful for you.
Having just tried a few methods, I came back to a relatively simple but speedy one. It's fast because it makes the SQL server do all the work, including an efficient execution plan.
I just build a long string containing a script of INSERT statements.
Public Sub Upload()
Const Tbl As String = "YourTbl"
Dim InsertQuery As String, xlRow As Long, xlCol As Integer
Dim DBconnection As New ADODB.Connection
DBconnection.Open "Provider=SQLOLEDB.1;Password=MyPassword" & _
";Persist Security Info=false;User ID=MyUserID" & _
";Initial Catalog=MyDB;Data Source=MyServer"
InsertQuery = ""
xlRow = 2
While Cells(xlRow, 1) <> ""
InsertQuery = InsertQuery & "INSERT INTO " & Tbl & " VALUES('"
For xlCol = 1 To 6 'Must match the table structure
InsertQuery = InsertQuery & Replace(Cells(xlRow, xlCol), "'", "''") & "', '" 'Includes mitigation for apostrophes in the data
Next xlCol
InsertQuery = InsertQuery & Format(Now(), "M/D/YYYY") & "')" & vbCrLf 'The last column is a date stamp, either way, don't forget to close that parenthesis
xlRow = xlRow + 1
Wend
DBconnection.Execute InsertQuery 'I'll leave any error trapping to you
DBconnection.Close 'But do be tidy :-)
Set DBconnection = Nothing
End Sub