I try to load my database with tons of data from a .csv file sized 1.4 GB. But when I try to run my code I get errors.
Here\'s my code:
USE [Intradata N
I got the same error when I had a different number of delimited fields in my CSV than columns I had in my table. Check if you have the right number of fields in intramerge.csv
.
Methods to determine rows with issues:
and here are the rows with less columns
I encountered a similar issue, but in this case the file being loaded contained some blank lines. Removing the blank lines solved it.
Alternatively, as the file was delimited, I added the correct number of delimiters to the blank lines, which again allowed the file to import successfully - use this option if the blank lines need to be loaded.
The bulk insert will not tell you if the import values will "fit" into the field format of the target table.
For example: I tried to import decimal values into a float field. But as the values all had a comma as decimal point, it was unable to insert them into the table (it was expecting a point).
These unexpected results often happen when the provided CVS value is an export from an Excel file. Your computer's regional settings will decide which decimal point will be used when saving an Excel file into a CSV. CSV's provided by different people will cause different results.
Solution: import all fields as VARCHAR, and try to deal with the values afterwards.
This is my solution: just give up.
I always end up using SSMS and [ Tasks > Import Data ]
.
I have never managed to get a real world .csv file to import using this method. This is utterly useless function that only works on pristine datasets that don't exist in the real world. Perhaps I've never had any luck because the datasets I deal with are quite messy and are generated by third parties.
And if it goes wrong, it doesn't give any clue as to why. Microsoft, you sadden me with your utter incompetence in this area.
Microsoft, perhaps add some error messages, so it says why it rejected it? Because it's almost impossible to fix the issue if you don't know why it failed!
It was an old question but It seems that my finding would enlight some other people having a similar issue.
The default SSIS timeout value appears to be 30 seconds. This makes any service bound or IO bound operation in your package goes well beyond that timeout value and causes a timeout. Increasing that timeout value (change to "0" for no timeout) will resolve the issue.
This can also happen if you file columns are separated with ";" but you are using "," as the FIELDTERMINATOR (or the other way around)