I\'m trying to insert a large CSV file (several gigs) into SQL Server
, but once I go through the Import Wizard
and finally try to import the file I
The Advanced Editor did not resolve my issue, instead I was forced to edit dtsx-file through notepad (or your favorite text/xml editor) and manually replace values in attributes to
length="0"
dataType="nText"
(I'm using unicode)
Always make a backup of the dtsx-file before you edit in text/xml mode.
Running SQL Server 2008 R2
Goto Advanced tab----> data type of column---> Here change data type from DT_STR to DT_TEXT and column width 255. Now you can check it will work perfectly.
Issue: The Jet OLE DB provider reads a registry key to determine how many rows are to be read to guess the type of the source column. By default, the value for this key is 8. Hence, the provider scans the first 8 rows of the source data to determine the data types for the columns. If any field looks like text and the length of data is more than 255 characters, the column is typed as a memo field. So, if there is no data with a length greater than 255 characters in the first 8 rows of the source, Jet cannot accurately determine the nature of the data type. As the first 8 row length of data in the exported sheet is less than 255 its considering the source length as VARCHAR(255) and unable to read data from the column having more length.
Fix: The solution is just to sort the comment column in descending order. In 2012 onwards we can update the values in Advance tab in the Import wizard.
I think its a bug, please apply the workaround and then try again: http://support.microsoft.com/kb/281517.
Also, go into Advanced tab, and confirm if Target columns length is Varchar(max).
This answer may not apply universally, but it fixed the occurrence of this error I was encountering when importing a small text file. The flat file provider was importing based on fixed 50-character text columns in the source, which was incorrect. No amount of remapping the destination columns affected the issue.
To solve the issue, in the "Choose a Data Source" for the flat-file provider, after selecting the file, a "Suggest Types.." button appears beneath the input column list. After hitting this button, even if no changes were made to the enusing dialog, the Flat File provider then re-queried the source .csv file and then correctly determined the lengths of the fields in the source file.
Once this was done, the import proceeded with no further issues.
In SQL Server Import and Export Wizard you can adjust the source data types in the Advanced
tab (these become the data types of the output if creating a new table, but otherwise are just used for handling the source data).
The data types are annoyingly different than those in MS SQL, instead of VARCHAR(255)
it's DT_STR
and the output column width can be set to 255
. For VARCHAR(MAX)
it's DT_TEXT
.
So, on the Data Source selection, in the Advanced
tab, change the data type of any offending columns from DT_STR
to DT_TEXT
(You can select multiple columns and change them all at once).