1
votes

I'm trying to import Text file for cities name into SQL server 2012. The file is 1.45 GB, I'm using SQL Server Import and Export Wizard to do the job but each time I get this error:

  • Copying to [dbo].[worldcitiespop] (Error) Messages Error 0xc02020a1: Data Flow Task 1: Data conversion failed. The data conversion for column "City" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.". (SQL Server Import and Export Wizard)

    Error 0xc020902a: Data Flow Task 1: The "Source - worldcitiespop_txt.Outputs[Flat File Source Output].Columns[City]" failed because truncation occurred, and the truncation row disposition on "Source - worldcitiespop_txt.Outputs[Flat File Source Output].Columns[City]" specifies failure on truncation. A truncation error occurred on the specified object of the specified component. (SQL Server Import and Export Wizard)

    Error 0xc0202092: Data Flow Task 1: An error occurred while processing file "C:\Users\elsho\Downloads\worldcitiespop.txt\worldcitiespop.txt" on data row 114091. (SQL Server Import and Export Wizard)

    Error 0xc0047038: Data Flow Task 1: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on Source - worldcitiespop_txt returned error code 0xC0202092. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure. (SQL Server Import and Export Wizard)

I changed the Table fields size to the text type to avoid data length problem but this did not help,please advise.

Thanks

1
Sometimes it helps to open the file and look for any inconsistencies, I am pretty sure that Microsoft NotePad will not be able to open this file or atleast you wont be able to work with the file even if you managed to open it in NotePad. A tool EmEditor that I use to inspect such big files has many cool features and allows you to work with files as big 100s of Gigabytes. It also has features to help you to find inconsistencies in large text , csv file. Try the free version. Surely, you will find it very helpful. - M.Ali
Thanks I will try it - Daina Hodges

1 Answers

3
votes

I strongly recommend that you use a staging table. Import the table into a staging table with wide nvarchar() text fields.

If you cannot do this successfully, then the data file itself is probably corrupted.

After you have loaded into the staging table, insert the data (with appropriate conversions) into the final form (and then build indexes). It is much easier to find such problems when the data is already in a table than when trying to import it frustratingly over and over again.