0
votes
BULK INSERT kbm_8311.dbo.cqm_mstr FROM 'D:\sanu\cqm_mstr.txt' WITH (DATAFILETYPE='native')  

I'm getting an error message like this

Msg 4866, Level 16, State 7, Line 2 The bulk load failed. The column is too long in the data file for row 1, column 15. Verify that the field terminator and row terminator are specified correctly. Msg 7399, Level 16, State 1, Line 2 The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error. Msg 7330, Level 16, State 2, Line 2 Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

This is the sample of file which i'm trying to use for uploading

é   MUII CMS2 (NQF0418) Percentage of patients aged 12 years and older screened for clinical depression on the date of the encounter using an age appropriate standardized depression screening tool AND if positive, a follow-up plan is documented on the date of the positive screen.T Preventive Care and Screening:  Screening for Clinical Depression and Follow-Up Plan+ Screening for Depression and Follow-Up Plan Depression Screenõ Document a depression assessment using the Screening tools template. Open health promotion plan, and document the plan of care or referral information. Enter medications via the medication module.  For adolescents, need to screen with the PHQ-9.
1
Do you know which table field column 15 is loading into? perhaps you could make it larger (i.e. VARCHAR(MAX)) and see if that helps - Nick.McDermaid
Here the problem is, i have created the BCP file from that table itself - Sanu Antony
I see. Well if you do load it into a different table with more forgiving data types you'll be able to see the data and it might become clear what the problem is. - Nick.McDermaid
The thing is i'm uploading the same txt file which i created from the same table; What should i do on this? - Sanu Antony
Till yesterday i ran it safely; now i'm getting an error - Sanu Antony

1 Answers

0
votes

The table structure was altered Explicitly inserting the row_timestamp was creating the issue.