I'm running into a situation in a data flow where I think I'm reaching the size limit of the SqlCommand property of an OLE DB Command data flow transformation. Let me give you the set up:
I have a text file source containing a few columns. One of the columns is a string object with upwards of 50,000 characters. I wrote a stored procedure that accepts these columns, including the string object which gets parsed and added as several new rows in a SQL table. It works fine when the stored proc is called within SQL Server Management Studio and I can pass in large amounts of text (50k+).
Now in my data flow object in SSIS, I have an OLE DB Command transformation which calls the stored proc passing in the columns as parameters, including the string object (DT_NTEXT data type). If the text file column containing the string has less than 33,000 characters, the data flow works fine. Once it runs into a row in which the string column goes over the size of 32,767 characters, the data flow triggers a truncation error and that row is never processed.
The SQLCommand property only has 1 line which is the stored proc call:
EXEC usp_ParseDataColumns ?,?,?
The last parameter is the string object which can get very large. I'm assuming the transformation replaces each parameter with the row's data during run-time and if the property's value size exceeds 32,767 characters, it truncates and generates an error.
The error output that is generated by the OLE DB Command transformation is:
The data value cannot be converted for reasons other than sign mismatch or data overflow.
I tried searching for several alternatives, including using a variable, but not sure if that's the right path. I'm using an OLE DB Command because I need to perform other transformations afterwards on each row.
How can I resolve this or is there a better alternative?