I have a simple data flow. The source is a small flat file with approxiamtely 16k rows in it. The destination is an OLE DB destination, a SQL 2008 table with a 3 part Unique key on it. The data flow goes via some simple transformations; Row Count, derived columns, data conversion etc.
All simple and all that works fine.
My problem is that within this data there are 2 rows which are duplicates in terms of the primary key, 2 duplicate rows that violate that key, so 4 rows in total. On the OLE DB destination i have set the error output to redirect Row and the rows are sent to an Error table which has enough columns for me to identify the bad rows.
The problem is that even though there are 4 cuplrits the tranformation keeps writing 1268 rows to the error table.
Any ideas?
Thanks.
**
Just to add, if i remove the 2 duplicate rows the whole file imports successfully....16,875 rows. There is no question that only 2 rows violate the key, but the error redirection affects 1268.
**