As @Diego had mentioned, you would use a Lookup transformation
within the Data Flow task to achieve the functionality.
Your data flow task would look something like this. Here, the Flat file source reads the CSV file and then passes the data to Lookup transformation. This transformation will check for existing data in the destination table (say the table name is dbo.Destination). The configuration for Lookup transformation is shown in the next screenshots. If there are no matching records, then the data from CSV file will be sent to the OLE DB Destination otherwise, the data will be discarded.

In the Lookup transformation
, you will choose the destination database table on the Connection tab. On the Columns section, you will verify all the columns that you would like to check for existing data. Here in this case, the columns eid, name and asofdate coming from CSV file are validated against columns of same names in the database table dbo.Destination. If the incoming values for these three columns match with any rows in the table, the data will not be sent further down the stream.
Hope that gives you an idea.


INSERT INTO target_table(...) SELECT ... FROM temp_table;
If there are duplicates, the insert will fail (and the whole batch is ignored) BTW: You are not very clear about yourasofdate
column. If that is to describe the validity of a single row, why do you want to reject the whole batch? – wildplasser