0
votes

I have an package where i load csv source files into table. I have two typs of source files, employee and employeedetails.

  1. For example Employee csv file has 4 columns eid, ename, designation, age.

  2. Employeedetails csv file has 6 columns eid, ename, designation, age, depatname, deptid.

  3. My destination table has eid, ename, designation, age, depatname, deptid.

Now, i want to load this 2 files based on file type variable in my Variables collection. I have completed that part also.

So instead of using 2 Dataflow tasks, i used only one. My DFT has following transformations inside it.

RawFileSource
|
Derived column Transformation
|
Oledb Destination table

In Control Flow i have used Execute Package Task whcih calls Child package when i give filetype as Employee, it runs Employee Package where it takes employee CSV file and loads it into rawfile destination.

If i select File type as Employeedetails, it runs EmployeeDetails Package where it takes employeeDetails CSV file and loads it into rawfile destination.

now i take this raw file and need to load it into destination table. Foe one file am able to load, when i load the second file be it either Employee file or EmployeeDetails file, metadata problem occurs.

Because of this i need to go to Package and change each and every transformation.

I tried keeping validate External Meta data value to false for all the transformations and at package level also.

How to over come this problem and load my two files.

1
Take a peek at my answer on stackoverflow.com/questions/10686785/… I think you're running into the same problembillinkc

1 Answers

0
votes

Since the two files have different columns, you need two data flows. You could have a step in the control flow that choses the correct one based on the filename you send in as a varaiable.

But since the details file has all the columns of the employees file, why don't you just process that one and ignore the other?