I have an awkward flat file input that can be virtually any length. This is a comma delimited file, but has embedded tables delimited by "[{" and "}]" or "{" and "}" .. depending on the table type. I cannot use the off the shelf SSIS comma delimited flat file as there may be records with no embedded tables at all. To get around this I've set the flat file input to be ragged right and with one column of 8,000 characters.
I've then done the string splitting in a script component and output the table data to separate output streams.
However, I am now receiving files that exceed 8000 characters which has broken my process.
I've tried converting the flat file from "1252 (ANSI Latin 1)" into unicode with the column in NTEXT.
I've then inserted the following code to convert this to a string See http://www.bimonkey.com/2010/09/convert-text-stream-to-string/
Dim TextStream As Byte() ' To hold Text Stream
Dim TextStreamAsString As String ' To Hold Text Stream converted to String
' Load Text Stream into variable
TextStream = Row.CopyofColumn0.GetBlobData(0, CInt(Row.CopyofColumn0.Length))
' Convert Text Stream to string
TextStreamAsString = System.Text.Encoding.Unicode.GetString(TextStream)
But when I look at the string I get appear to get a lot of kanji type characters and no line feeds.
Any ideas what I can try next?