@kuldeep - You can't load this data since new line characters are used as a row delimiter as well as a value in the data. You need to escape the new lines before exporting the data, and then mention the escape character in Snowflake's file format before loading them into the DW.
When you export this data from the source system into S3/blob, make sure you add quotes and as escape character. This would ensure (1) new line characters are escaped and quoted in row 1 (2) tag quotes are escaped and then quoted in row 2.
If the data is specific to a particular table, you could just create a specific file format in Snowflake for the table and use it along with the copy statement. For instance, choose CSV file format with backslash as the escape character and optionally enclosed field set to double quote.
CREATE FILE FORMAT CSV_ESC_DQ TYPE = 'CSV' COMPRESSION = 'AUTO'
FIELD_DELIMITER = ',' RECORD_DELIMITER = '\n' SKIP_HEADER = 0
FIELD_OPTIONALLY_ENCLOSED_BY = '\042' TRIM_SPACE = FALSE
ERROR_ON_COLUMN_COUNT_MISMATCH = TRUE ESCAPE = '\134'
ESCAPE_UNENCLOSED_FIELD = '\134' DATE_FORMAT = 'AUTO'
TIMESTAMP_FORMAT= 'AUTO' NULL_IF = ('\\N');
copy into table_name from @stage/path_to_file/
file_format=csv_esc_dq;