1
votes

I had sqooped the data from teradata to Hive using sqoop import command.

For one of the tables in teradata, I have a date field . After sqooping, my date field is appearing as a timestamp with bigint datatype.

But I need the date field as a date datatype in hive table. Can anyone please suggest me in achieving this?

2
What does timestamp with bigint datatype look like? Add some samples to your post please. - Andrew
i came across this last year the only way round I found was to cast the field as a varchar as part of the sqoop export. - shaine
My date field in hue is a 13-digit timestamp, which looks like : 1450051200000 which is of bigint type. But I'm expecting the date type instead of bigint in hue after sqooping. - Stack 777
I've tried to cast the bigint into string field using to_char function. But, instead of string , is there any way to convert directly from bigint to datetime type - Stack 777
why don't you use Teradata function cast(column_name as date) function in sqoop query - sumitya

2 Answers

1
votes

I've had this problem. My approach was to create the Hive table first. You should make an equivalence between Teradata datatypes and your Hive version datatypes. After that you can use the Sqoop argument --hive-table <table-name> to insert into that table.

3
votes
select to_date(from_unixtime(your timestamp));

example:

select to_date(from_unixtime(1490985000));
output:2017-04-01

I hope it will work. please let me know if i am wrong.