1
votes

How can I convert unix epoch with milliseconds to timestamp with milliseconds In Hive? Neither cast() nor from_unixtime() function is working to get the timestamp with milliseconds.

I tried .SSS but the function just increases the year and doesn't take it as a part of millisecond.

scala> spark.sql("select from_unixtime(1598632101000, 'yyyy-MM-dd hh:mm:ss.SSS')").show(false)
+-----------------------------------------------------+
|from_unixtime(1598632101000, yyyy-MM-dd hh:mm:ss.SSS)|
+-----------------------------------------------------+
|52628-08-20 02:00:00.000                             |
+-----------------------------------------------------+

2

2 Answers

1
votes

I think you can just cast():

select cast(1598632101000 / 1000.0 as timestamp)

Note that this produces a timestamp datatype rather than a string, as in from_unixtime().

1
votes

from_unixtime works with seconds, not milliseconds. Convert to timestamp in seconds from_unixtime(ts div 1000), concatenate with '.'+ milliseconds (mod(ts,1000)) and cast as timestamp. Tested in Hive:

with your_data as (
select stack(2,1598632101123, 1598632101000) as ts
)

select cast(concat(from_unixtime(ts div 1000),'.',mod(ts,1000)) as timestamp)
from your_data;

Result:

2020-08-28 16:28:21.123
2020-08-28 16:28:21.0