1
votes

My HBase table has columns containing bigint. Those bigints where declared from Hive, has I used Hive to generate all HBase's HFiles for bulk loading.

From HBase shell I can print the row and see the appropriate integer value :

...
00000020-079e-4e9f-800b-e71937a78b5d    column=cf:p_le_id, timestamp=1428571993408, value=1395243843
...

From Phoenix I select the row and see a negative value.

select "p_le_id" from "bulk_1month" where UUID = '00000020-079e-4e9f-800b-e71937a78b5d';

I tried several types for declaring the column in Phoenix. None of them matched the HBase value :

-- BIGINT : -5678131804545731784

-- UNSIGNED INT : 825440565

-- UNSIGNED LONG : 3545240232309044024

-- UNSIGNED_FLOAT : 2.6080447E-9

-- INTEGER : -1.322.043.083

One interesting point however : the HBase value is 1,395,243,843. The Phoenix type showing the "more similar" value is INTEGER.

Thanks in advance for your suggestions!

1

1 Answers

2
votes

I noticed I can read the appropriate value when using a VARCHAR type for the numerical column.

It is weird as source was declared to be of type bigint from Hive side.