4
votes

I have 2 tables in hive having Order's and Order_Detail's (having 1:n relation and joined on order_id) which I am trying to load to a single table taking advantage of hive complex data type - map[struct].

Say ORDER has below data,

Order_id total_amount customer

123 10.00 1

456 12.00 2

and ORDER_DETAILS have

Order_id Order_Item_id Item_amount Item_type

123 1 5.00 A

123 2 5.00 B

456 1 6.00 A

456 2 3.00 B

456 3 3.00 C

I would like to create single table ORDERS with all order columns and order_detail columns as map of structs. This helps me to combine related data and query together thus avoiding frequent joins. I tried loading table with complex data types using txt/json files input with respective serde's and it works well. But in this scenario I want to load data from existing 2 hive tables of ORCFile format into the new table. Have tried some basic insert using named_struct function but it loads each row separately and does not combine same order_id's into a single row.

Expected output something like ,

123 10.00 1 [1:{5.00,A},2:{5.00,B}]

456 12.00 2 {1:{6.00,A}, 2:{3.00,B},3:{3.00,C}]

but i get ,

123 10.00 1 [1:{5.00,A}]

123 10.00 1 [2:{5.00,B}]

456 12.00 2 {1:{6.00,A}]

456 12.00 2 {2:{3.00,B}]

456 12.00 2 {3:{3.00,C}]

Kindly help me understand the way of achieving this with just INSERT INTO table select from 2 tables. Thanks in advance.

1

1 Answers

4
votes

I found a way to do this using map , named_struct functions and a custom UDF to_map posted by David Worms on to_map UDF blog. Here is the sample,

CREATE TABLE ORDER(
  order_id bigint,
  total_amount bigint,
  customer bigint)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ','
STORED AS INPUTFORMAT 'org.apache.hadoop.mapred.TextInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat';

CREATE TABLE ORDER_DETAILS(
  order_id bigint,
  Order_Item_id bigint,
  Item_amount bigint,
  Item_type string)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ','
STORED AS INPUTFORMAT
  'org.apache.hadoop.mapred.TextInputFormat'
OUTPUTFORMAT
  'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat';

CREATE TABLE ORDERS(
  order_id bigint,
  Order_Items map < bigint, struct < Item_amount: bigint, Item_type: string >> ,
  total_amount bigint,
  customer bigint)
ROW FORMAT SERDE 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'
STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat';

Insert overwrite table ORDERS
select
a.order_id,
  a.order_items,
  b.total_amount,
  b.customer
from
  (select order_id as order_id,
    to_map(order_item_id, named_struct("item_amount", item_amount, "item_type", item_type)) as order_items from ORDER_DETAILS group by order_id) a
JOIN ORDER b ON(a.order_id = b.order_id);

select * from ORDERS;

123 {1:{"Item_amount":5,"Item_type":"A"},2:{"Item_amount":5,"Item_type":"B"}} 10 1

456 {1:{"Item_amount":6,"Item_type":"A"},2:{"Item_amount":3,"Item_type":"B"},3:{"Item_amount":3,"Item_type":"C"}} 12 2

Hope this helps everyone.