from your question I assume that you already have your data in hdfs.
So you don't need to LOAD DATA
, which moves the files to the default hive location /user/hive/warehouse
. You can simply define the table using the external
keyword, which leaves the files in place, but creates the table definition in the hive metastore. See here:
Create Table DDL
eg.:
create external table table_name (
id int,
myfields string
)
location '/my/location/in/hdfs';
Please note that the format you use might differ from the default (as mentioned by JigneshRawal in the comments). You can use your own delimiter, for example when using Sqoop:
row format delimited fields terminated by ','