3
votes

What is the difference between Apache Sqoop and Hive? I know that sqoop is used to import/export data from RDBMS to HDFS and Hive is a SQL layer abstraction on top of Hadoop. Can I can use Sqoop for importing data into HDFS and then use Hive for querying?

4

4 Answers

4
votes

Yes, you can. In fact many people use sqoop and hive for exactly what you have told.

In my project what I had to do was to load the historical data from my RDBMS which was oracle, move it to HDFS. I had hive external tables defined for this path. This allowed me to run hive queries to do transformations. Also, we used to write mapreduce programs on top of these data to come up with various analysis.

2
votes

Sqoop transfers data between HDFS and relational databases. You can use Sqoop to transfer data from a relational database management system (RDBMS) such as MySQL or Oracle into HDFS and use MapReduce on the transferred data. Sqoop can export this transformed data back into an RDBMS as well. More info http://sqoop.apache.org/docs/1.4.3/index.html

Hive is a data warehouse software that facilitates querying and managing large datasets residing in HDFS. Hive provides schema on read (as opposed to schema on write for RDBMS) onto the data and the ability to query the data using a SQL-like language called HiveQL. More info https://hive.apache.org/

1
votes

Yes you can. As a matter of fact, that's exactly how it is meant to be used.

0
votes

I)Sqoop : 1. We can integrate with any external data sources with HDFS i.e Sql , NoSql and Data warehouses as well using this tool at the same time we export it as well since this can be used as bi-directional ways. 2. sqoop to move data from a relational database into Hbase. Hive: 1.As per my understanding we can import the data from Sql databases into hive rather than NoSql Databases. 2. We can't export the data from HDFS into Sql Databases.
II) We can use both together using the below two options 1. sqoop create-hive-table --connect jdbc:mysql://<hostname>/<dbname> --table <table name> --fields-terminated-by ',' The above command will generate the hive table and this table name will be same name in the external table and also the schema 2. Load the data hive> LOAD DATA INPATH <filename> INTO TABLE <filename> Hive can be shortened to one step if you know that you want to import stright from a database directly into hive sqoop import --connect jdbc:mysql://<hostname>/<dbname> --table <table name> -m 1 --hive-import