I've got hadoop / hbase / hive / set up and running (can create files on hdfs, run map reduce jobs, create a "table" in hbase and also in hive) on my mac with osx 10.9. I'm now trying to import data from a mysql table into sqoop (using query, not table-name etc). I am getting this error with this command.
--COMMAND
sqoop import --connect jdbc:mysql:///joshLocal --username root --query "SELECT * FROM BITLOG WHERE \$CONDITIONS" --split-by oozie_job.id --hbase-table bitlogTest --hbase-create- table --column-family bitLogColumn
--ERROR
ERROR tool.ImportTool: Error during import: HBase jars are not present in classpath, cannot import to HBase!
I believe that all the export vars are correctly setup. I have the following in swoop-env.sh
export HADOOP_HOME="/usr/local/Cellar/hadoop/2.4.0"
export HBASE_HOME="/usr/local/Cellar/hbase/0.98.1"
export HIVE_HOME="/usr/local/Cellar/hive/0.13.0"
export ZOOCFGDIR="/usr/local/etc/zookeeper"
export HCAT_HOME="/usr/local/Cellar/hive/0.13.0/libexec/hcatalog"
one thing I did that gave a different message was to change the hbase home to point to HBASE_HOME/libexec/lib in swoop-env.sh. That gave me a
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/Tool
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
error. I have seen some advice given that says I need to copy over hadoop security jar files to hbase's installation. I don't exactly know what files need to go over and if that is even an issue. The only reason I thought it might be is because java.security.SecureClassLoader is in the stacktrace.
I'm sorry if this is a really basic java question but I'm a complete novice with it.
One other even more basic java question. When we define HADOOP_HOME HBASE_HOME etc., what are we "telling" the other java programs that rely on that info. Are we saying "here is the executable java file" or are we saying "here are the jar file in the lib folder" . I don't quite understand what I should actually be pointing to because I don't know how that path is used.
UPDATE-1: tried changing my connection string in the sqoop import to a new user I created in mysql (sqoop@192.168.1.5). The new call to sqoop is
sqoop import --connect jdbc:mysql://wolfs-MacBook-Pro.local:3306/joshLocal --username sqoop --password sqoop --query "SELECT * FROM BITLOG WHERE \$CONDITIONS" --split-by oozie_job.id --hbase-table bitlogTest --hbase-create-table --column-family bitLogColumn
Same error as before. HBase jars are not present. Also, I created some base 'tables' with 'columns' that match my sqoop request thinking that may be part of the issue.
UPDATE-2: updated my export paths in the swoop-env.sh config to
export HADOOP_HOME="/usr/local/Cellar/hadoop/2.4.0"
export HBASE_HOME="/usr/local/Cellar/hbase/0.98.1/libexec"
export HIVE_HOME="/usr/local/Cellar/hive/0.13.0/libexec"
export ZOOCFGDIR="/usr/local/etc/zookeeper"
export HCAT_HOME="/usr/local/Cellar/hive/0.13.0/libexec/hcatalog"
and now I'm getting this error
Exception in thread "main" java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.hdfs.DistributedFileSystem could not be instantiated: java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.addDeprecations([Lorg/apache/hadoop/conf/Configuration$De precationDelta;)V
at java.util.ServiceLoader.fail(ServiceLoader.java:207)
at java.util.ServiceLoader.access$100(ServiceLoader.java:164)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:360)
at java.util.ServiceLoader$1.n
UPDATE 3 Updated export path for hadoop in sqoop env file to
export HADOOP_HOME="/usr/local/Cellar/hadoop/2.4.0/libexec"
Getting Close! The import job created the hbase table but no info is in there. I got this error now.
jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
Exception in thread "main" java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at org.apache.sqoop.mapreduce.DelegatingOutputFormat.checkOutputSpecs(DelegatingOutputFormat.java:63)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:458)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:343)
BTW - I'm running sqoop 1.4.4