2
votes

I'm using Azure Blob stoarge,Data factory with HDInsight cluster . I've a shell script which contain hadoop and hive related code , i'm trying to add/create a hive/Pig activity in ADF ,from the code of pig/hive i'm calling a shell script ; as

myFile.pig

sh /myFolder/myscript.sh

==========================

myFile.hql

!/myFolder/myscript.sh

while executing,I'm getting Java.IO.Excption | No such file or directory . as per the exception pig/hive file is not able to recognize the shell script path ;

Did anyone faced similar issue or anyone deployed pig/hive activity along with shell script from ADF.

I've tried multiple ways and all possible path combination to pass the location of the shell script but it was not picked up , any help /suggestion/pointer, will be highly appreciable .

Thanks in advance.

1
Did you get the answer? Having the same problemVignesh I
Yes , we've got the solution .Prashant
What was it? Could you please shareVignesh I
1) Create a Hive activity in ADF 2) Set hive.execution.engine =mr; in hive script 3) call sh <your shell script > as ; !sh myscript.shPrashant
Could you please expand on how to make ADF recognize the shell script pathVignesh I

1 Answers

0
votes

Upload the shell script to blob storage and then invoke that script to pig or hive, Bleow is the steps.

Hive 

    !sh hadoop fs -ls wasbs://contyaineName@StorageAccountName.blob.core.windows.net/pathToScript/testshell.ksh

Pig 
    sh hadoop fs -ls wasbs://contyaineName@StorageAccountName.blob.core.windows.net/pathToScript/testshell.ksh