0
votes

Hi I have a shell script which i am trying to run it as a pig activity in azure data factory. The pig script content is

sh containername/testshell.ksh

And it throws "No such file or directory". I tried using the full path as well but no luck.

Could someone give some input on how to give the correct path.

2
Maybe this link could help you.Shui shengbao
@Walter-MSFT Thanks for the attention. That link is about how to launch pig jobs from linux shell, mine is an inverse i am trying to run a shell script inside a pig script. And i am creating a ondemand hadoop cluster to perform the taskVignesh I
Sounds like you are trying to do something that ADF isn't really designed for. You might be better off using a Custom Activity rather than a Pig job.Paul Andrew

2 Answers

0
votes

Upload the shell script to blob storage and then invoke that script to pig or hive. Below are the steps.

Hive:

!sh hadoop fs -ls    
wasbs://contyaineName@StorageAccountName.blob.core.windows.net/pathToScript/testshell.ksh`

Pig

sh hadoop fs -ls 
wasbs://contyaineName@StorageAccountName.blob.core.windows.net/pathToScript/testshell.ksh`
0
votes

try following Solution:

sh bash containername/testshell.ksh

and make sure you enter your shebang directive accordingly.