Oozie supress logging from shell job action? -


i have simple workflow (see below) runs shell script. shell script runs pyspark script, moves file local hdfs folder.

when run shell script itself, works perfectly, logs redirect folder > spark.txt 2>&1 right in shell script.

but when submit oozie job following workflow, output shell seems supressed. tried redirect possible oozie logs (-verbose -log) > oozie.txt 2>&1, didn't help.

the workflow finished successfuly (status successeded, no error log), see, folder not copied hdfs, when run alone (not through oozie), fine.

<action name="forceloadfromlocal2hdfs"> <shell xmlns="uri:oozie:shell-action:0.1">   <job-tracker>${jobtracker}</job-tracker>   <name-node>${namenode}</name-node>   <configuration>     <property>       <name>mapred.job.queue.name</name>       <value>${queuename}</value>     </property>   </configuration>   <exec>driver-script.sh</exec>   <argument>s</argument>   <argument>script.py</argument>   <!-- arguments py script -->   <argument>hdfspath</argument>   <argument>localpath</argument>   <file>driver-script.sh#driver-script.sh</file> </shell> <ok to="end"/> <error to="killaction"/> 

thx lot!

edit: thx advice found full log under

yarn -logs -applicationid [application_xxxxxx_xxxx]  

thx advice found full log under the

yarn -logs -applicationid [application_xxxxxx_xxxx]  

Comments

Popular posts from this blog

node.js - Node js - Trying to send POST request, but it is not loading javascript content -

javascript - Replicate keyboard event with html button -

javascript - Web audio api 5.1 surround example not working in firefox -