1
votes

I have successfully executed mapreduce java code on same machine. and now I am trying to execute Mapreduce code written in python on same machine. For this I am using hadoop_3.2.1 and hadoop-streaming-3.2.1.jar.

I have tested the code by command

[dsawale@localhost ~]$ cat Desktop/sample.txt | python PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.py | sort | python PycharmProjects/MapReduceCode/com/code/wordcount/WordCountReducer.py

I found it displays correct output.

But when i try to execute on hadoop cluster using command

[dsawale@localhost ~]$ hadoop jar Desktop/JAR/hadoop-streaming-3.2.1.jar -mapper mapper.py -reducer reducer.py -file PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.py -file PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.py -input /sample.txt -output pysamp

I am getting output as :

packageJobJar: [PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.py, PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.py, /tmp/hadoop-unjar6715579504628929924/] [] /tmp/streamjob3211585412475799030.jar tmpDir=null
Streaming Command Failed!

This is my very first python MapReduce program. Could you please help me to get rid of this error. Thanks!

Configuration files: mapred-site.xml

<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
    <property>
            <name>yarn.app.mapreduce.am.env</name>
            <value>HADOOP_MAPRED_HOME=${HADOOP_HOME}</value>
    </property>
        <property>
            <name>mapreduce.map.env</name>
            <value>HADOOP_MAPRED_HOME=${HADOOP_HOME}</value>
    </property>
        <property>
            <name>mapreduce.reduce.env</name>
            <value>HADOOP_MAPRED_HOME=${HADOOP_HOME}</value>
    </property>
</configuration>

core-site.xml:

<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
    </property>
</configuration>

hdfs-site.xml

<configuration>
    <property>
            <name>dfs.replication</name>
            <value>1</value>
    </property>
    <property>  
        <name>dfs.permission</name>
        <value>false</value>
    </property>
    <property>  
        <name>dfs.namenode.name.dir</name>
        <value>/home/dsawale/hadoop-3.2.1/hadoop2_data/hdfs/namenode</value>
    </property>
    <property>
        <name>dfs.datanode.data.dir</name>
        <value>/home/dsawale/hadoop-3.2.1/hadoop2_data/hdfs/datanode</value>
    </property>
</configuration>

yarn-site.xml:

    <configuration>
<!-- Site specific YARN configuration properties -->
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
    <property>
        <name>yarn.nodemanager.auxservices.mapreduce.shuffle.class</name>
        <value>org.apache.hadoop.mapred.ShuffleHandler</value>
    </property>
</configuration>
1
Which is your mapper script and reducer script? The filepaths you have mentioned for -file is different from -mapper and -reducer.franklinsijo
Mapper script is : PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.py Reducer script: PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.pyDigres
They must be passed to mapper and reducer arguments. Updated it as the answer.franklinsijo

1 Answers

1
votes

You have incorrect file paths passed to mapper and reducer arguments.

Try,

hadoop jar Desktop/JAR/hadoop-streaming-3.2.1.jar \
-mapper PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.py \
-reducer PycharmProjects/MapReduceCode/com/code/wordcount/WordCountReducer.py  \
-file PycharmProjects/MapReduceCode/com/code/wordcount/WordCountMapper.py \
-file PycharmProjects/MapReduceCode/com/code/wordcount/WordCountReducer.py \
-input /sample.txt \
-output pysamp