Thursday, April 23, 2015

return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

I have created partiton table in Hive, When I do Insert into partition table I was stuck of with these error..

Error

    Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask

....
[Fatal Error] Operator FS_2 (id=2): Number of dynamic partitions exceeded hive.exec.max.dynamic.partitions.pernode.

above error in first line was little confusing, if you scroll up the console you may get the second line which wss the real cause of the exception.

Solution

  bdalab@solai:/opt$ hive

  hive> set hive.exec.max.dynamic.partitions.pernode=500

by default hive.exec.max.dynamic.partitions.pernode set to 100, if the partition will exceeds the limit, you will get an error. Just change the default value based on the requirement to rid out of these.

Wednesday, April 22, 2015

how to : Execute HDFS commands from DataNode

how to : Work on NameNode (HDFS) from DataNode

command work on NameNode from DataNode or any other Hadoop installed system(which may/may not be part of hadoop cluster)

HDFS 'FS' command execute from DataNode

    bdalab@solai:/opt$ hadoop fs -fs hdfs://masterNodeIP:9000/ -rm /input/log.csv


Above command will be executed from DataNode, File '/input/log.csv' will be removed from NameNode.
here, masterNodeIP -> IP address of remote system

List/show all the files in NameNode from DataNode

    bdalab@solai:/opt$ hadoop fs -fs hdfs://masterNodeIP:9000/ -ls /


Create dir 'pjt' in NameNode from DataNode

    bdalab@solai:/opt$ hadoop fs -fs hdfs://masterNodeIP:9000/ -mkdir /pjt

all the above command will be run from DataNode and executed on NameNode