Running Wordcount on Hadoop multi node cluster


$ start-all.sh

$ jps

$ cd Desktop

$ sudo mkdir www

$ cd www

$ jps >> example.txt

$ cd /usr/local/hadoop

$ bin/hdfs dfs -mkdir /user

$ bin/hdfs dfs -mkdir /user/chaalpritam

$ bin/hdfs dfs -put /home/chaalpritam/Desktop/www input

$ bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar wordcount input output

$ bin/hdfs dfs -cat output/*


2 comments:

  1. I try to do the test but i haven't success.

    When I try: "bin/hdfs dfs -put /home/myuser/www input" the message appears: "put: Cannot create directory /user/hadoop/input. Name node is in safe mode."
    When I try: "bin/hadoop dfsadmin -safemode leave" the message appears: "safemode: Access denied for user hadoop. Superuser privilege is required".
    I had try with root, but haven't success. I don't know what I do to execute it.
    I have follow all your pass, on single node and multi multi node but I could not make it work on multi mode. Can you help me?

    Ah, I have format all machines, and reinstall hadoop, one by one.


    Can we talk via email?

    ReplyDelete

 

Flickr Photostream

Twitter Updates