- download hadoop-0.18.2
$ wget http://ftp.twaren.net/Unix/Web/apache/hadoop/core/hadoop-0.18.2/hadoop-0.18.2.tar.gz
$ tar zxvf hadoop-0.18.2.tar.gz
- [注意] 需要 JAVA_HOME 環境變數才能執行 hadoop namenode
$ echo "export JAVA_HOME=/usr/lib/jvm/java-6-sun/jre" >> ~/.bash_profile
$ cd hadoop-0.18.2
~/hadoop-0.18.2$ bin/hadoop namenode -format
- 編輯 conf/hadoop-site.xml 在 configuration 那一段加入以下設定
<configuration>
<property>
<name>fs.default.name</name>
<value>localhost:5000</value>
</property>
</configuration>
- 用 bin/hadoop namenode 啟動 namenode
~/hadoop-0.18.2$ bin/hadoop namenode
<ctrl-C> and come out
- 執行 bin/start-dfs.sh
~/hadoop-0.18.2$ bin/start-dfs.sh
starting namenode, logging to /home/jazz/hadoop-0.18.2/bin/../logs/hadoop-jazz-namenode-hadoop.out
The authenticity of host 'localhost (127.0.0.1)' can't be established.
RSA key fingerprint is 70:3f:8b:f2:b9:a8:de:ea:90:f4:bf:ce:cb:85:7a:eb.
Are you sure you want to continue connecting (yes/no)? yes
localhost: Warning: Permanently added 'localhost' (RSA) to the list of known hosts.
- 從這邊的訊息可以知道 Hadoop 會用 SSH 進行內部連線,因此需要做 SSH Key exchange
~$ ssh-keygen
~$ cp .ssh/id_rsa.pub .ssh/authorized_keys
-
- 如果遇到 JAVA_HOME 的問題,可以改用手動的方式啟動 DFS 服務
~/hadoop-0.18.2$ bin/hadoop datanode
~/hadoop-0.18.2$ bin/hadoop-daemon.sh start datanode
- http://localhost:50070/