Changes between Version 9 and Version 10 of waue/2009/1005


Ignore:
Timestamp:
Oct 6, 2009, 7:15:52 PM (15 years ago)
Author:
waue
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • waue/2009/1005

    v9 v10  
    1  = fuse hdfs 0.20.1 繼續加油 =
     1{{{
     2#!html
     3<div style="text-align: center; color: blue"><big
     4 style="font-weight: bold;"><big><big> fuse hdfs 0.20.1 完成 </big></big></big></div>
     5}}}
     6[[PageOutline]]
    27
    38= 環境 =
     
    510 * kernel 2.6.24-24-generic (原本預設為 2.6.28-15 )
    611   * 請重新開機於grub 切換成第三個kernel,否則無法 modprobe fuse
    7  * hadoop 0.20.1 (安裝於/opt/hadoop)
     12 * hadoop 0.20.1 (安裝於/opt/hadoop-0.20.1)
    813 * fuse 版本 (FUSE_HOME = /usr/local)
    9    * tarball 版為 2.7.4 (使FUSE_HOME  於 /usr/local)
     14   * tarball 版為 2.7.4 ,此目的為了使FUSE_HOME  於 /usr/local (應該可以不用做此步)
    1015   * 而'''/lib/modules/2.6.24-24-generic/kernel/fs/fuse/fuse.ko''' 此fuse模組檔為安裝 '''linux-image-2.6.24-24-generic'''.deb時就有
    1116 * automake 1.9.x 以上
    1217   * sudo apt-get install automake  可得 1.10.2
    1318
    14  = 需求 =
    15 
    16  * Hadoop with compiled libhdfs.so
    17  * Linux kernel > 2.6.9 with fuse, which is the default or Fuse 2.7.x, 2.8.x installed. See: [http://fuse.sourceforge.net/]
    18  * modprobe fuse to load it
    19  * fuse-dfs executable (see below)
    20  * fuse_dfs_wrapper.sh installed in /bin or other appropriate location (see below)
    21 
    22  = 方法 =
     19 = 安裝 =
    2320 == 0. 準備 ==
    2421{{{
    25 $ sudo apt-get install linux-image-2.6.24-24-generic fuse-utils libfuse-dev libfuse2
     22$ sudo apt-get install linux-image-2.6.24-24-generic fuse-utils libfuse-dev libfuse2 automake
    2623}}}
    2724
    28 重開機
     25重開機,於grub 挑選linux-image-2.6.24-24-generic (第三個)
     26
     27開機完後,載入fuse 的 kernel module (也可不sudo)
     28
    2929{{{
    3030$ modprobe fuse
    3131}}}
    3232
    33  ==  1. build ==
    34 改 build.xml 的 1046行左右,去掉 doc相關敘述,如下:
     33 ==  1. build fuse-hdfs ==
     34
     35改 /opt/hadoop-0.20.1/build.xml 的 1046行左右,去掉 doc相關敘述,如下:
     36
    3537{{{
    3638#!text
    3739  <target name="package" depends="compile, jar, examples, tools-jar, jar-test, ant-tasks, package-librecordio"
    3840}}}
     41
     42接著開始編譯
    3943
    4044{{{
     
    4549}}}
    4650
    47 完成則 /opt/hadoop/build/contrib/ 內則有 fuse-dfs 資料夾
     51完成則 /opt/hadoop-0.20.1/內增加 build 資料夾,為編譯過後的hadoop ,版本為 0.20.2
     52
     53而/opt/hadoop-0.20.1/build/contrib/ 內則有 fuse-dfs 資料夾,此資料夾內的檔案fuse_dfs與fuse_dfs_wraper.sh 則是掛載hdfs的重要程式
     54
     55接著將/opt/hadoop-0.20.1/build 製作成hadoop工作目錄,並且將hadoop重新啟動(假設$hadoop/conf內的設定檔已設定完成)
    4856
    4957{{{
     
    5462$ mkdir /opt/hadoop/logs
    5563$ cd /opt/hadoop
     64$ rm -rf /tmp/hadoop* ; rm logs/*
    5665$ bin/hadoop namenode -format ; bin/start-all
    5766}}}
    5867
    5968
    60  == 2.  ==
     69 == 2. ==
     70
    6171請檢查 /opt/hadoop/contrib/fuse-dfs/fuse_dfs_wrapper.sh 檔內的設定檔是否符合系統環境
    6272
     
    6474#!text
    6575export HADOOP_HOME=/opt/hadoop
    66 export PATH=$HADOOP_HOME/contrib/fuse_dfs:$PATH
    6776export OS_ARCH=i386
    68 export  JAVA_HOME=/usr/lib/jvm/java-6-sun
     77export JAVA_HOME=/usr/lib/jvm/java-6-sun
    6978export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/$OS_ARCH/server:/opt/hadoop/libhdfs:/usr/local/lib
    7079}}}
     80
     81找出libhdfs.so並放到$LD_LIBRARY_PATH內,
    7182
    7283{{{
     
    7485$ mkdir libhdfs
    7586$ sudo cp /opt/hadoop-0.20.1/c++/Linux-i386-32/lib/libhdfs.so ./libhdfs/
    76 $ mkdir /tmp/fusehdfs
    7787}}}
    7888
    7989{{{
     90$ mkdir /tmp/fusehdfs
    8091$ ln -sf /opt/hadoop/build/contrib/fuse-dfs/fuse-dfs* /usr/local/bin/
     92}}}
     93
     94之後執行 fuse_dfs_wrapper.sh 則出現以下訊息
     95
     96{{{
    8197$ cd /opt/hadoop/build/contrib/fuse-dfs/
    8298$ ./fuse_dfs_wrapper.sh
    8399USAGE: /opt/hadoop/contrib/fuse-dfs/fuse_dfs [debug] [--help] [--version] [-oprotected=<colon_seped_list_of_paths] [rw] [-onotrash] [-ousetrash] [-obig_writes] [-oprivate (single user)] [ro] [-oserver=<hadoop_servername>] [-oport=<hadoop_port>] [-oentry_timeout=<secs>] [-oattribute_timeout=<secs>] [-odirect_io] [-onopoermissions] [-o<other fuse option>] <mntpoint> [fuse options]
    84100NOTE: debugging option for fuse is -debug
    85 
    86101}}}
    87102
    88 == 3. ==
    89  * 掛載
     103 ps . 若有錯請往前檢查
     104
     105 = 執行 =
     106
     107 * 掛載hdfs-fuse
     108
    90109法1.
    91110{{{
     
    108127}}}
    109128
    110  * 卸載
     129 * 卸載hdfs-fuse
    111130{{{
    112131$ fuseumount -u /tmp/fusehdfs
     
    130149
    131150 = bug fix =
     151
    132152 * check java5
    133153改 build.xml 的 1046行左右,去掉 doc相關敘述,如下:
    134 
    135154{{{
    136155#!text
     
    138157}}}
    139158
    140  * libhdfs.so does not exist: /opt/hadoop-0.20.1/build/libhdfs/libhdfs.so
    141 找到 libhdfs.so ,複製到 /opt/hadoop/libhdfs/ 資料夾
    142 {{{
    143 $ mkdir /opt/hadoop-0.20.1/build/libhdfs/
    144 $ cp /opt/hadoop-0.20.1/c++/Linux-i386-32/lib/libhdfs.so /opt/hadoop/libhdfs/
    145 }}}
     159 * 出現找不到 libhdfs.so.0,如:fuse_dfs: error while loading shared libraries: libhdfs.so.0:
     160
     161可檢查LD_LIBRARY_PATH是否設定正確:
     162
     163並複製libhdfs.so(/opt/hadoop-0.20.1/c++/Linux-i386-32/lib/libhdfs.so )到該路徑內;
     164
     165再鍊結 libhdfs.so.0 -> libhdfs.so
    146166
    147167
    148  * fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot open shared object file: No such file or directory
    149 
    150 鍊結 libhdfs.so.0 -> libhdfs.so
    151 {{{
    152 $ ln -sf libhdfs.so libhdfs.so.0
    153 }}}
    154