Changes between Version 9 and Version 10 of waue/2009/1005
- Timestamp:
- Oct 6, 2009, 7:15:52 PM (16 years ago)
Legend:
- Unmodified
- Added
- Removed
- Modified
-
waue/2009/1005
v9 v10 1 = fuse hdfs 0.20.1 繼續加油 = 1 {{{ 2 #!html 3 <div style="text-align: center; color: blue"><big 4 style="font-weight: bold;"><big><big> fuse hdfs 0.20.1 完成 </big></big></big></div> 5 }}} 6 [[PageOutline]] 2 7 3 8 = 環境 = … … 5 10 * kernel 2.6.24-24-generic (原本預設為 2.6.28-15 ) 6 11 * 請重新開機於grub 切換成第三個kernel,否則無法 modprobe fuse 7 * hadoop 0.20.1 (安裝於/opt/hadoop )12 * hadoop 0.20.1 (安裝於/opt/hadoop-0.20.1) 8 13 * fuse 版本 (FUSE_HOME = /usr/local) 9 * tarball 版為 2.7.4 (使FUSE_HOME 於 /usr/local)14 * tarball 版為 2.7.4 ,此目的為了使FUSE_HOME 於 /usr/local (應該可以不用做此步) 10 15 * 而'''/lib/modules/2.6.24-24-generic/kernel/fs/fuse/fuse.ko''' 此fuse模組檔為安裝 '''linux-image-2.6.24-24-generic'''.deb時就有 11 16 * automake 1.9.x 以上 12 17 * sudo apt-get install automake 可得 1.10.2 13 18 14 = 需求 = 15 16 * Hadoop with compiled libhdfs.so 17 * Linux kernel > 2.6.9 with fuse, which is the default or Fuse 2.7.x, 2.8.x installed. See: [http://fuse.sourceforge.net/] 18 * modprobe fuse to load it 19 * fuse-dfs executable (see below) 20 * fuse_dfs_wrapper.sh installed in /bin or other appropriate location (see below) 21 22 = 方法 = 19 = 安裝 = 23 20 == 0. 準備 == 24 21 {{{ 25 $ sudo apt-get install linux-image-2.6.24-24-generic fuse-utils libfuse-dev libfuse2 22 $ sudo apt-get install linux-image-2.6.24-24-generic fuse-utils libfuse-dev libfuse2 automake 26 23 }}} 27 24 28 重開機 25 重開機,於grub 挑選linux-image-2.6.24-24-generic (第三個) 26 27 開機完後,載入fuse 的 kernel module (也可不sudo) 28 29 29 {{{ 30 30 $ modprobe fuse 31 31 }}} 32 32 33 == 1. build == 34 改 build.xml 的 1046行左右,去掉 doc相關敘述,如下: 33 == 1. build fuse-hdfs == 34 35 改 /opt/hadoop-0.20.1/build.xml 的 1046行左右,去掉 doc相關敘述,如下: 36 35 37 {{{ 36 38 #!text 37 39 <target name="package" depends="compile, jar, examples, tools-jar, jar-test, ant-tasks, package-librecordio" 38 40 }}} 41 42 接著開始編譯 39 43 40 44 {{{ … … 45 49 }}} 46 50 47 完成則 /opt/hadoop/build/contrib/ 內則有 fuse-dfs 資料夾 51 完成則 /opt/hadoop-0.20.1/內增加 build 資料夾,為編譯過後的hadoop ,版本為 0.20.2 52 53 而/opt/hadoop-0.20.1/build/contrib/ 內則有 fuse-dfs 資料夾,此資料夾內的檔案fuse_dfs與fuse_dfs_wraper.sh 則是掛載hdfs的重要程式 54 55 接著將/opt/hadoop-0.20.1/build 製作成hadoop工作目錄,並且將hadoop重新啟動(假設$hadoop/conf內的設定檔已設定完成) 48 56 49 57 {{{ … … 54 62 $ mkdir /opt/hadoop/logs 55 63 $ cd /opt/hadoop 64 $ rm -rf /tmp/hadoop* ; rm logs/* 56 65 $ bin/hadoop namenode -format ; bin/start-all 57 66 }}} 58 67 59 68 60 == 2. == 69 == 2. == 70 61 71 請檢查 /opt/hadoop/contrib/fuse-dfs/fuse_dfs_wrapper.sh 檔內的設定檔是否符合系統環境 62 72 … … 64 74 #!text 65 75 export HADOOP_HOME=/opt/hadoop 66 export PATH=$HADOOP_HOME/contrib/fuse_dfs:$PATH67 76 export OS_ARCH=i386 68 export 77 export JAVA_HOME=/usr/lib/jvm/java-6-sun 69 78 export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/$OS_ARCH/server:/opt/hadoop/libhdfs:/usr/local/lib 70 79 }}} 80 81 找出libhdfs.so並放到$LD_LIBRARY_PATH內, 71 82 72 83 {{{ … … 74 85 $ mkdir libhdfs 75 86 $ sudo cp /opt/hadoop-0.20.1/c++/Linux-i386-32/lib/libhdfs.so ./libhdfs/ 76 $ mkdir /tmp/fusehdfs77 87 }}} 78 88 79 89 {{{ 90 $ mkdir /tmp/fusehdfs 80 91 $ ln -sf /opt/hadoop/build/contrib/fuse-dfs/fuse-dfs* /usr/local/bin/ 92 }}} 93 94 之後執行 fuse_dfs_wrapper.sh 則出現以下訊息 95 96 {{{ 81 97 $ cd /opt/hadoop/build/contrib/fuse-dfs/ 82 98 $ ./fuse_dfs_wrapper.sh 83 99 USAGE: /opt/hadoop/contrib/fuse-dfs/fuse_dfs [debug] [--help] [--version] [-oprotected=<colon_seped_list_of_paths] [rw] [-onotrash] [-ousetrash] [-obig_writes] [-oprivate (single user)] [ro] [-oserver=<hadoop_servername>] [-oport=<hadoop_port>] [-oentry_timeout=<secs>] [-oattribute_timeout=<secs>] [-odirect_io] [-onopoermissions] [-o<other fuse option>] <mntpoint> [fuse options] 84 100 NOTE: debugging option for fuse is -debug 85 86 101 }}} 87 102 88 == 3. == 89 * 掛載 103 ps . 若有錯請往前檢查 104 105 = 執行 = 106 107 * 掛載hdfs-fuse 108 90 109 法1. 91 110 {{{ … … 108 127 }}} 109 128 110 * 卸載 129 * 卸載hdfs-fuse 111 130 {{{ 112 131 $ fuseumount -u /tmp/fusehdfs … … 130 149 131 150 = bug fix = 151 132 152 * check java5 133 153 改 build.xml 的 1046行左右,去掉 doc相關敘述,如下: 134 135 154 {{{ 136 155 #!text … … 138 157 }}} 139 158 140 * libhdfs.so does not exist: /opt/hadoop-0.20.1/build/libhdfs/libhdfs.so 141 找到 libhdfs.so ,複製到 /opt/hadoop/libhdfs/ 資料夾 142 {{{ 143 $ mkdir /opt/hadoop-0.20.1/build/libhdfs/ 144 $ cp /opt/hadoop-0.20.1/c++/Linux-i386-32/lib/libhdfs.so /opt/hadoop/libhdfs/ 145 }}} 159 * 出現找不到 libhdfs.so.0,如:fuse_dfs: error while loading shared libraries: libhdfs.so.0: 160 161 可檢查LD_LIBRARY_PATH是否設定正確: 162 163 並複製libhdfs.so(/opt/hadoop-0.20.1/c++/Linux-i386-32/lib/libhdfs.so )到該路徑內; 164 165 再鍊結 libhdfs.so.0 -> libhdfs.so 146 166 147 167 148 * fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot open shared object file: No such file or directory149 150 鍊結 libhdfs.so.0 -> libhdfs.so151 {{{152 $ ln -sf libhdfs.so libhdfs.so.0153 }}}154