wiki:waue/2009/1005

Version 7 (modified by waue, 15 years ago) (diff)

--

fuse hdfs 0.20.1 繼續加油

環境

  • ubuntu 9.04
  • kernel 2.6.24-24-generic (原本預設為 2.6.28-15 )
    • 請重新開機於grub 切換成第三個kernel,否則無法 modprobe fuse
  • hadoop 0.20.1 (安裝於/opt/hadoop)
  • fuse 版本 (FUSE_HOME = /usr/local)
    • tarball 版為 2.7.4 (使FUSE_HOME 於 /usr/local)
    • /lib/modules/2.6.24-24-generic/kernel/fs/fuse/fuse.ko 此fuse模組檔為安裝 linux-image-2.6.24-24-generic.deb時就有
  • automake 1.9.x 以上
    • sudo apt-get install automake 可得 1.10.2

需求

  • Hadoop with compiled libhdfs.so
  • Linux kernel > 2.6.9 with fuse, which is the default or Fuse 2.7.x, 2.8.x installed. See: http://fuse.sourceforge.net/
  • modprobe fuse to load it
  • fuse-dfs executable (see below)
  • fuse_dfs_wrapper.sh installed in /bin or other appropriate location (see below)

方法

1. build

改 build.xml 的 1046行左右,去掉 doc相關敘述,如下:

  <target name="package" depends="compile, jar, examples, tools-jar, jar-test, ant-tasks, package-librecordio"
$ cd /opt/hadoop
$ ant compile-c++-libhdfs -Dlibhdfs=1
$ ant package
$ ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1

完成則 /opt/hadoop/build/contrib/ 內則有 fuse-dfs 資料夾

2.

請檢查 /opt/hadoop/build/contrib/fuse-dfs/fuse_dfs_wrapper.sh 檔內的設定檔是否符合系統環境

export HADOOP_HOME=/opt/hadoop
export PATH=$HADOOP_HOME/contrib/fuse_dfs:$PATH
export OS_ARCH=i386
export  JAVA_HOME=/usr/lib/jvm/java-6-sun
export LD_LIBRARY_PATH=$JAVA_HOME/jre/lib/$OS_ARCH/server:/usr/local/share/hdfs/libhdfs/:/usr/local/lib
$ cp -rf ./build/contrib/fuse-dfs contrib/
$ sudo mkdir /usr/local/share/hdfs/
$ sudo cp /opt/hadoop/build/libhdfs /usr/local/share/hdfs/libhdfs/
$ ln -sf /opt/hadoop/build/contrib/fuse-dfs/fuse-dfs* /usr/local/bin/
$ cd /opt/hadoop/build/contrib/fuse-dfs/
$ ./fuse_dfs_wrapper.sh 
USAGE: /opt/hadoop/contrib/fuse-dfs/fuse_dfs [debug] [--help] [--version] [-oprotected=<colon_seped_list_of_paths] [rw] [-onotrash] [-ousetrash] [-obig_writes] [-oprivate (single user)] [ro] [-oserver=<hadoop_servername>] [-oport=<hadoop_port>] [-oentry_timeout=<secs>] [-oattribute_timeout=<secs>] [-odirect_io] [-onopoermissions] [-o<other fuse option>] <mntpoint> [fuse options]
NOTE: debugging option for fuse is -debug

3.

$ mkdir /tmp/fusehdfs
$ fuse_dfs_wrapper.sh dfs://secuse.nchc.org.tw:9000 /tmp/fusehdfs -d

另外一個視窗

$  ls /tmp/fusehdfs

4.

加入以下內容到 /etc/fstab

fuse_dfs#dfs://hadoop_server.foo.com:9000 /export/hdfs fuse -oallow_other,rw,-ousetrash,-oinitchecks 0 0

bug fix

  • check java5

改 build.xml 的 1046行左右,去掉 doc相關敘述,如下:

  <target name="package" depends="compile, jar, examples, tools-jar, jar-test, ant-tasks, package-librecordio"
  • libhdfs.so does not exist: /opt/hadoop-0.20.1/build/libhdfs/libhdfs.so

找到 libhdfs.so ,複製到 /opt/hadoop/libhdfs/ 資料夾

$ mkdir /opt/hadoop-0.20.1/build/libhdfs/
$ cp /opt/hadoop-0.20.1/c++/Linux-i386-32/lib/libhdfs.so /opt/hadoop/libhdfs/
  • fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot open shared object file: No such file or directory

鍊結 libhdfs.so.0 -> libhdfs.so

$ ln -sf libhdfs.so libhdfs.so.0