Changes between Version 1 and Version 2 of NCTU110329/Lab5


Ignore:
Timestamp:
Apr 19, 2011, 10:04:20 AM (13 years ago)
Author:
jazz
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • NCTU110329/Lab5

    v1 v2  
    1515 
    1616{{{
    17 $ cd /opt/hadoop
    18 $ bin/hadoop fs -put conf lab3_input
    19 $ bin/hadoop fs -ls lab3_input
    20 $ bin/hadoop jar hadoop-*-examples.jar grep lab3_input lab3_out1 'dfs[a-z.]+'
     17$ hadoop fs -put /etc/hadoop/conf lab5_input
     18$ hadoop fs -ls lab5_input
     19$ hadoop jar hadoop-examples.jar grep lab5_input lab5_out1 'dfs[a-z.]+'
    2120 
    2221}}}
     
    2524 
    2625{{{
    27 
    28 09/03/24 12:33:45 INFO mapred.FileInputFormat: Total input paths to process : 9
    29 09/03/24 12:33:45 INFO mapred.FileInputFormat: Total input paths to process : 9
    30 09/03/24 12:33:45 INFO mapred.JobClient: Running job: job_200903232025_0003
    31 09/03/24 12:33:46 INFO mapred.JobClient:  map 0% reduce 0%
    32 09/03/24 12:33:47 INFO mapred.JobClient:  map 10% reduce 0%
    33 09/03/24 12:33:49 INFO mapred.JobClient:  map 20% reduce 0%
    34 09/03/24 12:33:51 INFO mapred.JobClient:  map 30% reduce 0%
    35 09/03/24 12:33:52 INFO mapred.JobClient:  map 40% reduce 0%
    36 09/03/24 12:33:54 INFO mapred.JobClient:  map 50% reduce 0%
    37 09/03/24 12:33:55 INFO mapred.JobClient:  map 60% reduce 0%
    38 09/03/24 12:33:57 INFO mapred.JobClient:  map 70% reduce 0%
    39 09/03/24 12:33:59 INFO mapred.JobClient:  map 80% reduce 0%
    40 09/03/24 12:34:00 INFO mapred.JobClient:  map 90% reduce 0%
    41 09/03/24 12:34:02 INFO mapred.JobClient:  map 100% reduce 0%
    42 09/03/24 12:34:10 INFO mapred.JobClient:  map 100% reduce 10%
    43 09/03/24 12:34:12 INFO mapred.JobClient:  map 100% reduce 13%
    44 09/03/24 12:34:15 INFO mapred.JobClient:  map 100% reduce 20%
    45 09/03/24 12:34:20 INFO mapred.JobClient:  map 100% reduce 23%
    46 09/03/24 12:34:22 INFO mapred.JobClient: Job complete: job_200903232025_0003
    47 09/03/24 12:34:22 INFO mapred.JobClient: Counters: 16
    48 09/03/24 12:34:22 INFO mapred.JobClient:   File Systems
    49 09/03/24 12:34:22 INFO mapred.JobClient:     HDFS bytes read=48245
    50 09/03/24 12:34:22 INFO mapred.JobClient:     HDFS bytes written=1907
    51 09/03/24 12:34:22 INFO mapred.JobClient:     Local bytes read=1549
    52 09/03/24 12:34:22 INFO mapred.JobClient:     Local bytes written=3584
    53 09/03/24 12:34:22 INFO mapred.JobClient:   Job Counters
    54 ......
     2611/04/19 10:00:20 INFO mapred.FileInputFormat: Total input paths to process : 25
     2711/04/19 10:00:20 INFO mapred.JobClient: Running job: job_201104120101_0645
     2811/04/19 10:00:21 INFO mapred.JobClient:  map 0% reduce 0%
     29( ... skip ... )
    5530}}}
    5631
     
    5934
    6035{{{
    61 $ bin/hadoop fs -ls lab3_out1
    62 $ bin/hadoop fs -cat lab3_out1/part-00000
     36$ hadoop fs -ls lab5_out1
     37Found 2 items
     38drwx------   - hXXXX supergroup          0 2011-04-19 10:00 /user/hXXXX/lab5_out1/_logs
     39-rw-r--r--   2 hXXXX supergroup       1146 2011-04-19 10:00 /user/hXXXX/lab5_out1/part-00000
     40$ hadoop fs -cat lab5_out1/part-00000
    6341}}}
    6442
     
    6644
    6745{{{
    68 3       dfs.class
     464       dfs.permissions
     474       dfs.replication
     484       dfs.name.dir
     493       dfs.namenode.decommission.interval.
     503       dfs.namenode.decommission.nodes.per.interval
    69513       dfs.
    70 2       dfs.period
    71 1       dfs.http.address
    72 1       dfs.balance.bandwidth
    73 1       dfs.block.size
    74 1       dfs.blockreport.initial
    75 1       dfs.blockreport.interval
    76 1       dfs.client.block.write.retries
    77 1       dfs.client.buffer.dir
    78 1       dfs.data.dir
    79 1       dfs.datanode.address
    80 1       dfs.datanode.dns.interface
    81 1       dfs.datanode.dns.nameserver
    82 1       dfs.datanode.du.pct
    83 1       dfs.datanode.du.reserved
    84 1       dfs.datanode.handler.count
    85 1       dfs.datanode.http.address
    86 1       dfs.datanode.https.address
    87 1       dfs.datanode.ipc.address
    88 1       dfs.default.chunk.view.size
    89 1       dfs.df.interval
    90 1       dfs.file
    91 1       dfs.heartbeat.interval
    92 1       dfs.hosts
    93 1       dfs.hosts.exclude
    94 1       dfs.https.address
    95 1       dfs.impl
    96 1       dfs.max.objects
    97 1       dfs.name.dir
    98 1       dfs.namenode.decommission.interval
    99 1       dfs.namenode.decommission.interval.
    100 1       dfs.namenode.decommission.nodes.per.interval
    101 1       dfs.namenode.handler.count
    102 1       dfs.namenode.logging.level
    103 1       dfs.permissions
    104 1       dfs.permissions.supergroup
    105 1       dfs.replication
    106 1       dfs.replication.consider
    107 1       dfs.replication.interval
    108 1       dfs.replication.max
    109 1       dfs.replication.min
    110 1       dfs.replication.min.
    111 1       dfs.safemode.extension
    112 1       dfs.safemode.threshold.pct
    113 1       dfs.secondary.http.address
    114 1       dfs.servers
    115 1       dfs.web.ugi
    116 1       dfsmetrics.log
    117 
    118  }}}
     52( ... skip ... )
     53}}}
    11954
    12055= Sample 2 : WordCount =
     
    12257 * 如名稱,WordCount會對所有的字作字數統計,並且從a-z作排列[[BR]]WordCount example will count each word shown in documents and sorting from a to z.
    12358 
    124  {{{
    125  /opt/hadoop$ bin/hadoop jar hadoop-*-examples.jar wordcount lab3_input lab3_out2
    126  }}}
     59{{{
     60~$ hadoop jar hadoop-examples.jar wordcount lab5_input lab5_out2
     61}}}
    12762 
    12863 檢查輸出結果的方法同之前方法[[BR]]Let's check the computed result of '''wordcount''' from HDFS :
    12964
    13065{{{
    131 $ bin/hadoop fs -ls lab3_out2
    132 $ bin/hadoop fs -cat lab3_out2/part-r-00000
     66$ hadoop fs -ls lab5_out2
     67$ hadoop fs -cat lab5_out2/part-r-00000
    13368}}}
    134 
    135 = Browsing MapReduce and HDFS via Web GUI =
    136  
    137  * [http://localhost:50030 JobTracker Web Interface]
    138 
    139  * [http://localhost:50070 NameNode Web Interface]
    14069
    14170= More Examples =