日期:2014-05-16  浏览次数:20737 次

linux上搭建Hadoop

linux环境

1.

/etc/hosts (是否必须)

http://hi.baidu.com/2enjoy/blog/item/28e4e721a24d62419922ed75.html

注意:机器是动态IP

cat ./a >> ./b

3.建立ssh无密码登录

?

在namenode上无密码登录本机

?

?

[djboss@DevStation24 hdtest]$ pwd
/home/djboss/hdtest

[djboss@DevStation24 hdtest]$ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
Generating public/private dsa key pair.
Your identification has been saved in /home/djboss/.ssh/id_dsa.
Your public key has been saved in /home/djboss/.ssh/id_dsa.pub.
The key fingerprint is:
9e:1d:39:87:dc:7f:e4:31:8d:df:82:ff:7a:fb:83:ab djboss@DevStation24

[djboss@DevStation24 .ssh]$ pwd
/home/djboss/.ssh
[djboss@DevStation24 .ssh]$ ls -a
.  ..  id_dsa  id_dsa.pub  known_hosts

[djboss@DevStation24 hdtest]$ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys
[djboss@DevStation24 .ssh]$ ls -a
.  ..  authorized_keys  id_dsa  id_dsa.pub  known_hosts



[djboss@DevStation24 .ssh]$ ssh localhost
The authenticity of host 'localhost (192.168.123.24)' can't be established.
RSA key fingerprint is f5:ba:aa:82:fd:e2:cb:34:03:9b:4d:69:bf:66:3e:a9.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added 'localhost' (RSA) to the list of known hosts.
Last login: Thu May 17 13:43:49 2012 from 172.16.10.24

[djboss@DevStation24 ~]$ ssh localhost
Last login: Thu May 17 15:29:15 2012 from devstation24


?

?

namenode无密码登录datanode

?

[djboss@DevStation24 ~]$ ssh 192.168.123.61
Last login: Thu May 17 15:43:12 2012 from teststation61
[djboss@TestStation61 ~]$ 

datanode暂不能访问namenode

?

?

设置环境变量

djboss ~/.bash_profile

?

source?~/.bash_profile 使更改生效

?

?

$vi ~/.bash_profile

export HADOOP_HOME=/home/djboss/hd_test/hadoop-1.0.2
export PATH=$PATH:$ANT_HOME/bin:$HADOOP_HOME/bin

$source ~/.bash_profile
?

?

配置文件1:core-site.xml

?

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
<property>
    <name>hadoop.tmp.dir</name>
    <value>/home/djboss/hdtest/tmp/</value>
</property>
<property>
   <name>fs.default.name</name>
   <value>hdfs://192.168.123.24:54310/</value>
</property>
<property>
  <name>dfs.block.size</name>
  <value>5120000</value>
  <description>The default block size for new files.</description>
</property>
</configuration>

?配置文件2:hdfs-site.xml

?

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
<property>
   <name>dfs.replication</name>
   <value>1</value>
</property>
</configuration>

?

配置文件3:mapred-site.xml

?

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
<property>
   <name>mapred.job.tracker</name>
   <value>hdfs://192.168.123.24:54311/</value>
</property>
</configuration>

?

?

?

$hadoop namenode -format

参考http://dikar.iteye.com/blog/941877

?

Warning: $HADOOP_HOME is deprecated.

12/05/18 13:09:58 INFO namenode.NameNode: STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting N