日期:2014-05-16  浏览次数:20982 次

hadoop 报错 org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException

报错:

org.apache.hadoop.hdfs.DFSClient:Failed to close file

org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.LeaseExpiredException)

?

解决方法:

修改linux打开文件最大限制

echo "fs.file-max = 65535" >> /etc/sysctl.conf 
echo "* - nofile 65535" >> /etc/security/limits.conf
sysctl -p

ulimit -n

?

修改hadoop配置

vi hdfs-site.xml

<property>
  <name>dfs.datanode.max.xcievers</name>
  <value>8192</value>
</property>

?

?

?