hdfs 权限不足
tpj11 2014-07-16 05:44:29 在安装Hadoop-2.4.0的过程中,要hdfs namenode -format 但却出现错误,请问要如何提高权限?????
环境是Ubutu 14.04。
ww@HDName:~$ sudo gedit /etc/hostname
(gedit:4742): IBUS-WARNING **: The owner of /home/ww/.config/ibus/bus is not root!
ww@HDName:~$ hdfs namenode -format
14/07/16 17:35:26 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = HDName/192.168.56.168
STARTUP_MSG: args = [-format]
STARTUP_MSG: version = 2.4.0
STARTUP_MSG: classpath = (略)
STARTUP_MSG: build = http://svn.apache.org/repos/asf/hadoop/common -r 1583262; compiled by 'jenkins' on 2014-03-31T08:29Z
STARTUP_MSG: java = 1.7.0_55
************************************************************/
14/07/16 17:35:26 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
14/07/16 17:35:26 INFO namenode.NameNode: createNameNode [-format]
14/07/16 17:35:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Formatting using clusterid: CID-a39bc3cc-a877-4267-8683-f51c351c362e
14/07/16 17:35:28 INFO namenode.FSNamesystem: fsLock is fair:true
14/07/16 17:35:28 INFO namenode.HostFileManager: read includes:
HostSet(
)
14/07/16 17:35:28 INFO namenode.HostFileManager: read excludes:
HostSet(
)
14/07/16 17:35:28 INFO blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
14/07/16 17:35:28 INFO blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true
14/07/16 17:35:28 INFO util.GSet: Computing capacity for map BlocksMap
14/07/16 17:35:28 INFO util.GSet: VM type = 64-bit
14/07/16 17:35:28 INFO util.GSet: 2.0% max memory 966.7 MB = 19.3 MB
14/07/16 17:35:28 INFO util.GSet: capacity = 2^21 = 2097152 entries
14/07/16 17:35:28 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=false
14/07/16 17:35:28 INFO blockmanagement.BlockManager: defaultReplication = 1
14/07/16 17:35:28 INFO blockmanagement.BlockManager: maxReplication = 512
14/07/16 17:35:28 INFO blockmanagement.BlockManager: minReplication = 1
14/07/16 17:35:28 INFO blockmanagement.BlockManager: maxReplicationStreams = 2
14/07/16 17:35:29 INFO blockmanagement.BlockManager: shouldCheckForEnoughRacks = false
14/07/16 17:35:29 INFO blockmanagement.BlockManager: replicationRecheckInterval = 3000
14/07/16 17:35:29 INFO blockmanagement.BlockManager: encryptDataTransfer = false
14/07/16 17:35:29 INFO blockmanagement.BlockManager: maxNumBlocksToLog = 1000
14/07/16 17:35:29 INFO namenode.FSNamesystem: fsOwner = ww (auth:SIMPLE)
14/07/16 17:35:29 INFO namenode.FSNamesystem: supergroup = supergroup
14/07/16 17:35:29 INFO namenode.FSNamesystem: isPermissionEnabled = true
14/07/16 17:35:29 INFO namenode.FSNamesystem: HA Enabled: false
14/07/16 17:35:29 INFO namenode.FSNamesystem: Append Enabled: true
14/07/16 17:35:29 INFO util.GSet: Computing capacity for map INodeMap
14/07/16 17:35:29 INFO util.GSet: VM type = 64-bit
14/07/16 17:35:29 INFO util.GSet: 1.0% max memory 966.7 MB = 9.7 MB
14/07/16 17:35:29 INFO util.GSet: capacity = 2^20 = 1048576 entries
14/07/16 17:35:29 INFO namenode.NameNode: Caching file names occuring more than 10 times
14/07/16 17:35:29 INFO util.GSet: Computing capacity for map cachedBlocks
14/07/16 17:35:29 INFO util.GSet: VM type = 64-bit
14/07/16 17:35:29 INFO util.GSet: 0.25% max memory 966.7 MB = 2.4 MB
14/07/16 17:35:29 INFO util.GSet: capacity = 2^18 = 262144 entries
14/07/16 17:35:29 INFO namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
14/07/16 17:35:29 INFO namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
14/07/16 17:35:29 INFO namenode.FSNamesystem: dfs.namenode.safemode.extension = 30000
14/07/16 17:35:29 INFO namenode.FSNamesystem: Retry cache on namenode is enabled
14/07/16 17:35:29 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
14/07/16 17:35:29 INFO util.GSet: Computing capacity for map NameNodeRetryCache
14/07/16 17:35:29 INFO util.GSet: VM type = 64-bit
14/07/16 17:35:29 INFO util.GSet: 0.029999999329447746% max memory 966.7 MB = 297.0 KB
14/07/16 17:35:29 INFO util.GSet: capacity = 2^15 = 32768 entries
14/07/16 17:35:29 INFO namenode.AclConfigFlag: ACLs enabled? false
14/07/16 17:35:29 INFO namenode.FSImage: Allocated new BlockPoolId: BP-210837843-192.168.56.168-1405503329548
14/07/16 17:35:29 WARN namenode.NameNode: Encountered exception during format:
java.io.IOException: Cannot create directory /usr/local/hadoop_store/hdfs/namenode/current
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:334)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:546)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:567)
at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:148)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:845)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1256)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1370)
14/07/16 17:35:29 FATAL namenode.NameNode: Exception in namenode join
java.io.IOException: Cannot create directory /usr/local/hadoop_store/hdfs/namenode/current
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:334)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:546)
at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:567)
at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:148)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:845)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1256)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1370)
14/07/16 17:35:29 INFO util.ExitUtil: Exiting with status 1
14/07/16 17:35:29 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at HDName/192.168.56.168
************************************************************/
ww@HDName:~$ start-dfs.sh
14/07/16 17:36:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [HDName]
HDName: mkdir: 无法建立目录'/usr/local/hadoop/logs': 拒绝不符权限的操作
HDName: chown: 无法存取'/usr/local/hadoop/logs': 没有此一档案或目录
HDName: starting namenode, logging to /usr/local/hadoop/logs/hadoop-ww-namenode-HDName.out
HDName: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 151: /usr/local/hadoop/logs/hadoop-ww-namenode-HDName.out: 没有此一档案或目录
HDName: head: 无法开启'/usr/local/hadoop/logs/hadoop-ww-namenode-HDName.out' 来读取资料: 没有此一档案或目录
HDName: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 166: /usr/local/hadoop/logs/hadoop-ww-namenode-HDName.out: 没有此一档案或目录
HDName: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 167: /usr/local/hadoop/logs/hadoop-ww-namenode-HDName.out: 没有此一档案或目录
localhost: mkdir: 无法建立目录'/usr/local/hadoop/logs': 拒绝不符权限的操作
localhost: chown: 无法存取'/usr/local/hadoop/logs': 没有此一档案或目录
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-ww-datanode-HDName.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 151: /usr/local/hadoop/logs/hadoop-ww-datanode-HDName.out: 没有此一档案或目录
localhost: head: 无法开启'/usr/local/hadoop/logs/hadoop-ww-datanode-HDName.out' 来读取资料: 没有此一档案或目录
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 166: /usr/local/hadoop/logs/hadoop-ww-datanode-HDName.out: 没有此一档案或目录
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 167: /usr/local/hadoop/logs/hadoop-ww-datanode-HDName.out: 没有此一档案或目录
192.168.5.201: ssh: connect to host 192.168.5.201 port 22: No route to host
Starting secondary namenodes [0.0.0.0]
0.0.0.0: mkdir: 无法建立目录'/usr/local/hadoop/logs': 拒绝不符权限的操作
0.0.0.0: chown: 无法存取'/usr/local/hadoop/logs': 没有此一档案或目录
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-ww-secondarynamenode-HDName.out
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 151: /usr/local/hadoop/logs/hadoop-ww-secondarynamenode-HDName.out: 没有此一档案或目录
0.0.0.0: head: 无法开启'/usr/local/hadoop/logs/hadoop-ww-secondarynamenode-HDName.out' 来读取资料: 没有此一档案或目录
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 166: /usr/local/hadoop/logs/hadoop-ww-secondarynamenode-HDName.out: 没有此一档案或目录
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 167: /usr/local/hadoop/logs/hadoop-ww-secondarynamenode-HDName.out: 没有此一档案或目录
14/07/16 17:37:15 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable