hadoop流失败,错误代码为5

7gcisfzg  于 2021-06-04  发布在  Hadoop
关注(0)|答案(2)|浏览(488)

wordcount的rhadoop程序:

Sys.setenv(HADOOP_CMD="/usr/local/hadoop/bin/hadoop")
Sys.setenv(HADOOP_STREAMING="/usr/local/hadoop/share/hadoop/tools/lib/hadoop-streaming-2.4.1.jar")
Sys.setenv(HADOOP_HOME="/usr/local/hadoop")
library(rmr2) 

## map function

map <- function(k,lines) {
  words.list <- strsplit(lines, '\\s') 
  words <- unlist(words.list)
  return( keyval(words, 1) )
}

## reduce function

reduce <- function(word, counts) { 
  keyval(word, sum(counts))
}

wordcount <- function (input, output=NULL) { 
  mapreduce(input=input, output=output, input.format="text", 
            map=map, reduce=reduce)
}

## Submit job

hdfs.root <- 'input'

# hdfs.data <- file.path(hdfs.root, 'data')

hdfs.out <- file.path(hdfs.root, 'out') 
out <- wordcount(hdfs.root, hdfs.out)

## Fetch results from HDFS

results <- from.dfs(out)

## check top 2 frequent words

results.df <- as.data.frame(results, stringsAsFactors=F) 
colnames(results.df) <- c('word', 'count') 
head(results.df[order(results.df$count, decreasing=T), ], 2)

为了检查rhadoop集成,我使用了上面在rscript中执行的wordcount程序。但我收到了下面显示的错误。

15/01/21 13:48:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
packageJobJar: [/usr/local/hadoop/data/hadoop-unjar5866699842450503195/] [] /tmp/streamjob7335081573862861018.jar tmpDir=null
15/01/21 13:48:53 INFO client.RMProxy: Connecting to ResourceManager at localhost/127.0.0.1:8050
15/01/21 13:48:53 INFO client.RMProxy: Connecting to ResourceManager at localhost/127.0.0.1:8050
15/01/21 13:48:53 ERROR streaming.StreamJob: Error Launching job : Permission denied: user=pgl-26, access=EXECUTE, inode="/tmp":hduser:supergroup:drwxrwx---
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:265)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:251)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:205)
	at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:168)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5523)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3521)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:779)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:764)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)

Streaming Command Failed!
Error in mr(map = map, reduce = reduce, combine = combine, vectorized.reduce,  : 
  hadoop streaming failed with error code 5

请帮助我处理这个错误。我对r和hadoop都是新手。我不能确定我哪里出错了。

wztqucjr

wztqucjr1#

我认为这个问题得到了允许。在日志“error streaming.streamjob:error launching job:permission denied:”中。请检查连接。

slhcrj9b

slhcrj9b2#

授予临时目录的权限,如 hadoop fs -chown -R rhadoop /tmp .
哪里 rhadoop 是用户名

相关问题