在hadoop上运行wordcountr示例代码时出错

z5btuh9x  于 2021-05-30  发布在  Hadoop
关注(0)|答案(1)|浏览(381)

r wordcount示例代码:

library(rmr2) 
map <- function(k,lines) {
    words.list <- strsplit(lines, '\\s') 
    words <- unlist(words.list)
    return( keyval(words, 1) )
}
reduce <- function(word, counts) { 
    keyval(word, sum(counts))
}
wordcount <- function (input, output=NULL) { 
    mapreduce(input=input, output=output, input.format = "text", map=map, reduce=reduce)
}
system("/opt/hadoop/hadoop-2.5.1/bin/hadoop fs -rm -r /wordcount/out")
hdfs.root <- 'wordcount'
hdfs.data <- file.path(hdfs.root, 'data')
hdfs.out <- file.path(hdfs.root, 'out')

当我执行r代码的最后一条语句时,它会给出以下错误消息。

Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)

出错后,显示:

INFO mapreduce.Job:  map 100% reduce 100%

以及

ERROR streaming.StreamJob: Job not Successful! Streaming Command Failed!

output文件夹是在hdfs中创建的,但不会生成任何结果。知道是什么导致了问题吗?

更新1:

我发现了一个错误日志,它是由hadoop为位于的特定作业提供的localhost:8042

Dec 11, 2014 3:26:38 PM com.google.inject.servlet.InternalServletModule$BackwardsCompatibleServletContextProvider get
WARNING: You are attempting to use a deprecated API (specifically, attempting to @Inject ServletContext inside an eagerly created singleton. While we allow this for backwards compatibility, be warned that this MAY have unexpected behavior if you have more than one injector (with ServletModule) running in the same JVM. Please consult the Guice documentation at http://code.google.com/p/google-guice/wiki/Servlets for more information.
Dec 11, 2014 3:26:40 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.app.webapp.JAXBContextResolver as a provider class
Dec 11, 2014 3:26:40 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Dec 11, 2014 3:26:40 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.app.webapp.AMWebServices as a root resource class
Dec 11, 2014 3:26:40 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Dec 11, 2014 3:26:40 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.app.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Dec 11, 2014 3:26:43 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Dec 11, 2014 3:26:45 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.app.webapp.AMWebServices to GuiceManagedComponentProvider with the scope "PerRequest"
log4j:WARN No appenders could be found for logger (org.apache.hadoop.ipc.Server).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

有人知道问题是什么吗?

更新2:

我在$hadoop\u home/logs/userlogs/[application\u id]/[container\u id]/stderr找到了额外的日志信息:

...
Error : .onLoad failed in loadNamespace() for 'rhdfs', details:
call: fun(libname, pkgname)
  error: Environment variable HADOOP_CMD must be set before loading package rhdfs
Warning in FUN(c("base", "methods", "datasets", "utils", "grDevices", "graphics",  :
can't load rhdfs
Loading required package: rmr2
Error in loadNamespace(j <- i[[1L]], c(lib.loc, .libPaths()), versionCheck = vI[[j]]) : 
there is no package called ‘stringr’
...
3vpjnl9f

3vpjnl9f1#

在深入查看错误日志之后,似乎我已经在用户级别上安装了r库,我应该在系统级别上安装它。在这个线程上可以找到如何将r库安装到系统级的详细信息。(“dev\u tools”软件包可能很方便,请记住在sudo下运行r,或者您更喜欢它 sudo R CMD INSTALL [package_name] )
您可以在r中通过 system.file(package="[package_name]") ,但这始终显示包的第一个首选库路径。所以我强烈推荐以前安装的用户库。
再运行几次以仔细检查错误日志,并确保在r system lib中正确安装了包。这个 stderr 日志很有用,但之前没有人指出实际位置:-(

相关问题