io—在Windows7上构建hadoop

niknxzdl  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(375)

我遵循本教程在Windows7环境中构建ApacheHadoop。长话短说。我可以用 mvn compile 命令并可以使用 mvn -package -DskipTests 但是我不能 mvn package -Pdist,native-win -DskipTests -Dtar 我遇到i/o异常,无法解决这些异常。我在构建hadoop时没有遇到这些异常 -Dtar 参数
有人能帮我解决这些异常吗?

[INFO] Executing tasks
main:
      [get] Destination already exists (skipping): C:\hadoop\hadoop-hdfs-    project\hadoop-hdfs-httpfs\downloads\tomcat.tar.gz
    [mkdir] Created dir: C:\hadoop\hadoop-hdfs-project\hadoop-hdfs-httpfs\target\tomcat.exp
 [exec] tar (child): C\:hadoophadoop-hdfs-projecthadoop-hdfs-httpfs/downloads/tomcat.tar.gz: Cannot open: I/O error
 [exec] tar (child): Error is not recoverable: exiting now
 [exec]
 [exec] gzip: stdin: unexpected end of file
 [exec] tar: Child returned status 2
 [exec] tar: Error exit delayed from previous errors
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [  1.018 s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [  1.653 s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [  2.181 s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [  0.200 s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [  2.889 s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [  1.957 s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [  1.570 s]
[INFO] Apache Hadoop Common .............................. SUCCESS [ 50.085 s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [  0.090 s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [ 35.510 s]
[INFO] Apache Hadoop HttpFS .............................. FAILURE [  5.155 s]
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] hadoop-yarn ....................................... SKIPPED
[INFO] hadoop-yarn-api ................................... SKIPPED
[INFO] hadoop-yarn-common ................................ SKIPPED
[INFO] hadoop-yarn-server ................................ SKIPPED
[INFO] hadoop-yarn-server-common ......................... SKIPPED
[INFO] hadoop-yarn-server-nodemanager .................... SKIPPED
[INFO] hadoop-yarn-server-web-proxy ...................... SKIPPED
[INFO] hadoop-yarn-server-resourcemanager ................ SKIPPED
[INFO] hadoop-yarn-server-tests .......................... SKIPPED
[INFO] hadoop-yarn-client ................................ SKIPPED
[INFO] hadoop-mapreduce-client ........................... SKIPPED
[INFO] hadoop-mapreduce-client-core ...................... SKIPPED
[INFO] hadoop-yarn-applications .......................... SKIPPED
[INFO] hadoop-yarn-applications-distributedshell ......... SKIPPED
[INFO] hadoop-yarn-site .................................. SKIPPED
[INFO] hadoop-yarn-project ............................... SKIPPED
[INFO] hadoop-mapreduce-client-common .................... SKIPPED
[INFO] hadoop-mapreduce-client-shuffle ................... SKIPPED
[INFO] hadoop-mapreduce-client-app ....................... SKIPPED
[INFO] hadoop-mapreduce-client-hs ........................ SKIPPED
[INFO] hadoop-mapreduce-client-jobclient ................. SKIPPED
[INFO] hadoop-mapreduce-client-hs-plugins ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] hadoop-mapreduce .................................. SKIPPED
[INFO] Apache Hadoop MapReduce Streaming ................. SKIPPED
[INFO] Apache Hadoop Distributed Copy .................... SKIPPED
[INFO] Apache Hadoop Archives ............................ SKIPPED
[INFO] Apache Hadoop Rumen ............................... SKIPPED
[INFO] Apache Hadoop Gridmix ............................. SKIPPED
[INFO] Apache Hadoop Data Join ........................... SKIPPED
[INFO] Apache Hadoop Extras .............................. SKIPPED    
[INFO] Apache Hadoop Pipes ............................... SKIPPED
[INFO] Apache Hadoop Tools Dist .......................... SKIPPED
[INFO] Apache Hadoop Tools ............................... SKIPPED
[INFO] Apache Hadoop Distribution ........................ SKIPPED
[INFO] Apache Hadoop Client .............................. SKIPPED
[INFO] Apache Hadoop Mini-Cluster ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:43 min
[INFO] Finished at: 2014-05-19T11:24:25+00:00
[INFO] Final Memory: 49M/179M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "native-win" could not be activated because it does not 
exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run      dist) on project hadoop-hdfs-httpfs: An Ant BuildExcept ion has occured: exec returned: 2 -    > [Help 1]

[ERROR]

[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.

[ERROR] Re-run Maven using the -X switch to enable full debug logging.

[ERROR]

[ERROR] For more information about the errors and possible solutions, please read the 
following articles:

[ERROR] [Help 1]     http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException

[ERROR]

[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs-httpfs
c:\hadoop>
wxclj1h5

wxclj1h51#

如果您使用的是更高版本的hadoop,即hadoop-2.6、2.7或2.8,那么就不需要构建hadoop src来获取windows原生hadoop。下面是一个github链接,其中包含用于hadoop最新版本的winutils。
在使用maven构建hadoop src时,我也遇到过类似的问题,这些步骤对我很有用。
在中下载并安装java c:/java/ (确保路径是这样的,如果java安装在程序文件中,那么hadoop-env.cmd将无法识别java路径)
下载hadoop二进制发行版。
(我使用的是二进制分布hadoop-2.8.1)
设置环境变量:

JAVA_HOME = "c:/Java"
HADOOP_HOME="<your hadoop home>"
Path= "JAVA_HOME/bin"
Path = "HADOOP_HOME/bin"

如果hadoop src是在windows机器中使用maven构建的,那么hadoop将在windows上工作。构建hadoop src(发行版)将创建一个hadoop二进制发行版,它将作为windows本机版本使用。
但如果你不想这样做,那就下载预构建的 winutils of Hadoop distribution. 这是一个github链接,其中包含一些hadoop版本的winutils。
(如果您使用的版本不在列表中,请遵循在windows上设置hadoop的常规方法-link)
如果找到了您的版本,则将文件夹的所有内容复制粘贴到路径:/bin中/
在hadoop-env.cmd文件中设置所有.xml配置文件-link&set java\u home path
从命令转到:

<HADOOP_HOME>/bin/> hdfs namenode -format
<HADOOP_HOME>/sbin> start-all.cmd

希望这有帮助。

相关问题