ec2中的flume和emr中的hdfs有可能吗?

snvhrwxg  于 2021-05-29  发布在  Hadoop
关注(0)|答案(0)|浏览(290)

我在emr中有一个hadoop集群,还有一个ec2示例。我需要在ec2节点上安装flume,并在emr中使用hdfs作为接收器。我已经尝试了一段时间,但不能这样做。我得到以下例外:

2017-12-19 10:50:32,428 (SinkRunner-PollingRunner-DefaultSinkProcessor) [WARN - org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:443)] HDFS IO error
org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot communicate with client version 3
     at org.apache.hadoop.ipc.Client.call(Client.java:740)
     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
     at com.sun.proxy.$Proxy4.getProtocolVersion(Unknown Source)
     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
     at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
     at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
     at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
     at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
     at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
     at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
     at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
     at org.apache.hadoop.fs.Path.getFileSystem(Path.java:175)
     at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:260)
     at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:252)
     at org.apache.flume.sink.hdfs.BucketWriter$9$1.run(BucketWriter.java:701)
     at org.apache.flume.auth.SimpleAuthenticator.execute(SimpleAuthenticator.java:50)
     at org.apache.flume.sink.hdfs.BucketWriter$9.call(BucketWriter.java:698)
     at java.util.concurrent.FutureTask.run(FutureTask.java:266)
     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
     at java.lang.Thread.run(Thread.java:748)

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题