无法使用flume将twitter数据流式传输到hdfs

hsgswve4  于 2021-07-13  发布在  Hadoop
关注(0)|答案(0)|浏览(292)

我正在尝试用flume将数据从twitter流式传输到hdfs,我使用的是cloudera quickstart vm 5.13,我没有任何错误,但是目标目录是空的。
这是我的flume.conf文件:

TwitterAgent.channels = MemChannel
TwitterAgent.sinks = HDFS

TwitterAgent.sources.Twitter.type = org.apache.flume.source.twitter.TwitterSource
TwitterAgent.sources.Twitter.channels = MemChannel
TwitterAgent.sources.Twitter.consumerKey = Sp0ti7peTvFPDJSWMGk2ChMZM
TwitterAgent.sources.Twitter.consumerSecret = Cncmq5b6rKxWPb6qNSPkqpzIR7L3EcQ8WUCeG0gX4L9sPIzflN
TwitterAgent.sources.Twitter.accessToken = 1370386818609377287-IsLuhCt54wK4T2Ua9Cb0TC14rrs1c5
TwitterAgent.sources.Twitter.accessTokenSecret = AL7oYsVUQXz5KXtQSj0tu36R85MyvAsBjcgktdZD63Ou6
TwitterAgent.sources.Twitter.keywords = hadoop, big data, analytics, bigdata, cloudera, data science, data scientist, business intelligence, mapreduce, data warehouse, data warehousing, mahout, hbase, nosql, newsql, businessintelligence, cloudcomputing

TwitterAgent.sinks.HDFS.channel = MemChannel
TwitterAgent.sinks.HDFS.type = hdfs
TwitterAgent.sinks.HDFS.hdfs.path = hdfs://quickstart.cloudera:8020/user/flume/tweets/
TwitterAgent.sinks.HDFS.hdfs.fileType = DataStream
TwitterAgent.sinks.HDFS.hdfs.writeFormat = text 
TwitterAgent.sinks.HDFS.hdfs.batchSize = 1000
TwitterAgent.sinks.HDFS.hdfs.rollSize = 0
TwitterAgent.sinks.HDFS.hdfs.rollCount = 10000
TwitterAgent.sinks.HDFS.hdfs.rollInterval = 600

TwitterAgent.channels.MemChannel.type = memory
TwitterAgent.channels.MemChannel.capacity = 10000
TwitterAgent.channels.MemChannel.transitionCapacity = 100

我正在调用此命令来流: flume-ng agent --conf ./conf/ -f /home/cloudera/flume.conf -n TwitterAgent​ 拜托,我想知道我在哪方面做错了。任何有价值的建议都将不胜感激。
提前谢谢。

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题