hadoop:copying csv 使用flume spool dir将文件发送到hdfs,错误:info source.spooldirectorysource:后台处理目录源运行程序已关闭

bvn4nwqk  于 2021-06-02  发布在  Hadoop
关注(0)|答案(0)|浏览(208)

我试图使用flume spool dir复制csv文件到hdfs。因为我是hadoop概念的初学者。请帮助我解决以下问题
hdfs目录:/home/hdfs
Flume方向:/etc/flume/
请查找flume-hwdgteam01.conf文件,如下所示


# Define a source, a channel, and a sink

hwdgteam01.sources = src1
hwdgteam01.channels = chan1
hwdgteam01.sinks = sink1

# Set the source type to Spooling Directory and set the directory

# location to /home/flume/ingestion/

hwdgteam01.sources.src1.type = spooldir
hwdgteam01.sources.src1.spoolDir = /home/hwdgteam01/nandan/input-data
hwdgteam01.sources.src1.basenameHeader = true

# Configure the channel as simple in-memory queue

hwdgteam01.channels.chan1.type = memory

# Define the HDFS sink and set its path to your target HDFS directory

hwdgteam01.sinks.sink1.type = hdfs
hwdgteam01.sinks.sink1.hdfs.path = /home/datalanding
hwdgteam01.sinks.sink1.hdfs.fileType = DataStream

# Disable rollover functionallity as we want to keep the original files

hwdgteam01.sinks.sink1.rollCount = 0
hwdgteam01.sinks.sink1.rollInterval = 0
hwdgteam01.sinks.sink1.rollSize = 0
hwdgteam01.sinks.sink1.idleTimeout = 0

# Set the files to their original name

hwdgteam01.sinks.sink1.hdfs.filePrefix = %{basename}

# Connect source and sink

hwdgteam01.sources.src1.channels = chan1
hwdgteam01.sinks.sink1.channel = chan1

以下是我执行命令的方式:
/usr/bin/flume ng agent--conf conf--conf file/home/hwdgteam01/nandan/config/flume-hwdgteam01.conf-dflume.root.logger=debug,console--name hwdgteam01

/usr/bin/flume ng agent-n hwdgteam01-f/home/hwdgteam01/nandan/config/flume-hwdgteam01.conf
/usr/bin/flume ng agent-n hwdgteam01-f/home/hwdgteam01/nandan/config/flume-hwdgteam01.conf

/主页/hwdgteam01/nandan/config/flume ng agent-n hwdgteam01-f

/主页/hwdgteam01/nandan/config/flume-hwdgteam01.conf
但是什么也没有解决,我得到以下错误flume error msg。
请告诉我哪里出了问题。谢谢你的帮助

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题