【求助】使用flume采集kafka中的数据持久化到HDFS中
打开kafka生产者:./kafka-console-producer --topic topic2017 --broker-list mfs-master:9092,mfs-log:9092,chunk0:9092
输入信息,就是收集不到数据,弄了挺久的,有大佬懂得么?求指教
贴上我的flume配置:
agent1.sources = kafkaSource
agent1.channels = memoryChannel
agent1.sinks = hdfsSink
agent1.sources.kafkaSource.channels = memoryChannel
agent1.sources.kafkaSource.type=org.apache.flume.source.kafka.KafkaSource
agent1.sources.kafkaSource.kafka.bootstrap.servers= mfs-master:9092,mfs-log:9092,chunk0:9092
agent1.sources.kafkaSource.kafka.topics=topic2017
agen1t.sources.kafkaSource.groupId=flume
agent1.channels.memoryChannel.type=memory
agent1.channels.memoryChannel.capacity=1000
agent1.channels.memoryChannel.transactionCapacity=100
# the sink of hdfs
agent1.sinks.hdfsSink.type=hdfs
agent1.sinks.hdfsSink.channel = memoryChannel
agent1.sinks.hdfsSink.hdfs.path=hdfs://mfs-master:9000/user/root/kafkaAndFlumeToHDFS/test/%Y-%m-%d
agent1.sinks.hdfsSink.hdfs.writeFormat=Text
agent1.sinks.hdfsSink.hdfs.fileType=DataStream