storm和kafka整合后,出现问题

ekekyn 2015-09-23 03:19:01
创建的拓扑类如下:
其中KafkaSpout和KafkaBolt,使用的storm-kafka包中的类
	// 配置Zookeeper地址
BrokerHosts brokerHosts = new ZkHosts("redis-slave-001:2181,redis-slave-002:2181,redis-master-002:2181");
// 配置Kafka订阅的Topic,以及zookeeper中数据节点目录和名字
SpoutConfig spoutConfig = new SpoutConfig(brokerHosts, "topic1", "/zkkafkaspout" , "kafkaspout");

// 配置KafkaBolt中的kafka.broker.properties
Config conf = new Config();
Map<String, String> map = new HashMap<String, String>();
// 配置Kafka broker地址
map.put("metadata.broker.list", "redis-slave-002:9092");
// serializer.class为消息的序列化类
map.put("serializer.class", "kafka.serializer.StringEncoder");
conf.put("kafka.broker.properties", map);
// 配置KafkaBolt生成的topic
conf.put("topic", "topic2");

spoutConfig.scheme = new SchemeAsMultiScheme(new MessageScheme());
TopologyBuilder builder = new TopologyBuilder();
builder.setSpout("spout", new KafkaSpout(spoutConfig));
builder.setBolt("bolt", new SenqueceBolt()).shuffleGrouping("spout");
builder.setBolt("kafkabolt", new KafkaBolt<String, Integer>()).shuffleGrouping("bolt");

if (args != null && args.length > 0) {
conf.setNumWorkers(3);
StormSubmitter.submitTopology(args[0], conf, builder.createTopology());
} else {

LocalCluster cluster = new LocalCluster();
cluster.submitTopology("Topo", conf, builder.createTopology());
Utils.sleep(100000);
cluster.killTopology("Topo");
cluster.shutdown();
}
}

在storm中运行后出现日志中报错:
b.s.d.worker [ERROR] Error on initialization of server mk-worker
java.lang.NoClassDefFoundError: Lkafka/javaapi/consumer/ConsumerConnector;
at java.lang.Class.getDeclaredFields0(Native Method) ~[na:1.7.0_80]
at java.lang.Class.privateGetDeclaredFields(Class.java:2509) ~[na:1.7.0_80]
at java.lang.Class.getDeclaredField(Class.java:1959) ~[na:1.7.0_80]
at java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1659) ~[na:1.7.0_80]
at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:72) ~[na:1.7.0_80]
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:480) ~[na:1.7.0_80]
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:468) ~[na:1.7.0_80]
at java.security.AccessController.doPrivileged(Native Method) ~[na:1.7.0_80]
at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:468) ~[na:1.7.0_80]
at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:365) ~[na:1.7.0_80]
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:602) ~[na:1.7.0_80]
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622) ~[na:1.7.0_80]
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517) ~[na:1.7.0_80]
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771) ~[na:1.7.0_80]
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) ~[na:1.7.0_80]
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) ~[na:1.7.0_80]
at backtype.storm.serialization.DefaultSerializationDelegate.deserialize(DefaultSerializationDelegate.java:52) ~[storm-core-0.9.5.jar:0.9.5]
at backtype.storm.utils.Utils.deserialize(Utils.java:89) ~[storm-core-0.9.5.jar:0.9.5]
at backtype.storm.utils.Utils.getSetComponentObject(Utils.java:228) ~[storm-core-0.9.5.jar:0.9.5]
at backtype.storm.daemon.task$get_task_object.invoke(task.clj:73) ~[storm-core-0.9.5.jar:0.9.5]
at backtype.storm.daemon.task$mk_task_data$fn__6337.invoke(task.clj:180) ~[storm-core-0.9.5.jar:0.9.5]
at backtype.storm.util$assoc_apply_self.invoke(util.clj:850) ~[storm-core-0.9.5.jar:0.9.5]
at backtype.storm.daemon.task$mk_task_data.invoke(task.clj:173) ~[storm-core-0.9.5.jar:0.9.5]
at backtype.storm.daemon.task$mk_task.invoke(task.clj:184) ~[storm-core-0.9.5.jar:0.9.5]
at backtype.storm.daemon.executor$mk_executor$fn__6516.invoke(executor.clj:323) ~[storm-core-0.9.5.jar:0.9.5]
at clojure.core$map$fn__4207.invoke(core.clj:2485) ~[clojure-1.5.1.jar:na]
at clojure.lang.LazySeq.sval(LazySeq.java:42) ~[clojure-1.5.1.jar:na]
at clojure.lang.LazySeq.seq(LazySeq.java:60) ~[clojure-1.5.1.jar:na]
at clojure.lang.RT.seq(RT.java:484) ~[clojure-1.5.1.jar:na]
at clojure.core$seq.invoke(core.clj:133) ~[clojure-1.5.1.jar:na]
at clojure.core.protocols$seq_reduce.invoke(protocols.clj:30) ~[clojure-1.5.1.jar:na]
at clojure.core.protocols$fn__6026.invoke(protocols.clj:54) ~[clojure-1.5.1.jar:na]
at clojure.core.protocols$fn__5979$G__5974__5992.invoke(protocols.clj:13) ~[clojure-1.5.1.jar:na]
at clojure.core$reduce.invoke(core.clj:6177) ~[clojure-1.5.1.jar:na]
at clojure.core$into.invoke(core.clj:6229) ~[clojure-1.5.1.jar:na]
at backtype.storm.daemon.executor$mk_executor.invoke(executor.clj:323) ~[storm-core-0.9.5.jar:0.9.5]
at backtype.storm.daemon.worker$fn__6959$exec_fn__1103__auto____6960$iter__6965__6969$fn__6970.invoke(worker.clj:424) ~[storm-core-0.9.5.jar:0.9.5]
at clojure.lang.LazySeq.sval(LazySeq.java:42) ~[clojure-1.5.1.jar:na]
at clojure.lang.LazySeq.seq(LazySeq.java:60) ~[clojure-1.5.1.jar:na]
at clojure.lang.Cons.next(Cons.java:39) ~[clojure-1.5.1.jar:na]
at clojure.lang.RT.next(RT.java:598) ~[clojure-1.5.1.jar:na]
at clojure.core$next.invoke(core.clj:64) ~[clojure-1.5.1.jar:na]
at clojure.core$dorun.invoke(core.clj:2781) ~[clojure-1.5.1.jar:na]
at clojure.core$doall.invoke(core.clj:2796) ~[clojure-1.5.1.jar:na]
at backtype.storm.daemon.worker$fn__6959$exec_fn__1103__auto____6960.invoke(worker.clj:424) ~[storm-core-0.9.5.jar:0.9.5]
at clojure.lang.AFn.applyToHelper(AFn.java:185) [clojure-1.5.1.jar:na]
at clojure.lang.AFn.applyTo(AFn.java:151) [clojure-1.5.1.jar:na]
at clojure.core$apply.invoke(core.clj:617) ~[clojure-1.5.1.jar:na]
at backtype.storm.daemon.worker$fn__6959$mk_worker__7015.doInvoke(worker.clj:391) [storm-core-0.9.5.jar:0.9.5]
at clojure.lang.RestFn.invoke(RestFn.java:512) [clojure-1.5.1.jar:na]
at backtype.storm.daemon.worker$_main.invoke(worker.clj:502) [storm-core-0.9.5.jar:0.9.5]
at clojure.lang.AFn.applyToHelper(AFn.java:172) [clojure-1.5.1.jar:na]
at clojure.lang.AFn.applyTo(AFn.java:151) [clojure-1.5.1.jar:na]
at backtype.storm.daemon.worker.main(Unknown Source) [storm-core-0.9.5.jar:0.9.5]
Caused by: java.lang.ClassNotFoundException: kafka.javaapi.consumer.ConsumerConnector
at java.net.URLClassLoader$1.run(URLClassLoader.java:366) ~[na:1.7.0_80]
at java.net.URLClassLoader$1.run(URLClassLoader.java:355) ~[na:1.7.0_80]
at java.security.AccessController.doPrivileged(Native Method) ~[na:1.7.0_80]
at java.net.URLClassLoader.findClass(URLClassLoader.java:354) ~[na:1.7.0_80]
at java.lang.ClassLoader.loadClass(ClassLoader.java:425) ~[na:1.7.0_80]
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) ~[na:1.7.0_80]
at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ~[na:1.7.0_80]
... 54 common frames omitted
2015-09-22T16:24:19.920+0800 b.s.util [ERROR] Halting process: ("Error on initialization")
java.lang.RuntimeException: ("Error on initialization")

...全文
1428 3 打赏 收藏 转发到动态 举报
写回复
用AI写文章
3 条回复
切换为时间正序
请发表友善的回复…
发表回复
ZwRiven 2018-11-09
  • 打赏
  • 举报
回复
看报错信息缺少哪些类,在本地把相关的jar包,上传到集群storm的lib下面即可
  • 打赏
  • 举报
回复
把需要的jar包上传到storm的lib目录下就ok了
qq_17436835 2016-04-15
  • 打赏
  • 举报
回复
楼主这个问题解决了吗?刚好也遇到这个问题

20,808

社区成员

发帖
与我相关
我的任务
社区描述
Hadoop生态大数据交流社区,致力于有Hadoop,hive,Spark,Hbase,Flink,ClickHouse,Kafka,数据仓库,大数据集群运维技术分享和交流等。致力于收集优质的博客
社区管理员
  • 分布式计算/Hadoop社区
  • 涤生大数据
加入社区
  • 近7日
  • 近30日
  • 至今
社区公告
暂无公告

试试用AI创作助手写篇文章吧