Hive中创建hbase关联表报错

xz43 2014-03-28 05:25:32
我用的hadoop-2.2.0、hbase-0.96.1.1-hadoop2和hive-0.12.0集成了一个3台机器的环境,其中一台namenode,两台datanode。配置完成,把hive的源码拉下来重新编译过,并把生成的jar替换了hive/lib下面的jar,启动metastore service后,hive连接上,创建一般的表都没问题,就是创建和hbase的管理表就报错了,错误如下:
hive> create table hivetest(key int, val string) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val") TBLPROPERTIES ("hbase.table.name" = "hivetest");
14/03/28 16:08:15 INFO log.PerfLogger: <PERFLOG method=Driver.run from=org.apache.hadoop.hive.ql.Driver>
14/03/28 16:08:15 INFO log.PerfLogger: <PERFLOG method=TimeToSubmit from=org.apache.hadoop.hive.ql.Driver>
14/03/28 16:08:15 INFO log.PerfLogger: <PERFLOG method=compile from=org.apache.hadoop.hive.ql.Driver>
14/03/28 16:08:15 DEBUG parse.VariableSubstitution: Substitution is on: create table hivetest(key int, val string) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val") TBLPROPERTIES ("hbase.table.name" = "hivetest")
14/03/28 16:08:15 INFO log.PerfLogger: <PERFLOG method=parse from=org.apache.hadoop.hive.ql.Driver>
14/03/28 16:08:15 INFO parse.ParseDriver: Parsing command: create table hivetest(key int, val string) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val") TBLPROPERTIES ("hbase.table.name" = "hivetest")
14/03/28 16:08:15 INFO parse.ParseDriver: Parse Completed
14/03/28 16:08:15 INFO log.PerfLogger: </PERFLOG method=parse start=1395994095285 end=1395994095548 duration=263 from=org.apache.hadoop.hive.ql.Driver>
14/03/28 16:08:15 INFO log.PerfLogger: <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
14/03/28 16:08:15 INFO parse.SemanticAnalyzer: Starting Semantic Analysis
14/03/28 16:08:15 INFO parse.SemanticAnalyzer: Creating table hivetest position=13
14/03/28 16:08:15 INFO ql.Driver: Semantic Analysis Completed
14/03/28 16:08:15 DEBUG parse.SemanticAnalyzer: validation start
14/03/28 16:08:15 DEBUG parse.SemanticAnalyzer: not validating writeEntity, because entity is neither table nor partition
14/03/28 16:08:15 INFO log.PerfLogger: </PERFLOG method=semanticAnalyze start=1395994095548 end=1395994095686 duration=138 from=org.apache.hadoop.hive.ql.Driver>
14/03/28 16:08:15 INFO ql.Driver: Returning Hive schema: Schema(fieldSchemas:null, properties:null)
14/03/28 16:08:15 INFO log.PerfLogger: </PERFLOG method=compile start=1395994095225 end=1395994095701 duration=476 from=org.apache.hadoop.hive.ql.Driver>
14/03/28 16:08:15 INFO ql.Driver: Concurrency mode is disabled, not creating a lock manager
14/03/28 16:08:15 INFO log.PerfLogger: <PERFLOG method=Driver.execute from=org.apache.hadoop.hive.ql.Driver>
14/03/28 16:08:15 INFO ql.Driver: Starting command: create table hivetest(key int, val string) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val") TBLPROPERTIES ("hbase.table.name" = "hivetest")
Query ID = hadoop_20140328160808_7f47197f-7322-4aa0-8445-eaecdcfa4717
14/03/28 16:08:15 INFO ql.Driver: Query ID = hadoop_20140328160808_7f47197f-7322-4aa0-8445-eaecdcfa4717
14/03/28 16:08:15 INFO log.PerfLogger: </PERFLOG method=TimeToSubmit start=1395994095225 end=1395994095721 duration=496 from=org.apache.hadoop.hive.ql.Driver>
14/03/28 16:08:15 INFO log.PerfLogger: <PERFLOG method=runTasks from=org.apache.hadoop.hive.ql.Driver>
14/03/28 16:08:15 INFO log.PerfLogger: <PERFLOG method=task.DDL.Stage-0 from=org.apache.hadoop.hive.ql.Driver>
14/03/28 16:08:15 INFO exec.DDLTask: Use StorageHandler-supplied org.apache.hadoop.hive.hbase.HBaseSerDe for table hivetest
14/03/28 16:08:15 DEBUG security.Groups: Returning fetched groups for 'hadoop'
14/03/28 16:08:15 DEBUG security.Groups: Returning cached groups for 'hadoop'
14/03/28 16:08:15 WARN conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect. Use hive.hmshandler.retry.* instead
14/03/28 16:08:15 DEBUG session.SessionState: Session is using authorization class class org.apache.hadoop.hive.ql.security.authorization.DefaultHiveAuthorizationProvider
14/03/28 16:08:15 DEBUG hive.log: DDL: struct hivetest { i32 key, string val}
14/03/28 16:08:16 DEBUG hbase.HBaseSerDe: HBaseSerDe initialized with : columnNames = [key, val] columnTypes = [int, string] hbaseColumnMapping = :key,cf1:val
14/03/28 16:08:16 DEBUG hive.log: DDL: struct hivetest { i32 key, string val}
14/03/28 16:08:16 DEBUG hbase.HBaseSerDe: HBaseSerDe initialized with : columnNames = [key, val] columnTypes = [int, string] hbaseColumnMapping = :key,cf1:val
14/03/28 16:08:16 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.5-1392090, built on 09/30/2012 17:52 GMT
14/03/28 16:08:16 INFO zookeeper.ZooKeeper: Client environment:host.name=testserver4
14/03/28 16:08:16 INFO zookeeper.ZooKeeper: Client environment:java.version=1.7.0_01
14/03/28 16:08:16 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation
14/03/28 16:08:16 INFO zookeeper.ZooKeeper: Client environment:java.home=/usr/java/jdk1.7.0_01/jre

14/03/28 16:08:16 INFO zookeeper.ZooKeeper: Client environment:java.class.path=太长省略
14/03/28 16:08:16 INFO zookeeper.ZooKeeper: Client environment:java.library.path=/home/hadoop/hadoop-2.2.0/lib/native
14/03/28 16:08:16 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=/tmp
14/03/28 16:08:16 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA>
14/03/28 16:08:16 INFO zookeeper.ZooKeeper: Client environment:os.name=Linux
14/03/28 16:08:16 INFO zookeeper.ZooKeeper: Client environment:os.arch=amd64
14/03/28 16:08:16 INFO zookeeper.ZooKeeper: Client environment:os.version=2.6.32-131.0.15.el6.x86_64
14/03/28 16:08:16 INFO zookeeper.ZooKeeper: Client environment:user.name=hadoop
14/03/28 16:08:16 INFO zookeeper.ZooKeeper: Client environment:user.home=/home/hadoop
14/03/28 16:08:16 INFO zookeeper.ZooKeeper: Client environment:user.dir=/home/hadoop
14/03/28 16:08:16 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=javadb11g:2181,java11g2:2181 sessionTimeout=90000 watcher=hconnection-0x2896bed3, quorum=javadb11g:2181,java11g2:2181, baseZNode=/hbase
14/03/28 16:08:16 DEBUG zookeeper.ClientCnxn: zookeeper.disableAutoWatchReset is false
14/03/28 16:08:16 INFO zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x2896bed3 connecting to ZooKeeper ensemble=javadb11g:2181,java11g2:2181
14/03/28 16:08:16 INFO zookeeper.ClientCnxn: Opening socket connection to server java11g2/192.168.0.230:2181. Will not attempt to authenticate using SASL (unknown error)
14/03/28 16:08:16 INFO zookeeper.ClientCnxn: Socket connection established to java11g2/192.168.0.230:2181, initiating session
14/03/28 16:08:16 DEBUG zookeeper.ClientCnxn: Session establishment request sent on java11g2/192.168.0.230:2181
麻烦大家帮我看看,是我哪里配置不对,还是替换的编译后的jar包有问题,谢谢。
...全文
4024 13 打赏 收藏 转发到动态 举报
写回复
用AI写文章
13 条回复
切换为时间正序
请发表友善的回复…
发表回复
明枫叶林丶 2016-01-11
  • 打赏
  • 举报
回复
Exception in thread "main" java.lang.AssertionError: Unknown token: [@-1,0:0='TOK_STORAGEHANDLER',<823>,0:-1] at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeCreateTable(SemanticAnalyzer.java:10895) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genResolvedParseTree(SemanticAnalyzer.java:10103) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10147) at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:192) at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:222) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:421) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:307) at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1112) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1160) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1039) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:207) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:159) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:370) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:754) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
shunzhi1988 2015-01-21
  • 打赏
  • 举报
回复
我的不缺JAR包,为什么也报这错
SG90 2014-08-25
  • 打赏
  • 举报
回复
Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace 还是Jar包的问题吧
ewwerpm 2014-08-25
  • 打赏
  • 举报
回复
少htrace-core-2.04.jar 类似的包,版本可能不同
  • 打赏
  • 举报
回复
如楼上说的可能是缺少htrace.jar,同时HBase-0.96.0与Hive-0.12.0整合会出现问题
仙人掌花开 2014-05-19
  • 打赏
  • 举报
回复
问一下你这个问题最后怎么解决的
wingerliwei 2014-05-09
  • 打赏
  • 举报
回复
建议你用hbase-0.96.2 hive-0.13.0进行整合,hive0.12有整合的bug
zuochanxiaoheshang 2014-03-31
  • 打赏
  • 举报
回复
缺少jar包吧,htrace-×××.jar
xz43 2014-03-28
  • 打赏
  • 举报
回复
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:MetaException(message:java.io.IOException: java.lang.reflect.InvocationTargetException at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:389) at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:366) at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:247) at org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:183) at org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStorageHandler.java:84) at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:162) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:546) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:539) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89) at $Proxy7.createTable(Unknown Source) at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:609) at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4057) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:276) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:884) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:874) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:424) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:687) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:626) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:525) at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:387) ... 34 more Caused by: java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:195) at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479) at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65) at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83) at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:801) at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:633) ... 39 more Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:423) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:356) ... 45 more )
xz43 2014-03-28
  • 打赏
  • 举报
回复
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:615) at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4057) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:276) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:884) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:874) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:424) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:687) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:626) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) Caused by: MetaException(message:MetaException(message:java.io.IOException: java.lang.reflect.InvocationTargetException at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:389) at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:366) at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:247) at org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:183) at org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStorageHandler.java:84) at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:162) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:546) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:539) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89) at $Proxy7.createTable(Unknown Source) at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:609) at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4057) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:276) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:884) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:874) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:424) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:687) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:626) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:525) at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:387) ... 34 more Caused by: java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:195) at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479) at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65) at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83) at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:801) at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:633) ... 39 more Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:423) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:356) ... 45 more ) at org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStorageHandler.java:88) at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:162) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:546) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:539) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89) at $Proxy7.createTable(Unknown Source) at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:609) at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4057) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:276) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:884) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:874) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:424) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:687) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:626) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) ) at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:212) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:546) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:539) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89) at $Proxy7.createTable(Unknown Source) at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:609) ... 20 more
xz43 2014-03-28
  • 打赏
  • 举报
回复
14/03/28 16:08:16 ERROR exec.DDLTask: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:MetaException(message:java.io.IOException: java.lang.reflect.InvocationTargetException at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:389) at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:366) at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:247) at org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:183) at org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStorageHandler.java:84) at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:162) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:546) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:539) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89) at $Proxy7.createTable(Unknown Source) at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:609) at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4057) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:276) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:884) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:874) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:424) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:687) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:626) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:525) at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:387) ... 34 more Caused by: java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:195) at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479) at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65) at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83) at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:801) at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:633) ... 39 more Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:423) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:356) ... 45 more ) at org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStorageHandler.java:88) at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:162) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:546) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:539) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89) at $Proxy7.createTable(Unknown Source) at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:609) at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4057) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:276) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:884) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:874) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:424) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:793) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:687) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:626) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) )
⼤数据应⽤测试经验总结 ⼤数据应⽤测试经验总结 ⼤数据应⽤测试过程与传统的web系统有较⼤的不同,⼤数据应⽤测试通常会分为web侧和ETL侧测试,web侧基本就是功能测试,⽽ETL(Extracting-Transfroming- Loading)测试主要指从任何外部系统提取、转换、载⼊数据到⽬标地。从底层数据采集、数据处理、到上层应⽤展现。 ⼀、从技术架构设计上,分为以下⼏块: 1. 数据采集:采集使⽤java和python程序从⽂件服务器下载⽂件,并把⽂件写⼊kafka、HbaseHive、Mysql; 2. 计算引擎:使⽤Hive on Tez计算引擎实现ETL跑批任务;使⽤spark streaming实现实时计算;使⽤Phoenix做前台交互式查询。 3. 数据存储:使⽤Kafka、HiveHbase、MySQL满⾜各层次存储技术需求。 4. 任务调度:使⽤Quartz实现作业调度及管理。 5. 监控接⼝:使⽤Kafka、⽂件接⼝对接统⼀监控平台。 6. 数据可视化:使⽤JQuery、Echarts、Easy UI等技术实现图格展⽰;使⽤Apache POI实现excel、CSV的导⼊导出;使⽤Log4J记录⽇志;使⽤Spring框架实现 页⾯、服务、数据的集成管理;使⽤DBCP实现数据库连接池。 7. 数据模型层次说明 ODS:贴源层,存储原始数据,数据采集直接写⼊; DWD:数据仓库明细层,存储从源数据抽去过来的明细数据; DW:数据仓库层,保存经过数据降维汇聚的计算后⽣成的汇总数据; DM:数据集市层,满⾜特定功能⽽建⽴的各种数据集市。 1. 数据处理过程说明 1. 数据采集模块从采集机采集相关业务数据; 2. 数据采集模块定期把原始数据导⼊Hive的ODS层相关;实时数据及时写⼊kafka提供给spark streaming处理;公参数据写⼊mysql。 3. ETL模块从ODS层相关抽取数据到DWD层; 4. ETL模块根据轻度汇总要求进⾏数据轻度汇总操作,并把汇总后的数据放到DW层; 5. ⼀些功能所需数据⽆法从轻度汇总计算出来,需要从DWD原始进⾏汇总; 6. ETL模块从DW层获取轻度汇总数据,根据各业务功能要求进⼀步汇总数据,形成DM层数据; 7. 即席查询数据从Hive关联Hbase; 8. Phoenix关联HBase,供页⾯查询使⽤; 9. 部分ETL模块数据把Hive汇总后数据导⼊MySQL,供模型建模使⽤; ⼆、Hadoop运⾏: mapreduce机制: 常⽤命令: 功能 功能 命令 命令 查看⽬录 hadoop fs -ls dir 上传⽂件 hadoop fs -put ${local file} ${hdfs file} 创建⽬录 hadoop fs -mkdir ${dirname} 获取⽂件 hadoop fs -get ${hdfs file} ${local file} 删除多个⽂件或⽬录 hadoop fs -rm ${hdfs file} ... hadoop fs -rm -r ${hdfs file}... 复制⽂件 hadoop fs -cp ${hdfs file} ${hdfs file} 移动⽂件 hadoop fs -mv ${hdfs file} ${hdfs file} 三、hivehbase⽐较: Hive(⾮数据库) Hbase(数据库) 适⽤场景 ⽤于对⼀段时间内的数据进⾏分析查询,离线批处理 ⼤数据的实时查询 特点 1、⼀种类SQL的引擎,运⾏MapReduce任务; 2、查询⼀般是全量查询,时间较长,可通过分区来优化; 3、基于⾏查询,定义时每⾏有固定列数据,每列数据固定⼤⼩; 4、操作:不⽀持更新操作 1、⼀种在Hadoop之上的NoSQL型Key/Value数据库; 2、查询是通过特定语⾔编写,可通过Phonenix实现类SQL功能; 3、基于列查询,可定义各种不同的列,每列数据⼤⼩不固定; 四、hive常⽤操作 hive 基础操作 基础操作 说明 说明 查看数据 库 show databases 使⽤数据 库 use DbName 删除数据 库 drop database if exists DbName CASCADE 如果数据库不为空,删除会报错,加上cascade可忽略 查看 show tables in DbName ; show tables like 'h*' 创建 内部:CREATE TABLE page_view if not exists(viewTime INT, userid BIGINT, page_url STRING, referrer_url STRIN

20,808

社区成员

发帖
与我相关
我的任务
社区描述
Hadoop生态大数据交流社区,致力于有Hadoop,hive,Spark,Hbase,Flink,ClickHouse,Kafka,数据仓库,大数据集群运维技术分享和交流等。致力于收集优质的博客
社区管理员
  • 分布式计算/Hadoop社区
  • 涤生大数据
加入社区
  • 近7日
  • 近30日
  • 至今
社区公告
暂无公告

试试用AI创作助手写篇文章吧