spark访问hive元数据出错

z821193713 2017-06-07 05:49:56
我的代码是
val spark = SparkSession
.builder()
.appName("Spark Hive Example")
.config(sparkConf)
.enableHiveSupport()
.getOrCreate()
import spark.implicits._
import spark.sql
sql("show databases").show()
sql("show tables").show()

编译运行后报的错误日志是:
17/06/07 17:42:33 INFO util.log: Logging initialized @2826ms
17/06/07 17:42:33 INFO server.Server: jetty-9.2.z-SNAPSHOT
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@36d27f24{/jobs,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3599309a{/jobs/json,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5c45e9f3{/jobs/job,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@72027fd8{/jobs/job/json,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2177862{/stages,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@52e06b00{/stages/json,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@b90f782{/stages/stage,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5a697dbc{/stages/stage/json,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@43bdf003{/stages/pool,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6ad877b3{/stages/pool/json,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@299abf92{/storage,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6e971f94{/storage/json,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@39a6645{/storage/rdd,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@db03ddc{/storage/rdd/json,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2dad3d89{/environment,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@48a304cc{/environment/json,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@46901255{/executors,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1ca3aea4{/executors/json,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3e1e9fac{/executors/threadDump,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4c84f510{/executors/threadDump/json,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2cf514af{/static,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@e210438{/,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@199bffc7{/api,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@50b56ef3{/stages/stage/kill,null,AVAILABLE}
17/06/07 17:42:33 INFO server.ServerConnector: Started ServerConnector@1d6777eb{HTTP/1.1}{0.0.0.0:4040}
17/06/07 17:42:33 INFO server.Server: Started @2944ms
17/06/07 17:42:33 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
17/06/07 17:42:33 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.109.2:4040
17/06/07 17:42:33 INFO executor.Executor: Starting executor ID driver on host localhost
17/06/07 17:42:33 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 55408.
17/06/07 17:42:33 INFO netty.NettyBlockTransferService: Server created on 192.168.109.2:55408
17/06/07 17:42:33 INFO storage.BlockManager: external shuffle service port = 7337
17/06/07 17:42:33 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.109.2, 55408)
17/06/07 17:42:33 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.109.2:55408 with 1993.8 MB RAM, BlockManagerId(driver, 192.168.109.2, 55408)
17/06/07 17:42:33 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.109.2, 55408)
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@573d8208{/metrics/json,null,AVAILABLE}
17/06/07 17:42:33 WARN spark.SparkContext: Use an existing SparkContext, some configuration may not take effect.
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@568d1fd4{/SQL,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5aab7e5a{/SQL/json,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@69f0d277{/SQL/execution,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@38ce81be{/SQL/execution/json,null,AVAILABLE}
17/06/07 17:42:33 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@799701da{/static/sql,null,AVAILABLE}
17/06/07 17:42:33 INFO hive.HiveSharedState: Warehouse path is 'spark-warehouse'.
17/06/07 17:42:33 INFO execution.SparkSqlParser: Parsing command: show databases
17/06/07 17:42:33 INFO hive.HiveUtils: Initializing HiveMetastoreConnection version 1.1.0 using Spark classes.
17/06/07 17:42:34 INFO hive.metastore: Trying to connect to metastore with URI thrift://redhat208.life.com:9083
17/06/07 17:42:34 INFO hive.metastore: Opened a connection to metastore, current connections: 1
17/06/07 17:42:34 INFO hive.metastore: Connected to metastore.
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:354)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:258)
at org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
at org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
at org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
at org.apache.spark.sql.hive.HiveSharedState.externalCatalog(HiveSharedState.scala:45)
at org.apache.spark.sql.hive.HiveSessionState.catalog$lzycompute(HiveSessionState.scala:50)
at org.apache.spark.sql.hive.HiveSessionState.catalog(HiveSessionState.scala:48)
at org.apache.spark.sql.hive.HiveSessionState$$anon$1.<init>(HiveSessionState.scala:63)
at org.apache.spark.sql.hive.HiveSessionState.analyzer$lzycompute(HiveSessionState.scala:63)
at org.apache.spark.sql.hive.HiveSessionState.analyzer(HiveSessionState.scala:62)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:582)
at com.chinalife.cbdp.realtime.etl.SparkHiveExample$.main(SparkHiveExample.scala:68)
at com.chinalife.cbdp.realtime.etl.SparkHiveExample.main(SparkHiveExample.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode(NativeIO.java:524)
at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:465)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:518)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:496)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:309)
at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:651)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:583)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:517)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:189)
... 26 more
...全文
577 回复 打赏 收藏 转发到动态 举报
写回复
用AI写文章
回复
切换为时间正序
请发表友善的回复…
发表回复

1,261

社区成员

发帖
与我相关
我的任务
社区描述
Spark由Scala写成,是UC Berkeley AMP lab所开源的类Hadoop MapReduce的通用的并行计算框架,Spark基于MapReduce算法实现的分布式计算。
社区管理员
  • Spark
  • shiter
加入社区
  • 近7日
  • 近30日
  • 至今
社区公告
暂无公告

试试用AI创作助手写篇文章吧