web工程集成spark问题

岸上的小鱼吐泡泡 2017-02-09 04:37:00
我现在需要在web工程里整合spark程序,然后读取hive里的数据,demo代码如下:
SparkSession spark = SparkSession.builder().master("spark://10.2.4.41:7077")
.appName("Java Spark Hive Example")
.enableHiveSupport()
.getOrCreate();
Dataset<Row> sqlDF = spark.sql("SELECT * FROM test ");
sqlDF.show();
运行代码是报如下异常(部分异常信息):
AlreadyExistsException(message:Database default already exists)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_database(HiveMetaStore.java:891)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
at com.sun.proxy.$Proxy491.create_database(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createDatabase(HiveMetaStoreClient.java:644)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

10.2.4.41是远程spark服务器ip
需要说明的是我本地没有spark和hive服务,我访问的是远程的spaik服务器,但是运行上面的代码后,在我本地的c盘下程序创建了hive的数据文件,而我想访问的是远程的hive服务,请问各位我代码该如何写?我上面的代码有什么问题吗?谢谢了先。

...全文
299 4 打赏 收藏 转发到动态 举报
写回复
用AI写文章
4 条回复
切换为时间正序
请发表友善的回复…
发表回复
LinkSe7en 2017-02-10
  • 打赏
  • 举报
回复
你这个貌似是单个jvm重复创建SparkSession/SparkContext了 2017/02/09-15:02:46 >> WARN >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logWarning(Logging.scala:66) >> Use an existing SparkContext, some configuration may not take effect.
  • 打赏
  • 举报
回复
完整日志 part three: Started o.s.j.s.ServletContextHandler@2b88ff{/metrics/json,null,AVAILABLE} 2017/02/09-15:02:46 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0 2017/02/09-15:02:46 >> WARN >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logWarning(Logging.scala:66) >> Use an existing SparkContext, some configuration may not take effect. 2017/02/09-15:02:46 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@fdd67c{/SQL,null,AVAILABLE} 2017/02/09-15:02:46 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@1802aef{/SQL/json,null,AVAILABLE} 2017/02/09-15:02:46 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@8862e4{/SQL/execution,null,AVAILABLE} 2017/02/09-15:02:46 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@14e1fb7{/SQL/execution/json,null,AVAILABLE} 2017/02/09-15:02:46 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@36b9bf{/static/sql,null,AVAILABLE} 2017/02/09-15:02:46 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Warehouse path is 'C:/Program Files/Apache Software Foundation/Tomcat 7.0/spark-warehouse'. 2017/02/09-15:02:46 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Parsing command: SELECT * FROM test 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registered executor NettyRpcEndpointRef(null) (10.2.4.35:53691) with ID 9 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registered executor NettyRpcEndpointRef(null) (10.2.4.35:53692) with ID 6 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registered executor NettyRpcEndpointRef(null) (10.2.4.36:49807) with ID 7 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registered executor NettyRpcEndpointRef(null) (10.2.4.39:34555) with ID 8 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registering block manager 10.2.4.35:42837 with 434.4 MB RAM, BlockManagerId(9, 10.2.4.35, 42837) 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registered executor NettyRpcEndpointRef(null) (10.2.4.36:49806) with ID 2 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registered executor NettyRpcEndpointRef(null) (10.2.4.37:41183) with ID 1 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registering block manager 10.2.4.35:42299 with 434.4 MB RAM, BlockManagerId(6, 10.2.4.35, 42299) 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registering block manager 10.2.4.36:49751 with 434.4 MB RAM, BlockManagerId(7, 10.2.4.36, 49751) 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registering block manager 10.2.4.39:9221 with 434.4 MB RAM, BlockManagerId(8, 10.2.4.39, 9221) 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registered executor NettyRpcEndpointRef(null) (10.2.4.39:34556) with ID 5 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registering block manager 10.2.4.36:49592 with 434.4 MB RAM, BlockManagerId(2, 10.2.4.36, 49592) 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registering block manager 10.2.4.37:33054 with 434.4 MB RAM, BlockManagerId(1, 10.2.4.37, 33054) 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registered executor NettyRpcEndpointRef(null) (10.2.4.37:41184) with ID 4 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registering block manager 10.2.4.37:52929 with 434.4 MB RAM, BlockManagerId(4, 10.2.4.37, 52929) 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registering block manager 10.2.4.39:58051 with 434.4 MB RAM, BlockManagerId(5, 10.2.4.39, 58051) 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registered executor NettyRpcEndpointRef(null) (10.2.4.38:34818) with ID 0 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registering block manager 10.2.4.38:60533 with 434.4 MB RAM, BlockManagerId(0, 10.2.4.38, 60533) 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registered executor NettyRpcEndpointRef(null) (10.2.4.38:34819) with ID 3 2017/02/09-15:02:47 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registering block manager 10.2.4.38:47643 with 434.4 MB RAM, BlockManagerId(3, 10.2.4.38, 47643) 2017/02/09-15:02:47 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Initializing HiveMetastoreConnection version 1.2.1 using Spark classes. 2017/02/09-15:02:50 >> INFO >> http-bio-8089-exec-3 >> org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:589) >> 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 2017/02/09-15:02:50 >> INFO >> http-bio-8089-exec-3 >> org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:289) >> ObjectStore, initialize called 2017/02/09-15:02:50 >> INFO >> http-bio-8089-exec-3 >> org.datanucleus.util.Log4JLogger.info(Log4JLogger.java:77) >> Property datanucleus.cache.level2 unknown - will be ignored 2017/02/09-15:02:50 >> INFO >> http-bio-8089-exec-3 >> org.datanucleus.util.Log4JLogger.info(Log4JLogger.java:77) >> Property hive.metastore.integral.jdo.pushdown unknown - will be ignored 2017/02/09-15:02:50 >> WARN >> http-bio-8089-exec-3 >> org.datanucleus.util.Log4JLogger.warn(Log4JLogger.java:96) >> BoneCP specified but not present in CLASSPATH (or one of dependencies) 2017/02/09-15:02:51 >> WARN >> http-bio-8089-exec-3 >> org.datanucleus.util.Log4JLogger.warn(Log4JLogger.java:96) >> BoneCP specified but not present in CLASSPATH (or one of dependencies) 2017/02/09-15:02:54 >> INFO >> Timer-1 >> com.neusoft.udolink.logger.Log4jLogger.info(Log4jLogger.java:86) >> [UDOLink IMPORTANT NOTICE] : select * from UP_LOGIN_ACCOUNTLIST , [DataSource=UNIEAP] 2017/02/09-15:02:54 >> INFO >> http-bio-8089-exec-3 >> org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:370) >> Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 2017/02/09-15:02:57 >> INFO >> http-bio-8089-exec-3 >> org.datanucleus.util.Log4JLogger.info(Log4JLogger.java:77) >> The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 2017/02/09-15:02:57 >> INFO >> http-bio-8089-exec-3 >> org.datanucleus.util.Log4JLogger.info(Log4JLogger.java:77) >> The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. 2017/02/09-15:02:58 >> INFO >> http-bio-8089-exec-3 >> org.datanucleus.util.Log4JLogger.info(Log4JLogger.java:77) >> The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 2017/02/09-15:02:58 >> INFO >> http-bio-8089-exec-3 >> org.datanucleus.util.Log4JLogger.info(Log4JLogger.java:77) >> The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. 2017/02/09-15:02:58 >> INFO >> http-bio-8089-exec-3 >> org.datanucleus.util.Log4JLogger.info(Log4JLogger.java:77) >> Reading in results for query "org.datanucleus.store.rdbms.query.SQLQuery@0" since the connection used is closing 2017/02/09-15:02:58 >> INFO >> http-bio-8089-exec-3 >> org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:139) >> Using direct SQL, underlying DB is DERBY 2017/02/09-15:02:58 >> INFO >> http-bio-8089-exec-3 >> org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:272) >> Initialized ObjectStore
  • 打赏
  • 举报
回复
Successfully started service 'SparkUI' on port 4040. 2017/02/09-15:02:44 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Bound SparkUI to 0.0.0.0, and started at http://10.2.1.138:4040 2017/02/09-15:02:44 >> INFO >> appclient-register-master-threadpool-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Connecting to master spark://10.2.4.41:7077... 2017/02/09-15:02:44 >> INFO >> netty-rpc-connection-0 >> org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:250) >> Successfully created connection to /10.2.4.41:7077 after 94 ms (0 ms spent in bootstraps) 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Connected to Spark cluster with app ID app-20170209145946-0008 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor added: app-20170209145946-0008/0 on worker-20170209095106-10.2.4.38-52409 (10.2.4.38:52409) with 2 cores 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Granted executor ID app-20170209145946-0008/0 on hostPort 10.2.4.38:52409 with 2 cores, 1024.0 MB RAM 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor added: app-20170209145946-0008/1 on worker-20170209095109-10.2.4.37-57598 (10.2.4.37:57598) with 2 cores 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Granted executor ID app-20170209145946-0008/1 on hostPort 10.2.4.37:57598 with 2 cores, 1024.0 MB RAM 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor added: app-20170209145946-0008/2 on worker-20170209095106-10.2.4.36-51672 (10.2.4.36:51672) with 2 cores 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Granted executor ID app-20170209145946-0008/2 on hostPort 10.2.4.36:51672 with 2 cores, 1024.0 MB RAM 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor added: app-20170209145946-0008/3 on worker-20170209095109-10.2.4.38-55955 (10.2.4.38:55955) with 2 cores 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Granted executor ID app-20170209145946-0008/3 on hostPort 10.2.4.38:55955 with 2 cores, 1024.0 MB RAM 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor added: app-20170209145946-0008/4 on worker-20170209095106-10.2.4.37-41627 (10.2.4.37:41627) with 2 cores 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Granted executor ID app-20170209145946-0008/4 on hostPort 10.2.4.37:41627 with 2 cores, 1024.0 MB RAM 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor added: app-20170209145946-0008/5 on worker-20170209095106-10.2.4.39-50470 (10.2.4.39:50470) with 2 cores 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Granted executor ID app-20170209145946-0008/5 on hostPort 10.2.4.39:50470 with 2 cores, 1024.0 MB RAM 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor added: app-20170209145946-0008/6 on worker-20170209095108-10.2.4.35-51854 (10.2.4.35:51854) with 2 cores 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Granted executor ID app-20170209145946-0008/6 on hostPort 10.2.4.35:51854 with 2 cores, 1024.0 MB RAM 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor added: app-20170209145946-0008/7 on worker-20170209095108-10.2.4.36-60394 (10.2.4.36:60394) with 2 cores 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Granted executor ID app-20170209145946-0008/7 on hostPort 10.2.4.36:60394 with 2 cores, 1024.0 MB RAM 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor added: app-20170209145946-0008/8 on worker-20170209095109-10.2.4.39-55515 (10.2.4.39:55515) with 2 cores 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Granted executor ID app-20170209145946-0008/8 on hostPort 10.2.4.39:55515 with 2 cores, 1024.0 MB RAM 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor added: app-20170209145946-0008/9 on worker-20170209095106-10.2.4.35-48440 (10.2.4.35:48440) with 2 cores 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Granted executor ID app-20170209145946-0008/9 on hostPort 10.2.4.35:48440 with 2 cores, 1024.0 MB RAM 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor updated: app-20170209145946-0008/2 is now RUNNING 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor updated: app-20170209145946-0008/1 is now RUNNING 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor updated: app-20170209145946-0008/0 is now RUNNING 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor updated: app-20170209145946-0008/4 is now RUNNING 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor updated: app-20170209145946-0008/3 is now RUNNING 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor updated: app-20170209145946-0008/7 is now RUNNING 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor updated: app-20170209145946-0008/6 is now RUNNING 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor updated: app-20170209145946-0008/9 is now RUNNING 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor updated: app-20170209145946-0008/5 is now RUNNING 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-1 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Executor updated: app-20170209145946-0008/8 is now RUNNING 2017/02/09-15:02:45 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 46897. 2017/02/09-15:02:45 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Server created on 10.2.1.138:46897 2017/02/09-15:02:45 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registering BlockManager BlockManagerId(driver, 10.2.1.138, 46897) 2017/02/09-15:02:45 >> INFO >> dispatcher-event-loop-0 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registering block manager 10.2.1.138:46897 with 434.4 MB RAM, BlockManagerId(driver, 10.2.1.138, 46897) 2017/02/09-15:02:45 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registered BlockManager BlockManagerId(driver, 10.2.1.138, 46897) 2017/02/09-15:02:45 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >>
  • 打赏
  • 举报
回复
完整日志信息part one: Running Spark version 2.0.1 2017/02/09-15:02:40 >> WARN >> http-bio-8089-exec-3 >> org.apache.hadoop.util.NativeCodeLoader.<clinit>(NativeCodeLoader.java:62) >> Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2017/02/09-15:02:40 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Changing view acls to: user 2017/02/09-15:02:40 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Changing modify acls to: user 2017/02/09-15:02:40 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Changing view acls groups to: 2017/02/09-15:02:40 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Changing modify acls groups to: 2017/02/09-15:02:40 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(user); groups with view permissions: Set(); users with modify permissions: Set(user); groups with modify permissions: Set() 2017/02/09-15:02:41 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Successfully started service 'sparkDriver' on port 46885. 2017/02/09-15:02:42 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registering MapOutputTracker 2017/02/09-15:02:42 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registering BlockManagerMaster 2017/02/09-15:02:42 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Created local directory at C:\Program Files\Apache Software Foundation\Tomcat 7.0\temp\blockmgr-626ddf6b-83dd-4340-9bf3-f99fea0e46d9 2017/02/09-15:02:42 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> MemoryStore started with capacity 434.4 MB 2017/02/09-15:02:42 >> INFO >> http-bio-8089-exec-3 >> org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54) >> Registering OutputCommitCoordinator 2017/02/09-15:02:42 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.util.log.Log.initialized(Log.java:186) >> Logging initialized @263481ms 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.Server.doStart(Server.java:327) >> jetty-9.2.z-SNAPSHOT 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@b94c2d{/jobs,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@1383a6c{/jobs/json,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@83a8e1{/jobs/job,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@1c44e17{/jobs/job/json,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@e2f91f{/stages,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@1518bb5{/stages/json,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@ab4c0a{/stages/stage,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@d4bd9{/stages/stage/json,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@ee949c{/stages/pool,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@15ea9f0{/stages/pool/json,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@1e7019b{/storage,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@11a8af7{/storage/json,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@1997f85{/storage/rdd,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@831040{/storage/rdd/json,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@a01375{/environment,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@15d7d36{/environment/json,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@c785c0{/executors,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@11e31e3{/executors/json,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@15d549c{/executors/threadDump,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@5de401{/executors/threadDump/json,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@243d70{/static,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@13dc1af{/,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@176d207{/api,null,AVAILABLE} 2017/02/09-15:02:43 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:744) >> Started o.s.j.s.ServletContextHandler@1d6e810{/stages/stage/kill,null,AVAILABLE} 2017/02/09-15:02:44 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:266) >> Started ServerConnector@a89d1{HTTP/1.1}{0.0.0.0:4040} 2017/02/09-15:02:44 >> INFO >> http-bio-8089-exec-3 >> org.spark_project.jetty.server.Server.doStart(Server.java:379) >> Started @264516ms

1,258

社区成员

发帖
与我相关
我的任务
社区描述
Spark由Scala写成,是UC Berkeley AMP lab所开源的类Hadoop MapReduce的通用的并行计算框架,Spark基于MapReduce算法实现的分布式计算。
社区管理员
  • Spark
  • shiter
加入社区
  • 近7日
  • 近30日
  • 至今
社区公告
暂无公告

试试用AI创作助手写篇文章吧