本地local运行Spark程序报错 ERROR SparkContext:91 - Error initializing SparkContext

lzw2016 2019-02-03 12:26:52
刚刚上手Spark,确保环境已经搭建成功了。scala版本是2.11.12,spark是2.4版本。使用eclipse安装了scala插件,然后创建了scala项目,导入了本地scala的jar包和spark目录下jars中的所有jar包,运行了个wordcount程序,如下:
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.rdd.RDD

object HelloWorld {
def main(args: Array[String]): Unit = {

val conf = new SparkConf().setMaster("local").setAppName("WC")
val sc = new SparkContext(conf)
val rdd = sc.textFile("hello.txt",1)
rdd.flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_).foreach(println)
}
}


报错信息如下:

2019-02-03 00:05:46 WARN Utils:66 - Your hostname, josonlee-PC resolves to a loopback address: 127.0.1.1; using 192.168.0.106 instead (on interface wlp2s0)
2019-02-03 00:05:46 WARN Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2019-02-03 00:05:46 INFO SparkContext:54 - Running Spark version 2.4.0
2019-02-03 00:05:46 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-02-03 00:05:46 INFO SparkContext:54 - Submitted application: WC
2019-02-03 00:05:46 INFO SecurityManager:54 - Changing view acls to: josonlee
2019-02-03 00:05:46 INFO SecurityManager:54 - Changing modify acls to: josonlee
2019-02-03 00:05:46 INFO SecurityManager:54 - Changing view acls groups to:
2019-02-03 00:05:46 INFO SecurityManager:54 - Changing modify acls groups to:
2019-02-03 00:05:46 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(josonlee); groups with view permissions: Set(); users with modify permissions: Set(josonlee); groups with modify permissions: Set()
2019-02-03 00:05:47 INFO Utils:54 - Successfully started service 'sparkDriver' on port 40621.
2019-02-03 00:05:47 INFO SparkEnv:54 - Registering MapOutputTracker
2019-02-03 00:05:47 INFO SparkEnv:54 - Registering BlockManagerMaster
2019-02-03 00:05:47 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2019-02-03 00:05:47 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2019-02-03 00:05:47 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-c8dd57f7-cf15-4823-a44e-b9a17fa86443
2019-02-03 00:05:47 INFO MemoryStore:54 - MemoryStore started with capacity 1403.1 MB
2019-02-03 00:05:47 ERROR MetricsConfig:91 - Error loading configuration file metrics.properties
java.lang.NullPointerException
at org.apache.spark.metrics.MetricsConfig.loadPropertiesFromFile(MetricsConfig.scala:133)
at org.apache.spark.metrics.MetricsConfig.initialize(MetricsConfig.scala:55)
at org.apache.spark.metrics.MetricsSystem.<init>(MetricsSystem.scala:95)
at org.apache.spark.metrics.MetricsSystem$.createMetricsSystem(MetricsSystem.scala:233)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:357)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
at HelloWorld$.main(HelloWorld.scala:9)
at HelloWorld.main(HelloWorld.scala)
2019-02-03 00:05:47 INFO SparkEnv:54 - Registering OutputCommitCoordinator
2019-02-03 00:05:47 INFO log:192 - Logging initialized @1049ms
2019-02-03 00:05:47 ERROR SparkContext:91 - Error initializing SparkContext.
java.lang.NullPointerException
at org.apache.spark.ui.JettyUtils$.createStaticHandler(JettyUtils.scala:193)
at org.apache.spark.ui.WebUI.addStaticHandler(WebUI.scala:121)
at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:68)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:80)
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:175)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:444)
at HelloWorld$.main(HelloWorld.scala:9)
at HelloWorld.main(HelloWorld.scala)
2019-02-03 00:05:47 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2019-02-03 00:05:47 INFO MemoryStore:54 - MemoryStore cleared
2019-02-03 00:05:47 INFO BlockManager:54 - BlockManager stopped
2019-02-03 00:05:47 INFO BlockManagerMaster:54 - BlockManagerMaster stopped
2019-02-03 00:05:47 WARN MetricsSystem:66 - Stopping a MetricsSystem that is not running
2019-02-03 00:05:47 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2019-02-03 00:05:47 INFO SparkContext:54 - Successfully stopped SparkContext
Exception in thread "main" java.lang.NullPointerException
at org.apache.spark.ui.JettyUtils$.createStaticHandler(JettyUtils.scala:193)
at org.apache.spark.ui.WebUI.addStaticHandler(WebUI.scala:121)
at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:68)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:80)
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:175)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:444)
at HelloWorld$.main(HelloWorld.scala:9)
at HelloWorld.main(HelloWorld.scala)
2019-02-03 00:05:47 INFO ShutdownHookManager:54 - Shutdown hook called
2019-02-03 00:05:47 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-ce3ab442-0846-4c61-84f0-01b48c2556f9


想请教各位大佬,有没有遇见过这个问题?我确保环境什么的是没问题的,我写了个java的wordcount代码可以本地运行的。
...全文
1227 1 打赏 收藏 转发到动态 举报
写回复
用AI写文章
1 条回复
切换为时间正序
请发表友善的回复…
发表回复
野男孩 2019-02-03
  • 打赏
  • 举报
回复
看错误提示是初始化metrics.properties失败阿,conf目录下看看有没有这个文件,没有的话看看有没有metrics.properties.template, 复制一个试试

1,258

社区成员

发帖
与我相关
我的任务
社区描述
Spark由Scala写成,是UC Berkeley AMP lab所开源的类Hadoop MapReduce的通用的并行计算框架,Spark基于MapReduce算法实现的分布式计算。
社区管理员
  • Spark
  • shiter
加入社区
  • 近7日
  • 近30日
  • 至今
社区公告
暂无公告

试试用AI创作助手写篇文章吧