Spark调用程序出错

ztFanTuan 2014-05-15 05:14:57
10.0.15.104为master,10.0.15.105和10.0.15.106是worker,

slave文件设置为:10.0.15.105和10.0.15.106

spark-env.sh设置为:

export SCALA_HOME=/app/spark/scala-2.10.3
export HADOOP_HOME=/app/hadoop-2.2.0
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
SPARK_WORKER_INSTANCES=3
SPARK_MASTER_PORT=8081
SPARK_MASTER_WEBUI_PORT=8090
SPARK_WORKER_PORT=8091
SPARK_MASTER_IP=10.0.15.104
SPARK_WORKER_DIR=/app/spark/spark-0.9.1-bin-hadoop2/worker

但是当我调用按照下面去调用JavaSparkContext时总是报错。
String master = "spark://10.0.15.104:8081";
String sparkHome = "/app/spark/spark-0.9.1-bin-hadoop2";
String appName = "JavaWordCount";
String[] jarArray = JavaSparkContext.jarOfClass(WordCount.class);
JavaSparkContext ctx = new JavaSparkContext(master, appName,
sparkHome, jarArray);

结果就一直报下面的错误,/220.250.64.18:0这个IP不知道从哪里来的,我根本就设置过这个
日夜期盼大神出手指点迷津

Exception in thread "main" org.jboss.netty.channel.ChannelException: Failed to bind to: /220.250.64.18:0
at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:391)
at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:388)
at scala.util.Success$$anonfun$map$1.apply(Try.scala:206)
at scala.util.Try$.apply(Try.scala:161)
at scala.util.Success.map(Try.scala:206)
at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
at scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:235)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
at akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:42)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.net.BindException: 无法指定被请求的地址
at sun.nio.ch.Net.bind(Native Method)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:124)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:59)
at org.jboss.netty.channel.socket.nio.NioServerBoss$RegisterTask.run(NioServerBoss.java:193)
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.processTaskQueue(AbstractNioSelector.java:366)
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:290)
at org.jboss.netty.channel.socket.nio.NioServerBoss.run(NioServerBoss.java:42)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:662)
...全文
601 2 打赏 收藏 转发到动态 举报
写回复
用AI写文章
2 条回复
切换为时间正序
请发表友善的回复…
发表回复
ibigdatas 2014-08-01
  • 打赏
  • 举报
回复
建议你连接master的时候尽量用hostname去连接,不然会连不上的。
hawk2036 2014-06-11
  • 打赏
  • 举报
回复
export SPARK_MASTER_IP=localhost export SPARK_LOCAL_IP=localhost

1,092

社区成员

发帖
与我相关
我的任务
社区描述
云计算服务器、网络、虚拟化相关讨论
社区管理员
  • 服务器
加入社区
  • 近7日
  • 近30日
  • 至今
社区公告
暂无公告

试试用AI创作助手写篇文章吧