Windows Eclipse运行虚拟机Hadoop集群WordCount程序报异常,求解决

coolbabybing 2014-02-27 08:40:01
集群是由两个虚拟机组成,一个NN,一个DN

Windows帐号和linux帐号相同

在eclipse中运行WordCount程序报如下异常

Exception in thread "main" java.lang.NullPointerException
at java.lang.ProcessBuilder.start(ProcessBuilder.java:442)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:404)
at org.apache.hadoop.util.Shell.run(Shell.java:379)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:678)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:661)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:639)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:435)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:277)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:125)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:344)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
at com.armslee.study.test.WordCount.main(WordCount.java:85)



这个WordCount程序是我从Hadoop2.2.0自带的example中挖出来的,只是修改了包名,代码如下:


/**
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.armslee.study.test;

import java.io.IOException;
import java.util.StringTokenizer;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.util.GenericOptionsParser;

public class WordCount {

public static class TokenizerMapper
extends Mapper<Object, Text, Text, IntWritable>{

private final static IntWritable one = new IntWritable(1);
private Text word = new Text();

public void map(Object key, Text value, Context context
) throws IOException, InterruptedException {
StringTokenizer itr = new StringTokenizer(value.toString());
while (itr.hasMoreTokens()) {
word.set(itr.nextToken());
context.write(word, one);
}
}
}

public static class IntSumReducer
extends Reducer<Text,IntWritable,Text,IntWritable> {
private IntWritable result = new IntWritable();

public void reduce(Text key, Iterable<IntWritable> values,
Context context
) throws IOException, InterruptedException {
int sum = 0;
for (IntWritable val : values) {
sum += val.get();
}
result.set(sum);
context.write(key, result);
}
}

public static void main(String[] args) throws Exception {
Configuration conf = new Configuration();
String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs();
if (otherArgs.length != 2) {
System.err.println("Usage: wordcount <in> <out>");
System.exit(2);
}
@SuppressWarnings("deprecation")
Job job = new Job(conf, "word count");
job.setJarByClass(WordCount.class);
job.setMapperClass(TokenizerMapper.class);
job.setCombinerClass(IntSumReducer.class);
job.setReducerClass(IntSumReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
FileInputFormat.addInputPath(job, new Path(otherArgs[0]));
FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}



我将程序导出jar报直接在linux中运行没有异常,求大神解决问题啊啊啊啊,感激不尽!!!
...全文
1984 14 打赏 收藏 转发到动态 举报
AI 作业
写回复
用AI写文章
14 条回复
切换为时间正序
请发表友善的回复…
发表回复
shuizhongyue098 2014-06-02
  • 打赏
  • 举报
回复
楼主解决了吗?我也是同样的问题,搞了几天了还是没有搞好,能分享一下吗?非常感谢
五哥 2014-05-30
  • 打赏
  • 举报
回复
编译完,上传到 linux hadoop中运行
mykingbull 2014-05-22
  • 打赏
  • 举报
回复
是Hadoop/bin下缺少winutils.exe文件,我试验过了,在Hadoop/bin下加入winutils.exe文件就解决问题。搜索“win7 32 winutils”可下载。
路远兮 2014-05-21
  • 打赏
  • 举报
回复
有解决方案吗,楼主!
狂热的土豆 2014-05-04
  • 打赏
  • 举报
回复
断点跟到hadoop的源码,发现hadoop会判断操作系统类型,如果windows会读取hadoop_home下bin目录里的winutils.exe文件,去网上下了这个文件放进去后没有再出现空指针错误,但是又出现了其他错误
wangjinyang_123 2014-04-16
  • 打赏
  • 举报
回复
还是换成Linux的吧,这样就不会有那么多的莫名其妙的问题了。
strikeg62 2014-04-15
  • 打赏
  • 举报
回复
一样的问题,我看一个帖子说要加 -site.xml到项目路径,试了没用! 求解决啊!+1
blooding79 2014-04-01
  • 打赏
  • 举报
回复
我也碰到同样的问题,求解决啊!
狂热的土豆 2014-03-28
  • 打赏
  • 举报
回复
引用 5 楼 lcltmac 的回复:
LZ问题解决了吗?、、我也是同样的问题。,hadoop location 有没有报错?、
好多人这个问题啊,有人解决了吗?
lcltmac 2014-03-18
  • 打赏
  • 举报
回复
LZ问题解决了吗?、、我也是同样的问题。,hadoop location 有没有报错?、
coolbabybing 2014-02-27
  • 打赏
  • 举报
回复
引用 1 楼 tntzbzc 的回复:
lz加plugin吧 你这样等于在windows上跑mr,肯定跑不起来的
有加
hadoop-eclipse-plugin-2.2.0.jar
这个plugin!
coolbabybing 2014-02-27
  • 打赏
  • 举报
回复
引用 1 楼 tntzbzc 的回复:
lz加plugin吧 你这样等于在windows上跑mr,肯定跑不起来的
加了plugin的!
撸大湿 2014-02-27
  • 打赏
  • 举报
回复
lz加plugin吧 你这样等于在windows上跑mr,肯定跑不起来的

20,848

社区成员

发帖
与我相关
我的任务
社区描述
Hadoop生态大数据交流社区,致力于有Hadoop,hive,Spark,Hbase,Flink,ClickHouse,Kafka,数据仓库,大数据集群运维技术分享和交流等。致力于收集优质的博客
社区管理员
  • 分布式计算/Hadoop社区
  • 涤生大数据
加入社区
  • 近7日
  • 近30日
  • 至今
社区公告
暂无公告

试试用AI创作助手写篇文章吧