eclipse连hadoop集群运行wordcount报空指针 [问题点数:40分,结帖人u012760284]

Bbs1
本版专家分:0
结帖率 100%
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Bbs1
本版专家分:0
Windows环境下eclipse提交到远程wordcount程序报错 at org.apache.hadoop.util.Shell.runCommand(Shell.java:545)
远程hadoop2.7,本地的windows7 程序报错如下: 2015-09-28 22:04:21,423 WARN  [main] util.NativeCodeLoader (NativeCodeLoader.java:(62)) - Unable to load native-hadoop library for your platform... using builtin-java
启动Java工程一直在log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
今天用jar命令启动一个jar包,一直卡在: 一般程序如果卡住不动,很有可能是某些地方的链接连接不上,我发现启动的jar目录居然生成了一个h2的文件,意识到可能是数据库的一些连接问题: 将配置文件中的数据库连接改改就行了。...
启动java工程卡在:log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
刚搭好的框架启动时候不报错也不提示,就是卡在 log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. 不动了,日志也没打出相应的错误,经过排查错误总结有三个问题 数据库连接驱动问题 数据连接驱动有问题导致xml文件无法与数据库进行连接,如果有别的程序可以连接,把jdbc版本改成...
hadoop集群运行小程序wordCount记录
1.在Linux下编写程序 WordCountMapper 分配任务import java.io.IOException;import org.apache.hadoop.io.LongWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Mapper; import org.apache.ha
三台机器的hadoop集群的配置、wordcount运行
1.安装好三台centos机器,配置好网络,使三台机器能够互通 本次试用的是virtualbox虚拟机,在virtualbox上分别安装好三台centos6.5后,一般为了使centos既可以在内网与主机间互通,又可以访问外网,需要为每台机器配置双网卡,其中一个用net网络模式,另一个用hostonly模式即可达到该要求。2.修改主机名 /etc/sysconfig/network
Hadoop集群中WordCount示例
Hadoop集群中<em>运行</em>的基础示例WordCount
配置Hadoop集群+WordCount案例
配置Hadoop集群 配置环境变量 /etc/profile export HADOOP_HOME=/bigData/hadoop-2.8.0 export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin 8.配置hadoop-env.sh文件 1>  vim  hadoop-env.sh 2>  在文件中加入
Hadoop集群WordCount详解(二)
Hadoop WorkCount详解(二) 源代码程序 WorkCount处理过程 1、源代码程序package org.apache.hadoop.examples;import java.io.IOException;import java.util.StringTokenizer;import org.apache.hadoop.conf.Configuration;import org.ap
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
莫名其妙遇到个这个错误,resources下面新建一个file log4j.properties 复制进去就行了, log4j.rootLogger=DEBUG,A1 log4j.appender.A1=org.apache.log4j.ConsoleAppender log4j.appender.A1.layout=org.apache.log4j.PatternLayout log4j.app...
启动java工程不报错,卡在log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
解决办法:在启动的相应工程的的-web或者-service或都放置一个log4j.properties文件.就可以打印出相应错误了.文件内容为log4j.rootLogger=DEBUG,A1 log4j.appender.A1=org.apache.log4j.ConsoleAppender log4j.appender.A1.layout=org.apache.log4j.PatternLa...
log4j升级到log4j2出现的问题,找不到log4j配置文件
-
spring启动一直卡在log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more
这种情况,肯定是项目已经报错了,并且你的log4j配置没有加载起来,所以没看到日志这种情况下,web.xml中log4j的监听器放到spring的监听器前就好了,再次<em>运行</em>,就能看到报错日志了...
log4j问题,帮忙看看啊
log4j配置: # Output pattern : date priority category - message log4j.rootLogger=WARN, Console, Rolling
hadoop集群通用wordcount测试程序
通用的<em>wordcount</em> 程序, 不依赖具体Hadoop平台,做research时很有用
Windows Eclipse运行虚拟机Hadoop集群WordCount程序报异常,求解决
集群是由两个虚拟机组成,一个NN,一个DNrnrnWindows帐号和linux帐号相同rnrn在<em>eclipse</em>中<em>运行</em>WordCount程序报如下异常rn[code=java]rnException in thread "main" java.lang.NullPointerExceptionrn at java.lang.ProcessBuilder.start(ProcessBuilder.java:442)rn at org.apache.hadoop.util.Shell.runCommand(Shell.java:404)rn at org.apache.hadoop.util.Shell.run(Shell.java:379)rn at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)rn at org.apache.hadoop.util.Shell.execCommand(Shell.java:678)rn at org.apache.hadoop.util.Shell.execCommand(Shell.java:661)rn at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:639)rn at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:435)rn at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:277)rn at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:125)rn at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:344)rn at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)rn at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)rn at java.security.AccessController.doPrivileged(Native Method)rn at javax.security.auth.Subject.doAs(Subject.java:396)rn at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)rn at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)rn at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)rn at com.armslee.study.test.WordCount.main(WordCount.java:85)rnrn[/code]rnrn这个WordCount程序是我从Hadoop2.2.0自带的example中挖出来的,只是修改了包名,代码如下:rnrn[code=java]rn/**rn * Licensed to the Apache Software Foundation (ASF) under onern * or more contributor license agreements. See the NOTICE filern * distributed with this work for additional informationrn * regarding copyright ownership. The ASF licenses this filern * to you under the Apache License, Version 2.0 (thern * "License"); you may not use this file except in compliancern * with the License. You may obtain a copy of the License atrn *rn * http://www.apache.org/licenses/LICENSE-2.0rn *rn * Unless required by applicable law or agreed to in writing, softwarern * distributed under the License is distributed on an "AS IS" BASIS,rn * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.rn * See the License for the specific language governing permissions andrn * limitations under the License.rn */rnpackage com.armslee.study.test;rnrnimport java.io.IOException;rnimport java.util.StringTokenizer;rnrnimport org.apache.hadoop.conf.Configuration;rnimport org.apache.hadoop.fs.Path;rnimport org.apache.hadoop.io.IntWritable;rnimport org.apache.hadoop.io.Text;rnimport org.apache.hadoop.mapreduce.Job;rnimport org.apache.hadoop.mapreduce.Mapper;rnimport org.apache.hadoop.mapreduce.Reducer;rnimport org.apache.hadoop.mapreduce.lib.input.FileInputFormat;rnimport org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;rnimport org.apache.hadoop.util.GenericOptionsParser;rnrnpublic class WordCount rnrn public static class TokenizerMapper rn extends Mapperrn rn private final static IntWritable one = new IntWritable(1);rn private Text word = new Text();rn rn public void map(Object key, Text value, Context contextrn ) throws IOException, InterruptedException rn StringTokenizer itr = new StringTokenizer(value.toString());rn while (itr.hasMoreTokens()) rn word.set(itr.nextToken());rn context.write(word, one);rn rn rn rn rn public static class IntSumReducer rn extends Reducer rn private IntWritable result = new IntWritable();rnrn public void reduce(Text key, Iterable values, rn Context contextrn ) throws IOException, InterruptedException rn int sum = 0;rn for (IntWritable val : values) rn sum += val.get();rn rn result.set(sum);rn context.write(key, result);rn rn rnrn public static void main(String[] args) throws Exception rn Configuration conf = new Configuration();rn String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs();rn if (otherArgs.length != 2) rn System.err.println("Usage: <em>wordcount</em> ");rn System.exit(2);rn rn @SuppressWarnings("deprecation")rn Job job = new Job(conf, "word count");rn job.setJarByClass(WordCount.class);rn job.setMapperClass(TokenizerMapper.class);rn job.setCombinerClass(IntSumReducer.class);rn job.setReducerClass(IntSumReducer.class);rn job.setOutputKeyClass(Text.class);rn job.setOutputValueClass(IntWritable.class);rn FileInputFormat.addInputPath(job, new Path(otherArgs[0]));rn FileOutputFormat.setOutputPath(job, new Path(otherArgs[1]));rn System.exit(job.waitForCompletion(true) ? 0 : 1);rn rnrnrn[/code]rnrn我将程序导出jar报直接在linux中<em>运行</em>没有异常,求大神解决问题啊啊啊啊,感激不尽!!!
eclipse运行wordCount出错,求救!!
错误如下:rnrn13/04/22 09:38:19 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=rn13/04/22 09:38:20 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String).rnException in thread "main" java.io.IOException: Call to localhost/127.0.0.1:9100 failed on local exception: java.io.EOFExceptionrn at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)rn at org.apache.hadoop.ipc.Client.call(Client.java:743)rn at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)rn at $Proxy0.getProtocolVersion(Unknown Source)rn at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)rn at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)rn at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:207)rn at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:170)rn at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)rn at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)rn at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)rn at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)rn at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)rn at org.apache.hadoop.fs.Path.getFileSystem(Path.java:175)rn at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:122)rn at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:770)rn at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)rn at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:447)rn at WordCount.main(WordCount.java:66)rnCaused by: java.io.EOFExceptionrn at java.io.DataInputStream.readInt(DataInputStream.java:375)rn at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)rn at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)rnrnrn请问是哪里的错误?
eclipse hadoop插件安装 运行WordCount
1. 资源包括了<em>eclipse</em> Oxygen版本的hadoop插件,包括winutils 2. 文档详细介绍了windows 下的hadoop版本的配置 3. 实例程序可以跑通,有详细截图
eclipse运行wordcount报Connection refused,求大神帮忙!!!
伪分步式环境,rn<em>eclipse</em>下面跑<em>wordcount</em>,rn报错如下:rnINFO ipc.Client: Retrying connect to server: xlk-dell/192.168.200.177:9001. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)rn14/05/11 23:13:13 rnINFO ipc.Client: Retrying connect to server: xlk-dell/192.168.200.177:9001. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)rn14/05/11 23:13:13rn ERROR security.UserGroupInformation: PriviledgedActionException as:xlk cause:java.net.ConnectException: Call to xlk-dell/192.168.200.177:9001 failed on connection exception: java.net.ConnectException:[color=#FF0000] Connection refused[/color]rnrn[color=#FF0000]其中,在main函数中加入了 Configuration conf = new Configuration();rn conf.set("mapred.job.tracker", "192.168.200.177:9001");[/color]rnrn[color=#FF0000]如果这里是 conf.set("mapred.job.tracker", "127.0.0.1:9001");则不会出现上面的报错,能正常<em>运行</em>,[/color]rn但不是说在<em>eclipse</em>下面要显式的指出mapreduce的地址么?(这样才能丢到mapreduce环境<em>运行</em>?!)rn还有就是是不是/etc/hosts 里面需要配置?rn我的配置是 127.0.0.1 localhostrn #127.0.1.1 xlk-dellrn 192.168.200.177 xlk-dellrnrnrn另外, 我的core-site.xml以及marred-site.xml里面陪的都是localhost:9000和localhost:9001, 改为 本机地址:9000,本机地址:9001 也试过, <em>eclipse</em>还是报链接错误。rnrn之前 在main函数中 加入代码 Configuration conf = new Configuration();rn conf.set("mapred.job.tracker", "192.168.200.177:9001");rn一直是成功<em>运行</em>的,但后来重装了下hadoop就不行了。rn哪位大神能给小弟指出下 ,<em>eclipse</em>下<em>运行</em>到底哪里不对的?rnrn
SSM整合tomcat服务器启动不了,控制台log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more
问题描述,ssm框架整合的时候,登录功能什么都做好了,准备测试,启动服务器的时候,一直启动不了。。。报45秒超时。。。 控制台: 在百度上面查了资料,还是没有找到解决方法。。。最后自己慢慢排除,花了很长时间,在usermapper.xml找到了错误。。。。 最后仔细研究,发现Umail = #{umail,jdbcType="VARCHAR"} and Upa
Tomcat启动卡在log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.这条语句上面
如图: 一般有三种情况:数据库连不上,zookeeper注册中心连不上,逆向工程生成的Mapper错了(如果用的话) 我的是数据库没有启动导致的
错误信息:nitializing Spring root WebApplicationContext log4j:WARN No appenders could be found for logger
具体错误信息:信息: Initializing Spring root WebApplicationContext log4j:WARN No appenders could be found for logger (org.springframework.web.context.ContextLoader). log4j:WARN Please initialize the log4j syst...
Initializing Spring root WebApplicationContext log4j:WARN No appenders couould be found for logger (
错误信息 在出现以下信息: 信息: Initializing Spring root WebApplicationContext log4j:WARN No appenders could be found for logger (org.springframework.web.context.ContextLoader). log4j:WARN Please initialize t
maven项目卡在Initializing Spring root WebApplicationContext
最近在网上自学taotao商城项目,按着教程部署ssm框架的时候,遇到了这个卡在Initializing Spring root WebApplicationContext的问题。项目刚开始可以正常部
Socket连接时Input.available()报空指针错误
在使用socket获取输入流读取数据时,总是提示<em>空指针</em>。获取输入流getInputstream()和创建socket是在一个线程中执行,又创建一个线程处理输入流中的数据。然后就会报错。 public class ReceiveThread implements Runnable { @Override public void run() { while (true) {
Eclipse下运行mapreduce报空指针而在linux下不会报错
在Eclipse下<em>运行</em>mapreduce报这个错而在linux下不会报错rn[img=https://img-bbs.csdn.net/upload/201402/11/1392083428_258471.jpg][/img]rn[img=https://img-bbs.csdn.net/upload/201402/11/1392083449_996159.jpg][/img]
eclipse构建工程师报空指针错误,求解!!!
内部验证错误 java.lang.NullPointerException at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.lookup.CompilationUnitBinding.sourceMethod(CompilationUnitBinding.java:83) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.lookup.MethodBinding.sourceMethod(MethodBinding.java:553) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.lookup.SourceTypeBinding.resolveTypesFor(SourceTypeBinding.java:1068) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.lookup.SourceTypeBinding.resolveTypesFor(SourceTypeBinding.java:1054) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.lookup.SourceTypeBinding.methods(SourceTypeBinding.java:779) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.lookup.MethodBinding.ensureBindingsAreComplete(MethodBinding.java:623) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.lookup.Scope.findMethod(Scope.java:638) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.lookup.Scope.getImplicitMethod(Scope.java:1635) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.ast.MessageSend.resolveType(MessageSend.java:325) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.ast.MessageSend.resolveType(MessageSend.java:267) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.ast.BinaryExpression.resolveType(BinaryExpression.java:262) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.ast.MessageSend.resolveType(MessageSend.java:279) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.ast.Expression.resolve(Expression.java:477) at org.<em>eclipse</em>.wst.jsdt.internal.compiler.ast.Block.resolve(Block.java:89) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.ast.IfStatement.resolve(IfStatement.java:191) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.ast.AbstractMethodDeclaration.resolveStatements(AbstractMethodDeclaration.java:350) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.ast.MethodDeclaration.resolveStatements(MethodDeclaration.java:137) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.ast.AbstractMethodDeclaration.resolve(AbstractMethodDeclaration.java:304) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.ast.AbstractMethodDeclaration.resolve(AbstractMethodDeclaration.java:375) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.ast.CompilationUnitDeclaration.resolve(CompilationUnitDeclaration.java:394) at rn org.<em>eclipse</em>.wst.jsdt.internal.compiler.Compiler.process(Compiler.java:604) at org.<em>eclipse</em>.wst.jsdt.internal.compiler.Compiler.compile(Compiler.java:356) at rn org.<em>eclipse</em>.wst.jsdt.internal.core.builder.AbstractImageBuilder.compile(AbstractImageBuilder.java:288) at rn org.<em>eclipse</em>.wst.jsdt.internal.core.builder.BatchImageBuilder.compile(BatchImageBuilder.java:86) at rn org.<em>eclipse</em>.wst.jsdt.internal.core.builder.AbstractImageBuilder.compile(AbstractImageBuilder.java:227) at rn org.<em>eclipse</em>.wst.jsdt.internal.core.builder.BatchImageBuilder.build(BatchImageBuilder.java:58) at rn org.<em>eclipse</em>.wst.jsdt.internal.core.builder.JavaBuilder.buildAll(JavaBuilder.java:291) at rn org.<em>eclipse</em>.wst.jsdt.internal.core.builder.JavaBuilder.build(JavaBuilder.java:199) at org.<em>eclipse</em>.core.internal.events.BuildManager$2.run(BuildManager.java:rn 728) at org.<em>eclipse</em>.core.runtime.SafeRunner.run(SafeRunner.java:42) at org.<em>eclipse</em>.core.internal.events.BuildManager.basicBuild(BuildManager.java:199) at rn org.<em>eclipse</em>.core.internal.events.BuildManager.basicBuild(BuildManager.java:239) at org.<em>eclipse</em>.core.internal.events.BuildManager$1.run(BuildManager.java:rn 292) at org.<em>eclipse</em>.core.runtime.SafeRunner.run(SafeRunner.java:42) at org.<em>eclipse</em>.core.internal.events.BuildManager.basicBuild(BuildManager.java:295) at rn org.<em>eclipse</em>.core.internal.events.BuildManager.basicBuildLoop(BuildManager.java:351) at rn org.<em>eclipse</em>.core.internal.events.BuildManager.build(BuildManager.java:374) at org.<em>eclipse</em>.core.internal.events.AutoBuildJob.doBuild(AutoBuildJob.java:143) rn at org.<em>eclipse</em>.core.internal.events.AutoBuildJob.run(AutoBuildJob.java:241) at org.<em>eclipse</em>.core.internal.jobs.Worker.run(Worker.java:54)
eclipse启动tomcat报空指针错误
启动tomcat之后出现了这样的错误,请问一下大大们怎么解决rnrnrnrn一月 23, 2018 12:58:48 下午 org.apache.tomcat.util.digester.Digester startElementrn严重: Begin event threw errorrnjava.lang.ExceptionInInitializerErrorrn at org.apache.catalina.mbeans.GlobalResourcesLifecycleListener.(GlobalResourcesLifecycleListener.java:66)rn at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)rn at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)rn at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)rn at java.lang.reflect.Constructor.newInstance(Constructor.java:423)rn at java.lang.Class.newInstance(Class.java:442)rn at org.apache.tomcat.util.digester.ObjectCreateRule.begin(ObjectCreateRule.java:145)rn at org.apache.tomcat.util.digester.Digester.startElement(Digester.java:1303)rn at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.startElement(AbstractSAXParser.java:509)rn at com.sun.org.apache.xerces.internal.parsers.AbstractXMLDocumentParser.emptyElement(AbstractXMLDocumentParser.java:182)rn at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanStartElement(XMLDocumentFragmentScannerImpl.java:1339)rn at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(XMLDocumentFragmentScannerImpl.java:2784)rn at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:602)rn at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:505)rn at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:842)rn at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:771)rn at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:141)rn at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1213)rn at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:643)rn at org.apache.tomcat.util.digester.Digester.parse(Digester.java:1576)rn at org.apache.catalina.startup.Catalina.load(Catalina.java:616)rn at org.apache.catalina.startup.Catalina.load(Catalina.java:667)rn at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)rn at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)rn at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)rn at java.lang.reflect.Method.invoke(Method.java:498)rn at org.apache.catalina.startup.Bootstrap.load(Bootstrap.java:253)rn at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:427)rnCaused by: java.lang.NullPointerExceptionrn at org.apache.tomcat.util.modeler.Registry.loadDescriptors(Registry.java:766)rn at org.apache.catalina.mbeans.MBeanUtils.createRegistry(MBeanUtils.java:1058)rn at org.apache.catalina.mbeans.MBeanUtils.(MBeanUtils.java:93)rn ... 28 morernrnjava.lang.ExceptionInInitializerErrorrn at org.apache.catalina.mbeans.GlobalResourcesLifecycleListener.(GlobalResourcesLifecycleListener.java:66)rn at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)rn at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)rn at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)rn at java.lang.reflect.Constructor.newInstance(Constructor.java:423)rn at java.lang.Class.newInstance(Class.java:442)rn at org.apache.tomcat.util.digester.ObjectCreateRule.begin(ObjectCreateRule.java:145)rn at org.apache.tomcat.util.digester.Digester.startElement(Digester.java:1303)rn at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.startElement(AbstractSAXParser.java:509)rn at com.sun.org.apache.xerces.internal.parsers.AbstractXMLDocumentParser.emptyElement(AbstractXMLDocumentParser.java:182)rn at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanStartElement(XMLDocumentFragmentScannerImpl.java:1339)rn at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(XMLDocumentFragmentScannerImpl.java:2784)rn at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:602)rn at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:505)rn at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:842)rn at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:771)rn at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:141)rn at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1213)rn at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:643)rn at org.apache.tomcat.util.digester.Digester.parse(Digester.java:1576)rn at org.apache.catalina.startup.Catalina.load(Catalina.java:616)rn at org.apache.catalina.startup.Catalina.load(Catalina.java:667)rn at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)rn at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)rn at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)rn at java.lang.reflect.Method.invoke(Method.java:498)rn at org.apache.catalina.startup.Bootstrap.load(Bootstrap.java:253)rn at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:427)rnCaused by: java.lang.NullPointerExceptionrn at org.apache.tomcat.util.modeler.Registry.loadDescriptors(Registry.java:766)rn at org.apache.catalina.mbeans.MBeanUtils.createRegistry(MBeanUtils.java:1058)rn at org.apache.catalina.mbeans.MBeanUtils.(MBeanUtils.java:93)rn ... 28 more
批量运行状态查询报空指针异常解决
服务器上发现批量<em>运行</em>状态查询有时会报系统未知错误,然后查询日志发现,batch端返回的rejcode=null,导致mweb无法识别错误类型。 但是为什么rejcode会等于null呢? 跟踪代码发现,batchtemplate里面有一个try catch,在catch里面会给rejcode重新赋值,所以验证try里面的代码 try {             context.setD
快学Big Data -- Spark SQL总结(二十四)
Spark  SQL 总结 概述 Spark  Sql 是用来处理结构化数据的一个模块,它提供了一个编程抽象叫做DataFrame并且作为分布式SQL查询引擎的作用。 特点 spark  sql 要比hive执行的速度要快,原因在于spark sql不用通过mapreduce来执行程序,减少了执行的复杂性。 Spark sql 可以将数据转化为RDD(内存中),大大提高了执行的效率。...
关于log4j的问题
tomcat6 我把log4j.properties文件放在应用的WEB-INF\classes\下面,不能生成log文件,报 log4j:WARN No appenders could be fou
log4j:WARN Please initialize the log4j system properly 的解决方法
异常描述 在测试ZK连接服务器的时候控制一直没有其他输出,log4j日志提示下方信息: log4j:WARN No appenders could be found for logger (org.apache.zookeeper.ZooKeeper). log4j:WARN Please initialize the log4j system properly. log4j:WARN See h...
tomcat7插件启动项目卡在:log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
运用tomcat7插件启动项目后,项目一直卡在log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.下面是卡住时的情况: 我遇到这种情况的原因是:mybatis逆向生成接口及xml文件时生成了两遍,虽然移动到项目中的文件没有重复,但还是引起了tomcat启动卡住,应该是文件不匹配导致的...
关于No Spring WebApplicationInitializer types detected on classpath的提示
在使用maven启动tomocat时出现的提示,然后启动到一半再也不走,提示 五月 17, 2017 11:45:41 上午 org.apache.catalina.core.StandardEngine startInternal 信息: Starting Servlet Engine: Apache Tomcat/7.0.47 五月 17, 2017 11:45:46 上午 org.apache.catalina.core.ApplicationContext log 信息: No Spring Web
部署在tomcat下运行,报空指针错误
action/listrn2017-8-5 10:49:31 org.apache.catalina.core.StandardWrapperValve invokern严重: Servlet.service() for servlet [actionServlet] in context with path [/dhcc] threw exception [Servlet execution threw an exception] with root causernjava.lang.NoClassDefFoundError: Could not initialize class Util.DBUtilrn at DAO.StudentDAO.findAll(StudentDAO.java:95)rn at web.ActionServlet.service(ActionServlet.java:41)rn at javax.servlet.http.HttpServlet.service(HttpServlet.java:731)rn at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)rn at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)rn at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:218)rn at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:110)rn at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:506)rn at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:169)rn at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)rn at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:962)rn at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)rn at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:445)rn at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1115)rn at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:637)rn at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)rn at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)rn at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)rn at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)rn at java.lang.Thread.run(Thread.java:662)rn
wordcount 运行
开启kerberos身份验证后,root账号没有了执行hdfs写的权限 [root@masternode1 centos]# hadoop jar /opt/cloudera/parcels/CDH-5.7.1-1.cdh5.7.1.p0.11/jars/hadoop-mapreduce-examples-2.6.0-cdh5.7.1.jar <em>wordcount</em> /input /outpu
eclipse空指针
1 报错log !SESSION 2014-04-04 17:52:14.756 ———————————————– <em>eclipse</em>.buildId=v22.6.2-1085508 java.version=1.6.0_43 java.vendor=Sun Microsystems Inc. BootLoader constants: OS=win32, ARCH=x86_64, WS=win...
win7下eclipse hadoop2.4.0 WordCount运行 空指针异常
win7下<em>eclipse</em>,hadoop2.4.0 在虚拟机ubuntu上rn<em>eclipse</em>可以操作hdfs(删除文件时报没有权限)。rn<em>运行</em> WordCount报 <em>空指针</em>异常:rnException in thread "main" java.lang.NullPointerExceptionrn at java.lang.ProcessBuilder.start(ProcessBuilder.java:441)rn at org.apache.hadoop.util.Shell.runCommand(Shell.java:445)rn at org.apache.hadoop.util.Shell.run(Shell.java:418)rn at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)rn at org.apache.hadoop.util.Shell.execCommand(Shell.java:739)rn at org.apache.hadoop.util.Shell.execCommand(Shell.java:722)rn at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:633)rn at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:421)rn at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:281)rn at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:125)rn at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:348)rn at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)rn at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)rn at java.security.AccessController.doPrivileged(Native Method)rn at javax.security.auth.Subject.doAs(Subject.java:396)rn at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)rn at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)rn at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)rn at WordCount.main(WordCount.java:89)rnrn网上搜索了许多方法,添加了winutils.exe和hadoop.dll还是报这个错。求高手帮忙解决。
win7下eclipse hadoop2.3.0 WordCount运行 空指针异常
linux为centos6.5 <em>eclipse</em>版本为 4.4.0 hadoop版本为2.3.0 执行WordCount报:rnException in thread "main" java.lang.NullPointerExceptionrn at java.lang.ProcessBuilder.start(ProcessBuilder.java:1011)rn at org.apache.hadoop.util.Shell.runCommand(Shell.java:445)rn at org.apache.hadoop.util.Shell.run(Shell.java:418)rn at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)rn at org.apache.hadoop.util.Shell.execCommand(Shell.java:739)rn at org.apache.hadoop.util.Shell.execCommand(Shell.java:722)rn at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:631)rn at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:421)rn at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:277)rn at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:125)rn at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:348)rn at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)rn at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)rn at java.security.AccessController.doPrivileged(Native Method)rn at javax.security.auth.Subject.doAs(Subject.java:415)rn at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)rn at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)rn at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)rn at com.hsqw.hadoop.action.WordCount.main(WordCount.java:84)rn已经将hadoop.dll和winutils.exe放到bin目录下,并且在path下配置了相关路径,system32下也放入hadoop.dll文件。但还是报错。请大神指点迷津。
log4j的简单异常解决办法
今天 log4j:WARN No appenders could be found for logger (org.apache.http.impl.conn.tsccm.ThreadSafeClientConnManager). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http:/
log4j2配置启动Unregistering but no MBeans found matching 'org.apache.logging.log4j2
配置如下: log4j2.xml monitorInterva
log4j2的详细使用及问题解决[总结]
log4j2相对于log4j 1.x有了脱胎换骨的变化,其官网宣称的优势有多线程下10几倍于log4j 1.x和logback的高吞吐量、可配置的审计型日志、基于插件架构的各种灵活配置等。如果已经掌握log4j 1.x,使用log4j2还是非常简单的。 1、基础配置 &lt;dependencies&gt; &lt;dependency&gt; &lt;gro...
log4j WARN 和 SLF4J WARN 解决办法
log4j:WARN Please initialize the log4j system properly. SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
eclipse中使用log4j
  在java平台上写程序也有点时日,从以前的j2ee到现在的后台程序,多多少少都用到了log4j来输出日志,不过每次用的时候都是别人配好了,然后告诉我说怎么用,然后直接调用就好,一直对log4j处于一直半解的状态。今天,凭着要知其然,知其所以然的态度,细细的研究了下log4j。 1、在<em>eclipse</em>的使用log4j 对于程序员来讲,最快的方法就是先上手,看到直观效果在说,所以第一步就是怎么让自己的程序能够在<em>eclipse</em>里
eclipse运行WordCount程序时的错误
有时候,我们为了方便学习或者编码,又或者是linux服务器就不在我们身边,我们只能利用远程来学习使用hadoop,这样我们就的在windows下面链接服务器端的hadoop服务,来作业。 在配置的过程中,我遇到了好多问题,这里我只说一个错误。那就是在<em>运行</em>WordCount程序的时候遇到的错误。 如下图: 是不是比较明白?在FileUtil中的,checkReturnValue
完全分布式下使用eclipse运行hadoop2.2.0的WordCount实例
[img]http://dl2.iteye.com/upload/attachment/0097/7395/179e518f-ec2b-3ca7-b040-858087afdeb4.png[/img] [img]http://dl2.iteye.com/upload/attachment/0097/7397/ec5176e0-e319-3be3-91c5-aa53745761d3.png[/im...
hadoop2.2.0在windows上安装
Good news for Hadoop developers who want to use Microsoft Windows OS for their development activities. Finally Apache Hadoop 2.2.0 release officially supports for running Hadoop on Microsoft Windows a
hadoop 配置eclipse 运行wordcount 出现问题
各位前辈,我今天在hadoop环境下配置<em>eclipse</em>开发环境,<em>运行</em><em>wordcount</em> 例子出现下面的问题,求各位解疑rnrnrn15/01/30 20:53:18 INFO input.FileInputFormat: Total input paths to process : 1rn15/01/30 20:53:19 INFO mapred.JobClient: Running job: job_local_0001rn15/01/30 20:53:19 WARN mapred.LocalJobRunner: job_local_0001rn[color=#FF0000]org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=lizhihui, access=WRITE, inode="root":root:supergroup:rwxr-xr-x[/color]rn at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)rn at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)rn at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)rn at java.lang.reflect.Constructor.newInstance(Constructor.java:513)rn at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:95)rn at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:57)rn at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:1213)rn at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:321)rn at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1126)rn at org.apache.hadoop.mapred.FileOutputCommitter.setupJob(FileOutputCommitter.java:52)rn at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:184)rnCaused by: org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.security.AccessControlException: Permission denied: user=lizhihui, access=WRITE, inode="root":root:supergroup:rwxr-xr-xrn at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:199)rn at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:180)rn at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:128)rn at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5212)rn at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:5186)rn at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2058)rn at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2027)rn at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:817)rn at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)rn at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)rn at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)rn at java.lang.reflect.Method.invoke(Method.java:597)rn at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)rn at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)rn at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)rn at java.security.AccessController.doPrivileged(Native Method)rn at javax.security.auth.Subject.doAs(Subject.java:396)rn at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)rn at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)rnrn at org.apache.hadoop.ipc.Client.call(Client.java:1066)rn at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)rn at com.sun.proxy.$Proxy1.mkdirs(Unknown Source)rn at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)rn at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)rn at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)rn at java.lang.reflect.Method.invoke(Method.java:597)rn at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)rn at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)rn at com.sun.proxy.$Proxy1.mkdirs(Unknown Source)rn at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:1211)rn ... 4 morern15/01/30 20:53:20 INFO mapred.JobClient: map 0% reduce 0%rn15/01/30 20:53:20 INFO mapred.JobClient: Job complete: job_local_0001rn15/01/30 20:53:20 INFO mapred.JobClient: Counters: 0
eclipse上搭建mapreduce开发环境及运行wordcount
1.先说明我的实验环境: win7, <em>eclipse</em>版本是Neon.2 Release (4.6.2), hadoop2.7.3部署在1master,3slave的集群上 2.需要下载winutils.exe和hadoop-<em>eclipse</em>-plugin-2.7.3.jar,下载地址是点击打开链接 3.找到<em>eclipse</em>根目录下的plugins目录,将下载好的hadoop-<em>eclipse</em>-
Log4j 入门总结
一、Log4j介绍 log4j是类似于java.util.logging的日志作用,即记录一些有用信息,是一个日志框架; log4j == log for Java 日志框架的作用: (1)函数参数是否正确; (2)软件发布后,记录用户的每一步操作; (3)记录程序<em>运行</em>出错位置; log4j在http://logging.apache.org/log4j/1.2/do...
Junit测试用例配置Log4j
用Junit测试非常方便,但有时我们想要看日志来方便排除,使用spring+log4j时,用Junit测试看日志很多人都不太会,即如何将Junit与log4j进行整合。 我也是研究了半天,才终于找到了方法,特此拿来与大家分享下。 Junit+spring+log4j整合之所以麻烦,是因为spring与log4j的整合,是放在web.xml里的,随tomcat启动后,spring才会加载log4
INFO: Initializing Spring root WebApplicationContext log4j:3个WARN卡死的解决
启动tomcat加载到这里卡死,没有任何输出,如下 INFO: Initializing Spring root WebApplicationContext log4j:WARN No appenders could be found for logger (org.springframework.web.context.ContextLoader). log4j:WARN Please i
SLF4J和log4j错误警告的解决方法
hbase 启动时遇到 log4j 的问题1. hbase 启动时遇到 log4j 的问题问题描述: 我在 hbase 启动时曾遇到如下的问题SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".SLF4J: Defaulting to no-operation (NOP) logger implementationSLF4J
commons-logging的使用
简介 commons-logging是Apache commons类库中的一员。Apache commons类库是一个通用的类库,提供了基础的功能,比如说commons-fileupload,commons-httpclient,commons-io,commons-codes等。 commons-logging能够选择使用Log4j还是JDK Logging,但是他不依赖...
在Hadoop集群上运行WordCount出现下列错误怎么解决??
Java HotSpot(TM) Client VM warning: You have loaded library /home/hadoop/hadoop-2.5.2/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.rnIt's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.rn16/12/15 04:48:08 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicablern16/12/15 04:48:09 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.1.147:18040rn16/12/15 04:48:12 INFO input.FileInputFormat: Total input paths to process : 1rn16/12/15 04:48:12 INFO mapreduce.JobSubmitter: number of splits:1rn16/12/15 04:48:12 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1481804617285_0002rn16/12/15 04:48:13 INFO impl.YarnClientImpl: Submitted application application_1481804617285_0002rn16/12/15 04:48:13 INFO mapreduce.Job: The url to track the job: http://master:18088/proxy/application_1481804617285_0002/rn16/12/15 04:48:13 INFO mapreduce.Job: Running job: job_1481804617285_0002rn16/12/15 04:48:34 INFO mapreduce.Job: Job job_1481804617285_0002 running in uber mode : falsern16/12/15 04:48:34 INFO mapreduce.Job: map 0% reduce 0%rn16/12/15 04:48:36 INFO mapreduce.Job: Task Id : attempt_1481804617285_0002_m_000000_0, Status : FAILEDrnContainer launch failed for container_1481804617285_0002_01_000002 : org.apache.hadoop.yarn.exceptions.YarnException: Unauthorized request to start container. rnThis token is expired. current time is 1481811355417 found 1481806715733rn at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)rn at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)rn at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)rn at java.lang.reflect.Constructor.newInstance(Constructor.java:526)rn at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.instantiateException(SerializedExceptionPBImpl.java:168)rn at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.deSerialize(SerializedExceptionPBImpl.java:106)rn at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$Container.launch(ContainerLauncherImpl.java:155)rn at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$EventProcessor.run(ContainerLauncherImpl.java:369)rn at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)rn at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)rn at java.lang.Thread.run(Thread.java:745)rnrn16/12/15 04:48:38 INFO mapreduce.Job: Task Id : attempt_1481804617285_0002_m_000000_1, Status : FAILEDrnContainer launch failed for container_1481804617285_0002_01_000003 : org.apache.hadoop.yarn.exceptions.YarnException: Unauthorized request to start container. rnThis token is expired. current time is 1481811357277 found 1481806717783rn at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)rn at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)rn at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)rn at java.lang.reflect.Constructor.newInstance(Constructor.java:526)rn at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.instantiateException(SerializedExceptionPBImpl.java:168)rn at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.deSerialize(SerializedExceptionPBImpl.java:106)rn at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$Container.launch(ContainerLauncherImpl.java:155)rn at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$EventProcessor.run(ContainerLauncherImpl.java:369)rn at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)rn at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)rn at java.lang.Thread.run(Thread.java:745)rnrn16/12/15 04:48:40 INFO mapreduce.Job: Task Id : attempt_1481804617285_0002_m_000000_2, Status : FAILEDrnContainer launch failed for container_1481804617285_0002_01_000004 : org.apache.hadoop.yarn.exceptions.YarnException: Unauthorized request to start container. rnThis token is expired. current time is 1481811359287 found 1481806719811rn at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)rn at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)rn at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)rn at java.lang.reflect.Constructor.newInstance(Constructor.java:526)rn at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.instantiateException(SerializedExceptionPBImpl.java:168)rn at org.apache.hadoop.yarn.api.records.impl.pb.SerializedExceptionPBImpl.deSerialize(SerializedExceptionPBImpl.java:106)rn at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$Container.launch(ContainerLauncherImpl.java:155)rn at org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl$EventProcessor.run(ContainerLauncherImpl.java:369)rn at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)rn at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)rn at java.lang.Thread.run(Thread.java:745)rnrn16/12/15 04:48:44 INFO mapreduce.Job: map 100% reduce 100%rn16/12/15 04:48:44 INFO mapreduce.Job: Job job_1481804617285_0002 failed with state FAILED due to: Task failed task_1481804617285_0002_m_000000rnJob failed as tasks failed. failedMaps:1 failedReduces:0rnrn16/12/15 04:48:44 INFO mapreduce.Job: Counters: 4rn Job Counters rn Other local map tasks=3rn Data-local map tasks=1rn Total time spent by all maps in occupied slots (ms)=0rn Total time spent by all reduces in occupied slots (ms)=0rn
hadoop集群eclipse运行mapreduce的一些问题
有一个问题,配置好了分布式集群后,把mapreduce程序打包成jar在命令行可以跑成功,得到结果;但是在<em>eclipse</em>中<em>运行</em>就会报错(class $map not found)就是说找不到map类。 最终问题原因及解决办法: 在项目src目录下导入core-site.xml,hdfs-site.xml,Log4j.propertise,这三个配置文件,千万不要导入其他的如yarn-stie....
hadoop集群运行jar包报错(eclipse导jar)
报错日志: Exception in thread &amp;amp;amp;quot;main&amp;amp;amp;quot; java.lang.UnsupportedClassVersionError: com/hdfs/<em>wordcount</em>/WordcountDriver has been compiled by a more recent version of the Java Runtime (class file version 53.0), th...
Eclipse编译、Hadoop集群运行MapReduce程序
本文介绍的是如何在Windows中使用Eclipse来开发MapReduce程序,并打包成jar包在已经搭建好的<em>hadoop集群</em>环境上<em>运行</em>。 环境 Windows 7 Hadoop 2.7.3 Eclipse Mars Release (4.5.0) 安装Hadoop-Eclipse-Plugin 由于Windows系统下配置JDK环境和Eclipse的安装比较简单,所以此处省略其安装步骤。 ...
hadoop集群运行wordcount例子没有结果生成 求大神支招
配置如下:rn虚拟机:virtualBoxrnhadoop:hadoop-2.2.0rnjdk:jdk7rn集群配置:rnrnip 机器名 系统rn192.168.1.252 hadoop-master centos6.4 namenode、resourcemanager、secondarynamenodern192.168.1.251 hadoop-slave1 centos6.4 dataNodernrn开发环境rn192.168.1.250 centos centos6.4 安装<em>eclipse</em>、jdk用于远程调试 不属于集群节点rnrn情况如下:rn该集群可以通过<em>eclipse</em>插件连接、可以上传下载文件、集群常用端口都可以访问(除了50030网页打不开外),在<em>eclipse</em>上配置好<em>运行</em>参数<em>运行</em><em>wordcount</em>.java程序不报错,但是在output文件夹下没有输出结果 求大神支招rn<em>wordcount</em>.java文件都一样,就不贴了rn
hibernate基础,没有成功创建表,log4j:WARN No appenders could be found for logger (org.jboss.logging).
Hibernate配置文件错误提示,The content of element type "list" must match ,怎么解决呢.The content of element type "list" must match "(meta*,subselect?,cache?,synchronize*,comment?,key,(index|list-index),(element|one-to-many|many-to-many|composite-element|many-to-any) Exc
dubbo项目中使用logback输出日志
初次建立dubbo项目,并启动服务提供方时会看到如下的警告。log4j:WARN No appenders could be found for logger (com.alibaba.dubbo.common.logger.LoggerFactory). log4j:WARN Please initialize the log4j system properly. log4j:WARN See h
Log4J学习【二】第一个日志示例
首先下载Log4J包,最新的代码版本为Log4J 1.2。当然,Log4j 2现在也发布了,但是使用的有限,这个我们后面再慢慢介绍。下载地址为: http://logging.apache.org/log4j/1.2/download.html 下载完成后解压。Log4J是一个很单纯的代码包,他本身不再依赖任何其他的框架。所以,直接把log4j-1.2.17.jar拷贝到应用classpath
eclipse运行hadoop程序
本来想自己写一下,但是别人写的挺好的,我就直接转过来了,加了一点自己遇到的一些问题。 1.我的是hadoop1.2.1+ecplise4.2 ,假如你是Hadoop1.2.1,建议使用ecplise4.2以下试试,因为我试了ecplise4.7、4.6、4.4、4.3都不行,把Hadopp-1.2.1的jar包放进去目标目录都显示不了DFS Location,所以不断尝试更换版本,最后在4.2版
如何使用eclipse开发hadoop程序
说明:环境说明:服务器端:虚拟机下安装centos6.8, jdk1.7, hadoop2.7.5客户端:win10,jdk1.7,hadoop2.7.5,<em>eclipse</em>Mars.2 Release (4.5.2)                  一、安装并配置相关软件1、下载windows下使用的<em>eclipse</em> 插件hadoop-<em>eclipse</em>-plugin-2.7.2.jar,以及winut
bug ——setContentView 报空指针
今天跳转 Activity 的时候莫名其动不了,编译没报错。查了好一会才发现是标签大小写混了。(估计是打太快enter了,智能提示没跟上,233) &amp;lt;view     android:layout_width=&quot;match_parent&quot;     android:background=&quot;@color/backgroundGray&quot;     android:layout_height=&quot;1d...
支付宝demo报空指针
配置好信息后,<em>运行</em>demo,在 sign = URLEncoder.encode(sign, "UTF-8"); 这一行报<em>空指针</em>。这是由于支付宝针对的版本不同所造成的, 解决办法:找到SignUtils类, //添加 “BC” KeyFactory keyf = KeyFactory.getInstance(ALGORITHM,"BC"); 再次<em>运行</em>,一切正常。
空指针
为什么在pid = Integer.parseInt(strPid);里报<em>空指针</em>错那?rnrnrnrnrn rnrnrnrnrnrnrnrn rn Insert title herernrnrn 添加根类别rnrn rn rn rn rn rn 类别名称rn rn rn rn 类别描述rn rn rn rn rn rn rn rn rn rn rnrnrnrn
二维数组赋值报空指针
[code=java]rntable = new String[td.size()][]; rn for (int i=0;i td = new ArrayList();rn目的是将容器中的值 转化成二维数组,但是在从容器取值赋值给二维数组的时候报<em>空指针</em>错误 请问什么原因啊[img=https://forum.csdn.net/PointForum/ui/scripts/csdn/Plugin/001/face/35.gif][/img]rnrnrnException in thread "main" java.lang.NullPointerExceptionrn at bean.T_History.s_History(T_History.java:51)rn at bean.T_History.main(T_History.java:21)
hibernate load 报空指针
根据主键IDload的时候报<em>空指针</em>rnshow_sql=truern在load的时候也没打印hqlrnID确认在数据库中存在rn配置方面都没问题rn登录都可以用rn[code=java]rn严重: Servlet.service() for servlet action threw exceptionrnjava.lang.NullPointerExceptionrn at org.hibernate.tuple.AbstractEntityTuplizer.createProxy(AbstractEntityTuplizer.java:372)rn at org.hibernate.persister.entity.AbstractEntityPersister.createProxy(AbstractEntityPersister.java:3121)rn at org.hibernate.event.def.DefaultLoadEventListener.createProxyIfNecessary(DefaultLoadEventListener.java:232)rn at org.hibernate.event.def.DefaultLoadEventListener.proxyOrLoad(DefaultLoadEventListener.java:173)rn at org.hibernate.event.def.DefaultLoadEventListener.onLoad(DefaultLoadEventListener.java:87)rn at org.hibernate.impl.SessionImpl.fireLoad(SessionImpl.java:862)rn at org.hibernate.impl.SessionImpl.load(SessionImpl.java:781)rn at org.hibernate.impl.SessionImpl.load(SessionImpl.java:774)rn at org.springframework.orm.hibernate3.HibernateTemplate$3.doInHibernate(HibernateTemplate.java:508)rn at org.springframework.orm.hibernate3.HibernateTemplate.execute(HibernateTemplate.java:372)rn at org.springframework.orm.hibernate3.HibernateTemplate.load(HibernateTemplate.java:502)rn at org.springframework.orm.hibernate3.HibernateTemplate.load(HibernateTemplate.java:496)rn at com.shinythink.DbOperration.PackagMethod.querys(PackagMethod.java:53)rn at com.shinythink.Dao.Impl.UserListDaoImpl.loadUserListWhereId(UserListDaoImpl.java:31)rn at com.shinythink.Service.Impl.UserListServiceImpl.updatePassWord(UserListServiceImpl.java:51)rn at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)rn at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)rn at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)rn at java.lang.reflect.Method.invoke(Method.java:597)rn at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:304)rn at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:182)rn at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:149)rn at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:106)rn at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)rn at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:89)rn at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:171)rn at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204)rn at $Proxy1.updatePassWord(Unknown Source)rn at com.shinythink.Action.UserListAction.UpdatePassWord(UserListAction.java:77)rn at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)rn at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)rn at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)rn at java.lang.reflect.Method.invoke(Method.java:597)rn at org.apache.struts.actions.DispatchAction.dispatchMethod(DispatchAction.java:270)rn at org.apache.struts.actions.DispatchAction.execute(DispatchAction.java:187)rn at org.springframework.web.struts.DelegatingActionProxy.execute(DelegatingActionProxy.java:110)rn at org.apache.struts.action.RequestProcessor.processActionPerform(RequestProcessor.java:431)rn at org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:236)rn at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1196)rn at org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:432)rn at javax.servlet.http.HttpServlet.service(HttpServlet.java:637)rn at javax.servlet.http.HttpServlet.service(HttpServlet.java:717)rn at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)rn at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)rn at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)rn at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191)rn at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127)rn at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)rn at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)rn at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298)rn at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:859)rn at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588)rn at org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489)rn at java.lang.Thread.run(Thread.java:662)rnrn[/code]rnrnrn谁帮忙看看是什么问题.rn谢谢了.
eclipse运行hadoop导入jar包和本地环境配置
------------<em>eclipse</em>上<em>运行</em>导入jar包 1)拷贝解压后的hadoop(E:\hadoop-2.6.4\share)下的文件夹hadoop到E盘根目录,   重命名为hadoopjars. 2)--------新建的项目导入hadoop的jar包  右键项目properties---java build path---add library---user library      ...
hadoop运行eclipse简单实例
整个Hadoop是基于Java开发的,所以要开发Hadoop相应的程序就得用JAVA。在linux下开发JAVA还数<em>eclipse</em>方便。 1、下载 进入官网:http://<em>eclipse</em>.org/downloads/。 找到相应的版本进行下载,我这里用的是<em>eclipse</em>-SDK-3.7.1-linux-gtk版本。 2、解压 下载下来一般是tar.gz文件,<em>运行</em>:
Eclipse运行Hadoop WordCount例程
Hadoop开发环境搭建-Eclipse插件配置 中
eclipse运行第一个hadoop程序
hadoop入门学习,在ubuntu14中安装hadoop和<em>eclipse</em>学习<em>运行</em>第一个<em>wordcount</em>项目,自学过程中遇到一些常见问题,仅以此记录。
eclipse运行hadoop程序
  参照hadoop-0.20.2/docs/quickstart.html     注:ssh-copy-id -i ~/.ssh/id_rsa.pub localhost,我的 用户名是fansxnet    配置我们的hadoop伪分布式,打开下面的页面,配置成功。   NameNode - http://localhost:50070/ JobTracker...
Xml解析报空指针
请教各位前辈,xml解析过程中<em>空指针</em>怎么破,我找了一天了也没找到解决办法rn[img=https://img-bbs.csdn.net/upload/201601/07/1452155445_737406.png][/img]rn日志上说蓝色那排<em>空指针</em>rn这是解析类:rnpackage com.hande.publichealth.util;rnrnimport java.io.File;rnimport java.io.FileInputStream;rnimport java.io.IOException;rnimport java.io.InputStream;rnimport java.util.ArrayList;rnimport java.util.HashMap;rnimport java.util.List;rnimport java.util.Map;rnrnimport org.xmlpull.v1.XmlPullParser;rnrnimport android.util.Log;rnimport android.util.Xml;rnrnimport com.hande.publichealth.global.Global;rnrn/**rn * 解析文件rn * rn * @author Adminrn *rn */rnpublic class ParsingUtil rn public static final String TAG = "ParsingUtil";rn /** 解析公共健康, format.xml文件 **/rn public static List>> parseHealthConsult() rn rn List>> list = null;rn File file = new File(Global.GGJK_XML_PATH);rn InputStream in = null;rn if (file.exists()) rn try rn in = new FileInputStream(file);rn XmlPullParser mXmlPullParser = Xml.newPullParser();rn mXmlPullParser.setInput(in, "UTF-8");rn int type = mXmlPullParser.getEventType();rn List> data = null;rn Map map = null;rn while (XmlPullParser.END_DOCUMENT != type) rn String name = mXmlPullParser.getName();// 标签名rn switch (type) rn case XmlPullParser.START_DOCUMENT:rn list = new ArrayList>>();rnrn break;rn case XmlPullParser.START_TAG:rn if ("arrayList".equals(name)) rn data = new ArrayList>();rn rn if ("Item".equals(name)) rn map = new HashMap();rn rn if ("Title".equals(name))rn map.put("title",mXmlPullParser.nextText());rn rn if ("Source".equals(name)) rn map.put("source", mXmlPullParser.nextText());rn rn if ("Number".equals(name)) rn map.put("number", mXmlPullParser.nextText());rn rn if ("Time".equals(name)) rn map.put("time", mXmlPullParser.nextText());rn rn if ("Imageurl".equals(name))rn map.put("img", mXmlPullParser.nextText()); rn rn if ("Htmlurl".equals(name))rn map.put(name, mXmlPullParser.nextText());rn rn break;rn case XmlPullParser.END_TAG:rn if ("Item".equals(name)) rn data.add(map);rn map = null;rn rn if ("arrayList".equals(name)) rn list.add(data);rn data = null;rn rn break;rn rn type = mXmlPullParser.next();rn rnrn catch (Exception e) rn e.printStackTrace();rn finallyrnrn if (in != null) rn try rn in.close();rn in = null;rn catch (IOException e) rn e.printStackTrace();rn rn rn rn rn else rn Log.e(TAG, "公共健康==>format.xml文件丢失!");rn rn return list;rn rnrn根据打印的日志文件,解析类就只执行了Log.e(TAG, "公共健康==>format.xml文件丢失!");这行,有人说可能是节点名字写错了,但我挨个复制节点名字后还是<em>空指针</em>,恳请各位前辈指点,感激不尽
spring注入报空指针
rn rn rn rn rn rn rnpublic class LoginAction extends ActionSupport rn private String username; // 用户名rn private String password; // 密码rn private UserService userService;rnpublic void setUserService(UserService userService) rn this.userService = userService;rn rnpublic String execute() throws Exception rn if (userService.login(username, password)) // 判断是否登录rn return "success"; // 返回成功字符串rn else rn return "fail"; // 返回失败字符串rn rn rnjava.lang.NullPointerExceptionrn com.gy.action.LoginAction.execute(LoginAction.java:32)rnrn省略部分测试代码,应该是spring注入问题,控制台直接读application.xml可以注入,但发布到tomcat userService就成<em>空指针</em>出错 ,请高手赐教!rn
使用butterknife报空指针
一、首先你使用的是哪个版本的 若是8.0.1以前的直接在build.gradle中添加 compile 'com.jakewharton:butterknife:7.0.1' 就Ok了 但是你要直接将7.0.1替换成8.0.1以后的版本,编译不会报错,当你<em>运行</em>使用控件时会报<em>空指针</em>, 解决方法:1、在项目最外层的build.gradle中添加 classpath 'com.nee
CXF报空指针
最近用cxf开发,将hibernate\spring集成到一个web工程中,但在执行hql语句时,在一个action中,有的hql执行就没有问题,可有的就报rn2011-4-22 17:22:27 org.apache.cxf.phase.PhaseInterceptorChain doDefaultLoggingrn警告: Application http://impl.cxfService.com/PortServiceImplService#http://cxfService.com/portMessage has thrown exception, unwinding nowrnorg.apache.cxf.interceptor.FaultrnCaused by: java.lang.NullPointerExceptionrn很是郁闷,但是我可以肯定hibernateDAO绝对不为空!rn就是同一个hql语句,比如第一次参数值是‘A’,结果执行就不报错;若将值改成'B',就会报上面的错!rn并且将语句在数据库中执行,完全可以查到结果!rn希望高手不吝赐教!
百度关键词数据采集源码下载
通过百度的关键词推荐工具,批量下载与主关键词相关的关键词列表,然后通过本程序,导入到本地系统,然后就可以批量采集百度搜索的数据。 相关下载链接:[url=//download.csdn.net/download/sea1126/3462123?utm_source=bbsseo]//download.csdn.net/download/sea1126/3462123?utm_source=bbsseo[/url]
C#入门经典5(第3版)下载
《C#入门经典(第3版)》 Beginning Visual C# 2005 共10个分卷,180M 相关下载链接:[url=//download.csdn.net/download/magic2004301/3494596?utm_source=bbsseo]//download.csdn.net/download/magic2004301/3494596?utm_source=bbsseo[/url]
bbs.lenovomobile.com下载
bbs.lenovomobile.com 相关下载链接:[url=//download.csdn.net/download/qq_20168685/7836727?utm_source=bbsseo]//download.csdn.net/download/qq_20168685/7836727?utm_source=bbsseo[/url]
相关热词 c++和c#哪个就业率高 c# 批量动态创建控件 c# 模块和程序集的区别 c# gmap 截图 c# 验证码图片生成类 c# 再次尝试 连接失败 c#开发编写规范 c# 压缩图片好麻烦 c#计算数组中的平均值 c#获取路由参数
我们是很有底线的