sqoop2 mysql导入hdfs问题

老9 2018-07-09 02:57:44
请问下,大家是否遇到这个问题,在线等,搞了一天了,还是没有查出问题来

我的job信息:
Job with name sqoop2 (Enabled: true, Created by sqoop2 at 7/9/18 10:40 AM, Updated by sqoop2 at 7/9/18 2:04 PM)
Throttling resources
Extractors: 2
Loaders: 2
Classpath configuration
Extra mapper jars:
From link: jdbc-link
Database source
Schema name: test
Table name: product
SQL statement:
Column names:
Partition column:
Partition column nullable:
Boundary query:
Incremental read
Check column:
Last value:
To link: hdfs-link
Target configuration
Override null value:
Null value:
File format: TEXT_FILE
Compression codec: NONE
Custom codec:
Output directory: hdfs://192.168.18.164:9000/test/sqoop2/test/product
Append mode:




2018-07-09 12:06:20,968 ERROR [IPC Server handler 17 on 38532] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: attempt_1530862456517_0016_r_000000_3 - exited : org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0018:Error occurs during loader run
at org.apache.sqoop.job.mr.SqoopOutputFormatLoadExecutor$ConsumerThread.run(SqoopOutputFormatLoadExecutor.java:292)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.sqoop.common.SqoopException: GENERIC_HDFS_CONNECTOR_0005:Error occurs during loader run
at org.apache.sqoop.connector.hdfs.HdfsLoader$1.run(HdfsLoader.java:115)
at org.apache.sqoop.connector.hdfs.HdfsLoader$1.run(HdfsLoader.java:60)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1686)
at org.apache.sqoop.connector.hdfs.HdfsLoader.load(HdfsLoader.java:60)
at org.apache.sqoop.connector.hdfs.HdfsLoader.load(HdfsLoader.java:45)
at org.apache.sqoop.job.mr.SqoopOutputFormatLoadExecutor$ConsumerThread$1.call(SqoopOutputFormatLoadExecutor.java:279)
at org.apache.sqoop.job.mr.SqoopOutputFormatLoadExecutor$ConsumerThread$1.call(SqoopOutputFormatLoadExecutor.java:260)
at org.apache.sqoop.utils.ClassUtils.executeWithClassLoader(ClassUtils.java:281)
at org.apache.sqoop.job.mr.SqoopOutputFormatLoadExecutor$ConsumerThread.run(SqoopOutputFormatLoadExecutor.java:259)
... 5 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hadoop is not allowed to impersonate sqoop2
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1491)
at org.apache.hadoop.ipc.Client.call(Client.java:1437)
at org.apache.hadoop.ipc.Client.call(Client.java:1347)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
at com.sun.proxy.$Proxy17.create(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:350)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
at com.sun.proxy.$Proxy18.create(Unknown Source)
at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:273)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1170)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1149)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1087)
at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:463)
at org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:460)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:474)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:401)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1103)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1083)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:972)
at org.apache.sqoop.connector.hdfs.hdfsWriter.HdfsTextWriter.initialize(HdfsTextWriter.java:41)
at org.apache.sqoop.connector.hdfs.HdfsLoader$1.run(HdfsLoader.java:93)
... 15 more
...全文
193 3 打赏 收藏 转发到动态 举报
写回复
用AI写文章
3 条回复
切换为时间正序
请发表友善的回复…
发表回复
老9 2018-07-09
  • 打赏
  • 举报
回复
引用 2 楼 sunshingheavy 的回复:
jdbc连接发出来看看
已经搞定了,是因为用户权限的问题导致,因为hadoop和sqoop运行用户不同,导致reduce的时候没有权限写入文件导致。
sunshingheavy 2018-07-09
  • 打赏
  • 举报
回复
jdbc连接发出来看看
老9 2018-07-09
  • 打赏
  • 举报
回复
在线等,有没有人碰到过, sqoop版本是1.99.7的

20,808

社区成员

发帖
与我相关
我的任务
社区描述
Hadoop生态大数据交流社区,致力于有Hadoop,hive,Spark,Hbase,Flink,ClickHouse,Kafka,数据仓库,大数据集群运维技术分享和交流等。致力于收集优质的博客
社区管理员
  • 分布式计算/Hadoop社区
  • 涤生大数据
加入社区
  • 近7日
  • 近30日
  • 至今
社区公告
暂无公告

试试用AI创作助手写篇文章吧