sqoop将mysql数据导入hive--报错

dream0352 2017-05-27 05:55:03
数据量在几百万,该怎么解决了?增加mr也不管用

Error: GC overhead limit exceeded
17/05/27 15:47:25 INFO mapreduce.Job: Task Id : attempt_149555519_0057_m_000000_1, Status : FAILED
Error: GC overhead limit exceeded
17/05/27 15:48:16 INFO mapreduce.Job: Task Id : attempt_149555519_0057_m_000000_2, Status : FAILED
Error: GC overhead limit exceeded
17/05/27 15:49:10 INFO mapreduce.Job: map 100% reduce 0%
17/05/27 15:49:10 INFO mapreduce.Job: Job job_149555519_0057 failed with state FAILED due to: Task failed task_149555519_0057_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0

17/05/27 15:49:10 INFO mapreduce.Job: Counters: 11
Job Counters
Failed map tasks=4
Launched map tasks=4
Other local map tasks=4
Total time spent by all maps in occupied slots (ms)=197894
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=197894
Total vcore-seconds taken by all map tasks=197894
Total megabyte-seconds taken by all map tasks=202643456
Map-Reduce Framework
CPU time spent (ms)=0
Physical memory (bytes) snapshot=0
Virtual memory (bytes) snapshot=0
17/05/27 15:49:10 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
17/05/27 15:49:10 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 214.2624 seconds (0 bytes/sec)
17/05/27 15:49:10 INFO mapreduce.ImportJobBase: Retrieved 0 records.
17/05/27 15:49:10 ERROR tool.ImportTool: Error during import: Import job failed
...全文
562 回复 打赏 收藏 转发到动态 举报
写回复
用AI写文章
回复
切换为时间正序
请发表友善的回复…
发表回复

932

社区成员

发帖
与我相关
我的任务
社区描述
云计算 云存储相关讨论
社区管理员
  • 云存储
加入社区
  • 近7日
  • 近30日
  • 至今
社区公告
暂无公告

试试用AI创作助手写篇文章吧