Spark执行了60个小时没结果,也没报错
[root@master opt]# hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/hive2.1.1/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hive2.1.1/lib/spark-assembly-1.6.3-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hive2.1.1/lib/spark-examples-1.6.3-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Logging initialized using configuration in file:/opt/hive2.1.1/conf/hive-log4j2.properties Async: true
hive> select * from t1;
OK
1 wangming
2 wangfang
3 songyun
4 xiaoling
5 huiming
6 sunlu
Time taken: 3.758 seconds, Fetched: 6 row(s)
hive> select count(*) from t1;
Query ID = root_20171113114006_fc8ac12a-563c-4770-8f39-8e72f15b209b
Total jobs = 1
Launching Job 1 out of 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapreduce.job.reduces=<number>
Starting Spark Job = ad0e3f12-503b-4d2e-bc95-f89b54b44b76
Query Hive on Spark job[0] stages:
0
1
Status: Running (Hive on Spark job[0])
Job Progress Format
CurrentTime StageId_StageAttemptId: SucceededTasksCount(+RunningTasksCount-FailedTasksCount)/TotalTasksCount [StageCost]
2017-11-13 11:40:33,290 Stage-0_0: 0/1 Stage-1_0: 0/1
2017-11-13 11:40:36,355 Stage-0_0: 0/1 Stage-1_0: 0/1
2017-11-13 11:40:39,407 Stage-0_0: 0/1 Stage-1_0: 0/1
2017-11-13 11:40:42,865 Stage-0_0: 0/1 Stage-1_0: 0/1
2017-11-13 11:40:45,953 Stage-0_0: 0/1 Stage-1_0: 0/1
2017-11-13 11:40:48,995 Stage-0_0: 0/1 Stage-1_0: 0/1
2017-11-13 11:40:52,043 Stage-0_0: 0/1 Stage-1_0: 0/1
2017-11-13 11:40:55,097 Stage-0_0: 0/1 Stage-1_0: 0/1
2017-11-13 11:40:58,142 Stage-0_0: 0/1 Stage-1_0: 0/1
......
执行了60个小时没结果,也没报错
版本信息
hadoop 2.7.1
hive 2.1.1
spark 1.6.3
日志没报错
17/11/13 10:44:38 INFO master.Master: Received unregister request from application app-20171110193202-0001
17/11/13 10:44:38 INFO master.Master: Removing app app-20171110193202-0001
17/11/13 10:44:38 INFO master.Master: master:40485 got disassociated, removing it.
17/11/13 10:44:38 INFO master.Master: 192.168.50.130:35575 got disassociated, removing it.
17/11/13 10:44:38 INFO spark.SecurityManager: Changing view acls to: root
17/11/13 10:44:38 INFO spark.SecurityManager: Changing modify acls to: root
17/11/13 10:44:38 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
17/11/13 11:14:36 INFO master.Master: Registering app Hive on Spark
17/11/13 11:14:36 INFO master.Master: Registered app Hive on Spark with ID app-20171113111436-0002