求助:hadoop的Spark安装问题
已经安装了Hadoop 2.9,下载了Spark 2.2.0,做了spark-env.sh的配置,不过往下走的时候出错了,本人是hadoop菜鸟:)
执行 bin/spark-submit examples/src/main/python/pi.py 命令后出现如下错误:
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/spark/launcher/Main : Unsupported major.minor version 52.