spark 分析报错,有大佬遇到吗

MS_MOKAI 2025-07-02 17:03:08
<parent>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-parent</artifactId>
    <version>3.4.1</version>
    <relativePath/>
</parent>

<java.version>17</java.version>
<scala.binary.version>2.13</scala.binary.version>
<scala.version>2.13.12</scala.version>
<fastjson.version>2.0.53</fastjson.version>
<cassadra.version>3.4.1</cassadra.version>
<!-- 核心组件版本 -->
<spark.version>4.0.0</spark.version>
<jackson.version>2.16.1</jackson.version>
<dependency>
    <groupId>com.datastax.spark</groupId>
    <artifactId>spark-cassandra-connector_${scala.binary.version}</artifactId>
    <version>3.5.1</version>
</dependency>
<!-- Spark 核心依赖 -->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_${scala.binary.version}</artifactId>
    <exclusions>
        <exclusion>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-databind</artifactId>
        </exclusion>
    </exclusions>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_${scala.binary.version}</artifactId>
    <version>${spark.version}</version>
</dependency>

<dependency>
    <groupId>com.fasterxml.jackson.module</groupId>
    <artifactId>jackson-module-scala_${scala.binary.version}</artifactId>
    <version>${jackson.version}</version>
</dependency>
<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-data-cassandra</artifactId>
    <version>${cassadra.version}</version>
    <exclusions>
        <exclusion>
            <groupId>com.datastax.oss</groupId>
            <artifactId>java-driver-core</artifactId>
        </exclusion>
    </exclusions>
</dependency>

pom.xml 依赖如上
报错:java.lang.ClassCastException: cannot assign instance of scala.collection.generic.DefaultSerializationProxy to field org.apache.spark.sql.execution.datasources.v2.DataSourceRDDPartition.inputPartitions of type scala.collection.immutable.Seq in instance of org.apache.spark.sql.execution.datasources.v2.DataSourceRDDPartition
    at java.base/java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2227)
    at java.base/java.io.ObjectStreamClass$FieldReflector.checkObjectFieldValueTypes(ObjectStreamClass.java:2191)
    at java.base/java.io.ObjectStreamClass.checkObjFieldValueTypes(ObjectStreamClass.java:1478)
    at java.base/java.io.ObjectInputStream$FieldValues.defaultCheckFieldValues(ObjectInputStream.java:2657)
    at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2471)
    at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2242)
    at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1742)
    at java.base/java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2584)
    at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2442)
    at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2242)
    at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1742)
    at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:514)
    at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:472)
    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:88)
    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:136)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:602)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
    at java.base/java.lang.Thread.run(Thread.java:833)
 [task-result-getter-0]
...全文
11 回复 打赏 收藏 转发到动态 举报
AI 作业
写回复
用AI写文章
回复
切换为时间正序
请发表友善的回复…
发表回复

1,270

社区成员

发帖
与我相关
我的任务
社区描述
Spark由Scala写成,是UC Berkeley AMP lab所开源的类Hadoop MapReduce的通用的并行计算框架,Spark基于MapReduce算法实现的分布式计算。
社区管理员
  • Spark
  • shiter
加入社区
  • 近7日
  • 近30日
  • 至今
社区公告
暂无公告

试试用AI创作助手写篇文章吧