初入spark,在配置环境选用的jdk17,scala版本2.13,spark-core版本3.3.2,测试连接spark框架是否成功,结果报错class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x16eb3ea3) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x16eb3ea3;
大概就是版本冲突的导致的这个问题,后面将jdk版本换成11就正常运行了。
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>3.3.2</version>
</dependency>