我遵循中的步骤(https://neo4j.com/developer/spark/quickstart/)并使用以下软件包:
$SPARK_HOME/bin/spark-shell --packages neo4j-contrib:neo4j-connector-apache-spark_2.12:4.0.1_for_spark_3
现在,在spark shell中,当我尝试执行此操作时(在同一站点中提到):
import org.apache.spark.sql.{SaveMode, SparkSession}
val spark = SparkSession.builder().getOrCreate()
val df = spark.read.format("org.neo4j.spark.DataSource")
.option("url", "bolt://localhost:7687")
.option("authentication.basic.username", "neo4j")
.option("authentication.basic.password", "neo4j")
.option("labels", "Person")
.load()
出现以下错误:
java.lang.noclassdeffounderror:org/apache/spark/sql/sources/v2/readsupport
我该怎么解决这个问题?
暂无答案!
目前还没有任何答案,快来回答吧!