Spark中未找到配置单元元存储的Postgresql驱动程序

guz6ccqo  于 2022-11-16  发布在  Apache
关注(0)|答案(1)|浏览(134)

为了安装spark使用的postgresql驱动程序,我做了两件事:

  • 已将驱动程序jar复制到$SPARK_HOME/jars目录:
$ll $SPARK_HOME/jars/post*
-rw-r--r--  1 stephenboesch  staff  1046274 Nov 10 14:44 /usr/local/Cellar/apache-spark/3.3.1/libexec/jars/postgresql-42.5.0.jar
  • 将jar添加到SparkConf
spark = (SparkSession.builder
    .appName(appName)
    .config('spark.jars','/usr/local/Cellar/apache-spark/3.3.1/libexec/jars/postgresql-42.5.0.jar')
    .master(master)
    .enableHiveSupport()
    .getOrCreate())

但是运行pyspark脚本时找不到驱动程序?

Caused by: java.sql.SQLException: No suitable driver found for jdbc:postgresql://localhost:5432/hive
nvbavucw

nvbavucw1#

显然,enableHiveSupport()杀死了它。?通过注解出来,现在一切都正常了

spark = (SparkSession.builder
    .appName(appName)
    .config('spark.jars','/usr/local/Cellar/apache-spark/3.3.1/libexec/jars/postgresql-42.5.0.jar')
    .master("local")
    # .enableHiveSupport()   #  Not sure why enabling this loses the driver..
    .getOrCreate())

相关问题