为了安装spark使用的postgresql
驱动程序,我做了两件事:
- 已将驱动程序jar复制到
$SPARK_HOME/jars
目录:
$ll $SPARK_HOME/jars/post*
-rw-r--r-- 1 stephenboesch staff 1046274 Nov 10 14:44 /usr/local/Cellar/apache-spark/3.3.1/libexec/jars/postgresql-42.5.0.jar
- 将jar添加到
SparkConf
:
spark = (SparkSession.builder
.appName(appName)
.config('spark.jars','/usr/local/Cellar/apache-spark/3.3.1/libexec/jars/postgresql-42.5.0.jar')
.master(master)
.enableHiveSupport()
.getOrCreate())
但是运行pyspark
脚本时找不到驱动程序?
Caused by: java.sql.SQLException: No suitable driver found for jdbc:postgresql://localhost:5432/hive
1条答案
按热度按时间nvbavucw1#
显然,
enableHiveSupport()
杀死了它。?通过注解出来,现在一切都正常了