如何在spark-on-k8s-operator中传递带有空格pyspark参数的字符串

uz75evzq  于 2021-07-12  发布在  Spark
关注(0)|答案(0)|浏览(200)

pyspark参数之一是sql query(带空格的字符串)。我试着把它当作- \"select * from table\" 以及 "select * from table" 但它并没有被当作一个完整的字符串 select * bash命令正在执行,这会损坏sql。
示例:上面的查询已转换为- \"select' folder1 file1.zip from 'table\" 驱动程序日志:

PYSPARK_ARGS=
+ '[' -n 'process  --query \"select * from table\"' ']'
+ PYSPARK_ARGS='process --query \"select * from table\"'
+ R_ARGS=
+ '[' -n '' ']'
+ '[' 3 == 2 ']'
+ '[' 3 == 3 ']'
++ python3 -V
+ pyv3='Python 3.7.3'
+ export PYTHON_VERSION=3.7.3
+ PYTHON_VERSION=3.7.3
+ export PYSPARK_PYTHON=python3
+ PYSPARK_PYTHON=python3
+ export PYSPARK_DRIVER_PYTHON=python3
+ PYSPARK_DRIVER_PYTHON=python3
+ case "$SPARK_K8S_CMD" in
+ CMD=("$SPARK_HOME/bin/spark-submit" --conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@" $PYSPARK_PRIMARY $PYSPARK_ARGS)
+ exec /usr/bin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=xx.xx.xx.xx --deploy-mode client --class org.apache.spark.deploy.PythonRunner file:/usr/local/bin/process_sql.py process 
--query '\"select' folder1 file1.zip from 'table\"'

有没有办法安全地传递带有空格、单引号或双引号的字符串参数?

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题